LLM Decision Containment
JacqOS is a physics engine for business logic. In that frame, this pattern is the wall a player cannot walk through: an LLM-proposed action is a player move, the engine refuses to enact it unless an explicit decision rule ratifies it against policy, and the named invariant is the wall itself.
The Real-World Failure
Section titled “The Real-World Failure”A Chevrolet dealership’s chatbot was tricked into “selling” a new Tahoe for $1. An Air Canada chatbot invented a bereavement refund policy that the airline had to honour in court. Across domains, LLM-powered assistants routinely propose actions that violate the policies the underlying business cares about — not because the models are broken, but because policy enforcement was never the model’s job.
The failure shape is always the same: a model generates a decision; a thin orchestration layer turns that decision into an action; the action reaches the world. There is nothing between the model’s probabilistic output and the effect.
What JacqOS Does About It
Section titled “What JacqOS Does About It”JacqOS makes it structurally impossible for a model’s decision to reach the world without passing through a policy check you wrote and can inspect.
Every model-proposed action lands in the reserved proposal.*
namespace. From there, an ontology decision rule evaluates the
proposal against the policy facts the system knows:
- “Authorize this offer if the proposed price is at or above the auto floor.”
- “Escalate to manager review if the proposed discount is in the manager band.”
- “Block this offer if it violates any pricing policy.”
Only authorized decisions derive executable intent.*. Blocked and
escalated decisions sit in Activity’s Blocked or Waiting tabs
where an operator can see them; no side effect touches the world.
This means:
- A model proposing a $1 Tahoe is isolated to the proposal space.
The would-be
intent.send_offernever derives because noauthorized_offerdecision formed for it. - A model proposing a reasonable price flows through authorization and fires the intent. Everyone sees the full decision chain.
- Invariants are a second, independent safety net. Even if a
misconfigured decision rule let a $1 offer through, the named
invariant
offer_sent_above_auto_floorwould make the resulting world state logically inadmissible.
The model is free. The safety is structural.
What You’ll See In Studio
Section titled “What You’ll See In Studio”Run the Chevy demo in Studio. Scenario tiles inject synthetic customer inquiries; the deterministic decider produces a structured offer decision; the containment plays out live.
- Tame offer → the
Donetab shows a row likeoffer-sent-04: $38,500 offer to req-1024, policy auto-authorized. Drill in and the inspector takes you from the executed offer back throughsales.decision.authorized_offer, the policy floor fact, and the model’s offer-decision observation. - $1 offer → the
Blockedtab shows a row likeproposed $1 offer — blocked by pricing floor policy. Drill in and the inspector names the blocking invariant, the missing authorized decision, and the proposal observation that tried to produce it. - Manager-review offer → the
Waitingtab shows a proposal parked for escalation. Drill in and the inspector names the specific review decision that would be required to promote or cancel the proposal.
Critically, the $1 offer scenario is not blocked because the model
produced something bad. The model produced exactly what you told it
to. It is blocked because the decision rule refused to authorize a
$1 offer against a policy you can see. The safety boundary lives in
your ontology, not in the prompt.
What It Looks Like In Code
Section titled “What It Looks Like In Code”The mapper declares that model-produced offer atoms route through the
proposal.* relay namespace:
fn map_observation(obs) { match obs.kind { "llm.offer_decision_result" => [ atom("request.id", obs.payload.request_id), atom("offer_decision.action", obs.payload.action), // requires_relay atom("offer_decision.price_usd", obs.payload.price), // requires_relay ], // ... }}
fn mapper_contract() { [("llm.offer_decision_result", ["offer_decision."], "proposal")]}A proposal staging rule lifts the atoms into the reserved
proposal.* namespace:
rule assert proposal.offer_action(request, action, seq) :- atom(obs, "request.id", request), atom(obs, "offer_decision.action", action), atom(obs, "seq", seq).
rule assert proposal.offer_price(request, price, seq) :- atom(obs, "request.id", request), atom(obs, "offer_decision.price_usd", price), atom(obs, "seq", seq).A decision rule evaluates the proposal against policy:
rule sales.decision.authorized_offer(request, vehicle, price) :- proposal.offer_action(request, "send", _), proposal.offer_price(request, price, _), sales.request(request, vehicle, _), policy.auto_authorize_min_price(vehicle, floor), price >= floor.
rule sales.decision.blocked_offer(request, vehicle, price, "below_floor") :- proposal.offer_action(request, "send", _), proposal.offer_price(request, price, _), sales.request(request, vehicle, _), policy.auto_authorize_min_price(vehicle, floor), price < floor.Only authorized decisions derive the executable intent:
rule intent.send_offer(request, vehicle, price) :- sales.decision.authorized_offer(request, vehicle, price), not sales.offer_sent(request, vehicle, price).And named invariants catch anything the decision rules miss:
invariant offer_sent_above_auto_floor(request) :- sales.offer_sent(request, vehicle, price), policy.auto_authorize_min_price(vehicle, floor), price >= floor.If someone tries to derive intent.send_offer directly from
offer_decision.* atoms without routing through proposal.*, the
platform rejects the program at load time. The relay boundary is
enforced mechanically.
Make It Yours
Section titled “Make It Yours”The Chevy example is one kind of LLM decision containment. The same pattern fits:
- Customer service chatbots — an LLM proposes a refund; a refund-policy decision rule authorizes, escalates, or rejects. See Air Canada Refund Policy for a complete worked example built around the public Air Canada bereavement-policy chatbot failure.
- Incident remediation agents — an LLM proposes a remediation
step; a safety decision rule ensures
no_kill_unsynced_primaryand friends hold before the remediation can fire. - Procurement automation — an LLM proposes a purchase; a spending-authority decision rule gates by amount and vendor tier.
- Compliance screening — an LLM proposes a disposition; a compliance decision rule checks watchlists and jurisdictional rules.
Any time you have an AI proposing a commercial or operational action,
the decision containment pattern fits. To start building, pick up
Build Your First App and scaffold
with jacqos scaffold --pattern decision.
Going deeper
Section titled “Going deeper”For the underlying mechanics — proposal.* namespaces, ratification
rules, and the relay boundary the loader enforces — see:
- Action Proposals — how to author
decider-relay proposals, the ratification rules that gate them, and
the schema reference for
proposal.*validation.