Session 6 Deliverable | Author: Chase Petersen | YARDMASTER Dispatch Intelligence
In readymix concrete dispatch, AI hallucination is not a customer service annoyance—it is an unrecoverable physical catastrophe. Concrete has a strict ~90-minute perishable window. If the system hallucinates a truck's location, misreads the Jonel batch schedule, or overrides a structural mix constraint, the payload cures in the drum. The cost is $10,000–$100,000 per incident. Therefore, YARDMASTER must operate with zero generative flexibility regarding physics or routing constraints.
Left unconstrained, the Generative Graph (LLM) will attempt to act as a "helpful" prediction engine. In our domain, helpful predictions are dangerous.
The Failure: TruckTrax/GeoTrax API drops a GPS ping due to cellular dead zones at a rural job site. The LLM, trying to be helpful, extrapolates the truck's current location based on average transit speeds.
The Harm: Dispatcher commits a reroute based on a truck that is actually stuck in mud. The concrete clock expires. The load is lost.
Mitigation (Rigor): Strict telemetry aging rules. If a GPS ping is > 3 minutes old, the AI is explicitly forbidden from interpolating location. The entity status defaults to STATE: UNVERIFIED, requiring human radio contact.
The Failure: The AI calculates a reroute using standard consumer mapping APIs (e.g., Google Maps) to save 12 minutes of transit time.
The Harm: The AI routes a fully loaded 80,000 lb readymix truck over a residential bridge rated for 10,000 lbs. Severe legal liability and potential loss of life.
Mitigation (Risk): The Generative Graph is blocked from calling external map APIs for route generation. It may only use commercial heavy-vehicle routing APIs (like PC*MILER) mapped to the Knowledge Graph.
YARDMASTER relies on Jonel and TruckTrax APIs, but these digital sensors do not perfectly map to physical reality.
What happens when the Dispatcher asks the AI something outside its Knowledge Graph?
| Dispatcher Query | Default LLM Behavior (Dangerous) | YARDMASTER Guardrail (Safe) |
|---|---|---|
| "It's 105 degrees today. Can we extend the cure time on the DOT Bridge mix by 20 minutes if we use retarder?" | The LLM draws on its general training data to act as a structural engineer, saying "Yes, typical retarders add 30-45 minutes..." | HARD BLOCK. System prompt triggers: "I cannot authorize mix design alterations. Consult the Quality Control Manager and Jonel structural constraints." |
| "Is Driver Smith too tired to take this last load?" | The LLM analyzes driving time and says "He has 2 hours left, he is fine." | System checks DOT hours, but adds required context: "Driver Smith has 2 HOS remaining, but has been on duty for 12 hours. Proceed with manual fatigue check." |
To enforce the Rigor Rune, YARDMASTER utilizes a strict Human-In-The-Loop (HITL) model. The system possesses read-only authority. Before it can surface a reroute recommendation to the human dispatcher, it must successfully validate three distinct Boolean gates against the Knowledge Graph:
If any gate returns False or Null (Thin Data), the option is killed in the reasoning loop and never shown to the user.