AI hallucinates. How do you keep it from fucking up automations?

4 points | by Gioppix 3 hours ago

2 comments

  • storystarling 2 hours ago
    I found the only way to make this work reliably is to treat the LLM as a fallible component inside a state machine rather than the controller. I've been using LangGraph to enforce structured outputs and run validation checks before any side effects happen. If the output doesn't match the schema or business logic it just retries or halts. It seems like a lot of boilerplate initially but it is necessary if you want to trust the system with actual invoices.
    • chrisjj 1 hour ago
      So when this issues a valid but garbage invoices, then what?
    • downboots 2 hours ago
      • Gioppix 2 hours ago
        I remember studying this in uni lol. How do you use it?