AI hallucinates. How do you keep it from fucking up automations?

4 points | by Gioppix 12 days ago

3 comments

  • storystarling 12 days ago
    I found the only way to make this work reliably is to treat the LLM as a fallible component inside a state machine rather than the controller. I've been using LangGraph to enforce structured outputs and run validation checks before any side effects happen. If the output doesn't match the schema or business logic it just retries or halts. It seems like a lot of boilerplate initially but it is necessary if you want to trust the system with actual invoices.
    • chrisjj 12 days ago
      So when this issues a valid but garbage invoices, then what?
    • downboots 12 days ago
      • Gioppix 12 days ago
        I remember studying this in uni lol. How do you use it?
      • nik282000 12 days ago
        If you have to manually validate everything then what did you save by using an LLM? DIY and know it will work the first time.