• @[email protected]
    link
    fedilink
    English
    4120 days ago

    I’m 100% sure he can’t. Or at least, not from LLMs specifically. I’m not an expert so feel free to ignore my opinion but from what I’ve read, “hallucinations” are a feature of the way LLMs work.

    • @[email protected]
      link
      fedilink
      English
      920 days ago

      One can have an expert system assisted by ML for classification. But that’s not an LLM.