• @[email protected]
    link
    fedilink
    English
    142 months ago

    It lacks cohesion the longer it goes on, not so much “hallucinating” as it is losing the thread, losing the plot. Internal consistency goes out the window, previously-made declarations are ignored, and established canon gets trounced upon.

    But that’s cuz it’s not AI, it’s just LLM all the way down.

      • @[email protected]
        link
        fedilink
        English
        42 months ago

        Depends on complexity and the number of elements to keep track of, and varies between models and people. Try it out for yourself to see! :)

      • @[email protected]
        link
        fedilink
        English
        32 months ago

        Its kind of an exponential falloff, for a few lines it can follow concrete mathematical rules, for a few paragraphs it can remember basic story beats, for a few pages it can just about remember your name.