• @[email protected]
    link
    fedilink
    12210 hours ago

    lol, definitely missed some important context.

    I guess it thought OOP meant “clean” as in how do you dress the bird before you cook it. (As in: “clean a fish” means to filet a fish and prep it for cooking.)

      • @[email protected]
        link
        fedilink
        English
        399 hours ago

        Well obviously not for fish. Sounds like someone’s never bought fresh pigeons from the grocery store, smh.

        • @[email protected]
          link
          fedilink
          56 hours ago

          I thought it was only illegal for the stores to cut those off. Like it was some kind of consumer protection thing. Which was back in the 70s, we don’t get that kind of thing anymore.

        • BlueKey
          link
          fedilink
          209 hours ago

          Yea, it voids the warranty. So when you get poisoned after eating it without label, you won’t be able to get a refund.

    • @[email protected]
      link
      fedilink
      46 hours ago

      But first it said they are usually clean. So that can’t be the context. If there was a context. But there is no context because AI is fucking stupid and all these c-suite assholes pushing it like their last bowel movement will be eating crow off of their golden parakeet about two years from now when all this nonsense finally goes away and the new shiny thing is flashing around.

      • @[email protected]
        link
        fedilink
        3
        edit-2
        4 hours ago

        There are signs of three distinct interpretations in the result:

        • On topic, the concept of cleaning a wild bird you are trying to save
        • Preparing a store bought Turkey (removing a label)
        • Preparing a wild bird that is caught

        It’s actually a pretty good illustration of how AI assembles “information shaped text” and how smooth it can look and yet how dumb it can be about it. Unfortunately advocates will just say “I can’t get this specific thing wrong when I ask it or another LLM, so there’s no problem”, even as it gets other stuff wrong. It’s weird as you better be able to second guess the result, meaning you can never be confident in an answer you didn’t already know, but when that’s the case, it’s not that great for factual stuff.

        For “doesn’t matter” content, it may do fine (generated alternatives to stock photography, silly meme pictures, random prattle from background NPCs in a game), but for “stuff that matters”, Generative AI is frequently more of a headache than a help.