• wurzelwerk@kbin.social
    link
    fedilink
    arrow-up
    6
    ·
    1 year ago

    Whenever I see this I have to chuckle. Humans do this all the time, whip stuff up as they go, lie, pretend. The AI can only do what it has learned from us people. So it lies, makes stuff up and pretends. Why is this so surprising?

    • FaceDeer@kbin.social
      link
      fedilink
      arrow-up
      4
      ·
      1 year ago

      I prefer the term “confabulation” to “lying”, both because it’s more accurate and it’s more fun to say. Confabulation is when you don’t know that you’re lying, it’s just your dumb brain coming up with stuff that turns out not to be real. Like if you’re asked “are there any red cars parked on the street in the neighborhood where you live?” Your brain hears “I want a memory of a red car parked on the street” and it helpfully delivers exactly that.

        • FaceDeer@kbin.social
          link
          fedilink
          arrow-up
          1
          ·
          1 year ago

          It turns out that it’s super easy to provoke the human brain to generate false memories about stuff. I’ve read about some of the research that Elizabeth Loftus has done and it’s eerie.

    • RocksForBrains@lemm.ee
      link
      fedilink
      arrow-up
      1
      ·
      edit-2
      1 year ago

      I think it’s just humorous. AI chat models have no capacity to understand the subject matter, it’s job is simply to regurgitate it’s findings on request. Naturally, it’s a bad liar.