Kill me now.

      • Riven@lemmy.dbzer0.com
        link
        fedilink
        arrow-up
        34
        ·
        7 months ago

        I tried the same Ai and asked it to provide a list of 20 things, it only gave me 5. I asked for the rest and it also apologized and then provided the rest. It’s weird that it stumbles at first but is able to see it’s error and fix it. I wonder if it’s a thing that it ‘learned’ from the data set. People not correctly answering prompts the first time.

        • webghost0101@sopuli.xyz
          link
          fedilink
          arrow-up
          8
          ·
          7 months ago

          Something else i also encounter with gpt4 a lot is asking “why did you do x or y” as a general curiosity of learning how it handles the task.

          Almost every time it apologizes and does a fully redo avoiding x or y

        • Echo Dot@feddit.uk
          link
          fedilink
          arrow-up
          3
          ·
          7 months ago

          I personally don’t think a large section of the population meets the requirement for general intelligence so I think it’s a bit rich to expect the AI to do it as well.

      • Echo Dot@feddit.uk
        link
        fedilink
        arrow-up
        6
        arrow-down
        1
        ·
        7 months ago

        It’s weird though because they were able to point out they got to absurdity to its comment and it did agree. No it’s not just algorithmic phrase matching, there is an actual “thought process” going on.

        I’ve never been able to get an AI to explain its logic though which is a shame. I’m sure it would be useful to know why they come up with the answers they do.

        • force@lemmy.world
          link
          fedilink
          arrow-up
          1
          ·
          edit-2
          7 months ago

          I’ve never been able to get an AI to explain its logic though which is a shame. I’m sure it would be useful to know why they come up with the answers they do.

          you and AI researchers both. it’s probably a trillion-dollar problem at this point