• FishFace@piefed.social
      link
      fedilink
      English
      arrow-up
      100
      ·
      2 days ago

      LLMs work by picking the next word* as the most likely candidate word given its training and the context. Sometimes it gets into a situation where the model’s view of “context” doesn’t change when the word is picked, so the next word is just the same. Then the same thing happens again and around we go. There are fail-safe mechanisms to try and prevent it but they don’t work perfectly.

      *Token

      • bunchberry@lemmy.world
        link
        fedilink
        arrow-up
        1
        ·
        4 hours ago

        This happened to me a lot when I tried to run big models with low context windows. It would effectively run out of memory so each new token wouldn’t actually be added to the context so it would just get stuck in an infinite loop repeating the previous token. It is possible that there was a memory issue on Google’s end.

      • ideonek@piefed.social
        link
        fedilink
        English
        arrow-up
        22
        arrow-down
        3
        ·
        2 days ago

        That was the answer I was looking for. So it’s simmolar to “seahorse” emoji case, but this time.at some point he just glitched that most likely next world for this sentence is “or” and after adding the “or” is also “or” and after adding the next one is also “or”, and after a 11th one… you may just as we’ll commit. Since thats the same context as with 10.

        Thanks!

          • ideonek@piefed.social
            link
            fedilink
            English
            arrow-up
            39
            arrow-down
            5
            ·
            edit-2
            2 days ago

            Chill dude. It’s a grammatical/translation error, not an ideological declaration. Especially common mistake if of your native language have “grammatical gender”. Everything have “gender” in mine. “Spoon” is a “she” for example, but im not proposing to any one soon. Not all hills are worth nitpicking on.

            • atomicbocks@sh.itjust.works
              link
              fedilink
              English
              arrow-up
              8
              arrow-down
              29
              ·
              2 days ago

              This one is. People need to stop anthropomorphizing AI. It’s a piece of software.

              I am chill, you shouldn’t assume emotion from text.

              • piccolo@sh.itjust.works
                link
                fedilink
                arrow-up
                2
                arrow-down
                1
                ·
                4 hours ago

                English, being a descendant of german, used to have grammatical gender. It has fallen out of favor since middle english. But there is still traces of it, such a common tradition is calling ships, vehicles, and other machines as a “she”, but some people will default to the “generic he” as well.

              • ideonek@piefed.social
                link
                fedilink
                English
                arrow-up
                23
                arrow-down
                2
                ·
                2 days ago

                As I explained, this is specyfic example, I no more atrompomorphin it than if I’m calling a “he” my toliet paper. The monster you choose to charge is a windmill. So “chill” seems adequate.

                • atomicbocks@sh.itjust.works
                  link
                  fedilink
                  English
                  arrow-up
                  3
                  arrow-down
                  20
                  ·
                  2 days ago

                  To be clear using gendered pronouns on inanimate objects is the literal definition of anthropomorphization. So chill does not seem fair at all.

                  • percent@infosec.pub
                    link
                    fedilink
                    arrow-up
                    2
                    arrow-down
                    1
                    ·
                    3 hours ago

                    In some languages, all nouns are gendered, and it’s impossible to refer to a noun without a gender. There is no “it”, only (s)he.

                    If you ever learn a language like that, you will make mistakes. If someone hears your mistake, hopefully they’ll be more forgiving about it than you are.

                  • MinnesotaGoddam@lemmy.world
                    link
                    fedilink
                    arrow-up
                    7
                    arrow-down
                    1
                    ·
                    1 day ago

                    To be clear using gendered pronouns on inanimate objects is the literal definition of anthropomorphization

                    you really need to get over yourself. the universe does not revolve around you nor humans. the use of gendered pronouns on inanimate objects is not anthropomorphization.

                  • captcha_incorrect@lemmy.world
                    link
                    fedilink
                    arrow-up
                    4
                    arrow-down
                    1
                    ·
                    1 day ago

                    It was explained to be a translation error from a language with pronouns for all objects. I have to disagree with you on this one.

                • ulterno@programming.dev
                  link
                  fedilink
                  English
                  arrow-up
                  1
                  arrow-down
                  10
                  ·
                  2 days ago

                  Yeah. It would have been much more productive to poke at the “well”, which was turned into “we’ll”.

                  • ideonek@piefed.social
                    link
                    fedilink
                    English
                    arrow-up
                    13
                    arrow-down
                    1
                    ·
                    2 days ago

                    You brought this unmistaken “I speak lauder and lauder on my European vacation until waiter that doesn’t speek English can finaly understand me” energy to this conversation.

                  • atomicbocks@sh.itjust.works
                    link
                    fedilink
                    English
                    arrow-up
                    2
                    arrow-down
                    6
                    ·
                    edit-2
                    2 days ago

                    I don’t care that this person, who seems to maybe be typing English on a keyboard with a different language dictionary, misspelled some words.

                    I care that people in general keep talking about AI like it is living or capable of thinking.

              • MotoAsh@piefed.social
                link
                fedilink
                English
                arrow-up
                4
                arrow-down
                1
                ·
                1 day ago

                Using ‘he’ in a sentence is a far cry from the important parts of not anthropomorphizing “AI”…

    • Ech@lemmy.ca
      link
      fedilink
      arrow-up
      24
      ·
      2 days ago

      It’s like the text predictor on your phone. If you just keep hitting the next suggested word, you’ll usually end up in a loop at some point. Same thing here, though admittedly much more advanced.

      • vaultdweller013@sh.itjust.works
        link
        fedilink
        arrow-up
        2
        ·
        23 hours ago

        Example of my phone doing this.

        I just want you are the only reason that you can’t just forget that I don’t have a way that I have a lot to the word you are not even going on the phone and you can call it the other way to the other one I know you are going out to talk about the time you are not even in a good place for the rest they’ll have a little bit more mechanically and the rest is.

        You can see it looping pretty damned quick with me just hitting the first suggestion after the initial I.

        • MrScottyTay@sh.itjust.works
          link
          fedilink
          English
          arrow-up
          2
          ·
          20 hours ago

          I think I will be in the office tomorrow so I can do it now and then I can do it now and then I can do it for you and your dad and dad and dad and dad and dad and dad and dad and dad and dad and dad

          That was mine haha

    • Arghblarg@lemmy.ca
      link
      fedilink
      arrow-up
      31
      ·
      2 days ago

      LLM showed its true nature, probabilistic bullshit generator that got caught in a strange attractor of some sort within its own matrix of lies.

    • palordrolap@fedia.io
      link
      fedilink
      arrow-up
      16
      ·
      2 days ago

      Unmentioned by other comments: The LLM is trying to follow the rule of three because sentences with an “A, B and/or C” structure tend to sound more punchy, knowledgeable and authoritative.

      Yes, I did do that on purpose.

        • luciferofastora@feddit.org
          link
          fedilink
          arrow-up
          2
          ·
          23 hours ago

          I used to think learning stylistic devices like this was just an idle fancy, a tool simply designed to analyse poems, one of the many things you’re most certain you’ll never need but have to learn in school.

          What a fool I’ve been.