• MentalEdge@sopuli.xyz
      link
      fedilink
      arrow-up
      2
      ·
      edit-2
      2 months ago

      In a couple sentences? In a way that doesn’t approach, equal or exceed the effort of training the model with that data to begin with?

      You insist these models can do new things out of nothing, and you keep saying “all you have to do, is give them something”.

      • Even_Adder@lemmy.dbzer0.com
        link
        fedilink
        English
        arrow-up
        2
        ·
        2 months ago

        You keep moving the goal posts and putting words in my mouth. I never said you can do new things out of nothing. Nothing I mentioned is approaching, equaling, or exceeding the effort of training a model.

        You haven’t answered a single one of my questions, and you are not arguing in good faith. We’re done here. I can’t say it’s been a pleasure.

        • MentalEdge@sopuli.xyz
          link
          fedilink
          arrow-up
          1
          ·
          edit-2
          2 months ago

          My argument was and is that neural models don’t produce anything truly new. That they can’t handle things outside what is outlined by the data they were trained on.

          Are you not claiming otherwise?

          You say it’s possible to guide models into doing new things, and I can see how that’s the case, especially if the model is a very big one, meaning it is more likely that it has relevant structures to apply to the task.

          But I’m also pretty damn sure they have insurmountable limits. You can’t “guide” and LLM into doing image generation, except by having it interact with an image generation model.