• Taako_Tuesday@lemmy.ca
    link
    fedilink
    English
    arrow-up
    0
    arrow-down
    1
    ·
    8 months ago

    Doesn’t it also have to do with the previous requests the LLM has recieved? In order for this thing to “learn” it has to know what people are looking for, so i’ve always imagined the porn problem as being a result of the fact that people are using these things to generate porn at a much greater volume than anything else, especially porn of women, so it defaults to nude because that’s what most requests were looking for

    • TheRealKuni@lemmy.world
      link
      fedilink
      English
      arrow-up
      2
      ·
      8 months ago

      Nah, most of these generative models don’t account for previous requests. There would be some problems if they did. I read somewhere that including generative AI data in generative AI training has a feedback effect that can ruin models.

      It’s just running a bunch of complicated math against previously trained algorithms.