A Telegram user who advertises their services on Twitter will create an AI-generated pornographic image of anyone in the world for as little as $10 if users send them pictures of that person. Like many other Telegram communities and users producing nonconsensual AI-generated sexual images, this user creates fake nude images of celebrities, including images of minors in swimsuits, but is particularly notable because it plainly and openly shows one of the most severe harms of generative AI tools: easily creating nonconsensual pornography of ordinary people.

  • kent_eh@lemmy.ca
    link
    fedilink
    English
    arrow-up
    1
    ·
    8 months ago

    People have been Photoshopping this kind of thing since before there was Photoshop. Why “AI” being involved matters is beyond me

    Because now it’s faster, can be generated in bulk and requires no skill from the person doing it.

    • Bob Robertson IX @discuss.tchncs.de
      link
      fedilink
      English
      arrow-up
      0
      ·
      8 months ago

      A kid at my high school in the early 90s would use a photocopier and would literally cut and paste yearbook headshots onto porn photos. This could also be done in bulk and doesn’t require any skills that a 1st grader doesn’t have.

      • ChexMax@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        ·
        8 months ago

        Those are easily disproven. There’s no way you think that’s the same thing. If you can pull up the source photo and it’s a clear match/copy for the fake it’s easy to disprove. AI can alter the angle, position, and expression on your face in a believable manor making it a lot harder to link the photo to source material