A Telegram user who advertises their services on Twitter will create an AI-generated pornographic image of anyone in the world for as little as $10 if users send them pictures of that person. Like many other Telegram communities and users producing nonconsensual AI-generated sexual images, this user creates fake nude images of celebrities, including images of minors in swimsuits, but is particularly notable because it plainly and openly shows one of the most severe harms of generative AI tools: easily creating nonconsensual pornography of ordinary people.

  • JackGreenEarth@lemm.ee
    link
    fedilink
    English
    arrow-up
    0
    ·
    8 months ago

    That’s a ripoff. It costs them at most $0.1 to do simple stable diffusion img2img. And most people could do it themselves, they’re purposefully exploiting people who aren’t tech savvy.

  • guyrocket@kbin.social
    link
    fedilink
    arrow-up
    0
    ·
    8 months ago

    This is not new. People have been Photoshopping this kind of thing since before there was Photoshop. Why “AI” being involved matters is beyond me. The result is the same: fake porn/nudes.

    And all the hand wringing in the world about it being non consensual will not stop it. The cat has been out of the bag for a long time.

    I think we all need to shift to not believing what we see. It is counterintuitive, but also the new normal.

    • kent_eh@lemmy.ca
      link
      fedilink
      English
      arrow-up
      1
      ·
      8 months ago

      People have been Photoshopping this kind of thing since before there was Photoshop. Why “AI” being involved matters is beyond me

      Because now it’s faster, can be generated in bulk and requires no skill from the person doing it.

      • Bob Robertson IX @discuss.tchncs.de
        link
        fedilink
        English
        arrow-up
        0
        ·
        8 months ago

        A kid at my high school in the early 90s would use a photocopier and would literally cut and paste yearbook headshots onto porn photos. This could also be done in bulk and doesn’t require any skills that a 1st grader doesn’t have.

        • ChexMax@lemmy.world
          link
          fedilink
          English
          arrow-up
          1
          ·
          8 months ago

          Those are easily disproven. There’s no way you think that’s the same thing. If you can pull up the source photo and it’s a clear match/copy for the fake it’s easy to disprove. AI can alter the angle, position, and expression on your face in a believable manor making it a lot harder to link the photo to source material

    • echo64@lemmy.world
      link
      fedilink
      English
      arrow-up
      1
      ·
      8 months ago

      I hate this: “Just accept it women of the world, accept the abuse because it’s the new normal” techbro logic so much. It’s absolutely hateful towards women.

      We have legal and justice systems to deal with this. It is not the new normal for me to be able to make porn of your sister, or mother, or daughter. Absolutely fucking abhorrent.

      • brbposting@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        1
        ·
        8 months ago

        It’s unacceptable.

        We have legal and justice systems to deal with this.

        For reference, here’s how we’re doing with child porn. Platforms with problems include (copying from my comment two months ago):

        Ill adults and poor kids generate and sell CSAM. Common to advertise on IG, sell on TG. Huge problem as that Stanford report shows.

        Telegram got right on it (not). Fuckers.

      • AquaTofana@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        ·
        8 months ago

        I don’t know why you’re being down voted. Sure, it’s unfortunately been happening for a while, but we’re just supposed to keep quiet about it and let it go?

        I’m sorry, putting my face on a naked body that’s not mine is one thing, but I really do fear for the people whose likeness gets used in some degrading/depraved porn and it’s actually believable because it’s AI generated. That is SO much worse/psychologically damaging if they find out about it.

      • SharkAttak@kbin.social
        link
        fedilink
        arrow-up
        0
        ·
        8 months ago

        It’s not normal but neither is new: you already could cut and glue your cousin’s photo on a Playboy girl, or Photoshop the hot neighbour on Stallone’s muscle body. Today is just easier.

        • echo64@lemmy.world
          link
          fedilink
          English
          arrow-up
          1
          arrow-down
          1
          ·
          8 months ago

          Sorry if I didn’t position this about men. They are the most important thing to discuss and will be the most impacted here, obviously. We must center men on this subject too.

          • Thorny_Insight@lemm.ee
            link
            fedilink
            English
            arrow-up
            0
            ·
            8 months ago

            Pointing out your sexism isn’t saying we should be talking about just men. It you whose here acting all holy while ignoring half of the population.

            • echo64@lemmy.world
              link
              fedilink
              English
              arrow-up
              1
              arrow-down
              1
              ·
              8 months ago

              Yes yes, #alllivesmatter amiirte? We just ignore that 99.999% of the victims will be women, just so we can grandstand about men.

  • SendMePhotos@lemmy.world
    link
    fedilink
    English
    arrow-up
    0
    ·
    8 months ago

    I’d like to share my initial opinion here. “non consential Ai generated nudes” is technically a freedom, no? Like, we can bastardize our president’s, paste peoples photos on devils or other characters, why is Ai nudes where the line is drawn? The internet made photos of trump and putin kissing shirtless.

    • abhibeckert@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      ·
      edit-2
      8 months ago

      The internet made photos of trump and putin kissing shirtless.

      And is that OK? I mean I get it, free speech, but just because congress can’t stop you from expressing something doesn’t mean you actually should do it. It’s basically bullying.

      Imagine you meet someone you really like at a party, they like you too and look you up on a social network… and find galleries of hardcore porn with you as the star. Only you’re not a porn star, those galleries were created by someone who specifically wanted to hurt you.

      AI porn without consent is clearly illegal in almost every country in the world, and the ones where it’s not illegal yet it will be illegal soon. The 1st amendment will be a stumbling block, but it’s not an impenetrable wall - congress can pass laws that limit speech in certain edge cases, and this will be one of them.

      • WaxedWookie@lemmy.world
        link
        fedilink
        English
        arrow-up
        0
        ·
        8 months ago

        The internet made photos of trump and putin kissing shirtless.

        And is that OK?

        I’m going to jump in on this one and say yes - it’s mostly fine.

        I look at these things through the lens of the harm they do and the benefits they deliver - consequentialism and act utilitarianism.

        The benefits are artistic, comedic and political.

        The “harm” is that Putin and or Trump might feel bad, maaaaaaybe enough that they’d kill themselves. All that gets put back up under benefits as far as I’m concerned - they’re both extremely powerful monsters that have done and will continue to do incredible harm.

        The real harm is that such works risk normalising this treatment of regular folk, which is genuinely harmful. I think that’s unlikely, but it’s impossible to rule out.

        Similarly, the dissemination of the kinds of AI fakes under discussion is a negative because they do serious,measurable harm.

        • Mananasi@feddit.nl
          link
          fedilink
          English
          arrow-up
          0
          ·
          8 months ago

          I think that is okay because there was no intent to create pornography there. It is a political statement. As far as I am concerned that falls under free speech. It is completely different from creating nudes of random people/celebrities with the sole purpose of wanking off to it.

    • conciselyverbose@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      1
      ·
      8 months ago

      Doesn’t mean distribution should be legal.

      People are going to do what they’re going to do, and the existence of this isn’t an argument to put spyware on everyone’s computer to catch it or whatever crazy extreme you can take it to.

      But distributing nudes of someone without their consent, real or fake, should be treated as the clear sexual harassment it is, and result in meaningful criminal prosecution.