• Just_Pizza_Crust@lemmy.world
    link
    fedilink
    English
    arrow-up
    28
    ·
    edit-2
    8 months ago

    Softcore gilf porn created by an AI to sell state lottery tickets wasn’t on my cards for 2024, but here we are.

  • tal@lemmy.today
    link
    fedilink
    English
    arrow-up
    11
    ·
    8 months ago

    “Our tax dollars are paying for that! I was completely shocked. It’s disturbing to say the least,” Megan explained to the Jason Rantz Show on KTTH.

    I mean, I’d assume that the state lottery is revenue-positive. It’s more like lottery players are paying for it.

  • webghost0101@sopuli.xyz
    link
    fedilink
    English
    arrow-up
    12
    arrow-down
    1
    ·
    8 months ago

    Lol, they didn’t even try to test the system if this is the result. Ai isn’t intelligent but humans still take the cake of stupidity by having brains and not using them.

    Many public stable diffusion models have a bias, porn often being overrepresented but all it takes is a "nude, naked, erotic, sex, nsfw " in the negative prompt and unless the model is build to only generate porn this will never happen. Or better yet, use some of that corporate money and build their own sd model that is verified to not included any nudity in its training data.

  • Greg Clarke@lemmy.ca
    link
    fedilink
    English
    arrow-up
    11
    ·
    8 months ago

    I can’t verify this story with any reputable sources. Is this real or just boomerbait?

    • stoly@lemmy.world
      link
      fedilink
      English
      arrow-up
      10
      ·
      8 months ago

      This site is a complete right-wing boomerbait rag, never pay it any attention.

      People think of WA and think of Seattle, then extrapolate. Seattle is really no different than places like Omaha where there is a more liberal, educated populace. They are surrounded by a state full of angry, ignorant people. This “newspaper” is for the angry types.

    • Fubarberry@sopuli.xyz
      link
      fedilink
      English
      arrow-up
      7
      arrow-down
      1
      ·
      8 months ago

      The “test drive a win” where it would generate AI images of people as lottery winners was a real thing, and they have taken it down.

      Only larger news outlet I see covering it is Fox news. They cite the “mynorthwest.com” as their main source, but they do say that they recieved a statement from the lottery confirming that it was shutdown for that reason:

      Washington’s Lottery confirmed to Fox News Digital that it shut down the site after being made aware of the purported image.

      Obviously a lot of people don’t like Fox news, but I don’t think there’s a political agenda where that statement shouldn’t be trusted.

    • Pissnpink@feddit.uk
      link
      fedilink
      English
      arrow-up
      3
      arrow-down
      1
      ·
      edit-2
      8 months ago

      Idk, Mynorthwest is a real source but it’s mostly dull local news fare with a some good event coverage. KIRO is that branch and its okay, its center right, it certianly isn’t the cinclair broadcasting station, thats komo 4. 710 sports is more center left but its sports. 770 KTTH where this article seems to be coming from is obviously garbage reactionary conservative radio, but that’s what makes money in radio.

  • Nobody@lemmy.world
    link
    fedilink
    English
    arrow-up
    9
    ·
    8 months ago

    AI hallucinates a request for a topless photo. Nothing fundamentally wrong with this technology at all. Keep pouring billions into it.

  • Norgur@kbin.social
    link
    fedilink
    arrow-up
    5
    ·
    8 months ago

    Can we talk less about AI inevitably doing what AI always dies and fuck up and instead talk about the website that uses AI resources to dangle an even more juicy carrot in front of desperate people throwing away their money with the lottery?

  • sugar_in_your_tea@sh.itjust.works
    link
    fedilink
    English
    arrow-up
    1
    ·
    8 months ago

    Why wouldn’t they just generate a couple hundred images and manually review them? It’s pretty easy to automate putting someone’s face onto an existing image, so that should be totally fine.

    They could cycle the images every so often with the insane amounts of money the lottery generates.

  • Neato@ttrpg.network
    link
    fedilink
    English
    arrow-up
    1
    arrow-down
    1
    ·
    8 months ago

    When Megan, a 50-year-old mother based in Tumwater, visited the new AI-powered mobile site from Washington’s Lottery on March 30, she thought she was in for some frivolous fun. Test Drive A Win allows users to digitally throw a dart at a dartboard featuring dream vacations you can pay for with the money you win in the lottery. Depending on where the dart lands, you can either upload a headshot or take one on your phone to upload, and the AI superimposes your image into the vacation spot.

    Megan landed on a “swim with the sharks” dream vacation option. She was shocked at one of the AI photos Washington’s Lottery spit out. It was softcore porn.

    So I can totally see this happening. Government contracts with an genAI company and company drops the ball and erroneously includes the function for pornography or doesn’t select the correctly curated training data (I’m unsure how exactly these work). It may be quite difficult to spot this error by the Washington government is the occurrence rate is very low or none of their test training data prompted pornography to be generated. Perhaps it was only keyed to make porn (when not specifically prompted to) on certain subsets of matched facial features? I’m not suggesting this, but perhaps that affected user looks a lot like a popular porn star? It could also totally be the government’s fault for quickly selecting an AI package and not looking what it could do; but with government bureaucracy there could’ve been quite a few people with oversight.

    My bigger question is WTF is this system even doing? If you win money in the lottery, you can select to apply it to a vacation package if your random draw hits it? Why wouldn’t you just take the money and buy your own? Maaaaybe if it heavily discounts the vacations or something. Seems like an unnecessary step in the lottery process.

    • Kbin_space_program@kbin.social
      link
      fedilink
      arrow-up
      3
      ·
      8 months ago

      It’s a core problem with image generator LLMs. For some fucking reason they seem to have fed them the content from sites that had a lot of porn. Guessing Imgur and Deviantart.

      Literally the first time I tried to use MS’s image generator, was out with some friends trying a new fried chicken place and we were discussing fake tinder profiles.

      So I thought to try it and make a fake image of “woman senuously eating fried chicken”.
      Content warning, blah blah blah.

      Try “Man sensuously eating fried chicken”. Works fine.

      We were all mystified by that. I went back a few days later to play around. Tried seeing what it didn’t like. Tried generating “woman relaxing at park”.
      Again, content warning. Switch to a man, no problem. Eventually got it to generate with “woman enjoying sunset in a park.” Got a very dark image, because it generated a completely nude woman T-posing in the dark.

      So, with that in hand I went back and started specifying “fully clothed” for a prompt involving the word “woman”. All of a sudden all of the prompts worked. They fed the bot so much porn that it defaulted women to being nude.

      • Taako_Tuesday@lemmy.ca
        link
        fedilink
        English
        arrow-up
        0
        arrow-down
        1
        ·
        8 months ago

        Doesn’t it also have to do with the previous requests the LLM has recieved? In order for this thing to “learn” it has to know what people are looking for, so i’ve always imagined the porn problem as being a result of the fact that people are using these things to generate porn at a much greater volume than anything else, especially porn of women, so it defaults to nude because that’s what most requests were looking for

        • TheRealKuni@lemmy.world
          link
          fedilink
          English
          arrow-up
          2
          ·
          8 months ago

          Nah, most of these generative models don’t account for previous requests. There would be some problems if they did. I read somewhere that including generative AI data in generative AI training has a feedback effect that can ruin models.

          It’s just running a bunch of complicated math against previously trained algorithms.

    • orclev@lemmy.world
      link
      fedilink
      English
      arrow-up
      1
      ·
      8 months ago

      My bigger question is WTF is this system even doing? If you win money in the lottery, you can select to apply it to a vacation package if your random draw hits it?

      No, it’s advertising. They’re trying to convince people to play the lottery so they have you roll a (virtual) wheel and upload a head shot then it generates a theoretical video of what it might look like if you went on that vacation (using your theoretical future winnings). It’s absolutely idiotic, but their target demographic isn’t exactly the sharpest tools in the shed to begin with.

    • chrash0@lemmy.world
      link
      fedilink
      English
      arrow-up
      1
      ·
      8 months ago

      they likely aren’t creating the model themselves. the faces are probably all the same AI girl you see everywhere. you gotta be careful with open weight models because the open source image gen community has a… proclivity for porn. there’s not a “function” per se for porn. the may be doing some preprompting or maybe “swim with the sharks” is just too vague of a prompt and the model was just tuned on this kind of stuff. you can add an evaluation network to the end to basically ask “is this porn/violent/disturbing”, but that needs to be tuned as well. most likely it’s even dumber than that where the contractor just subcontracted the whole AI piece and packages it for this use case

      • Sabata11792@kbin.social
        link
        fedilink
        arrow-up
        2
        ·
        8 months ago

        The fun part is the image detection models needs to be trained on a lot of porn to be able to identity and filter for porn.

  • boatsnhos931@lemmy.world
    link
    fedilink
    English
    arrow-up
    4
    arrow-down
    7
    ·
    8 months ago

    Ahh I love technology… however I need the uncensored image to investigate further…in private…for 30-45 seconds…