• chrash0@lemmy.world
    link
    fedilink
    English
    arrow-up
    1
    ·
    8 months ago

    they likely aren’t creating the model themselves. the faces are probably all the same AI girl you see everywhere. you gotta be careful with open weight models because the open source image gen community has a… proclivity for porn. there’s not a “function” per se for porn. the may be doing some preprompting or maybe “swim with the sharks” is just too vague of a prompt and the model was just tuned on this kind of stuff. you can add an evaluation network to the end to basically ask “is this porn/violent/disturbing”, but that needs to be tuned as well. most likely it’s even dumber than that where the contractor just subcontracted the whole AI piece and packages it for this use case

    • Sabata11792@kbin.social
      link
      fedilink
      arrow-up
      2
      ·
      8 months ago

      The fun part is the image detection models needs to be trained on a lot of porn to be able to identity and filter for porn.