Thanks for the info! I guess that’s ultimately what I’m looking for more about: how much do we know about cellular traffic? Obviously with encryption we can’t just directly read cell signals to find out what’s being sent, so do people just record the volume of data being sent in individual packets and make educated guesses?
It seems plausible to run a simple(non-AI) algorithm to isolate probable conversations and send stripped and compressed audio chunks along with normal data. I assume that’s still probably too hard to hide, but if anyone out there knows of someone that’s looked for this stuff, I’d love to check it out.
I don’t think you’re on the right track here. There are definitely existing laws in most states regarding ‘revenge porn’, creating sexual media of minors, Photoshop porn, all kinds of things that are very similar to ai generated deep fakes. In some cases ai deepfakes fall under existing laws, but often they don’t. Or, because of how the law is written, they exist in a legal grey area that will be argued in the courts for years.
Nowhere is anyone suggesting that making deepfakes should be prosecuted as rape, that’s just complete nonsense. The question is, where do new laws need to be written, or laws need to be updated to make sure ai porn is treated the same as other forms of illegal use of someone’s likeness to make porn.