Deepfake Porn Is Out of Control::New research shows the number of deepfake videos is skyrocketing—and the world’s biggest search engines are funneling clicks to dozens of sites dedicated to the nonconsensual fakes.
Deepfake Porn Is Out of Control::New research shows the number of deepfake videos is skyrocketing—and the world’s biggest search engines are funneling clicks to dozens of sites dedicated to the nonconsensual fakes.
And a “video” should ruin those things why?
Literally everything you listed is because society is making a big stink of things that don’t matter.
Why should your job care … even if it’s real?
If somebody didn’t cheat and there’s no other reason to believe that than a … suspiciously careless video of someone that looks like them… Why in the world should that end their relationship?
Not to mention, AI isn’t going to get the details right. It’s going to get the gist of it right but anyone who’s actually seen you naked is presumably going to be able to find some details that are/aren’t off.
Also in terms of privacy, your privacy wasn’t violated. Someone made a caricature of you.
It’s really not, the only reason it is, is because video has been trustworthy for the past century, now it’s not.
I hope you folks down voting me have some magic ace up your sleeve, but I see no way past this other than through it. Just like when the atom bomb was invented, it’s technology that exists now and we have to deal with it. Unlike the atom bomb, it’s just a bunch of computer code and at some point pretty much any idiot is going to be able to get their hands on convincing versions of it. Also unlike the atomic bomb, it can’t actually kill you.