The creation of sexually explicit deepfake content is likely to become a criminal offense in England and Wales as concern grows over the use of artificial intelligence to exploit and harass women.
You’re not the first to think of it and it’s where this whole idea will fall flat on it’s face.
There’s just no way to actually check if the subject of a photo consented to having their photo taken. That was difficult enough with physical cameras, it’s so much more difficult now that no camera is involved in generating the image.
I mean, if I were to post an image here in this comment - how can the Fediverse possibly verify that I have the right to post it?
I just imagine someone showing up to my work and presenting that contract and next thing you know I’m stuck in the dryer with only my stepson Esteban to help me…
Step one… create consent deepfake…
I don’t like that I thought it… But It pains me to say it will be used as a defense at some point.
You’re not the first to think of it and it’s where this whole idea will fall flat on it’s face.
There’s just no way to actually check if the subject of a photo consented to having their photo taken. That was difficult enough with physical cameras, it’s so much more difficult now that no camera is involved in generating the image.
I mean, if I were to post an image here in this comment - how can the Fediverse possibly verify that I have the right to post it?
I just imagine someone showing up to my work and presenting that contract and next thing you know I’m stuck in the dryer with only my stepson Esteban to help me…