• warmaster@lemmy.world
    link
    fedilink
    English
    arrow-up
    20
    ·
    edit-2
    7 months ago

    UK: Making porn of unwilling celebrities is illegal.

    US: Making commercial movies with unwilling actors is perfectly fine!

  • yggstyle@lemmy.world
    link
    fedilink
    English
    arrow-up
    14
    ·
    7 months ago

    Step one… create consent deepfake…

    I don’t like that I thought it… But It pains me to say it will be used as a defense at some point.

    • abhibeckert@lemmy.world
      link
      fedilink
      English
      arrow-up
      8
      ·
      edit-2
      7 months ago

      You’re not the first to think of it and it’s where this whole idea will fall flat on it’s face.

      There’s just no way to actually check if the subject of a photo consented to having their photo taken. That was difficult enough with physical cameras, it’s so much more difficult now that no camera is involved in generating the image.

      I mean, if I were to post an image here in this comment - how can the Fediverse possibly verify that I have the right to post it?

    • Entropywins@lemmy.world
      link
      fedilink
      English
      arrow-up
      3
      ·
      7 months ago

      I just imagine someone showing up to my work and presenting that contract and next thing you know I’m stuck in the dryer with only my stepson Esteban to help me…

  • UnpluggedFridge@lemmy.world
    link
    fedilink
    English
    arrow-up
    12
    ·
    7 months ago

    This is a difficult issue to deal with, but I think the problem lies with our current acceptance of photographs as an objective truth. If a talented writer places someone in an erotic text, we immediately know that this is a product of imagination. If a talented artist sketches up a nude of someone, we can immediately recognize that this is a product of imagination. We have laws around commercial use of likenesses, but I don’t think we would make those things illegal.

    But now we have photographs that are products of imagination. I don’t have a solution for this specific issue, but we all need to calibrate how we establish trust with persons and information now that photographs, video, speech, etc can be faked by AI. I can even imagine a scenario in the not-too-distant future where face-to-face conversation cannot be immediately trusted due to advances in robotics or other technologies.

    Lying and deception are human nature, and we will always employ any new technologies for these purposes along with any good they may bring. We will always have to carefully adjust the line on what is criminal vs artistic vs non-criminal depravity.

    • CleoTheWizard@lemmy.world
      link
      fedilink
      English
      arrow-up
      4
      ·
      7 months ago

      I just don’t see why you’d make the creation of this stuff illegal. Right now you could be easy photoshop to put people’s faces onto dirty pictures. It hurts zero people and also takes a similar low amount of effort. As long as you keep it to yourself, society should not care.

      Making it illegal also seems kind of dumb when you can just hold someone civilly liable for this stuff if they’re posting nude photos of you, real or not. I don’t see the issue of any of it if we enforce these photos spreading as if they were real and let people collect damages.