• Hawke@lemmy.world
    link
    fedilink
    arrow-up
    45
    ·
    10 days ago

    Better title: “Photographers complain when their use of AI is identified as such”

    • Valmond@lemmy.world
      link
      fedilink
      arrow-up
      14
      ·
      10 days ago

      “It was just a so little itsy bitsy teeny weeny AI edit!!”

      Please don’t flag AI please!

    • CabbageRelish@midwest.social
      cake
      link
      fedilink
      English
      arrow-up
      7
      ·
      edit-2
      9 days ago

      People are complaining that an advanced fill tool that’s mostly used to remove a smudge or something is automatically marking a full image as an AI creation. As-is if someone actually wants to bypass this “check” all they have to do is strip the image’s metadata before uploading it.

    • BigPotato@lemmy.world
      link
      fedilink
      arrow-up
      20
      ·
      10 days ago

      Right? I thought I went crazy when I got to “I just used Generative Fill!” Like, he didn’t just auto adjust the exposure and black levels! C’mon!

  • WatDabney@sopuli.xyz
    link
    fedilink
    arrow-up
    27
    ·
    edit-2
    10 days ago

    No - I don’t agree that they’re completely different.

    “Made by AI” would be completely different.

    “Made with AI” actually means pretty much the exact same thing as “AI was used in this image” - it’s just that the former lays it out baldly and the latter softens the impact by using indirect language.

    I can certainly see how “photographers” who use AI in their images would tend to prefer the latter, but bluntly, fuck 'em. If they can’t handle the shame of the fact that they did so they should stop doing it - get up off their asses and invest some time and effort into doing it all themselves. And if they can’t manage that, they should stop pretending to be artists.

    • Paradachshund@lemmy.today
      link
      fedilink
      arrow-up
      16
      ·
      10 days ago

      I think it is a bit of an unclear wording personally. “Made with”, despite technically meaning what you’re saying, is often colloquially used to mean “fully created by”. I don’t mind the AI tag, but I do see the photographers point about it implying wholesale generation instead of touchups.

  • pyre@lemmy.world
    link
    fedilink
    arrow-up
    25
    ·
    9 days ago

    or… don’t use generative fill. if all you did was remove something, regular methods do more than enough. with generative fill you can just select a part and say now add a polar bear. there’s no way of knowing how much has changed.

    • thedirtyknapkin@lemmy.world
      link
      fedilink
      arrow-up
      11
      ·
      edit-2
      9 days ago

      there’s a lot more than generative fill.

      ai denoise, ai masking, ai image recognition and sorting.

      hell, every phone is using some kind of “ai enhanced” noise reduction by default these days. these are just better versions of existing tools than have been used for decades.

  • kromem@lemmy.world
    link
    fedilink
    English
    arrow-up
    24
    ·
    10 days ago

    Artists in 2023: “There should be labels on AI modified art!!”

    Artists in 2024: “Wait, not like that…”

    • Thorny_Insight@lemm.ee
      link
      fedilink
      arrow-up
      13
      ·
      9 days ago

      I don’t think that’s fair. AI wont turn a bad photograph into a good one. It’s a tool that quickly and automatically does something we’ve been doing by hand untill now. That’s kind of like saying a photoshopped picture isn’t “good” or “real”. They’re all photoshopped. Not a single serious photographer releases unedited photos except perhaps the ones shooting on film.

      • Zelaf@sopuli.xyz
        link
        fedilink
        arrow-up
        2
        ·
        8 days ago

        Even finns photographers touch up their photos, either during development by adjusting how long they sit in one or the chemical processes or by using different methods of shaking/mixing processes and techniques.

        If they enlarge their negatives on photo paper they often have tools to add lightness and darkness to different areas of the paper to help with exposure, contrast and subject highlighting. AKA. Dodging and burning which is also available in most photo editing software today.

        There are loads of things to do to improve developed photos and been something that has always been something that photographers/developers do. People who still go with the “Don’t edit photos” BS are usually not very well informed about photo history and techniques of their photography inspirations.

  • glimse@lemmy.world
    link
    fedilink
    arrow-up
    15
    ·
    10 days ago

    This would be more suited for asklemmy, this community isn’t for opinion discussions

  • IIII@lemmy.world
    link
    fedilink
    arrow-up
    14
    ·
    10 days ago

    Can’t wait for people to deliberately add the metadata to their image as a meme, such that a legit photograph without any AI used gets the unremovable made with ai tag

  • A_Very_Big_Fan@lemmy.world
    link
    fedilink
    English
    arrow-up
    12
    ·
    10 days ago

    We’ve been able to do this for years, way before the fill tool utilized AI. I don’t see why it should be slapped with a label that makes it sound like the whole image was generated by AI.

  • PhlubbaDubba@lemm.ee
    link
    fedilink
    arrow-up
    9
    ·
    10 days ago

    I agree pretty heartily with this metadata signing approach to sussing out AI content,

    Create a cert org that verifies that a given piece of creative software properly signs work made with their tools, get eyeballs on the cert so consumers know to look for it, watch and laugh while everyone who can’t get thr cert starts trying to claim they’re being censored because nobody trusts any of their shit anymore.

    Bonus points if you can get the largest social media companies to only accept content that has the signing and have it flag when signs indicate photoshopping or AI work, or removal of another artist’s watermark.

    • Schmeckinger@lemmy.world
      link
      fedilink
      arrow-up
      9
      ·
      edit-2
      10 days ago

      That simply won’t work, since you could just use a tool to recreate a Ai image 1:1, or extract the signing code and sign whatever you want.

      • PhlubbaDubba@lemm.ee
        link
        fedilink
        arrow-up
        2
        ·
        10 days ago

        There are ways to secure signatures to be a problem to recreate, not to mention how the signature can be unique to every piece of media made, meaning a fake can’t be created reliably.

        • Schmeckinger@lemmy.world
          link
          fedilink
          arrow-up
          4
          ·
          10 days ago

          How are you gonna prevent recreating a Ai image pixel by pixel or just importing a Ai image/taking a photo of one.

          • PhlubbaDubba@lemm.ee
            link
            fedilink
            arrow-up
            1
            ·
            edit-2
            9 days ago

            Importing and screen capping software can also have the certificate software on and sign it with the metadata of the original file they’re copying, taking a picture of the screen with a separate device or pixel by pixel recreations could in theory get around it, but in practice, people will see at best a camera image being presented as a photoshopped or paintmade image, and at worst, some loser pointing their phone at their laptop to try and pass off something dishonestly. Pixel by pixel recreations, again, software can be given the metadata stamp, and if sites refuse to accept non stamped content, going pixel by pixel on unvetted software will just leave you with a neat png file for your trouble, and doing it manually, yeah if someone’s going through and hand placing squares just to slip a single deep fake picture through, that person’s a state actor and that’s a whole other can of worms.

            ETA: you can also sign the pixel art creation as pixel art based on it being a creation of squares, so that would tip people off in the signature notes of a post.

      • Feathercrown@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        ·
        10 days ago

        The opposite way could work, though. A label that guarantees the image isn’t [created with AI / digitally edited in specific areas / overall digitally adjusted / edited at all]. I wonder if that’s cryptographically viable? Of course it would have to start at the camera itself to work properly.

        • Trainguyrom@reddthat.com
          link
          fedilink
          English
          arrow-up
          4
          ·
          9 days ago

          Signing the photo on the camera would achieve this, but ultimately that’s just rehashing the debate back when this Photoshop thing was new. History shows us that some will fight it but ultimately new artistic tools will create new artistic styles and niches

  • Zelaf@sopuli.xyz
    link
    fedilink
    arrow-up
    6
    ·
    8 days ago

    As a photographer I’m a bit torn on this one.

    I believe AI art should definitely be labeled to minimize people being mislead about the source of the art. But at the same time the OP on the Adobe forums post did say they used it as any other tool for touching up and fixing inconsistencies.

    If I were to for example arrange a photoshoot with a model and they happened to have a zit that day on their forehead of course I’m gonna edit that out. Or if I happened to have an assistant with me that got in the shot but I don’t want to crop in making the background and feel of the photo tighter I would gladly remove that too. Sure Adobe already has the patch, clone and even magic eraser tool (Which also uses AI, that might or might not mark photos) to do these fix-ups but if I can use AI, that I hope is trained on data they’re actually allowed to train on, I think I would prefer that because if I’m gonna spend 10 to 30 minutes fixing blemishes, zits and what not I’d much prefer to use the AI tools to get my job done quicker.

    If the tools were however used to rigorously change, modify and edit the scene and subject then for sure, it might be best to add that.

    Wouldn’t it be better to not discourage the use of editing tools when those tools are used in a way that just makes one’s job quicker? If I were to use Lightrooms subject quick selection, should it be slapped on then? Or if I were to use an editing preset created with AI that automatically adjusts the basic settings of an image and further my editing from that, should the label be created then? Or if I have a flat white background with some tapestry pattern and don’t want to spend hours getting the alignment of the pattern just right as I try to fix a minor aspect ratio issue or want to get just a bit more breathing room on the subject and I use the mentioned AI tool in the OP.

    Things OP mentioned in his post and the scenarios I mentioned are all things you can do without AI anyways it just takes a lot longer sometimes, there’s no cheating in using the right tool for the right job IMO. I don’t think it’s too far off from someone who makes sculptures in clay uses an ice scream scoop with ridges to create texture or a Dremel to touch up and fix corners. Or a painter using different tools and brushes and scrapers to finish their painting.

    Perhaps a better idea would be if we want to make the labels “fair” there should also be a label that the photo has been manipulated by a program in general or maybe add a percentage indicator to see how much of it has been edited specifically with AI. Slapping an “AI” label on someone because they decided to get equal results by using another tool to do normal touch-ups to a photo could potentially be damaging to ones career and credibility when it doesn’t say how much of it was AI or in what reach, because now there’s the chance someone might be looking for their next wedding photographer and be discouraged because of the bad rep regarding AI.

    • parody@lemmings.worldOP
      link
      fedilink
      arrow-up
      1
      ·
      7 days ago

      trained on data they’re actually allowed to train on

      That’s the ticket. For touchups, certainly, that’s the key: did theft help, or not?

      • Zelaf@sopuli.xyz
        link
        fedilink
        arrow-up
        2
        ·
        7 days ago

        Indeed, if the AI was trained based on theft it’s neither right on their part or ethical on mine.

        I did some searching but sadly don’t have time to look into it more but there were some concerning articles that would suggest they have either used shady practices to get their training data or users having to manually check an opt out box in the app settings.

        I can’t make an opinion on it right now before looking into it more but my core argument about using AI itself in this manner, even if that data was your own on your own trained AI using allowed resources, I still believe somewhat holds.

  • Lad@reddthat.com
    link
    fedilink
    arrow-up
    5
    ·
    8 days ago

    I disagree with their complaints. If AI was used in any way, it should be labelled as such, no matter how small the adjustments were.

      • deafboy@lemmy.world
        link
        fedilink
        English
        arrow-up
        3
        ·
        8 days ago

        To appease the artists worried about “fake” art somehow replacing the "real"art, while the big social somehow profits. They just didn’t think leopards would eat THEIR faces…

        • Schadrach@lemmy.sdf.org
          link
          fedilink
          English
          arrow-up
          2
          ·
          8 days ago

          You aren’t wrong. It’s entirely about status and needing to stigmatize, penalize and limit “fake” art because the artists in question are worried it will cut into the work available to them in the form of things like commissions.