• Mohamed@lemmy.ca
    link
    fedilink
    English
    arrow-up
    50
    arrow-down
    3
    ·
    3 days ago

    Generated AI CP should be illegalized even if its creation did not technically harm anyone. The reason is, presumably it looks too close to real CP, so close that it: 1) normalizes consumption of CP, 2) grows a market for CP, and 3) Real CP could get off the hook by claiming it is AI.

    While there are similar reasons to be against clearly not real CP (e.g. hentai), this type at least does not have problem #3. For example, there doesnt need to be an investigation into whether a picture is real or not.

    • Nyoka@lemm.ee
      link
      fedilink
      English
      arrow-up
      6
      ·
      2 days ago

      Fun fact it’s already illegal. If it’s indistinguishable from the real thing it’s a crime.

      • raptir@lemmy.zip
        link
        fedilink
        English
        arrow-up
        1
        ·
        2 days ago

        I was under the impression that even clearly drawn it’s already illegal, though it’s a grey area since they can say “lol it’s a 1000 year old demon that just looks like a child.” Is that not the case?

        • Nyoka@lemm.ee
          link
          fedilink
          English
          arrow-up
          1
          ·
          2 days ago

          Clearly drawn is hard to prosecute (and one might argue shouldn’t be prosecuted, since obscenity laws are just… weird). However, the stuff that is photorealistic can be treated, legally, like the real thing.

          • raptir@lemmy.zip
            link
            fedilink
            English
            arrow-up
            1
            ·
            2 days ago

            That’s interesting and led me down a wikipedia rabbit hole. So the law in the US says that fictional child pornography (i.e., where it is drawn and this is not “indistinguishable” from a minor) is illegal if it is “obscene.” And the definition of “obscene” essentially comes down to “would the average member of the community find it offensive.”

            That takes “grey area” to a whole new level.

    • jacksilver@lemmy.world
      link
      fedilink
      English
      arrow-up
      22
      arrow-down
      1
      ·
      edit-2
      2 days ago

      The biggest issue with this line of thinking is, how do you prove it’s CP without a victim. I suppose at a certain threshold it becomes obvious, but that can be a very blurry line (there was a famous case where a porn star had to be flown to a court case to prove the video wasn’t CP, but can’t find the link right now).

      So your left with a crime that was committed with no victim and no proof, which can be really easy to abuse.

      Edit: This is the case I was thinking of - https://nypost.com/2010/04/24/a-trial-star-is-porn/

      • cmhe@lemmy.world
        link
        fedilink
        English
        arrow-up
        3
        ·
        2 days ago

        This sort of reminds myself on the discussion on “what is a women”. Is Siri a women? Many might say so, but t the same time Siri is not even human.

        The question on how old the person on a specific generated image might be and if it even depicts a person at all, can only be answered through society. There is no scientific or any logical answer for this.

        So this will always have grey areas and differing opinions and can be rulings in different cultures.

        In the end it is about discussions about ethics not logic.

          • cmhe@lemmy.world
            link
            fedilink
            English
            arrow-up
            1
            ·
            2 days ago

            Well my point is that pretty much all of our laws are build around ethic values, which are developed within a society. There is no logical or scientific reason that would make killing other people bad, but we still should have strict rules about this.

            Laws are always built around soft things like “what is obscene”, “at what point is someone naked in public”, “How much alcohol can a drink have before it is a alcoholic beverage?”, “did the person die of natural causes, or was killed by some event years ago, that wasn’t properly treated.”

            Society decides what is acceptable and what isn’t and that changes through time and culture.

            Your argument is therefore not a good one, you have to make a case based on ethics.

      • JohnEdwa@sopuli.xyz
        link
        fedilink
        English
        arrow-up
        24
        ·
        3 days ago

        Pictures of clothed children and naked adults.

        Nobody trained them on what things made out of spaghetti look like, but they can generate them because smushing multiple things together is precisely what they do.

        • Blackmist@feddit.uk
          link
          fedilink
          English
          arrow-up
          7
          ·
          2 days ago

          Given the “we spared no expense” attitude to the rest of the data these things are trained on, I fear that may be wishful thinking…

        • Ilovethebomb@lemm.ee
          link
          fedilink
          English
          arrow-up
          7
          arrow-down
          1
          ·
          3 days ago

          Well, that’s somewhat reassuring.

          Still reprehensible that it’s being used that way, of course.

      • funkless_eck@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        2
        arrow-down
        1
        ·
        2 days ago

        if you have a soup of all liquids and a sieve that only lets coffee and ice cream through it produces coffee ice cream (metaphor, don’t think too hard about it)

        that’s how gen ai works. each step sieves out raw data to get closer to the prompt.

    • swelter_spark@reddthat.com
      link
      fedilink
      English
      arrow-up
      12
      arrow-down
      2
      ·
      3 days ago

      AI CP seems like a promising way to destroy demand for the real thing. How many people would risk a prison sentence making or viewing the real thing when they could push a button and have a convincing likeness for free with no children harmed? Flood the market with cheap fakes and makers of the real thing may not find it profitable enough to take the risk.

      • AnonomousWolf@lemm.ee
        link
        fedilink
        English
        arrow-up
        3
        ·
        2 days ago

        I think it would boost the market for the real thing more.

        It’s possible that there are people that would become into AI generated CP if it was just allowed to be advertised on nsfw website.

        And that would lead some to seek out the real thing. I think it’s best to condemn it entirely

    • Evotech@lemmy.world
      link
      fedilink
      English
      arrow-up
      1
      ·
      3 days ago

      It is here at least.

      If it wasn’t you could just flood everything with it and it would be impossible to go after actual cp

  • jaschen@lemm.ee
    link
    fedilink
    English
    arrow-up
    14
    arrow-down
    42
    ·
    3 days ago

    Who actually gets hurt in AI generated cp? The servers?

      • jaschen@lemm.ee
        link
        fedilink
        English
        arrow-up
        14
        arrow-down
        18
        ·
        3 days ago

        I’m no pedo, but what you do in your own home and hurts nobody is your own thing.

        • L3ft_F13ld!@lemmy.dbzer0.com
          link
          fedilink
          English
          arrow-up
          18
          arrow-down
          12
          ·
          3 days ago

          Yes, but how is the AI making the images or videos? It has to be trained on SOMETHING.

          So, regardless of direct harm or not, harm is done at some point in the process and it needs to be stopped before it slips and gets worse because people “get used to” it.

          • Mnemnosyne@sh.itjust.works
            link
            fedilink
            English
            arrow-up
            39
            arrow-down
            2
            ·
            edit-2
            3 days ago

            Ai can combine two things. It can train on completely normal pictures of children, and it can train on completely normal adult porn, and then it can put those together.

            This is the same reason it can do something like Godzilla with Sailor Moon’s hair, not because it trained on images of Godzilla with Sailor Moon’s hair, but because it can combine those two separate things.

            • RightEdofer@lemmy.ca
              link
              fedilink
              English
              arrow-up
              1
              arrow-down
              8
              ·
              3 days ago

              Only the real things are actual humans who have likely not consented to ever being in this database at all let alone having parts of their likeness being used for this horrific shit. There is no moral argument for this garbage:

              • jaschen@lemm.ee
                link
                fedilink
                English
                arrow-up
                12
                arrow-down
                3
                ·
                3 days ago

                Technically speaking, if you post images of your child on social media, you have consented. If you never uploaded an image of your child online, you never need to worry.

                • RightEdofer@lemmy.ca
                  link
                  fedilink
                  English
                  arrow-up
                  5
                  arrow-down
                  1
                  ·
                  3 days ago

                  Social media has been around a long time. It is not reasonable to expect people to think of technology they can’t imagine even existing ten years in the future when “consenting” to use a platform. Legally you are correct. Morally this is obviously terrible. Everything about how terms and conditions are communicated is designed to take advantage of people who won’t or are unable to parse its meaning. Consent needs to be informed.

          • Kusimulkku@lemm.ee
            link
            fedilink
            English
            arrow-up
            9
            arrow-down
            2
            ·
            3 days ago

            I wouldn’t think it needs to have child porn in the training data to be able to generate it. It has porn as the data, it knows what kids look like, merge the two. I think that works for anything AI knows about, make this resemble this.

              • Kusimulkku@lemm.ee
                link
                fedilink
                English
                arrow-up
                4
                ·
                3 days ago

                It seems pretty understandable that companies wouldn’t allow it, it’s more that if it is illegal (like in some places) then that gets into really sketchy territory imo.

              • jaschen@lemm.ee
                link
                fedilink
                English
                arrow-up
                2
                arrow-down
                1
                ·
                3 days ago

                I agree it shouldn’t be accepted, but I disagree on being allowed. I think it should be allowed because it doesn’t hurt anyone.

          • MonkderVierte@lemmy.ml
            link
            fedilink
            English
            arrow-up
            7
            arrow-down
            1
            ·
            edit-2
            3 days ago

            needs to be stopped before it slips and gets worse because people “get used to” it.

            Ah, right, almost finally forgot the killer games rhetoric.

            • L3ft_F13ld!@lemmy.dbzer0.com
              link
              fedilink
              English
              arrow-up
              2
              arrow-down
              2
              ·
              3 days ago

              I also don’t agree with the killer games thing, but humans are very adaptable as a species.

              Normally that’s a good thing, but in a case like this exposure to something shocking or upsetting can make it less shocking or upsetting over time (obviously not in every case). So, if AI is being used for something like this and being reported on isn’t it possible that people might slowly get desensitized to it over time?

              • MonkderVierte@lemmy.ml
                link
                fedilink
                English
                arrow-up
                5
                ·
                edit-2
                3 days ago

                But what if pedophiles in therapy are less likely to commit a crime if they have access to respective porn? Even better then, if it can be AI generated, no?

                • jaschen@lemm.ee
                  link
                  fedilink
                  English
                  arrow-up
                  1
                  ·
                  2 days ago

                  Japan is country that has legal drawn cp. It’s available in physical store and online. Yet Japan is much lower than most developed country in the world in terms of actual sexual child abuse.

    • Goretantath@lemm.ee
      link
      fedilink
      English
      arrow-up
      6
      arrow-down
      1
      ·
      3 days ago

      Making a photo of a child based off of real photos in a sexual manner is essentially using said child in the training data as the one in the act…

      • jaschen@lemm.ee
        link
        fedilink
        English
        arrow-up
        5
        arrow-down
        8
        ·
        3 days ago

        But who is actually getting hurt? No kid has gotten hurt using Gen AI.

        • OutDoeHoe@lemmy.world
          link
          fedilink
          English
          arrow-up
          2
          ·
          2 days ago

          A child whose abuse images are used to generate AI CP can be re-victimized by it, without even getting at the issues with normalizing it.

    • blind3rdeye@lemm.ee
      link
      fedilink
      English
      arrow-up
      6
      ·
      3 days ago

      Are you suggesting that this particular type of CP should be acceptable? (And suddenly “but I used AI” becomes a popular defence.)

      • jaschen@lemm.ee
        link
        fedilink
        English
        arrow-up
        6
        arrow-down
        4
        ·
        3 days ago

        No cp should be acceptable. But I argue AI generated isn’t cp.

        This is no different than someone cutting out a child’s head from a Target catalog and sticking it to a body on a playboy magazine and masturbating to it.

        Or someone using Photoshoping a kids head to a pornographic photo.

        It’s just a more accessible version of those examples.

        At the end of the day, what you do in your own home is your thing. t’s not my business what you do. As long as it doesn’t hurt/affect anyone, go ahead.

        • Ilovethebomb@lemm.ee
          link
          fedilink
          English
          arrow-up
          1
          arrow-down
          2
          ·
          3 days ago

          I almost respect you for taking a stance so blatantly against what most people believe.

          Almost.

    • mbirth@lemmy.ml
      link
      fedilink
      English
      arrow-up
      6
      arrow-down
      1
      ·
      3 days ago

      I don’t remember whether it was some news article or a discussion thread. But other people also suggested this might help during therapy and/or rehab. And they had the same argument in that nobody gets harmed in creating these.

      As for uses outside of controlled therapy, I’d be afraid it might make people want the “real thing” at some point. And, as others already pointed out: Good luck proving to your local police that those photos on your laptop are all “fake”.

      • barnaclebutt@lemmy.world
        link
        fedilink
        English
        arrow-up
        7
        arrow-down
        2
        ·
        edit-2
        3 days ago

        It fetishes the subject’s images, and nobody knows if it would lead to recivitism in child predators. It is generally accepted that producing drawings of CP alone is bad, let alone by AI. I remember some dude getting arrested at the Canadian border for sexual drawings of Bart and Lisa. Regardless, I would say that it is quite controversial and probably not what you’d want your company to be known for …

      • jaschen@lemm.ee
        link
        fedilink
        English
        arrow-up
        9
        arrow-down
        1
        ·
        3 days ago

        Gen AI doesn’t take cp content and recreates it. There wouldn’t be a point of gen AI if that is the case. It knows what regular porn looks like and what a child looks like and it generates an image. With those inputs it can create something new and at the same time hurt nobody.

        • surewhynotlem@lemmy.world
          link
          fedilink
          English
          arrow-up
          7
          arrow-down
          2
          ·
          3 days ago

          Prove it. Please, show me the full training data to guarantee you’re right.

          But also, all the kids used for “kids face data” didn’t sign up to be porn

          • jaschen@lemm.ee
            link
            fedilink
            English
            arrow-up
            6
            arrow-down
            2
            ·
            3 days ago

            I don’t need to. It’s is just the way gen AI works. It takes images of things it knows and then generates NEW content based on what it think you want with your prompts.

            If I’m looking for a infant flying an airplane, gen AI knows what a pilot looks like and what a child looks like and it creates something new.

            Also kids face data doesn’t mean they take the actual face of the actual child and paste it on a body. It might take an eyebrow and a freckle from one kidand use a hair style from another and eyes from someone else.

            Lastly, the kids parents consented when they upload images of their kids on social media.