• deathbird@mander.xyz
    link
    fedilink
    English
    arrow-up
    48
    arrow-down
    1
    ·
    5 days ago

    I mean honestly this AI era is the time for these absurd anti-piracy penalties to be enforced. Meta downloads libgen? $250,000 per book plus jail time to the person who’s responsible.

    Oh but laws aren’t for the rich and powerful you see!

  • K3zi4@lemmy.world
    link
    fedilink
    English
    arrow-up
    162
    ·
    7 days ago

    In theory, could you then just register as an AI company and pirate anything?

    • pdxfed@lemmy.world
      link
      fedilink
      English
      arrow-up
      73
      arrow-down
      1
      ·
      7 days ago

      Well no, just the largest ones who can pay some fine or have nearly endless legal funds to discourage challenges to their practice, this bring a form of a pretend business moat. The average company won’t be able to and will get shredded.

      • CosmoNova@lemmy.world
        link
        fedilink
        English
        arrow-up
        30
        ·
        7 days ago

        What fine? I thought this new law allows it. Or is it one of those instances where training your AI on copyrighted material and distributing it is fine but actually sourcing it isn‘t so you can‘t legally create a model but also nobody can do anything if you have and use it? That sounds legally very messy.

        • AwesomeLowlander@sh.itjust.works
          link
          fedilink
          English
          arrow-up
          12
          ·
          6 days ago

          You’re assuming most of the commentors here are familiar with the legal technicalities instead of just spouting whatever uninformed opinion they have.

    • Dr. Moose@lemmy.world
      link
      fedilink
      English
      arrow-up
      8
      ·
      6 days ago

      You can already just pirate anything. In fact, downloading copyrighted content is not illegal in most countries just distributing is.

      • rivalary@lemmy.ca
        link
        fedilink
        English
        arrow-up
        1
        ·
        5 days ago

        That would be hilarious if someone made a website showing how they are using pirated Nintendo games (complete with screenshots of the games, etc) to show how they are “training” their AI just to watch Nintendo freak out.

      • darkdemize@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        52
        arrow-down
        3
        ·
        7 days ago

        If they are training the AI with copyrighted data that they aren’t paying for, then yes, they are doing the same thing as traditional media piracy. While I think piracy laws have been grossly blown out of proportion by entities such as the RIAA and MPAA, these AI companies shouldn’t get a pass for doing what Joe Schmoe would get fined thousands of dollars for on a smaller scale.

        • taladar@sh.itjust.works
          link
          fedilink
          English
          arrow-up
          16
          ·
          7 days ago

          In fact when you think about the way organizations like RIAA and MPAA like to calculate damages based on lost potential sales they pull out of thin air training an AI that might make up entire songs that compete with their existing set of songs should be even worse. (not that I want to encourage more of that kind of bullshit potential sales argument)

        • FaceDeer@fedia.io
          link
          fedilink
          arrow-up
          7
          arrow-down
          38
          ·
          7 days ago

          The act of copying the data without paying for it (assuming it’s something you need to pay for to get a copy of) is piracy, yes. But the training of an AI is not piracy because no copying takes place.

          A lot of people have a very vague, nebulous concept of what copyright is all about. It isn’t a generalized “you should be able to get money whenever anyone does anything with something you thought of” law. It’s all about making and distributing copies of the data.

          • ultranaut@lemmy.world
            link
            fedilink
            English
            arrow-up
            27
            ·
            7 days ago

            Where does the training data come from seems like the main issue, rather than the training itself. Copying has to take place somewhere for that data to exist. I’m no fan of the current IP regime but it seems like an obvious problem if you get caught making money with terabytes of content you don’t have a license for.

            • ferrule@sh.itjust.works
              link
              fedilink
              English
              arrow-up
              2
              arrow-down
              4
              ·
              6 days ago

              the slippery slope here is that you as an artist hear music on the radio, in movies and TV, commercials. All this hearing music is training your brain. If an AI company just plugged in an FM radio and learned from that music I’m sure that a lawsuit could start to make it that no one could listen to anyone’s music without being tainted.

              • ultranaut@lemmy.world
                link
                fedilink
                English
                arrow-up
                3
                ·
                6 days ago

                That feels categorically different unless AI has legal standing as a person. We’re talking about training LLMs, there’s not anything more than people using computers going on here.

                • ferrule@sh.itjust.works
                  link
                  fedilink
                  English
                  arrow-up
                  1
                  arrow-down
                  3
                  ·
                  edit-2
                  6 days ago

                  So then anyone who uses a computer to make music would be in violation?

                  Or is it some amount of computer generated content? How many notes? If its not a sample of a song, how does one know how much of those notes are attributed to which artist being stolen from?

                  What if I have someone else listen to a song and they generate a few bars of a song for me? Is it different that a computer listened and then generated output?

                  To me it sounds like artists were open to some types of violations but not others. If an AI model listened to the radio most of these issues go away unless we are saying that humans who listen to music and write similar songs are OK but people who write music using computers who calculate the statistically most common song are breaking the law.

            • FaceDeer@fedia.io
              link
              fedilink
              arrow-up
              2
              arrow-down
              14
              ·
              6 days ago

              A lot of the griping about AI training involves data that’s been freely published. Stable Diffusion, for example, trained on public images available on the internet for anyone to view, but led to all manner of ill-informed public outrage. LLMs train on public forums and news sites. But people have this notion that copyright gives them some kind of absolute control over the stuff they “own” and they suddenly see a way to demand a pound of flesh for what they previously posted in public. It’s just not so.

              I have the right to analyze what I see. I strongly oppose any move to restrict that right.

              • AwesomeLowlander@sh.itjust.works
                link
                fedilink
                English
                arrow-up
                6
                ·
                6 days ago

                It’s also pretty clear they used a lot of books and other material they didn’t pay for, and obtained via illegal downloads. The practice of which I’m fine with, I just want it legalised for everyone.

                • ferrule@sh.itjust.works
                  link
                  fedilink
                  English
                  arrow-up
                  3
                  ·
                  6 days ago

                  I’m wondering when i go to the library and read a book, does this mean i can never become an author as I’m tainted? Or am I only tainted if I stole the book?

                  To me this is only a theft case.

          • Knock_Knock_Lemmy_In@lemmy.world
            link
            fedilink
            English
            arrow-up
            5
            ·
            6 days ago

            the training of an AI is not piracy because no copying takes place.

            One of the first steps of training is to copy the data into the training data set.

          • WalnutLum@lemmy.ml
            link
            fedilink
            English
            arrow-up
            5
            ·
            6 days ago

            This isn’t quite correct either.

            The reality is that there’s a bunch of court cases and laws still up in the air about what AI training counts as, and until those are resolved the most we can make is conjecture and vague moral posturing.

            Closest we have is likely the court decisions on music sampling and so far those haven’t been consistent, and have mostly hinged on “intent” and “affect on original copy sales”. So based on that logic whether or not AI training counts as copyright infringement is likely going to come down to whether or not shit like “ghibli filters” actually provably (at least as far as a judge is concerned) fuck with Ghibli’s sales.

            • Knock_Knock_Lemmy_In@lemmy.world
              link
              fedilink
              English
              arrow-up
              3
              ·
              6 days ago

              court decisions on music sampling and so far those haven’t been consistent,

              Grand Upright Music, Ltd. v. Warner Bros. Records Inc. (1991) - Rapper Biz Markie sampled Gilbert O’Sullivan’s “Alone Again (Naturally)” without permission

              Bridgeport Music, Inc. v. Dimension Films (2005) - any unauthorized sampling, no matter how minimal, is infringement.

              VMG Salsoul v. Ciccone (2016) - to determine whether use was de minimis it must be considered whether an average audience would recognize appropriation from the original work as present in the accused work.

              • WalnutLum@lemmy.ml
                link
                fedilink
                English
                arrow-up
                3
                ·
                6 days ago

                Campbell v. Acuff-Rose Music, Inc. (1994) - This case established that the fact that money is made by a work does not make it impossible for fair use to apply; it is merely one of the components of a fair use analysis

      • CommanderCloon@lemmy.ml
        link
        fedilink
        English
        arrow-up
        1
        ·
        5 days ago

        Well I agree in principle (I disagree that AI training is necessarily “stealing”), but downloading copyrighted material for which you do not own a license is textbook piracy, regardless of intent

  • deathbird@mander.xyz
    link
    fedilink
    English
    arrow-up
    35
    ·
    5 days ago

    Normal people pirate: one hundred bazillion dollars fine for download The Hangover.

    One hundred bazillion dollars company pirate: special law to say it okay because poor company no can exist without pirate 😞

  • HighFructoseLowStand@lemm.ee
    link
    fedilink
    English
    arrow-up
    25
    arrow-down
    3
    ·
    5 days ago

    What is the actual justification for this? Everyone has to pay for this except for AI companies, so AI can continue to develop into a universally regarded negative?

    • ryathal@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      11
      arrow-down
      50
      ·
      5 days ago

      AI doesn’t copy things anymore than a person copies them by attending a concert or museum.

          • gradual@lemmings.world
            link
            fedilink
            English
            arrow-up
            1
            ·
            5 days ago

            Sigh, more censorship.

            We need better communities that let people decide for themselves what they get to see.

            • mechoman444@lemmy.world
              link
              fedilink
              English
              arrow-up
              3
              ·
              5 days ago

              Totally agree. This kind of crap started happening after the great reddit exodus of 23. Shitty reddit mods made their way to lemmy and this is what we get.

              If you wanna see something cool just type the word “trans” into your comment and watch the downvotes come in!

              Keep an eyeball on this comment! You’ll see!

      • mechoman444@lemmy.world
        link
        fedilink
        English
        arrow-up
        4
        arrow-down
        2
        ·
        5 days ago

        This is 100% correct. You can downvote this person all you want but their not wrong!

        A painter doesn’t own anything to the estate of Rembrandt because they took inspiration from his paintings.

            • SoftestSapphic@lemmy.world
              link
              fedilink
              English
              arrow-up
              2
              ·
              4 days ago

              So if it can’t function properly without other people’s work deciding what the art will look like that’s called copying.

              If human beings get shit for copying famous art or tracing we need to hold AI to the same standard.

              • mechoman444@lemmy.world
                link
                fedilink
                English
                arrow-up
                1
                arrow-down
                2
                ·
                4 days ago

                Copy

                1. noun a. an imitation, transcript, or reproduction of an original work (such as a letter, a painting, a table, or a dress) b. one of a series of especially mechanical reproductions of an original impression c. matter to be set especially for printing; also: something considered printable (such as an advertisement or news story)
                1. verb a. to make a copy or copies of b. to model oneself on c. to transfer (data, text, etc.) from one location to another, especially in computing

                I can’t believe I just had to provide you with a definition of the word copy.

                Are you freaking serious!!!

                Being inspired by and creating an original production is not the same as copying if that original work is inspired by other artists!!!

                By your definition of copying because Elvis Presley was inspired by Muddy Waters they made the exact same music!

                LLMs don’t produce copyrighted material they take inspiration from the training data so to speak. They create original productions.

                In the same way that you can envision the Mona Lisa in your head but you couldn’t paint it by hand.

                • SoftestSapphic@lemmy.world
                  link
                  fedilink
                  English
                  arrow-up
                  2
                  ·
                  edit-2
                  4 days ago

                  You know copying literal brushstrokes and traces identifiable from real artists is different than being inspired, it’s amazing the level of denial you cultists will self induce to keep it making sense.

                  Your god is not valuable enough to give more rights than human beings. Sorry

                  I don’t care what techbro conmen told you.

                  AI will never be a replacement for actual creativity, and is already being legislated against properly in civilized countries.

      • SoftestSapphic@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        ·
        5 days ago

        You need to learn how your god functions.

        If it needs training data then it is effectively copying the training data.

    • jsomae@lemmy.ml
      link
      fedilink
      English
      arrow-up
      9
      arrow-down
      20
      ·
      edit-2
      5 days ago

      why do you say AI is a universally regarded negative?

      Edit: if you’re going to downvote me, can you explain why? I am not saying AI is a good thing here. I’m just asking for evidence that it’s universally disliked, i.e. there aren’t a lot of fans. It seems there are lots of people coming to the defense of AI in this thread, so it clearly isn’t universally disliked.

        • jsomae@lemmy.ml
          link
          fedilink
          English
          arrow-up
          5
          arrow-down
          4
          ·
          5 days ago

          I am aware of a lot of people who are very gung-ho about AI. I don’t know if anybody has actually tried to make a comprehensive survey about people’s disposition toward AI. I wouldn’t expect Lemmy to be representative.

      • bufalo1973@lemm.ee
        link
        fedilink
        English
        arrow-up
        1
        ·
        5 days ago

        I don’t know the rest but I hate the spending of resources to feed the AI datacenters. It’s not normal building a nuclear powerplant to feed ONE data center.

        • jsomae@lemmy.ml
          link
          fedilink
          English
          arrow-up
          1
          ·
          5 days ago

          You’ve explained your personal opinion, and while I think it’s a sensible opinion, I was asking about the universal opinion on AI. And I don’t think there is a consensus that it’s bad. Like I don’t even understand how that’s controversial – everywhere you look, people are talking about AI in broadly mixed terms.

        • loutr@sh.itjust.works
          link
          fedilink
          English
          arrow-up
          7
          arrow-down
          1
          ·
          5 days ago

          That’s just not true, chatgpt & co are hugely popular, which is a big part of the issue.

            • jsomae@lemmy.ml
              link
              fedilink
              English
              arrow-up
              1
              ·
              edit-2
              5 days ago

              You do realize the root of this thread was this question, right?

              why do you say AI is a universally regarded negative?

              In the early 20th century, Nazism was not a universally regarded negative.

            • gradual@lemmings.world
              link
              fedilink
              English
              arrow-up
              1
              arrow-down
              2
              ·
              edit-2
              5 days ago

              Analogies are fallacies. All they do is reveal that you can’t argue the merits of the topic at hand, so you need to derail and distract by pivoting to something else.

              Now we need to debate the accuracy of your analogy, which is never 1:1, instead of talking about what we were talking about previously.

              You’re also arguing with the wrong person. You should be talking to the person who argued “AI is a negative because pretty much nobody likes it” instead of the person who says it’s not true that “nobody likes it.”

              You’re literally only looking for an angle to shit on AI so you can fit in with the average idiots.

              AI discussion at this point are litmus tests for who is average that lets other average people do their thinking for them. It really puts into perspective how much popular opinion should be scrutinized.

          • Ilovethebomb@lemm.ee
            link
            fedilink
            English
            arrow-up
            1
            arrow-down
            2
            ·
            5 days ago

            Hugely popular, mostly with a bunch of dorks nobody likes that much.

            People are getting the message now, but when it first came out, there were so many posts about what ChatGPT had to say about the topic, and the posters never seemed to understand why nobody cared.

        • mechoman444@lemmy.world
          link
          fedilink
          English
          arrow-up
          3
          ·
          5 days ago

          I want it and I like it. I’ve been using llms for years now with great benefit to myself.

          Like any tool one just needs to know how to use them. Apparently you don’t.

        • jsomae@lemmy.ml
          link
          fedilink
          English
          arrow-up
          1
          ·
          edit-2
          5 days ago

          I think you’re mistaken – there are a large number of people who vehemently dislike it, why is probably why you think that.

  • zephorah@lemm.ee
    link
    fedilink
    English
    arrow-up
    63
    arrow-down
    2
    ·
    7 days ago

    It’s like the goal is to bleed culture from humanity. Corporate is so keep on the $$$ they’re willing to sacrifice culture to it.

    I’ll bet corporate gets to keep their copyrights.

  • jsomae@lemmy.ml
    link
    fedilink
    English
    arrow-up
    39
    ·
    6 days ago

    Can the rest of us please use copyrighted material without permission?

  • the_q@lemm.ee
    link
    fedilink
    English
    arrow-up
    24
    arrow-down
    1
    ·
    6 days ago

    I mean they were trained on copyrighted material and nothing has been done about that so…

  • SocialMediaRefugee@lemmy.world
    link
    fedilink
    English
    arrow-up
    23
    ·
    edit-2
    6 days ago

    On the other hand copyright laws have been extended to insane time lengths. Sorry but your grandkids shouldn’t profit off of you.

      • reksas@sopuli.xyz
        link
        fedilink
        English
        arrow-up
        5
        ·
        6 days ago

        no no, i mean people should actually start utilizing this bullshit. Anyone can start a company and with some technical knowhow you can add somekind of ai crap to it. companies dont have to make profit or anything useful so there is no pressure to do anything with it.

        But if it comes to copyright law not applying to ai companies, why should some rich assholes be only ones exploiting that? It might lead to some additional legal bullshit that excludes this hypotetical kind of ai company, but that would also highlight better that the law benefits only the rich.

  • StonerCowboy@lemm.ee
    link
    fedilink
    English
    arrow-up
    19
    ·
    6 days ago

    How funny this is gonna get when AI copyrights Nintendo stuff. Ah man I got my popcorn ready.

    • CriticalMiss@lemmy.world
      link
      fedilink
      English
      arrow-up
      2
      ·
      6 days ago

      They’re not gonna do anything about it for the same reason any other litigious company hasn’t done anything thus far. They’re looking to benefit from AI by cutting costs. If the tech wasn’t beneficiary to these big tech conglomerates they would’ve already sued their asses to oblivion, but since they do care they’ll let AI train on their copyrighted material.

  • rottingleaf@lemmy.world
    link
    fedilink
    English
    arrow-up
    25
    arrow-down
    4
    ·
    6 days ago

    But you, casual BitTorrent, eDonkey (I like good old things) and such user, can’t.

    It’s literally a law allowing people doing some business violate a right of others, or, looking at that from another side, making only people not working for some companies subject to a law …

    What I mean - at some point in my stupid life I thought only individuals should ever be subjects of law. Where now the sides are the government and some individual, a representative (or a chain of people making decisions) of the government should be a side, not its entirety.

    For everything happening a specific person, easy to determine, should be legally responsible. Or a group of people (say, a chain from top to this specific one in a hierarchy).

    Because otherwise this happens, the differentiation between a person and a business and so on allows other differentiation kinds, and also a person having fewer rights than a business or some other organization. And it will always drift in that direction, because a group is stronger than an individual.

    And in this specific case somebody would be able to sue the prime minister.

    OK, it’s an utopia, similar to anarcho-capitalism, just in a different dimension, in that of responsibility.