• QuantumTickle@lemmy.zip
    link
    fedilink
    English
    arrow-up
    182
    arrow-down
    1
    ·
    24 days ago

    If “everyone will be using AI” and it’s not a bad thing, then these big companies should wear it as a badge of honor. The rest of us will buy accordingly.

    • Devial@discuss.online
      link
      fedilink
      English
      arrow-up
      53
      arrow-down
      2
      ·
      24 days ago

      If “everyone will be using AI”, AI will turn to shit.

      They can’t create originality, they’re only recycling and recontextualising existing information. But if you recycle and recontextualise the same information over and over again, it keeps degrading more and more.

      It’s ironic that the very people who advocate for AI everywhere, fail to realise just how dependent the quality of AI content is on having real, human generated content to input to train the model.

      • 4am@lemmy.zip
        link
        fedilink
        English
        arrow-up
        26
        arrow-down
        2
        ·
        24 days ago

        “The people who advocate for AI” are literally running around claiming that AI is Jesus and it is sacrilege to stand against it.

        And by literally, I mean Peter Thiel is giving talks actually claiming this. This is not an exaggeration, this is not hyperbole.

        They are trying to recruit techno-cultists.

      • Sl00k@programming.dev
        link
        fedilink
        English
        arrow-up
        5
        arrow-down
        1
        ·
        24 days ago

        I think the grey area is what if you’re an indie dev and did the entire story line and artwork yourself, but have the ai handle more complex coding.

        It is to our eyes entirely original but used AI. Where do you draw the line?

        • Devial@discuss.online
          link
          fedilink
          English
          arrow-up
          7
          ·
          edit-2
          21 days ago

          The line, imo, is: are you creating it yourself, and just using AI to help you make it faster/more convenient, or is AI the primary thing that is creating your content in the first place.

          Using AI for convenience is absolutely valid imo, I routinely use chatGPT to do things like debugging code I wrote, or rewriting data sets in different formats, instead of doing to by hand, or using it for more complex search and replace jobs, if I can’t be fucked to figure out a regex to cover it.

          For these kind of jobs, I think AI is a great tool.

          More simply said, I personally generally use AI for small subtasks that I am entirely capable of doing myself, but are annoying/boring/repetitive/time consuming to do by hand.

          • Sl00k@programming.dev
            link
            fedilink
            English
            arrow-up
            2
            ·
            24 days ago

            I definitely agree but I think that case would still get caught in the steam AI usage badge?

        • irmoz@reddthat.com
          link
          fedilink
          English
          arrow-up
          3
          ·
          23 days ago

          That’s somewhat acceptable. The ideal use of AI is as a crutch - and I mean that literally. A tool that multiplies and supports your effort, but does not replace your effort or remove the need for it.

      • CatsPajamas@lemmy.dbzer0.com
        link
        fedilink
        English
        arrow-up
        2
        arrow-down
        2
        ·
        24 days ago

        How does this model collapse thing still get spread around? It’s not true. Synthetic data has actually helped bots get smarter, not dumber. And if you think that all Gemini3 does is recycle idk what to tell you

        • Devial@discuss.online
          link
          fedilink
          English
          arrow-up
          1
          arrow-down
          1
          ·
          edit-2
          22 days ago

          If the model collapse theory weren’t true, then why do LLMs need to scrape so much data from the internet for training ?

          According to you, they should be able to just generate synthetic training data purely with the previous model, and then use that to train the next generation.

          So why is there even a need for human input at all then ? Why are all LLM companies fighting tooth and nail against their data scraping being restricted, if real human data is in fact so unnecessary for model training, and they could just generate their own synthetic training data instead ?

          You can stop models from deteriorating without new data, and you can even train them with synthetic data, but that still requires the synthetic data to either be modelled, or filtered by humans to ensure its quality. If you just take a million random chatGPT outputs, with no human filtering whatsoever, and use those to retrain the chatGPT model, and then repeat that over and over again, eventually the model will turn to shit. Each iteration some of the random tweaks chatGPT makes to their output are going to produce some low quality outputs, which are now presented to the new training model as a target to achieve, so the new model learns that the quality of this type of bad output is actually higher, which makes it more likely for it to reappear in the next set of synthetic data.

          And if you turn of the random tweaks, the model may not deteriorate, but it also won’t improve, because effectively no new data is being generated.

          • CatsPajamas@lemmy.dbzer0.com
            link
            fedilink
            English
            arrow-up
            1
            arrow-down
            1
            ·
            22 days ago

            I stopped reading when you said according to me and then produced a wall of text of shit I never said.

            Synthetic data is massively helpful. You can look it up. This is a myth.

            • Devial@discuss.online
              link
              fedilink
              English
              arrow-up
              1
              arrow-down
              1
              ·
              edit-2
              22 days ago

              That is enormously ironic, since I literally never claimed you said anything except for what you did: Namely, that synthetic data is enough to train models.

              According to you, they should be able to just generate synthetic training data purely with the previous model, and then use that to train the next generation.

              LIterally, the very next sentence starts with the words “Then why”, which clearly and explicitly means I’m no longer indirectly quoting you Everything else in my comment is quite explicitly my own thoughts on the matter, and why I disagree with that statment, so in actual fact, you’re the one making up shit I never said.

  • twinnie@feddit.uk
    link
    fedilink
    English
    arrow-up
    123
    ·
    24 days ago

    They don’t need to court developers, they need to court consumers. The games will be sold wherever people are buying.

    • CosmoNova@lemmy.world
      link
      fedilink
      English
      arrow-up
      81
      arrow-down
      3
      ·
      24 days ago

      Consumers have already decided mobile gambling slop is the most successful investment in the gaming industry. I don‘t trust consumers to know what‘s best for them.

      • Katana314@lemmy.world
        link
        fedilink
        English
        arrow-up
        55
        ·
        edit-2
        24 days ago

        I think the studies showing how certain minds can be targeted and manipulated by dark gambling patterns made me think differently about gambling. I’m less likely to blame the victims now - in many ways it can be difficult or near-impossible for them to control those impulses. I’d at least like lootbox gambling slop to be regulated the same as casinos.

        Look how popular fantasy sports is now. It’s basically just the casino industry seeking out new avenues to cheat the definition of “Playing odds to win cash”.

    • rtxn@lemmy.world
      link
      fedilink
      English
      arrow-up
      35
      arrow-down
      4
      ·
      24 days ago

      consumers

      This is very much a pet peeve, but be careful about how you use “consumer” versus “customer”. They each imply completely different power dynamics.

      • warm@kbin.earth
        link
        fedilink
        arrow-up
        14
        arrow-down
        4
        ·
        24 days ago

        It’s very much consumer these days, people buy literally anything marketed to them.

          • warm@kbin.earth
            link
            fedilink
            arrow-up
            4
            ·
            24 days ago

            I like to think I hold myself to a higher standard or at least just a standard. General consumption, I’m not sure, but for video games, people standards have dropped significantly, the masses accept a lot of bullshit and even defend it.

  • Aurenkin@sh.itjust.works
    link
    fedilink
    English
    arrow-up
    39
    ·
    edit-2
    24 days ago

    The ethics and utility (or lack thereof) of AI is an important discussion in it’s own right. In terms of Steam though, I really don’t think it’s relevant. Players want the disclosures, that’s it, that’s all that should really matter. Am I missing some nuance here?

    • borth@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      23
      ·
      24 days ago

      The nuance is that Tim doesn’t give a shit what players want, him and his cronies don’t want it because it’s harder to convince someone to play AI slop when they know it’s AI slop before they even try it 😂

    • Darkcoffee@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      17
      arrow-down
      2
      ·
      24 days ago

      They want it? I don’t know, the review score of Black Ops 7 begs to differ.

      Personally I’ll give money to a hard working indie dev that may use AI to help in their work spiradically over a big company shoving AI in everything to replace workers.

    • Sl00k@programming.dev
      link
      fedilink
      English
      arrow-up
      1
      arrow-down
      1
      ·
      24 days ago

      I posted this in another comment but I think the nuance is really in what did they use the AI for. Are they using Claude code for the programming but did the entire artwork by hand? How many really care about that?

      Compared to someone who tried to one shot a slop game with full AI assets and is just trying to make a quick buck.

  • minorkeys@lemmy.world
    link
    fedilink
    English
    arrow-up
    29
    ·
    24 days ago

    Consumers have a right to be informed of information relevant to them making purchasing decisions. AI is obviously relevant to the consumer and should be disclosed.

  • who@feddit.org
    link
    fedilink
    English
    arrow-up
    22
    ·
    edit-2
    24 days ago

    “Calls to scrap” the disclosures make it sound like a societal movement, when in fact it’s just two people with obvious bias: Tim Sweeney and some guy who promotes Tim Sweeney’s products on youtube.

    I don’t give a flying frog what they think. When I allow someone to sell me something, I like to know what’s in it.

    • ameancow@lemmy.world
      link
      fedilink
      English
      arrow-up
      15
      arrow-down
      1
      ·
      edit-2
      24 days ago

      Yah the more I use AI the more I can detect the absolute bullshit people on both sides spew.

      It’s the most amazingly complicated averaging machine we’ve ever invented. It will take the most interesting source materials, the most unique ideas of other people, the most creative materials, and it will find a way to find the safest, most average common qualities between those things. This isn’t a model problem or input problem, it’s fundamental to how generative AI works.

      It helps with searching for things online, it helps create guide plans for taking on new tasks like learning some new skill. It’s far better at teaching how to do something like coding than it is left to just code on its own and you copy and paste. It can certainly do that, but you spend so much time correcting it and fixing it that you do far better learning the code yourself and how it works.

      Same with art, the people who are using it to best effect are themselves already artists and they use AI to thumbnail compositions or rough layouts, color tests and such, and then just do the work themselves but faster because they already know roughly what direction they’re going.

      But using it to write your scripts, to copy/paste code, to generate works of art… it’s literally just giving you other people’s ideas mashed together and unseasoned.

  • kazerniel@lemmy.world
    link
    fedilink
    English
    arrow-up
    20
    ·
    24 days ago

    I’m glad for those disclosures (because I’m not touching AI games), but tons of devs don’t disclose their AI usage, even in obvious cases, leaving us to guessing :/

    • Bassman1805@lemmy.world
      link
      fedilink
      English
      arrow-up
      5
      ·
      24 days ago

      There’s also the massive gray area of “what do YOU define AI to mean?”

      There are legitimate use cases for machine learning and neural networks besides LLMs and “art” vomit. Like, what AI used to mean to gamers: how the computer plays the game against you. That probably isn’t going to upset many people.

      (IIRC, Steam’s AI disclosure is specifically about AI-generated graphics and music so that ambiguity might be settled here)

        • AgentRocket@feddit.org
          link
          fedilink
          English
          arrow-up
          2
          ·
          23 days ago

          I’d say it depends on whether or not the voice actor whose voice the AI is imitating has agreed and is fairly compensated.

          I’m imagining a game, where instead of predefined dialog choices, you talk into your microphone and the game’s AI generates the NPCs answer.

  • Wilco@lemmy.zip
    link
    fedilink
    English
    arrow-up
    8
    ·
    23 days ago

    We need laws passed where AI should have to be clearly labeled or the user faces severe fines. Robo calls and AI IVR phone systems should clearly tell you “this is AI”.

  • krakenx@lemmy.world
    link
    fedilink
    English
    arrow-up
    8
    ·
    23 days ago

    Use of AI should be disclosed the same way 3rd party DRM and EULA agreements are. And similarly it should mention some details. People are free to boycott Denuvo if they want, but people are also free to buy it anyways if they want. Disclosure is never a bad thing.

  • RampantParanoia2365@lemmy.world
    link
    fedilink
    English
    arrow-up
    7
    ·
    24 days ago

    …what calls? No one is calling for this. One dude said it was unnecessary. That’s not a call, it’s an opinion. He’s not out picketing for the end of fucking AI labels.

  • daniskarma@lemmy.dbzer0.com
    link
    fedilink
    English
    arrow-up
    5
    ·
    23 days ago

    The thing is that it’s kind of voluntary. Game developers could have use AI to develop the game and if they wouldn’t want to disclose it no one would know.

    Unless the use of AI is the very crappy “AI art” that’s easy to notice the rest of uses would be very hard or actually impossible to figure it out to audit the legitimacy of the tag.

    And this will end like r/art where the mods deleted a post accusing the artist of using AI when it was not AI and the final mod answer was “change your art style so it doesn’t look like AI”. A brutal witch-hunt in the end.

  • FlashMobOfOne@lemmy.world
    link
    fedilink
    English
    arrow-up
    6
    arrow-down
    2
    ·
    24 days ago

    I heard the new Game of Thrones game is using LLM’s to generate some of its content. Pisses me off.

    • nutsack@lemmy.dbzer0.com
      link
      fedilink
      English
      arrow-up
      1
      arrow-down
      2
      ·
      edit-2
      24 days ago

      lots of big companies are using them to generate code. i agree with what I think is your point of view, but where do you draw the line

      • FlashMobOfOne@lemmy.world
        link
        fedilink
        English
        arrow-up
        5
        arrow-down
        1
        ·
        edit-2
        24 days ago

        I don’t buy a lot of the big company games anyway, but if this becomes commonplace, what’ll happen is I’ll buy my big-company games second-hand so the benefit to the perpetrators is lessened.