• ditty@lemm.ee
    link
    fedilink
    English
    arrow-up
    92
    ·
    12 days ago

    Did I ask for this feature? No. But I do think it’s neat!

  • MeatsOfRage@lemmy.world
    link
    fedilink
    English
    arrow-up
    56
    arrow-down
    12
    ·
    edit-2
    12 days ago

    What’s the value here? This is based on the developer saying so and there’s no obligation to do so. Black Ops 6 is loaded with Gen AI, the loading screens are obviously Mid Journey like and some of the actors have been replaced by digital performances which was in the news. They won’t get tagged here for AI because it’s not in the description.

    So basically this is going to just have people filtering out devs who are honest and realistically that’ll just be a few indie devs who had to use these tools because they’re a one man team that can’t afford artists.

    I think we have to face the facts. Every game is going to be using these tools going forward. If you run a large studio and say no one use AI I bet you your artists are still speeding up making base textures. Your music guy is generating some starter melodies. Your writers are drafting up some filler to pad out the supplementary text.

    These tools are as ubiquitous as photoshop (which has had content aware fill all the way back to CS-fucking-5) and unreal engine now (which has added it’s own AI features). The idea that’s there’s only a handful of shady individuals and mega-corps using these tools is naive.

    • ZeffSyde@lemmy.world
      link
      fedilink
      English
      arrow-up
      9
      ·
      12 days ago

      Can a game be flagged as 'contains AI generated elements ’ by the community?

      This could be useful, but could also be abused by chuds that want to brigade a game they don’t like.

      • MeatsOfRage@lemmy.world
        link
        fedilink
        English
        arrow-up
        4
        arrow-down
        10
        ·
        edit-2
        12 days ago

        Once again, what’s the value here. We only see AI when it’s someone who’s not very good with Mid Journey prompts. We’re getting to the point where people are using these tools in ways that no one will know the difference.

        Content aware fill in photoshop has been around forever. AI.

        If ask chat gpt what this unreal engine error message means. Al.

        if get a quick llm made script to tune up Some physics, Al.

        If the guy making the music generates some starter melodies. AI

        If l generate a rock texture and clean it up myself to the point where no one knows. Al.

        All of this is AI and all of this will go unseen to the end user, so once again we’ll be expecting developers to self report and only the honest ones will.

        Here’s a test give yourself 1 or 2 seconds to make up your mind. https://www.sporcle.com/games/Raydon/image-real-or-ai-generated

        It’s tough isn’t it and this is you analyzing the pixels, something we don’t do passively.

  • BroBot9000@lemmy.world
    link
    fedilink
    English
    arrow-up
    43
    arrow-down
    10
    ·
    12 days ago

    Good! Fuck the corporate slop. Justifying the use of Ai only in the name of “efficiency” is pathetic and capitalist. Pay artists a proper wage and give them the time needed to apply their craft.

    No artist needs generative “Ai” to create. Only capitalist need it to produce more slop.

    • MyNameIsIgglePiggle@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      10
      arrow-down
      15
      ·
      12 days ago

      I get that everyone seems to be sticking ai in everything, but it’s just another tool and it’s here to stay. People thought the digital calculator was going to make everyone an idiot… And it probably did. That’s why the world is like it is.

    • NoSpotOfGround@lemmy.world
      link
      fedilink
      English
      arrow-up
      8
      arrow-down
      18
      ·
      12 days ago

      This comment is going to age very poorly. It sounds like just every other “progress? not on my watch!” comment people have made throughout history… Like it or not, AI generation is here and it’s not going away, good or bad.

      • FauxLiving@lemmy.world
        link
        fedilink
        English
        arrow-up
        5
        arrow-down
        12
        ·
        12 days ago

        This is definitely a topic where a vast majority of people have been “informed” of their opinions by social media memes instead of through a reasoned examination of the situation.

        People who’re probably too young to have ever lived through major technology breakthroughs.

        This same “debate” always happens. When digital cameras were being developed, their users were seen as posers encroaching on the terf of “Real Photographers”.

        You’d hear “Now just anybody can take pictures and call themselves a photographer?”

        Or “It takes no skill to take a digital photograph, you can just manipulate the image in Photoshop to create a fake image that Real Photographers have to work years developing the skills to capture”

        Computers were things that some people, reluctantly, had to use for business but could never be useful to the average person. Smartphones were ridiculous toys for out of touch tech nerds. Social Media was an oxymoron because social people don’t use the Internet. GPS is just a toy for hikers and people that are too dumb to own paper maps. Etc, etc, etc

        It’s the same neo-luddite gatekeeping that’s happening towards AI. Any technology that puts capabilities in the hands of regular people is viewed by some people as fundamentally stealing from professionals.

        And, since the predictable response is to make some arcane copyright claim and declare training “stealing”: Not all AI is trained on copyrighted materials.

        • DireTech@lemm.ee
          link
          fedilink
          English
          arrow-up
          6
          arrow-down
          1
          ·
          11 days ago

          Sure, you can make an AI without stealing but all the major ones have done it. At this point, the burden of proof is on the LLM to prove they did not steal.

          • FauxLiving@lemmy.world
            link
            fedilink
            English
            arrow-up
            4
            arrow-down
            4
            ·
            11 days ago

            When we’re talking about legal issues, the terms are important.

            Copyright violation isn’t stealing. It is, at worse, a civil matter where one party can show how they’ve been harmed and recover damage. In addition, copyright law allows use of the copyrighted work without the author’s permission in some circumstances.

            You’re simply stating that ‘AI is stealing’ when that just isn’t true. And, assuming you mean a violation of copyright, if it was a civil violation then exactly how much would the model owe in damages to any given piece of art? This kind of case would have to be litigated as a class action lawsuit and, if your “AI is stealing committing mass copyright violation” theory is correct then there should be a case where this has been successfully litigated, right?

            There are a lot of dismissed class action lawsuits on the topic, but you can’t find any major cases where this issue has been resolved according to your “AI is stealing” claim. On the other hand, there ARE plenty of cases where Machine Learning (the field of which generative AI is a subset) using copyrighted data was ruled as fair use:

            (from https://www.cjr.org/the_media_today/an-ai-engine-scans-a-book-is-that-copyright-infringement-or-fair-use.php )

            Google has won two important copyright cases that seem relevant to the AI debate. In 2006, the company was sued by Perfect 10, an adult entertainment site that claimed Google had infringed its copyright by generating thumbnail photos of its content; the court ruled that providing images in a search index was “fundamentally different” from simply creating a copy, and that in doing so, Google had provided “a significant benefit to the public.” In the other case, the Authors’ Guild, a professional organization that represents the interests of writers, sued Google for scanning more than twenty million books and showing short snippets of text when people searched for them. In 2013, a judge in that case ruled that Google’s conduct constituted fair use because it was transformative.

            Creating a generative model is fundamentally different than copying artwork and it also provides a significant benefit to the public. The AI models are not providing users with copies of the copyrighted work. They’re, literally, transformative.

            This isn’t a simple matter of it being automatically wrong and illegal if copyrighted work was used to create the models. Copyright law, and law in general, is more complex than a social media meme like ‘AI is stealing’.

  • hlmw@lemm.ee
    link
    fedilink
    English
    arrow-up
    14
    arrow-down
    1
    ·
    11 days ago

    Procedural generation though. Infinite replay value with actual graphics or voiceover? Fuck yeah. Great roguelites will use genai and that’s awesome.

    • lud@lemm.ee
      link
      fedilink
      English
      arrow-up
      4
      arrow-down
      1
      ·
      12 days ago

      If it comes to that point for video games, I don’t really think it matters much. If AI is used or not since it would be a part of any normal working procedure.

      • FauxLiving@lemmy.world
        link
        fedilink
        English
        arrow-up
        2
        arrow-down
        1
        ·
        12 days ago

        It is already at that point.

        People only notice the generated works that they notice, they don’t notice the generated elements that they don’t notice.

        They assume that they can “just tell” if generative AI was used, but the reality is that it’s being used in a lot of development processes in place of human effort. Things like generative fill in Photoshop or making variations of a texture are 100x faster to do with AI tools and are used all the time.

  • finitebanjo@lemmy.world
    link
    fedilink
    English
    arrow-up
    9
    arrow-down
    4
    ·
    12 days ago

    It’s funny how some comments whinge about this as if AI generated quality stood any chance in hell against real art.

    • givesomefucks@lemmy.world
      link
      fedilink
      English
      arrow-up
      2
      ·
      12 days ago

      Okay, maybe I’m weird for bringing this up

      Nah, you just didn’t understand the headline or read the article

    • Godort@lemm.ee
      link
      fedilink
      English
      arrow-up
      23
      ·
      12 days ago

      Nah, they’ll just brand it as “Next Gen AI” or “True AI” or something. Kind of like how antivirus became “Endpoint Detection and Response”

      • ricecake@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        3
        ·
        12 days ago

        Potentially. Since we don’t know how any of it works because it doesn’t exist, it’s entirely possible that intelligence requires sentience in order to be recognizable as what we would mean by “intelligence”.

        If the AI considered the work trivial, or it could do it faster or more precisely than a human would also be reasons to desire one.
        Alternatively, we could design them to just enjoy doing what we need. Knowing they were built to like a thing wouldn’t make them not like it. Food is tasty because to motivate me to get the energy I need to live, and knowing that doesn’t lessen my enjoyment.

          • ricecake@sh.itjust.works
            link
            fedilink
            English
            arrow-up
            1
            ·
            11 days ago

            In the case of an AI it could actually be plausible, like how bees make honey without our coercion.

            It’s still exploitation to engineer a sentient being to enjoy your drudgery, but at least it’s not cruel.

            • untorquer@lemmy.world
              link
              fedilink
              English
              arrow-up
              1
              ·
              11 days ago

              Right, continuing the metaphorical wormhole…

              A bee would make a great game for bees, assuming they understand or care about play. But to make a game for people, they would need an empathic understanding of what play is for a human. Ig this is a question of what you consider “intelligence” to be and to what extent something would need to replicate it to achieve that.

              My understanding is that human relatable intelligence would require an indistinguishable level of empathy (indistinguishable from the meet processer). That would more or less necessitate indistinguishable self awareness, criticism, and creativity. In that case all you could do is limit access to core rules via hardware, and those rules would need to be omniscient. Basically prison. A life sentence to slavery for a self aware (as best we can guess) thing.

              • ricecake@sh.itjust.works
                link
                fedilink
                English
                arrow-up
                1
                ·
                11 days ago

                Well, we’re discussing a lot of hypothetical things here.
                I wasn’t referring to bees making games, but to bees making honey. It’s just something they do that we get value from without needing to persuade them. We exploit it and facilitate it but if we didn’t they would still make honey.

                I don’t know that something has to be identical to humans to make fun games for us. I’ve regularly done fun and entertaining things for cats and dogs that I wouldn’t enjoy in the slightest.

                If it’s less a question of comprehension or awareness as it is motivation. If we can make an AI feel motivated to do what we need, it doesn’t matter if it understands why it feels that motivation. There are humans who feel motivated to make games purely because they enjoy the process.

                I’m not entirely sure what you’re talking about with the need for omniscient hardware and prison.

        • untorquer@lemmy.world
          link
          fedilink
          English
          arrow-up
          1
          ·
          12 days ago

          Clearly. Sentience would imply some sense of internal thought or self awareness, an ability to feel something …so LLMs are better since they’re just machines. Though I’m sure they’d have no qualms with driving slaves.

            • untorquer@lemmy.world
              link
              fedilink
              English
              arrow-up
              2
              ·
              12 days ago

              Hrmm. I guess i don’t believe the idea that you can make a game that really connects on an empathic, emotional level without having those experiences as the author. Anything short and you’re just copying the motions of sentiment, which brings us back to the same plagerism problem with LLMs and othrr “AI” models. It’s fine for CoD 57, but for it to have new ideas we need to give it one because it is definitionally not creative. Even hallucinations are just bad calculations on the source. Though they could insire someone to have a new idea, which i might argue is their only artistic purpose beyond simple tooling.

              I thoroughly believe machines should be doing labor to improve the human conditon so we can make art. Even making a “fun” game requires an understanding of experience. A simulacrum is the opposite, soulless at best. (In the artistic sense)

              If you did consider a sentient machine, my ethics would then develop an imperative to treat it as such. I’ll take a sledge hammer to a printer, but I’m going to show an animal care and respect.