• ryper@lemmy.ca
    link
    fedilink
    English
    arrow-up
    50
    arrow-down
    2
    ·
    edit-2
    30 days ago

    Reminder that Bethesda is owned by Microsoft, the company that insists it’s going to end support for Windows 10 in October and wants everyone to move to Windows 11, which doesn’t officially support perfectly functional but somewhat old CPUs. So of course they don’t care about GPUs too old to support ray tracing.

    • kemsat@lemmy.world
      link
      fedilink
      English
      arrow-up
      6
      ·
      29 days ago

      At some point it was going to happen, this is just earlier than many thought. The real question is “when is AMD going to have an answer to Nvidia when it comes to RT performance?”

      • MystikIncarnate@lemmy.ca
        link
        fedilink
        English
        arrow-up
        4
        ·
        29 days ago

        Earlier than they thought?

        How long did they think it would take before RT was a requirement? It was introduced with the GeForce 20 series more than six years ago.

        For technology, six years is vintage.

        The only people this should affect is people still using GTX 10 and 16 series cards. I dunno what’s happening with AMD/Radeon. Since they were purchased by AMD the banking schemes have gotten to be more and more nonsensical, so I always have a hard time knowing WTF generation a card is from by the model number.

        In any case. Yeah, people using 5+ year old tech are going to be unable to play the latest AAA games. And?

        Has there ever been a time when a 5+ year old system can reasonably play a modern AAA title without it being a slide show?

        • kemsat@lemmy.world
          link
          fedilink
          English
          arrow-up
          3
          ·
          edit-2
          29 days ago

          I’m still hearing from people that they’re using an Nvidia 1000 card. I was expecting to hear 2000 instead of 1000, and then it would happen.

          • MystikIncarnate@lemmy.ca
            link
            fedilink
            English
            arrow-up
            3
            ·
            29 days ago

            I have a 20 series card, albeit one of the higher tier ones, and I probably won’t be upgrading this year. I probably also won’t be playing any new AAA titles either.

            It’s fine to have an older card, but nobody in that position should be expecting to play the latest and greatest games at reasonable framerates, if at all.

            It is the way of things.

            I am personally rather miffed about the fact that if you want any performance from a GPU, you basically need to spend $800+. Even though some cards are saying they’re available for less, they almost never are, either due to scalping or greed (which are kind of the same thing), or something else like idiotic tariffs. I don’t have nearly a grand I can burn every year to upgrade my GPU the last GPU I bought was a 1060, and my current card was a gift. I haven’t had a budget for a decent GPU in many, many years.

            When I upgrade, I’m likely going Intel arc, because the value proposition makes sense to me. I can actually spend less than $600 and get a card that will have some reasonable level of performance.

            • kemsat@lemmy.world
              link
              fedilink
              English
              arrow-up
              1
              arrow-down
              1
              ·
              29 days ago

              The current Intel GPUs aren’t better than a RTX 2070, so that won’t be an upgrade if you’re on a higher tier 2000 series.

              I just went up to a 4070 Ti from a 2080 Ti, because it was the only worthwhile upgrade. $660 used. So you don’t need to spend $800.

              • DacoTaco@lemmy.world
                link
                fedilink
                English
                arrow-up
                2
                ·
                28 days ago

                Euh, no. The intel battlemage cards are way way better than rtx 2070. They even beat the 4060… For 250$.
                Intel battlemage gpu’s are really good cards if you dont need pure, raw, power because everything must be in 4k and on ultra etc.
                Which is a good value since that raw, pure, power comes with an electricity bill i would not want to pay

                • kemsat@lemmy.world
                  link
                  fedilink
                  English
                  arrow-up
                  1
                  ·
                  28 days ago

                  Yeah, I got the cards wrong. They are around a 2080, which is around the same as a 4060. Still not much of an upgrade from an upper end 2000 series, which to me is 2070 and up.

              • MystikIncarnate@lemmy.ca
                link
                fedilink
                English
                arrow-up
                1
                ·
                28 days ago

                Yeah, the gifted card I’m using is a 2080 Ti. My friend that gifted it, went from a dual 2080 ti SLI setup to a 4090 IIRC, he kept one for his old system so it’s still useful, but gave me one of the two since SLI is dead and he doesn’t need the extra card in a system he’s not frequently using.

                11G of memory is an odd choice, but it was a huge uplift from the 3G I was using before then. I had a super budget GTX 1060 3G (I think it was made by palit?) before.

                I still have to play on modest settings for anything modern, but my only real challenge has been feeding it with fresh air. My PC case puts the GPU on a riser with front to back airflow and very little space front-back and top/bottom. The card uses a side intake, which is fairly typical for GPUs, which is basically starved for air if I install the card normally. For now, I’ve got it on a riser, sitting on top of the system with the cover off, so my GPU is in open air. Not ideal. I need to work on a better solution… But it works great otherwise.

  • ILikeBoobies@lemmy.ca
    link
    fedilink
    English
    arrow-up
    9
    arrow-down
    3
    ·
    30 days ago

    8gb is intro level for gpus anyway so that’s not a big deal

    I suppose if you’re going to have ray tracing it cuts down development time to not have to redo lighting again for when the feature is off

    • DacoTaco@lemmy.world
      link
      fedilink
      English
      arrow-up
      1
      ·
      edit-2
      28 days ago

      And i agree that is a good thing and natural progression/evolution of tech.
      What i dont like is nvidia’s cockhold on the tech with insane prices ( for the card and power draw ) as a result. I know other cards are catching up and all that, but the difference is still huge because some methods and functions are locked to the cuda cores and nvidias tech.
      I will not be giving nvidia money in the next 7 years. We will see where they stand once i have to replace my (amd) gpu.

  • baatliwala@lemmy.world
    link
    fedilink
    English
    arrow-up
    1
    ·
    edit-2
    30 days ago

    How does RT only titles work on consoles? The RT really isn’t that powerful, aren’t they supposed to be equivalent to an RTX 2070 at best? It sounds like the graphics difference will be quite a lot for PC vs consoles.

    • SoJB@lemmy.ml
      link
      fedilink
      English
      arrow-up
      5
      ·
      30 days ago

      Consoles have been seriously struggling to run anything past 30fps blurry messes for the past several years. It’s real bad on that side of the fence.

      Although PC gamers aren’t much better off, having to buy $900 GPUs every year just to run the latest AAAA blurfest at 30 FPS with AI frame gen on top of upscaling on top of interpolation frame gen.

    • alphabethunter@lemmy.world
      link
      fedilink
      English
      arrow-up
      1
      ·
      30 days ago

      Both current gen consoles are RT capable, so they’ll just use lowered graphical settings, some amount of optimization, and upscaling. Indiana Jones ran great though, way better than you’d expect. I was getting a perfectly smooth 75 fps on a 6750 XT on 1080p high, no upscaling or framegen in use.