

Bad headline.
…As he could have done this anyway? Like, what does the shutdown have to do with it other than optics and literal vengance?


Bad headline.
…As he could have done this anyway? Like, what does the shutdown have to do with it other than optics and literal vengance?


Does the reality even matter anymore?
Pop quiz: can you guys remember the last 20 major US shootings?
I can’t. I certainly don’t know their outcomes.
Folks will internalize the first wild rumor they heard (indirectly) from Twitter or CNN or whatever before attention moves to the next shooting, and that’s the impression that lasts.


The message I am getting is:
Brendan Carr is our scapegoat for this backfire.


Well, it’s no mystery:
It’s specifically desktop addin boards:
AMD’s RX 9070 XT and RX 9070 represent AMD’s new RDNA 4 architecture, competing with Nvidia’s midrange offerings. Nvidia introduced two new Blackwell-series AIBs: the GeForce RTX 5080 Super and the RTX 5070. The company also announced the RTX 500 workstation AIB. Rumors have persisted about two new AIBs from Intel, including a dual-GPU model.
It is including workstation cards like the Blackwell Pro. But this is clearly not including server silicon like the B200, H200, MI325X and so on, otherwise they would have mentioned updates. They are not AIBs.
I hate to obsess over such a distinction, but it’s important: server sales are not skewing this data, and workstation sales volumes are pretty low. It’s probably a accurate chart for gaming GPUs.


I’m not sure the bulk of datacenter cards count as ‘discrete GPUs’ anymore, and they aren’t counted in that survey. They’re generally sold socketed into 8P servers with crazy interconnects, hyper specialized to what they do. Nvidia does sell some repurposed gaming silicon as a ‘low end’ PCIe server card, but these don’t get a ton of use compared to the big silicon sales.


I’ve kinda lost this thread, but what does that have to do with consumer GPU market share? The servers are a totally separate category.
I guess my original point was agreement: the 5000 series is not great for ‘AI’, not like everyone makes it out to be, to the point where folks who can’t drop $10K for a GPU are picking up older cards instead. But if you look at download stats for these models, there is interest in running stuff locally instead of ChatGPT, just like people are interested in internet free games, or Lemmy instead of Reddit.


Who the fuck buys a consumer GPU for AI?
Plenty. Consumer GPU + CPU offloading is a pretty common way to run MoEs these days, and not everyone will drop $40K just to run Deepseek in CUDA instead of hitting an API or something.
I can (just barely) run GLM-4.5 on a single 3090 desktop.


MAGA has one rule: Trump is never wrong, and is always the victim, no matter what history there is.


I buy this.
And I can’t really blame people for not diving into components and wanting stuff to just… Work.
No one (on average) knows their graphics card off the top of their head.


It mentions desktop GPUs, which are not part of this market cap survey.
Basically I don’t see what the server market has to do with desktop dGPU market share. Why did you bring that up?


Ehh, 5% market share is not fine.
AMD’s server GPU division is not fine, either, so don’t bet on that saving them.
AMD/Intel graphics divisions need R&D money from sales to keep up, and if this keeps up, they’re gonna drop out of dGPUs and stick to integrated graphics (which Intel is already at extremely severe risk of doing).


Yeah. What does that have to do with home setups? No one is putting an H200 or L40 in their homelab.


They need dGPUs worth buying for HPC, other than servers that cost more than a house, so devs will actually target them.


I don’t get this.
Well, if this includes laptops, I get that. Just try to find a dGPU laptop with AMD or Arc these days.
…But in desktops, everyone seems to complain about Nvidia pricing, yet no one is touching Battlemage or the 9000 series? Why? For gaming specifically, they seem pretty great in their price brackets.
Maybe prebuilts are overshadowing that too?


Not as many as you’d think. The 5000 series is not great for AI because they have like no VRAM, with respect to their price.
4x3090 or 3060 homelabs are the standard, heh.


I’m not sure where you’re clicking, but there should always be a way to view it in .ml instead. The little rainbow link is the only button that should take you outside of .ml, which is not what you want, heh.
For instance, this post in .world for me is:
https://lemmy.world/post/35134812
But its ‘home’ on lemmy.ml is:


I mean, I am on .world, and I can comment on your .ml post like this.
You should be able to access other instances anywhere from within lemmy.ml unless you click on the wrong link, without needing any new credentials.


there are plenty of crappy indie games, too
This is a massive understatement.
There’s this fantasy that indie = high quality, but just look through Steam chronologically. 95%-99% of indie games seem to be good ideas that faded into obscurity, buried under the tidal wave of other games, that their creators probably burned out making for little in return. Many are just… not great. But others look like bad rolls of the dice.
Basically zero indies are Stardew Valleys or Rimworlds.
This is the nuance the Baldurs Gate dev is getting it. It’s not ‘games should develop like indies’; they literally can’t afford a 95% flop rate.
But that doesn’t mean the metrics they use for decision making aren’t massively flawed.


He argued that both the switch to county-wide voting in 2019, which allows voters to cast a ballot at any polling site in the county, and the expected low turnout made the cuts appropriate.
Random bit, but as someone from Tarrant County, the assigned polling places thing was really annoying and confusing. Glad they got rid of that.
This is horrifying. Like, mind-bogglingly bad.
It’s also a blatant lie. Other countries are not testing nuclear weapons, they’re running simulations. We didn’t do that in the 60s because it wasn’t possible, but the US (and presumably China/Russia) literally have giant supercomputers for this purpose now.
…And we would know if they were doing real testing because it would show up on seismographs.
But who cares about truth? Or sanity? No, all my relatives (even scientifically minded ones) won’t even bat an eye, lest some Democrat steal their retirement, ugh.