

High end workstation is the same silicon/memory as gaming cards.
They’re presumably “saving” it for the high margin server GPUs, under the presumption that memory makers will allocate more production to HBM.


High end workstation is the same silicon/memory as gaming cards.
They’re presumably “saving” it for the high margin server GPUs, under the presumption that memory makers will allocate more production to HBM.


100%.
I got the first Korean 1440p “overclock” monitor, and 60-> 110hz was like night and day many years ago. Sometimes it’d reset from a driver update (as the graphics driver had to be patched to work with overclocked DVI back then), and I’d immediately notice even poking around the web.
Some with phones. I got a Razer phone 2, and 120hz was incredible. I went from that to an iPhone 16 plus (60hz), and it feels sluggish to me.
Another caveat is that 120hz is more “convenient” and less stuttery for most video. 24fps does not evenly divide into 60, but it does for 96 or 120. An once you start seeing choppiness in video, your eyes can’t unsee it.


Yes. 74% is the “average” point of diminishing returns to preserve the battery, according to Accubattery’s data. It tracks charging cycles and battery wear across many thousands of smartphones.
In fact, the reason many phones/gadgets don’t offer this feature (and that Apple sometimes charges to 100% in spite of the toggle) is likely planned obsolescence.
…To add to this, the actual charging threshold of the battery is a bit arbitrary and set by the manufacturer, as a tradeoff of capacity vs life. Fast charging is the same; charging quickly is hard on the battery, and the limits at different charge levels are configured as a “balance” between convenience and life.
…And sometimes they get those thresholds wrong.
Like Samsung rather infamously did for the exploding Galaxy Notes. Google did for the Nexus 6P. They pushed the batteries too hard and borked the phones.


The real problem is social media, and how feeds are structured now.
The ‘few trustworthy institutions’ model has been utterly obliterated because a few tech companies figured out a sea of influencers is more profitable/exploitable. Not to minimize some of the great creators out there, but one’s daily news shouldn’t come from Joe Rogan + your Facebook uncle’s reshares.


To be fair, Twitter is a HORRIBLE place to post that. There’s zero indication of who he is or his reputability.
Inhales for rant. Reporters posting stuff like ‘the country is burning down’ or ‘social media is destroying America’ on Twitter is so irritating.


deleted by creator


Why clog up .world’s front page with tabloids like rawstory and dailybeast? They just repost stuff with an angry headline and Twitter links. Why not link the original source?
Here it is, for reference:
But I’ve brought this up many times. Mods and posters don’t seem to care.
…Eh. I’m blocking this community.
I like most posts, but I’m sick of seeing tabloid outrage on the front page every day, like Reddit.


(I watched the whole ad)
Yeah…
I mean, I’m a local ML tinkerer. I’m practically an AI-loving extremist on Lemmy, and this is just… weird? Even if I was a New York Republican or whatever, it feels like a clickbait supplement ad your boomer uncle’d send you from Facebook, complete with emojiis.
I think this is why many like Trump. For all his antics and trolling, it’s often authentic, where this feels like an alien trying to figure out what a young adult would tweet.
I’d say Cuomo needs to be authentic, too, but that’d probably be even worse, heh.


And this is well before Trump’s announcement of nuclear testing resumption.
If he actually goes through with it, everyone will know because it shows up on seismographs, something he’s apparently ignorant of… And the comission will surely move it closer.
I am extremely concerned. There’s never been somone so unscientifically minded, with no one knowledgable on nuclear warfare advising him directly, with their finger on the button like this.
And what’s annoying is that my relatives do not care. Even the scientifically minded ones. There is absolutely nothing Trump can do wrong, at least nothing that can get through without being warped and dismissed as hysteria. That in itself is scary, as there’s now no “political fallout” check on Trump’s actions until 401Ks literally evaporate.


Clearly, even AMD wants you to switch to linux, heh.
…I’m not even totally kidding. A mass migration would be a huge boon to their business and leave Nvidia scrambling.


This is horrifying. Like, mind-bogglingly bad.
It’s also a blatant lie. Other countries are not testing nuclear weapons, they’re running simulations. We didn’t do that in the 60s because it wasn’t possible, but the US (and presumably China/Russia) literally have giant supercomputers for this purpose now.
…And we would know if they were doing real testing because it would show up on seismographs.
But who cares about truth? Or sanity? No, all my relatives (even scientifically minded ones) won’t even bat an eye, lest some Democrat steal their retirement, ugh.


Bad headline.
…As he could have done this anyway? Like, what does the shutdown have to do with it other than optics and literal vengance?


Does the reality even matter anymore?
Pop quiz: can you guys remember the last 20 major US shootings?
I can’t. I certainly don’t know their outcomes.
Folks will internalize the first wild rumor they heard (indirectly) from Twitter or CNN or whatever before attention moves to the next shooting, and that’s the impression that lasts.


The message I am getting is:
Brendan Carr is our scapegoat for this backfire.


Well, it’s no mystery:
It’s specifically desktop addin boards:
AMD’s RX 9070 XT and RX 9070 represent AMD’s new RDNA 4 architecture, competing with Nvidia’s midrange offerings. Nvidia introduced two new Blackwell-series AIBs: the GeForce RTX 5080 Super and the RTX 5070. The company also announced the RTX 500 workstation AIB. Rumors have persisted about two new AIBs from Intel, including a dual-GPU model.
It is including workstation cards like the Blackwell Pro. But this is clearly not including server silicon like the B200, H200, MI325X and so on, otherwise they would have mentioned updates. They are not AIBs.
I hate to obsess over such a distinction, but it’s important: server sales are not skewing this data, and workstation sales volumes are pretty low. It’s probably a accurate chart for gaming GPUs.


I’m not sure the bulk of datacenter cards count as ‘discrete GPUs’ anymore, and they aren’t counted in that survey. They’re generally sold socketed into 8P servers with crazy interconnects, hyper specialized to what they do. Nvidia does sell some repurposed gaming silicon as a ‘low end’ PCIe server card, but these don’t get a ton of use compared to the big silicon sales.


I’ve kinda lost this thread, but what does that have to do with consumer GPU market share? The servers are a totally separate category.
I guess my original point was agreement: the 5000 series is not great for ‘AI’, not like everyone makes it out to be, to the point where folks who can’t drop $10K for a GPU are picking up older cards instead. But if you look at download stats for these models, there is interest in running stuff locally instead of ChatGPT, just like people are interested in internet free games, or Lemmy instead of Reddit.


Who the fuck buys a consumer GPU for AI?
Plenty. Consumer GPU + CPU offloading is a pretty common way to run MoEs these days, and not everyone will drop $40K just to run Deepseek in CUDA instead of hitting an API or something.
I can (just barely) run GLM-4.5 on a single 3090 desktop.


MAGA has one rule: Trump is never wrong, and is always the victim, no matter what history there is.
…I wonder what it would take for Intel to get in the memory business.
Or GloFo? WTF are the euro fabs doing these days?