• 0 Posts
  • 13 Comments
Joined 2 years ago
cake
Cake day: July 3rd, 2023

help-circle

  • gamer@lemm.eetoAsklemmy@lemmy.mlWhy would'nt this work?
    link
    fedilink
    arrow-up
    8
    arrow-down
    1
    ·
    12 days ago

    This doesn’t account for blinking.

    If your friend blinks, they won’t see the light, and thus would be unable to verify whether the method works or not.

    But how does he know when to open his eyes? He can’t keep them open forever. Say you flash the light once, and that’s his signal to keep his eyes open. Okay, but how long do you wait before starting the experiment? If you do it immediately, he may not have enough time to react. If you wait too long, his eyes will dry out and he’ll blink.

    This is just not going to work. There are too many dependent variables.





  • 96 GB+ of RAM is relatively easy, but for LLM inference you want VRAM. You can achieve that on a consumer PC by using multiple GPUs, although performance will not be as good as having a single GPU with 96GB of VRAM. Swapping out to RAM during inference slows it down a lot.

    On archs with unified memory (like Apple’s latest machines), the CPU and GPU share memory, so you could actually find a system with very high memory directly accessible to the GPU. Mac Pros can be configured with up to 192GB of memory, although I doubt it’d be worth it as the GPU probably isn’t powerful enough.

    Also, the 83GB number I gave was with a hypothetical 1 bit quantization of Deepseek R1, which (if it’s even possible) would probably be really shitty, maybe even shittier than Llama 7B.

    but how can one enter TB zone?

    Data centers use NVLink to connect multiple Nvidia GPUs. Idk what the limits are, but you use it to combine multiple GPUs to pool resources much more efficiently and at a much larger scale than would be possible on consumer hardware. A single Nvidia H200 GPU has 141 GB of VRAM, so you could link them up to build some monster data centers.

    Nivida also sells prebuilt machines like the HGX B200 which can have 1.4TB of memory in a single system. That’s less than the 2.6TB for unquantized deepseek, but for inference only applications, you could definitely quantize it enough to fit within that limit with little to no quality loss… so if you’re really interested and really rich, you could probably buy one of those for your home lab.


  • If all you care about is response times, you can easily do that by just using a smaller model. The quality of responses will be poor though, and it’s not feasible to self host a model like chatgpt on consumer hardware.

    For some quick math, a small Llama model is 7 billion parameters. Unquantized that’s 4 bytes per parameter (32 bit floats), meaning it requires 28 billion bytes (28 gb) of memory. You can get that to fit in less memory with quantization, basically reducing quality for lower memory usage (use less than 32 bits per param, reducing both precision and memory usage)

    Inference performance will still vary a lot depending on your hardware, even if you manage to fit it all in VRAM. A 5090 will be faster than an iPhone, obviously.

    … But with a model competitive with ChatGPT, like Deepseek R1 we’re talking about 671 billion parameters. Even if you quantize down to a useless 1 bit per param, that’d be over 83gb of memory just to fit the model in memory (unquantized it’s ~2.6TB). Running inference over that many parameters would require serious compute too, much more than a 5090 could handle. This gets into specialized high end architectures to achieve that performance, and it’s not something a typical prosumer would be able to build (or afford).

    So the TL; DR is no




  • You’re not engaging in any good faith arguing here. I made a good faith argument that your focus on the “illegal” label is arbitrary and pointless, and even invited a discussion on immigration in general, but instead you came back with some terrible takes and snark. I can’t tell if you’re a troll or a bot.

    Good thing all of you taught my why I was wrong.

    Is that why you came here? To have people teach you why you’re wrong? Because if you already know you’re wrong, why are you spending so much effort arguing against people telling you what you already know? It’s Saturday dude, spend it with friends and family, not bitterly debating politics with strangers on the internet.


  • Good to know! I also hate illegal immigration, which is why, at least on this issue, I’m voting for democrats for the foreseeable future. The republican party is hell bent on increasing the amount of illegal immigrantion in this country, and I just can’t support that. Of course, the dems would never go so far as to eliminate illegal immigration completely (by adopting open borders), but I’m confident they’re at least more open to finding a middle ground that makes most people happy.



  • but I’m struggling to find a reason we shouldnt deport illegal immigrants.

    Are you specifically concerned about illegal immigration, or just immigration in general? Because if it’s the former, that’s a silly distinction because the government (we the people, aka Elon Musk) decides what is illegal or not. If the next wave of politicians decides we should have actually open borders, then there would be no such thing as “illegal immigration”.

    I’m concerned that illegal immigrant labor is akin to H1b or prison labor, where the worker has diminished rights and is abused more than other groups.

    Do you have specific examples in mind where immigrants are exploited? If you do, look at those examples and ask yourself: “could we pass laws to protect these people from abuse?”, and you’ll find that the answer is obviously yes.

    Maybe your definition of “abuse” is that they need to work harder to earn less? Well, that’s the society we live in. Capitalism has its problems, but it has worked good enough for us for the past 248 years. For the immigrant, US minimum wage is likely far better than whatever they received in their home country, and I suspect most would happily take that deal. I think that’s what they’d call “the American dream”, as their children will be able to go to school and have a better future than they did.

    …If your issue is with immigration in general, then I don’t know what to tell you. That’s entirely opinion based, and nobody knows what the correct answer is (despite what they might claim). 100% open borders has risks, 100% closed borders has risks.