DeepSeek R1 Is Reportedly Running Inference On Huawei's Ascend 910C AI Chips, Showing China's Growing AI Capabilities
DeepSeek R1 Is Reportedly Running Inference On Huawei's Ascend 910C AI Chips, Showing China's Growing AI Capabilities
DeepSeek's AI model is running inference workloads on Huawei's Ascend 910C chips, showing how massively China's AI industry has evolved.
![DeepSeek R1 Is Reportedly Running Inference On Huawei's Ascend 910C AI Chips, Showing China's Growing AI Capabilities](https://lemmy.world/pictrs/image/fcee368b-8ab0-48fd-8b1f-86ed3c273705.jpeg?format=webp&thumbnail=256)
Honestly good for them. US tech CEOs deserve to have their lunch eaten for ducking the industry into stagnation with their short sighted greed.
57 1 ReplyIn one story they're using PTX on Nvidia H800s. In another they're on Huawei chips.
Which is it? Are we all just hypothesising?
23 3 ReplyNot the best on AI/LLM terms, but I assume that training the models was done on Nvidia, while inference (using the model/getting the data from the model) is done on Huawei chips
To add: Training the model is a huge single-cost expense, while inference is a continuous expense.
26 0 ReplyWait, so after you train, you don't need all those fancy Nvidia chips?
They should make one place where there is an overabundance of geo thermal energy and train all models there...
5 0 Reply
An unknown quantization of R1 is running on the 3rd iteration of outdated 7nm hardware taken from Sophgo's work with TSMC last year?
Is this meant to be impressive or alarming? Because I'm neither.
16 5 ReplyI'm not going to parse this shit article. What does interference mean here? Please and thank you.
1 43 ReplyThat's a very toxic attitude.
Inference is in principle the process of generation of the AI response. So when you run locally and LLM you are using your GPU only for inference.
42 0 ReplyYeah, I misread because I'm stupid. Thanks for replying, non-toxic man.
25 7 Reply
Training: Creating the model
Inference: Using the model19 0 ReplyInference? It's the actual running of the ai when you use it, as opposed to training.
13 0 ReplySorry. I forgot to mention that I'm dumb.
14 2 Reply