On lemmy.world i would fully believe anyone who said there are 60+ bots boosting any given post (obviously not every post). Thats not a lot when you think about it but enough to give certain posts traction and ensure they stay at the top for days.
Not as weird as reddit. Go look at the front page these days. There are posts like "what's your name?" With thousands of upvotes. Completely worthless drivel is getting driven to the front page.
Definitely losing interest fast in this "reddit alternative". The politics subs are just as bad, if not worse, than Facebook and Twitter. The communities are just clones of subreddits, after the 3rd party / mod purge. The dumbass comment chains that stopped being funny 5 years ago. Clearly bots being used to influence social issues... Could go on but it's just more wasted bandwidth.
The linear algebraic computations performed on their GPU's tensor cores (since the Turing era) combined with their CUDA and cuDNN software stack have the fastest performance in training deep neural network algorithms.
That may not last forever, but it's the best in terms of dollars per FLOP an average DNN developer like myself has access to currently.
People buy Nvidia no matter what. Even when they aren't the best choice. Then those same people complain about Nvidia doing the anticompetitive things they do.
The best is when people cheer for AMD making something great, only so they can buy an Nvidia card cheaper, as if the only reason AMD exists is to subsidise their Nvidia purchase!
Nvidia's greatest asset is the mindshare they have.
Well that and CUDA still means a load of professionals in various fields are stuck using Nvidia whether they like it or not. This means data centers are incentivised to go with Nvidia if they want those customers, which ultimately means if someone gonna work on code/tools that run in those data centers, you want the same architecture on your local machine for development and testing.
It's getting better, but the gap is still real. Hopefully the guys that are working on SCALE can actually get it working on the CDNA GPUs one day, since data centers are where a lot of the CUDA is running or perhaps the UDNA stuff AMD just announced will enable this.
The fact this is all hinging on the third party that develops SCALE, should highlight that AMD still doesn't seem to be playing the same game as Nvidia, which is why we're still in this position.
I would have much preferred giving AMD money instead, but at their best the lack of DLSS performance was meaningful when everyone thought Cyberpunk was the new standard of graphical fidelity with the 6000/3000 series.