Stubsack: weekly thread for sneers not worth an entire post, week ending Sunday 26 May 2024
Have a sneer percolating in your system but not enough time/energy to make a whole post about it? Go forth and be mid!
Any awful.systems sub may be subsneered in this subthread, techtakes or no.
If your sneer seems higher quality than you thought, feel free to cut’n’paste it into its own post, there’s no quota for posting and the bar really isn’t that high
The post Xitter web has spawned soo many “esoteric” right wing freaks, but there’s no appropriate sneer-space for them. I’m talking redscare-ish, reality challenged “culture critics” who write about everything but understand nothing. I’m talking about reply-guys who make the same 6 tweets about the same 3 subjects. They’re inescapable at this point, yet I don’t see them mocked (as much as they should be)
Like, there was one dude a while back who insisted that women couldn’t be surgeons because they didn’t believe in the moon or in stars? I think each and every one of these guys is uniquely fucked up and if I can’t escape them, I would love to sneer at them.
reporter: Oh, I'm so glad you asked! The AI is IN the computer!
me: y tho?
reporter: To win the race to get AI everywhere the fastest!
me: sorry, y tho?
reporter: Oh my, you sure have a lot of questions! Look, let's not try to make any sense of this. After all, only our tech-daddies have the answers!
me: begins weeping
reporter: Don't cry! At least we're all in this together, right???
While AI being "in" a computer might sound as obvious as blue being "in" the sky, this is actually one of those things that is a Big Deal™. AI models are normally either downloaded or used online, but Microsoft has just announced an "AI computer", meaning the technology is in-built. It's the company's latest play in the overheated race to see which tech giant can get the most AI into the most places, fastest.
What does it mean? Hard to say! In case you haven't worked it out yet, this is all one big live experiment, and we're the rats. Perhaps there's some comfort in knowing we'll all find out together.
I started pre-hurting in 2018-2019 when I saw SoC vendors pitch things as AI chips because you could natively run accelerated models on them for speech and image recognition. Yes, really, that was the total pitch. It’s AI because you could run models. Where to train the models, what the total scope of capacity was, total execution space, total throughput, etc? No no, not practically touched at all in those docs
And where I said pre-hurting? At the time I thought “oh god”….you can imagine how quaint that memory feels in retrospect right now
oh yeah doing embedded dev that was always the worst. my RISC-V SoC has an AI accelerator on it! cool, what does the datasheet say it does? TensorFlow lite and nothing else. ok maybe I can use that silicon for something else at least? nope, there’s no documentation on its capabilities or how it works at all, it just does TensorFlow lite. shit well ok, what can I do with TensorFlow lite? you can load the example model we’ve provided and make a shitty voice activation trigger or try to train a model but there’s no docs. well fuck me then.
And don’t you dare ask about the intended binary representation, that’s obviously some illegal hacker shit
Just render it out with lite! What do you mean about in 3-5y? Software doesn’t change, you scrub. We have a golden master toolchain right here on our ftp, it’s at ftp://nonssl.http25.pub.norsou-semi-sometimes.xyz/pub/archive/noindex feel free to download and develop your own applications!