Stubsack: weekly thread for sneers not worth an entire post, week ending Sunday 14 April 2024
Have a sneer percolating in your system but not enough time/energy to make a whole post about it? Go forth and be mid!
Any awful.systems sub may be subsneered in this subthread, techtakes or no.
If your sneer seems higher quality than you thought, feel free to cut’n’paste it into its own post, there’s no quota for posting and the bar really isn’t that high
The post Xitter web has spawned soo many “esoteric” right wing freaks, but there’s no appropriate sneer-space for them. I’m talking redscare-ish, reality challenged “culture critics” who write about everything but understand nothing. I’m talking about reply-guys who make the same 6 tweets about the same 3 subjects. They’re inescapable at this point, yet I don’t see them mocked (as much as they should be)
Like, there was one dude a while back who insisted that women couldn’t be surgeons because they didn’t believe in the moon or in stars? I think each and every one of these guys is uniquely fucked up and if I can’t escape them, I would love to sneer at them.
To be clear nothing in the post makes me think they actually did what they are claiming, from doing non-specific 'fixes' to never explicitly saying what the project is other that that it is 'major' and 'used by many' to the explicit '{next product iteration} is gonna be so incredible you guys' tone of the post, it's just the thought of random LLM enthusiasts deciding en masse to play programmer on existing oss projects that makes my hairs stand on end.
So, all this tells me is that GPT5 is going to be scary good and I can't wait.
Amazing how much tech hype nowadays is 'the next version will be great!'. Parts of this always have existed, and there is also the other part of tech hype 'You don't get it, this isn't just tech, this allows you to be A Platform!'. Vast fields of new possibilities, always just out of reach. Fusion is 17.6 years away people!
E: Related to that, also see how people always need to shift to the next big thing. The next codebase will fix your problems, no the next new AI system will be better, dump the old and learn the new thing. (Don't forget to not notice you are not actually doing things, just learning new systems over and over).
What do you mean “fixed” an entire repo? How were you prompting and what were you fixing?
When you're refactoring you need to be more familiar with the code base. For example, why are you refactoring you ask yourself. What parts of the code do that functionality. How is it intertwined to other parts of the code base.
I swear to god it was like 50 years ago it feels, but I still remember everyone responding to criticism of ChatGPT with "GPT 4 is going to be much better just wait".
Amazing that when there is pushback against his ideas he starts to ad hominem call people professors.
(there was this talk/article about how to recognize a crank I forgot the link/source but iirc 'reacting badly to pushback' and 'always trying to solve the biggest open problems firsts' are two things on the list).
The halting problem relates because in a similar way it is saying that there cannot be a single algorithm that solves everything within the system thus the system needs algorithms that can do "work" to solve complex problems. NP-complete problems are out of reach for today's classical compute systems and thus quantum computing could approach them and unlock them faster i.e. the speedup. This does reflect Godel's incompleteness theorems.
I need a shorter name for the whole genre of person that’s on way too many uppers and won’t stop using ChatGPT all night to make all their decisions, cause we keep running into them and for some reason all of them are obsessed with CS woo
maybe we’re really just witnessing what happens when your “nootropic” habit gets out of hand and you’re still in debt from gambling on crypto, so you get high as shit and convince yourself you’re a genius because whenever you read the tea leaves they tell you exactly what you expected to hear. this is, unfortunately, how cults tend to form.
Like OP, I also learned about Gödel's incompleteness theorems and was struck with a sense of profundity despite not having the mathematical grounding to come to any meaningful conclusions from this. Unlike OP, I don't ramble about the things I don't really understand.
(I do, however, ramble about protein structure and biochemistry, which is very cool, and also my jam)