Killing children, class systems, so many programming language names, the ridiculous ways equality and order-of-operations are done sometimes. Plenty of recursion jokes to be made. Big O notation. Any other ideas?
One of the slave node's child process failed, so the master node sent a signal to terminate the child and restart the slave
There's pretty solid reason my research group is pushing to use "head node and executor nodes" nomenclature rather than the old-school "master node and slave nodes" nomenclature, haha
This is basically what the bank are doing when you get a loan.
When you get a $25k loan from a bank the banker does not take money from somewhere to put it in your bank account. The banker basically just add a +25k in your bank account that comes from nowhere.
Thanks! In computer graphics it's referred to as the "Utah teapot" because the 3D model was created at the University of Utah. But it was originally a Melitta brand teapot. It is still manufactured by German company Friesland, which I bought it from.
Unfortunately it appears they recently had a fire and their webshop is temporarily closed, but I think you can also get it off of Amazon.
I'm more of a computer-science geek than a tea geek, so all I can say is that it pours without spilling. You won't get a laminar flow out of it or anything like that.
It's a double joke. For programmers, it's pretty useless unless your in high performance computing.
If you're on the nitty gritty OS or CPU itself, 0.02% optimization can mean significant improvememt of different things but because it is otherwise unitless, it is equally useless to the reader.
Socrates said books were dumbing down humanity because, since people could just look things up in books they wouldn’t have to memorise information anymore, and that made their brains soft.
Ever since society began, some people have been convinced the next generation’s technology was going to be society’s downfall, whether it was Socrates’ books, the telegraph in the 1800s, radio, the (land line) telephone, dishwashers (women will become lazy and unsuitable wives and mothers), screened windows (society will collapse because you won’t hear your neighbours and pedestrians on the street, we’ll all become hermits and die holed up in our homes), comic books would rot the brains of the youth, then music, then video games… it goes on and on.
So far, those predictions have never been true. Every older generation freaks out when the ones after come of age. It’s like societal growing pains.
I think this is one step further, that technology has become so abstract and complex that people who focus on different crafts and careers are using magical black boxes. It blows my mind how my neighbour goes through life without any concept of what a phone app is. He just uses functionality and memorized the associated logo. I'm an engineering wizard to him.
Isn’t that true of pretty much everu technology, though? I remember in the late 70s there’d occasionally be a loud pop and a puff of smoke from the television, and I’d tag along with my dad to the tv shop to buy new vacuum tubes, then we’d remove the back of the television and do minor repairs. Everyone knew how to do that.
Some technologies actually have had unintended side effects, but not always the ones we saw coming. Artificial lights are killing all the insects which nobody really worried about and cars do kill tons of people, which we worried about in the 1920s. I don't know what the deal was with leaded gasoline, that one was just bizarre.
All in all, it's just really hard to anticipate how society and technology will interact. We think about the environment now but I don't know if any systematic progress has been made on predicting the human factor.
screened windows (society will collapse because you won’t hear your neighbours and pedestrians on the street, we’ll all become hermits and die holed up in our homes)
This one has actually come true to a certain measurable degree (see Bowling Alone, written at what is now the midpoint of the trend), but I don't think it's down to window screens.
Increasing the CPU optimization by 0.02% does seem crazy to me. If you're going to spend time working on something, make it worthwhile. Also, isn't while(true) {print(money)} Microsoft, Apple and Amazon:s business model?
Only if you'd removed and fixed all other bottlenecks that would gain you more than 0.02%. And I'm not convinced there are many if any projects of any reasonable size where that has been the case.