Encultured AI was started in 2022 by regulars of AI doomsday site LessWrong to work out new safety benchmarks for existential risk — whether the AI would turn us all into paperclips. [LessWrong, 20…
pivoting my startup from blog posts about how my video game’s AI could break out of the game and into the real world, just like a shitty action movie from 2009, to a medical startup that assumes auto-immune diseases are exactly like politics and we must therefore build nanomachines that debate your cells into perfect health
the rationalists have plagiarized so much from that one episode of Are you Afraid of The Dark? where the computer virus escapes VR into the real world through the serial port in the kid’s hand
e: I got the name of the cheesy 90s horror series confused for the name of a cheesy 90s survival horror game!
Oh, you're not satirizing, that's nearly a literal quote. They actually think debatebro conflict resolution solutions will be applicable to autoimmune disorders.
He was previously involved in MetaMed, a 2012 attempt to use medicine to demonstrate the power of LessWrong rationalism: telling staff not trained in medicine that their job was to out-doctor actual doctors from first principles they had learned on an amateur philosophy forum.
2012? So where's the reporting of their success? 12 years is enough to literally finish med school and get a license, surely they achieved way more in that time with their massive rat dongs brains.
Why so general? The multi-agent dynamical systems theory needed to heal internal conflicts such as auto-immune disorders may not be so different from those needed to heal external conflicts as well, including breakdowns in social and political systems.
This isn't, an answer to the question why so general? This is aspirational philosophical goo.
"multi-agent dynamical systems theory" => you mean any theory that takes composite view of a larger system? Like Chemistry? Biology?Physics? Sociology? Economics? "Why so general" may as well be "why so uncommitted?"
I feel bayesian rationalism has basically missed the point of inference and immediately fallen into the regression to the mean trap of "the general answer to any question shouldn't say anything in particular at all."
Yeah, that's a good call out, I do feel the meta is good obsession is borderline definitely cultish.
There's a big difference between a committed scientists doing emperical work on specific mechanisms saying something like "wow, isn't it cool how considering a broader perspective of how unrelated parts work together to create this newly discovered set of specifics?" and someone who is committed anti-institutional saying "see how by me taking your money and offering vague promises of immortal we are all enriched?"