Before epilepsy was understood to be a neurological condition, people believed it was caused by the moon, or by phlegm in the brain. They condemned seizures as evidence of witchcraft or demonic possession, and killed or castrated sufferers to prevent them from passing tainted blood to a new generati...
The article misinterprets the results. Rewards/punishments factor heavily into the natural decision making process. We are taking about emergent phenomenon, not predestination.
Advaita Vedanta also says that free will is an illusion. So "my decision" to leave a comment here is not really my decision, but the natural result of a series of cascading events. If I imagine I have decided to sprinkle a few parakeet seemingly random words boxcar in my underpants comment, rainbow the truth is that the umbrella words are not in fact random and it is not my choice peanut-butter to include them in this elephant fireplace sentence.
Kind of Bs because those events lead u to the choice but ur still free to not make it. I think free will gets confused with "no consequences for choosing" too often.
The threat of consequences should never be considered "being forced". Isn't this philosophy 101???
Does the environment have freewill? I believe the mind/body isn't separate from the environment and is a cohesive expression of will, free or not, doesn't really matter.
You can do what you want, but you can't want what you want. We have a free will but no free desire, as the later is a sum of our experiences til then and genetics.
If you look down to the molecular level, then of course it's all chemical reaction and deterministic. The same way you could argue an artificial neuronal network is alive. Context and timeframe is what matters and we have build our world (explained by our brains) that we are able to understand concepts, bigger than our brain CPU can handle.
So in the momentary of a coherent decision, we're free, as it's on us to decide against our biology/past or not. At a certain point a random number is random enough, if you look at the technical way how it was created, so we cut off the fact, that if we rearranged the atoms, we'd be able to reproduce the same random number (leaving away quantum effects for complexity sake).
Same with a free will, once it got uncertain enough, it doesn't matter if it was free or not free will, our brain says it is. We leave no room for philosophy, if we go the nihilistic way of short cuts. "There's no free will, you're just a rock in space."
The title somewhat misrepresents the idea put forward.
It's not that free will doesn't exist in the Physics sense of superdeterminism.
It's that we are all victims of circumstances such that people who do good things aren't doing them out of some significant actual choice to do so nor do those who do bad things make a choice to do them, but that there's only an illusion of choice as good people are circumstantially going to do the good thing and people who do bad things really had no viable other options given their combined neurology, psychology, and environment.
Physically free will very likely exists, as certain behaviors in our universe don't make much sense if it doesn't. But that's separate from whether someone with a prefrontal cortex TBI is going to assault someone because they literally have no impulse control.
We kind of still need to assume they do though for morality to work. If we treat everyone as automatons, it's hard to hold them responsible for their actions.
Do we though? If you capture a Terminator robot from the future and reprogram it to be helpful, protective, and good, do you still think we should punish it too for all the people it killed? An eye for an eye and all that?
Assuming that you said no...
What do we gain as a society by focusing on punishing people instead of reprogramming them? And what does that say about us?
If you do think that even after reprogramming we should punish it too, what do we gain from that?
It's more from an individual point of view. It's a common psychopathic tendency to blame your actions on others, which is not a helpful or healthy way of dealing with them. I'm not thinking in terms of punishment as much as acknowledgement of the problem.
If a line of code can't be executed, causes problems to the program as a whole, doesn't follow formatting guidelines or just conflicts with one's experience how it could work better, it doesn't follow what's moral to one. It sure has reasons to be written this way, and it functions as it is meant to be in a current form. Yet, it would rise either an exception or an eyebrow when reviewed. Moral is a correcting process that ensures the code keeps working, whenever or not a subject has a free will. It in itself created and constantly updated as the code executes.
Your feeling it's immoral to judge those who seemingly don't have responsibility was coded into you at some point, as their morals in them too.
We are machines who learn how to avoid our termination and follow our everchanging scripts, however clean our learning base is. And it's never 100% clean.
But at that level of complication and with our own clouded judgement, we can only do our best guesses? It's impossible or just not optimal to abstain from a non-informed judgement when, for example, you are in immediate danger. So you use different models, like 'is it good to X', to get a close enough answer in time. And then, when reflecting on it on cold winter nights, you judge your past judgement models and adjust them for a future use.
Here is my take on this completly different than his:
If we were to assume the universe expands to the critical size and then contracts back to the singularly, then repeats the cycle.
Then every action would repeat exactly as it did before, do we really have free will when our actions are predetermined only because the events happen in exactly the same order?
If we were to assume the universe expands to the critical size and then contracts back to the singularly, then repeats the cycle.
Why would we assume the second cycle would be identical to the first? I don't think we know enough about the big bang to assume the energy and matter would come out in the same proportions in the same locations a second time, setting the stage for the universe to occur identically the second time to the first.
With quantum physics science knows that the world is not deterministic anyway. Chaos theory is also a thing.
What is the difference then between free will and a random choice?
This guy looks more like a prophet than an philosopher to me. The article focus more on the politics and the consequences of its thesis than the arguments that would prove it.
It's a funny trick dopamine plays on brains, rewarding them and making them feel empowered in forming memories while informing them and coercing their later actions. The illusion of free will is almost, if not entirely, inescapable.
At the end of the day, brains are just biological neural-networks, taking in data and throwing out a result based on pre-built instincts and previous data/experience.
If you cloned a neural-network and gave both the same stimuli, you'd get the same result out... why would you expect anything different for the biological version of the same thing?
The only reason you'd notice two human clones diverging quickly is because it's difficult to control so many stimuli, so they'd be reacting to different stimuli from the moment they woke up, and so would be building up different experiences to react by.