Home Artists Posts Import Register

Files

Comments

Kaedys

Ah, there it is, right to the crux of the entire philosophical debate over sentience, free will, and the nature of intelligence. Can we truly judge the intelligence and sentience of other entities if we cannot even demonstrate that *we* are sentient, that *we* have free will? All of our science to date suggests that we ourselves are merely extremely advanced biological computers, acting out the mostly deterministic chemical "code" defined by our genes and environment. Arguments could be made on quantum physical grounds, since that's where determinism stops, but then why are we sentient and not, say, a grain of rice or a house fly or a salmon? Same quantum mechanical tricks apply, so what makes us different? Or are we really just a collection of chemical processes with delusions of grandeur? Edit: and gets really messy in that regard, too. Literally the entirety of our legal, ethical, and political theory is fundamentally predicated on the assumption that we act with free will. If that's not the case, if we lack true free will, can we be held responsible for our actions? You don't punish a computer program when it does something unexpected, because computer programs are deterministic. If *we* are fundamentally deterministic, even if orders of magnitude more complex, can we even *have* a concept of justice, of right and wrong? That path quickly leads to madness.

Janne Hurskainen

It's only determined that brains are deterministic based on past experience and current stimuli, so that if you make a mistake once you learn and know not to make that mistake again. I.e. brain makes the decision based on the current knowledge it has. There also is a legal term "not criminally responsible" for situations where it is thought that the person did not have 'free will' at the time. One that is frequently used is "temporary insanity". So in case of justice, as long as you know that what you do is illegal and not insane you are then responsible for the actions. Brain knows this and it's one more item in the inputs that it has when it's making a decision, so it stands to reason that the brain should suffer the consequences of those actions.