Saturday, July 16, 2011

Zerairds, free will and sentience


I believe in free will. By free will, I mean that I decide what I try to do, and how hard. I may not be able to judge how hard I need to try to achieve something. I may not succeed even if I push myself to my limits.

I also mean I can choose what to believe and what not to, even though I may be grossly limited in finding out what I should believe (usually, the truth), let alone making my beliefs come true.

If I'm wrong about this, it's not my fault.


Intelligence and sentience. Sensing and feeling. Decision making and free will. They all seem intertwined.

Let us say, I keep a jug of water in front of a stone. The stone can see it, inasmuch as if there's light in these surroundings, light reflected by the jug will hit the stone. If the chemical nature of the stone's surface is made up of silver bromide or something, then it will even react to the light. Responding to stimuli?

When we attach a camera to a microprocessor and program it, we can make the processor print out "Hey, you've kept a jug near me!" on a monitor also connect to it — even to announce it through a connected speaker.

We can do a lot more things with the microprocessor: we can have different messages, chosen on basis of time of the day, colour of the jug — may be the people it "sees" around itself, and so on. But, is it the microprocessor's choice? No, it is a direct result of if-elses within the program

The program could use random numbers and make outcomes harder to predict. Purer random numbers can be generated using some of the external inputs the processor gets to generate them. Besides, after quantum physics, we believe that there is some ultimately pure randomness too, which we can cleverly introduce (give a place of significance) in the program. Then, the programmer would themselves be rendered completely unable to predict the processor's future actions.


Getting somewhere?

Would the programmed microprocessor be responsible for its actions?

If it is not, and we are, there is something sophisticated machines don't have which we do. Let us call this "zeraird"+. I somehow perceive the contrast between sensing (as is also done by a weighing machine) and feeling (as we do when we lift a baby) as so strikingly similar, I think that, too, must be a result of our having zerairds. Intellectually speaking, we do not know if weighing machines feel, too, which is why I took this post along the thinking–acting lines.

Thank you!

+ Sadly, very few people who would patiently read through such a load of logical matter and philosophy would not be put off by the word 'soul'. Hence, "zerairds, free will and sentience".


Rampy said...

A decision will appear out of Free will or pure randomness only when we don't have adequate knowledge of all the parameters and variables.
Once we know all the variables, everything is predictable, everything is governed by some laws of forces and energy.
Free will and sentience doesn't exist!!

PS: good post Poo !! keep more coming :D

Srikant said...


I can't keep more coming if there is no free will. :( They should happen of their own accord.

Anyway, if there is free will, there are a few I plan to write soon.

Also, randomness is present in nature inasmuch as we know of nature's laws. For example, in a mass of radioactive material, we can't say (even with infinite computational power) which atoms will decompose when, though we can tell that overall, these many will disintegrate in this much time.