I believe in free will. By free will, I mean that I decide what I try to do, and how hard. I may not be able to judge how hard I need to try to achieve something. I may not succeed even if I push myself to my limits.
I also mean I can choose what to believe and what not to, even though I may be grossly limited in finding out what I should believe (usually, the truth), let alone making my beliefs come true.
If I'm wrong about this, it's not my fault.
Intelligence and sentience. Sensing and feeling. Decision making and free will. They all seem intertwined.
Let us say, I keep a jug of water in front of a stone. The stone can see it, inasmuch as if there's light in these surroundings, light reflected by the jug will hit the stone. If the chemical nature of the stone's surface is made up of silver bromide or something, then it will even react to the light. Responding to stimuli?
When we attach a camera to a microprocessor and program it, we can make the processor print out "Hey, you've kept a jug near me!" on a monitor also connect to it — even to announce it through a connected speaker.
We can do a lot more things with the microprocessor: we can have different messages, chosen on basis of time of the day, colour of the jug — may be the people it "sees" around itself, and so on. But, is it the microprocessor's choice? No, it is a direct result of if-elses within the program
The program could use random numbers and make outcomes harder to predict. Purer random numbers can be generated using some of the external inputs the processor gets to generate them. Besides, after quantum physics, we believe that there is some ultimately pure randomness too, which we can cleverly introduce (give a place of significance) in the program. Then, the programmer would themselves be rendered completely unable to predict the processor's future actions.
Would the programmed microprocessor be responsible for its actions?
If it is not, and we are, there is something sophisticated machines don't have which we do. Let us call this "zeraird"+. I somehow perceive the contrast between sensing (as is also done by a weighing machine) and feeling (as we do when we lift a baby) as so strikingly similar, I think that, too, must be a result of our having zerairds. Intellectually speaking, we do not know if weighing machines feel, too, which is why I took this post along the thinking–acting lines.
+ Sadly, very few people who would patiently read through such a load of logical matter and philosophy would not be put off by the word 'soul'. Hence, "zerairds, free will and sentience".