Matter Over Mind
After a monthly visit to the depths of Wikipedia, I learned that I am a materialist. Not in the classical sense, but in the sense that I strongly believe that a human being’s mind is the result of the structure inside their brain, coupled with things like hormones and oxygen intake, and nothing else. That every decision we make can be traced back to its roots and no matter how deeply we dig, we will never encounter the holy grail of countless religious and philosophical claims called, “The Soul”. Because it does not exist.
This might be hard to wrap your head around but it makes much more sense this way. If the soul was just a figment of our imagination, so many questions would be answered that frankly, couldn’t be otherwise.
The difficulty is that all those aforementioned claims would turn out to be false. From the existence of an afterlife to the idea that an intelligent mind cannot be emulated or recreated: all false.
It goes strongly against our gut feeling. Like the notion that the human being was the climax of creation or that the universe revolves around the earth, it is an outdates concept that links back to our ego, making it hard to abolish.
“Of course I have a soul! I am a human being, am I not? I have a free will, a consciousness, a mind, self-awareness, et cetera! ”
But what is a soul, really? It it the little spirit in us that evaporates when we die, and maybe kinda comes back if we are reanimated? Is it the thing that conserves our character when we die, and changes color as we sin? Does it change along with us when we suffer brain damage and change our behavior? It is what sets us apart from dogs, or maybe cats, or maybe very smart cats or stupid dogs? Does it influence our decisions towards the good or bad?
The idea that the soul does not exist answers all these questions with a nice and easy, “nope.” But it is so contradictive to our most intimate beliefs that the notion almost seems evil, and heartless to say out loud. You have no soul! And neither do I! We are soulless machines, and literally nothing would set us apart from a robot if we could have it mimic our behavior identically.
If we detach our physical being from some ethereal idea of identity, it would suddenly seem so trivial to copy, kill or create a mind. The very essence of our ethical beliefs would turn out to be equally trivial, a joke almost. But do we really need to tie those to things that we cannot prove? Must we really have that anchor to something we cannot disprove, so that we may use it as a beacon to keep us on the right path? I don’t think so.
Without our will to survive, co-exist, love and create rather than destroy, our very species would not have survived. I think all sane people share the will to live, and project that feeling onto others without needing a fairy tale to urge us.
I recently learned of a thought experiment called “The Chinese Mind”. Since there are about an equal amount of people living in China than the average adult has synapses in his brain, it would theoretically be possible to give them all a radio and have them interact with each other the same way our synapses do. The net result would be a human brain, capable of thought and emotion, albeit extremely slowly.
It might not have chemical subtleties like hormones, but I think this makes sense. I really do think a thinking mind would come to exist, and it would theoretically be unethical to kill it.
Scientists are doing similar experiments as we speak, with brains emulated through software, and very soon in human history, perhaps even within our lifetime, it will be possible to create a human brain from scratch, synthetically.
There will be a time, like it or not, when an android will be made that behaves like a human being. And we will have to set him side by side with one of our own, and decide which has the soul and which does not. And when we do, we will have to decide where the line is: If the android were given a human, lab-grown brain, would it make a difference? If we augment the human with electronic devices, will he be less human? How about ones that he can freely turn off or remove? External ones, like glasses or a hearing aid? Are our perceptions not essential to our character?
So many questions that will sooner or later need answering, when we’ll have to decide on the rights of an artificially created citizen. They already walk among us: They might look perfectly human but their insemination happened mechanically, before being implanted into a surrogate human mother. But still we remain blind to the changes, don’t want to answer urgent philosophical questions before it’s too late.
With beings that will greatly outmatch us in terms of strength and intelligence, we’ll want to be careful. “Too late” in this case, could mean our own destruction, the very thing that we want to avoid. Because wasn’t a constructive mind the very thing that makes us human?
It frustrates me that no one in the world seems to care about this but me. No one reads this shit. If I tell people about this, they stop me and say they don’t care. And all that is their god-given right, but if the masses don’t care, who in this world will make the decisions? The companies who build the robots? The politicians? White men in business suits, in large conference halls far, far away from everyday reality.
Mark my words:
By the time the masses will wake up, we will be divided into camps very similar to how religion divides us today. And the battles will be equally fanatic, and blood will flow. Both sides will have good arguments but neither will be able to prove them, so money will be the deciding factor. These AIs will be made.
And if we’re lucky, they won’t wipe us off the face of the earth. It will boil down to blind, simple luck, because morally, philosophically, ethically, we won’t be ready for them.