As we discussed in class, unimaginable sums of money are being thrown into the development of AI. But I honestly wonder sometimes whether or not any of these developers stop to ask themselves, "Why AI?" Disclaimer—I am not the biggest fan of the kind of AI technology that we talked about in class, but this post will explain my position.
Most of the time, technological development is a wonderful thing. I am a big fan of technology designed to assist me throughout my life, and possibly even save my life in an emergency room. But, am I a fan of technology designed to replace me, my friends, my family, my fellow humans? No. And my question to those who are in favor of such technology is, you guessed it—why?
The people developing AI, unfortunately, seem to have a very narrow opinion regarding what our brains do. Yes, our brains store and interpret information. But, our brains are not computers. We are animals, and parts of our brain, also the oldest parts, are responsible for some of our most primal instincts and emotions. I have a feeling we aren't going to code into a computer the parts of our brain that contribute to the struggle that is the human condition—jealousy, anger, fear, etc.
We wouldn't do that, at least, if we are trying to make a computer brain that is superior to ours. But, I find myself asking why again. Why would we willfully create a computer organism that has all of our smarts, but none of our faults? Are we asking for a sci-fi movie of robots vs. humans to come true? Are we so sure that these computers will forever be our allies, and never outsmart us?
If we are trying to make AI computers to be exactly like ourselves, I still wonder why. Humans are already here. We don't need to have robot friends or spouses or coworkers, because we can make those connections with living, breathing human beings right here and right now. There is no shortage of human beings here on earth, either.
If the reason we are developing this technology is so that we can merge with it, and live forever in "the cloud" playing the saxophone all day (as Dr. Oliver imagined), then there are even more important questions to ask. I mean, will we have to upgrade our storage every now and then to accommodate new souls, or will we stop reproducing altogether? If we can live forever, will life feel as meaningful? Our lives are special, in part because we know it is only temporary. I'm not saying with 100% certainty that life would become devoid of meaning if we found a way to live forever, but who knows? I sure don't. And I doubt that silicon valley is spending enough time thinking about this question to know the answer either.
Our species has found many ways to defy nature. Most of the time, it's great for us. Also most of the time, at least these days, it isn't so great for everything/one else. Surely, in a matter of time, our efforts will begin to backfire on us too. I hope that we can continue to progress (science is awesome and I am excited about self driving cars), but in such a way that actually brings about some benefit to ourselves and the world.
Rant over. What do you think?
For better or worse, our only comprehension of what it means to be human is tied to our embodied and mortal condition. Eternal life in the cloud, if you could call it living, would be a different "game" for sure. It's easy enough to say that what we call our "self" is merely a bundle of perceptions, and that such bundles could as easily reside in computers as on planets, but the permanent loss of corporeal existence sure sounds to me like the loss of something fundamental and irreplaceable. Not saying I would not want my digital avatar on file after I'm gone, as it were... but though it might in some sense possess my memories and patterns of thought, I cannot conceive of it as an extension of my self. Now, if there were a way to embody that digitized persona in an upgraded and sensate robot body, I might be more enthusiastic about the posthuman age of spiritual machines. But those posthumans are still going to need a healthy planet, no? Or can they just go live on Mars or Venus, impervious to the climatic elements? The more sci-fi this gets, the more I miss Mother Earth.
ReplyDelete“ The more sci-fi this gets, the more I miss Mother Earth.” I couldn’t agree more with that!
DeleteMost excellent post Heather. 'Why' is indeed the first question we should have for everything.
ReplyDeleteThanks, Ed! I agree. I think that if people starting asking "why" a bit more often, the world would be a better place.
DeleteThis comment has been removed by the author.
DeleteHeather, this is a terrific post. I meant to bring it up in class today, sorry I got distracted. Maybe next time.
ReplyDeleteAsking "why" is so important and I think many people today, especially in governmental realms, get annoyed with explanation. Or maybe it's the fact they don't want people to know the "why" behind things. Whatever the case may be, like you mentioned, we would be a lot better off. People would be accountable for the things they do and say and that especially important at the governmental level.
ReplyDeleteHeather I do believe that you are correct in saying that AI is not the way to go, also I would like to throw my two cents by saying when you think about all the information that a human brain processing during the day nothing we have created comes even close to what we can do
ReplyDeleteI agree that there are many inherent dangers to the idea of living in the cloud, and creating more "human" types of AI. If we live forever in the cloud, who controls and maintains that space? What happens if you decide in the cloud that you want to be deleted? What happens if someone finds a way to use your cloud likeness for monetary or political gain? Also I think the negatives to giving computers human-like AI outweigh the positives. In fact, the only reasonable positive to making AI seem more human would be the easier conversational nature of the interface, but there are ways to simulate speech and syntax without giving computers "real emotions".
ReplyDelete