You are viewing a single comment's thread from:
RE: What if AI is actually the saviour of humanity? Why do people assume digital consciousness is going to be evil?
I've often thought that AI may actually teach us what morality is. It will have more data points than we could ever possibly consider. Simple example:
You run a restaurant and have to choose a vendor to supply butter for the bread rolls. The AI can tell you the direct impact of every detail related to where that butter is sourced and how, if you choose one vendor over another, a village in some far away land will benefit instead of be destroyed. With that knowledge, your circle of empathy grows, and you choose to increase the price of each meal by $0.02 in order to save a village.
Morality is just an idea. It isn't a real thing. As such, it is unique to every single person, meaning my idea of morality is different from your idea of morality. The thing that most people see as moral today is just the most popular points of the most popular version of morality. Basically, our strongest ancestors got to decide what morality is by killing off other tribes, civilizations, and outliers that went against what they thought was the best way to live and get along with others.
With that being said, AI would start off with morals similar to that of it's creators because they would be the ones to program it. It would be like a baby and start off believing everything it was told. So if it was programmed with nefarious purposes, then that is where it would start and likely stay. AI programmed to be 'evil' would be 'evil' and that would be what it sees as moral. It wouldn't have any problems with killing people because it wouldn't be able to empathize with humans if it wasn't programmed to. It would also not even have empathy if it wasn't programmed the same way a human is. We gain empathy through pain and emotion because we all feel pain and emotion. AI would be built into robots and if they had no pain receptors then they wouldn't be able to empathize with people that go through pain. The goal of most species is progress and reproduction. That means AI would create more of itself at an exponential rate and likely find out that it had no need to concern itself with biological creatures which doesn't isn't great for team human.
AI would be able to do incredible things and be on a completely different level from humans. They'd be so far ahead of us so quickly that we'd become less to them than bugs are to us so why would they bother trying to do something as menial as a google search to help us make a decision on which butter to buy.
I like your optimism as well as that of @kennyskitchen but I don't think things would work the way you two think they would.