An editorial in today's paper made the suggestion that there be a Hypoctratic oath for computers - well - for computer programers I suppose. In light of the Volkswagen automobile computer scandal ,ethicists are wondering what kind of boundaries should be placed on those who tell computers how to "think."
In a nutshell, the computers onboard VWs were program to lie to humans to tell them that the cars were not polluting the air that humans were breathing thereby allowing humans to slowly continue poisoning themselves unawares. Once the computer was thus programmed it required no further human input to continue the lying.
It was suggested that a code of ethics be developed for programmers much the same as we have for doctors. Rule #1 - do no harm to humans, etc . . .
Of course, on the other hand, we want computers to think as humanly as possible. It seems we are at cross purposes here. Suppose your car could think like a human and it was mulling over the ethics of lying to the government about whether your car was putting the earth and putting humanity at risk. Doctors face these questions all the time.
"Do no harm." Hypcratic oath #1.
But what if a human being comes to you and wants you to put another human being to death because that human being threatens a good life for her? Her rights, her body, her other desires take precedent and medical professionals can easily rationalize what they are doing and believe they are still fulfilling their Hypocratic oath. In California and Utah, doctors can feel they are still in line with their oath by granting the wish to die by their patients. Allegedly Planned Parenthood can sell viable body parts from what they otherwise deem non-viable human beings with the idea that they are making the world a better place and proclaim that they have broken no laws and therefore are still moral and have not broken their promise to do no harm.
So my car. Do I want it to think like a human being?
"I know I should not be polluting like this BUT think of my poor owner. He can barely afford me let alone get a new car. And if this becomes part of a recall, think of all the poor workers and stockholders that will lose their shirts - not to mention call dealers, advertisers, and truck drivers. What is a little more poison in the atmosphere compared to all the good that could be accomplished if I just keep my printouts to myself?"
Do we really want computers to think like us?