Thursday, October 8, 2015


An editorial in today's paper made the suggestion that there be a Hypoctratic oath for computers - well - for computer programers I suppose.  In light of the Volkswagen automobile computer scandal ,ethicists are wondering what kind of boundaries should be placed on those who tell computers how to "think."

In a nutshell, the computers onboard VWs were program to lie to humans to tell them that the cars were not polluting the air that humans were breathing thereby allowing humans to slowly continue poisoning themselves unawares.  Once the computer was thus programmed it required no further human input to continue the lying.

It was suggested that a code of ethics be developed for programmers much the same as we have for doctors.  Rule #1 - do no harm to humans, etc . . .

Of course, on the other hand, we want computers to think as humanly as possible.  It seems we are at cross purposes here.  Suppose your car could think like a human and it was mulling over the ethics of lying to the government about whether your car was putting the earth and putting humanity at risk.  Doctors face these questions all the time. 

"Do no harm."  Hypcratic oath #1.

But what if a human being comes to you and wants you to put another human being to death because that human being threatens a good life for her?  Her rights, her body, her other desires take precedent and medical professionals can easily rationalize what they are doing and believe they are still fulfilling their Hypocratic oath.  In California and Utah, doctors can feel they are still in line with their oath by granting the wish to die by their patients.  Allegedly Planned Parenthood can sell viable body parts from what they otherwise deem non-viable human beings with the idea that they are making the world a better place and proclaim that they have broken no laws and therefore are still moral and have not broken their promise to do no harm.
So my car.  Do I want it to think like a human being?

"I know I should not be polluting like this BUT think of my poor owner.  He can barely afford me let alone get a new car.  And if this becomes part of a recall, think of all the poor workers and stockholders that will lose their shirts - not to mention call dealers, advertisers, and truck drivers.  What is a little more poison in the atmosphere compared to all the good that could be accomplished if I just keep my printouts to myself?"

Do we really want computers to think like us?

1 comment:

Anonymous said...

Interesting. Why did the person making the suggestion deal with the issue of harm rather than just going straight for the "Do no lie" rule which would solve the whole problem?

But that would mean giving up our convenient little problem-solver, aka, lying, fibbing, bending the truth.

The amount lying that goes on in our businesses these days is mind-boggling. The pressure to lie is extreme. Every one _says_ they are against lying. Until the lie will make their lives easier, more convenient or gets them what they want. Then it is perfectly okay because you know "no harm no foul"

One of the hardest things about being a Catholic in the workforce is watching the pressure on the "kids" (people in their 1st or 2nd job out of school) to go along with dishonesty. You watch a part of them die. They go from "ummm, I don't think that is right" to "whatever, just get it done. It doesn't matter."