Wouter Beek
2015/01/26
Act only according to a rule of which you can, at the same time, will that it should become a universal law without contradiction.
Absolute and unconditional requirements that must be obeyed in all circumstances.
Deception
Theft
Act only according to that maxim whereby you can, at the same time, will that it should become a universal law without contradiction.
Example: Abortion is wrong if an embryo has (sufficient) sentience and/or sapience.
Conditions for moral status:
If two beings have the same functionality and the same conscious experience, and differ only in the substrate of their implementation, then they have the same moral status.
If two beings have the same functionality and the same consciousness experience, and differ only in how they came into existence, then they have the same moral status.
Uploading: A hypothetical future technology enables a human to be transferred from her original implementation in an organic brain onto a digital computer.
Suppose an upload could be sentient, then its subjective rate of time would be different from humans.
In cases where the duration of an experience is of basic normative significance, it is experience's subjective duration that counts.
Not founded in ethics.
Implicitly conditioned on empirical contingencies.
AI has ethical requirements through inheritance.
Machines take on cognitive work with social dimensions from humans.
What if Archimedes of Syracuse had been able to create a long‐lasting AI with a fixed version of the moral code of Ancient Greece?
Deep Blue was not programmed in terms of individual chess moves.
Watson was not programmed in terms of individual trivia questions.
AIs are programmed in terms of the optimization of a non-local criterion (winning chess, answering trivia questions).
Thus... Good behavior ought to be a non-local extrapolation of (future) consequences of generic behavior.
Require the AI to think like a human engineer concerned about ethics, not just a simple product of ethical engineering.
An AI programmed by Archimedes, with no more moral expertise than Archimedes has, should be able to recognize our own civilization's ethics as moral progress when compared to Ancient Greece.
Imagine running a dog mind at very high speed. Would a thousand years of doggy living add up to any human insight?
Intuitions about which future scenarios are plausible and realistic may be shaped by what is seen in media, overestimating the probability of scenarios that make for a good story.
When was the last time you saw a movie about humanity suddenly going extinct (without warning and without being replaced by some other civilization)?