As we stand on the precipice of a new era, the question of whose ethics should guide the robots of tomorrow becomes increasingly pertinent. This exploration delves into the complexities of programming morality, a task as nuanced as humanity itself.
Are values and morals really about what the majority want?
When we make moral decisions, we weigh up principles and certain values. It’s unlikely that we can recreate that in AI
- One way to instill ethics in AI is to have everyone vote on what robots should do in particular situations
- In a world of machine learning, automated robots, and driverless car which demands answers, the philosophical question is whose answers do we choose? Whose worldview should serve as the template for AI morality?
Purpose of thought experiments
Moral thought experiments serve to perturb the initial conditions of a moral situation until one’s intuitions vary
- This can be philosophically useful insofar as we may be able to analyze which salient feature of the dilemma caused that variance
- Thought experiments are evidence that either corroborates or falsifies a certain hypothesis. They are not the hypothesis itself
Building ethical AI
As machines start to integrate more and more into our lives, they will inevitably face more ethical decisions
- There is no “perfect moral citizen” and there is no black and white when it comes to right and wrong
- So, how are we to square all this? Who should decide how our new machines behave?
What would you do if…?
An international team of philosophers, scientists, and data analysts gave us The Moral Machine Experiment
- 40 million responses across 233 countries and territories about what people would do in this or that moral situation
- Most people favor humans over animals, many humans over fewer, and younger over older
- Respondents tended to prefer saving women over men, doctors over athletes, and the fit over the unfit
The democratization of values for ethical AI
How do we know what we say we’ll do will match what we’ll actually do?
- In many cases, our intentions and actions are jarringly different.
- It’s not clear that values and ethics are something that should be democratized.