Robot laws: 5 new rules that could save human lives (at least on TV) | Innovation
From Battlestar Galactica to The Terminator, on-screen robots have never been above a little rule-breaking. Could our new laws of robotics keep them in line?
How are you supposed to behave if you’re a robot? According to the science fiction writer Isaac Asimov, there are three simple rules: don’t let human beings come to harm, obey orders and protect yourself – in that order of priority.
Asimov first set out these three laws of robotics in his 1942 short story Runaround, but since then they’ve become staples of the genre, violated by artificial intelligence in everything from Doctor Who to Alien. As much as they continue to dominate the conversation, however, Asimov’s laws are rapidly becoming out-of-date.
After all, we live in a world fast filling with actual robots: driving our cars, performing our medical procedures, influencing our elections and threatening to take our jobs. Can a new set of laws ever keep us safe?
With that in mind, we’ve drawn up a revised set of laws to help us live alongside artificial intelligence in the 21st century (to read more, see “Robot laws: Why we need a code of conduct for AI – and fast”). Would they still make for entertaining science fiction? Note, spoilers follow…
Battlestar Galactica’s notorious cylons, who wipe out most of the human race before chasing the survivors across the galaxy, would certainly fall foul of our first law: a robot may not injure a human being or allow a human being to come to harm – unless it is being supervised by another human.
In 2001: A Space Odyssey, the ship-board computer HAL 9000 dooms its crew for reasons that are left ambiguous. That would place it in contravention of our second law: a robot must be able to explain itself. Not necessarily verbally or even explicitly, but by having transparent coding to make its actions and motivations clear.
The robot-women of 1975’s Stepford Wives are designed to fulfil a sexist ideal of female behaviour. If we want to ensure that machines of the future don’t perpetuate our own human prejudices, then it’s important we teach them to rise above the petty stereotyping that informs so many of our own thoughts and actions. In the words of our third law: a robot must treat all human beings equally.
The eponymous theme park where the TV show Westworld is set depends on sophisticated robots that are indistinguishable from humans. As the series progresses, in fact, the distinction becomes ever more blurry. None of this would be permitted with our fourth law: a robot must not impersonate a human.
Shoot them, freeze them, break them apart: there’s virtually no stopping the Terminator robots. If we are to ensure that robots stay in their lane, then we’ll need to institute our fifth law: a robot should always have an off switch.
More on these topics: