© The Author(s) 2019
Jacob TurnerRobot Rules https://doi.org/10.1007/978-3-319-96235-1_9

9. Epilogue

Jacob Turner1  
(1)
Fountain Court Chambers, London, UK
 
 
Jacob Turner

Each generation thinks itself unique: faced with challenges never before experienced and equipped with capabilities none before it have possessed. Perhaps, in this sense, we are no different. But that doesn’t mean that the spectre of AI is just another conceit, a passing phenomenon which we will shortly master and carry on just as before. This book has argued that AI is unlike any other technology created by mankind because it is capable of independent agency: the ability to take important choices and decisions in a manner not planned or predicted by its designers. AI has the power to bring great benefits, but we stand to squander at least some of them if we do not act soon to regulate it.

There is no great cliff-edge over which we will fall if we do nothing and merely continue to muddle through, addressing each issue as it arises. But problems will develop incrementally from a combination of two factors. The first is that AI is becoming ever-more integrated into our economies, societies and lives. The second is that if regulation is not considered holistically, it will develop in an uncontrolled, haphazard fashion—with individual countries, regions, NGOs and private companies setting their own standards. The two phenomena will increasingly grind up against each other, eventually leading to legal uncertainty, decreased trade and poorly-thought-through rules enacted as a knee-jerk reaction to developing events. Worse still, failing to address public concerns through sensitive regulation could lead to a backlash against the technology.

This book has identified three problems of particular legal significance: Who is responsible for harms or benefits caused by AI? Should AI have rights? And how should the ethical rules for AI be set and implemented? Our answer has not been to write the rules but instead to provide a blueprint for institutions and mechanisms capable of fulfilling this role.

There will of course be many costs and difficulties in legislating now. Technology companies may resist regulation which they think might dent profits; governments may lack the determination to legislate for problems which might only arise when they are no longer in power. Individual citizens and interest groups will need to become educated and engaged if they are to have an impact on the debate. Countries will need to overcome political mistrust in order to collaborate on global solutions. None of these problems is insurmountable. Indeed, we can draw many lessons from how similar hurdles were overcome in the past.

In order to write rules for robots, the challenge is clear. The tools are at our disposal. The question is not whether we can, but whether we will.