The biggest threat for our future, according to Elon Musk, is the development of artificial intelligence (AI).
“I think we should be very careful about artificial intelligence. If I had to guess at what our biggest existential threat is, it’s probably that. So we need to be very careful. I’m increasingly inclined to think that there should be some regulatory oversight, maybe at the national and international level, just to make sure that we don’t do something very foolish.”
Musk addresses a couple of key issues. The first is how we should take care when it comes to the implementation of AI. The second, and perhaps the most important challenge, is how can you regulate and oversee this technology?
In today’s rush to adopt machine learning and AI techniques to gain a competitive advantage, there is a real danger that the technology will be deployed in an unsuitable field. This ultimately results in bias and unfairness becoming encoded in our society.
Cathy O’Neil in her book Weapons of Math Destruction lays out the dangers currently present when it comes to the rapid and unthinking adoption of machine learning and AI in a variety of different fields, including US prison sentencing and education. Her main thesis is that predictive models are never neutral but reflect the goals and ideologies of those who create them. They also tend to load the dice against poor people, reinforcing inequality in society.
From calculating university rankings or credit ratings and processing job applications, to deciding what advertising you see online or what stories appear in your Facebook news feed, algorithms play an increasingly important role in our lives.
This leads to the conclusion that when it comes to the design phase of machine learning, we must ensure that we have tested for and avoided obvious biases from the historic data and the data input itself.
However, it says little about how we might effectively regulate the technology in real time. One option to consider is Blockchain, which has a crucial role to play in the regulation of machine learning and AI technologies. Having an immutable, tamper-proof record of actions is a key mechanism to allow regulation to be performed on autonomous AI entities.
Blockchain can also enable access control for any autonomous AI entity based on the reputation of that autonomous agent. After all, possibly the only sanction that you can apply to a non-human autonomous AI entity is to terminate it; or if you do not have that power, remove access rights to the community. You could consider this a modern day “excommunication.”
So, if we imagine a future where standards are defined around what AI has to write to the Blockchain and the access control mechanism that essentially looks at the “reputation” of the autonomous AI entity, then we have a mechanism that can create a control that is impossible to circumvent by any autonomous AI agent.
If we look at this from a philosophical perspective, the closest approach stems from the English philosopher Thomas Hobbes who wrote the seminal text Leviathan. In this work, he describes the need for citizens to give up some of their power to a ruler or “leviathan” for the good of society.
Much has been written about “perverse instantiation,” where consequences occur that were unforeseen by the creators. However, very little has been written about how we might actually create a leviathan to regulate AI and machine learning.
Enabling an AI leviathan that can enforce identification and reporting by AI autonomous agents to a Blockchain. The immutability of this Blockchain ensures no AI agent can tamper with it, therefore establishing a reputation ledger for all autonomous AI agents. This leviathan would effectively police the agreed set of rules and deliver sanctions if they are not met.
Many have reported on the use cases of Blockchain that keep spring up everywhere. Not to put too fine a point on it, but Blockchain’s most important use case could be this: the survival of humanity.
The article was originally published on The Cointelegraph and is re-posted here by permission.