Unlike nuclear weapons, this new class can potentially target by traits like race or even by what people have liked on social media.
By Ariel Conn
AI Arms Race Principle: An arms race in lethal autonomous weapons should be avoided.*
Perhaps the scariest aspect of the Cold War was the nuclear arms race. At its peak, the U.S. and Russia held over 70,000 nuclear weapons, only a fraction of which could have killed every person on Earth. As the race to create increasingly powerful artificial intelligence accelerates, and as governments increasingly test AI capabilities in weapons, many AI experts worry that an equally terrifying AI arms race may already be under way.
In fact, at the end of 2015, the Pentagon requested $12-$15 billion for AI and autonomous weaponry for the 2017 budget, and the deputy defense secretary at the time, Robert Work, admitted that he wanted “our competitors to wonder what’s behind the black curtain.” Work also said that the new technologies were “aimed at ensuring a continued military edge over China and Russia.”
But the U.S. does not have a monopoly on this technology, and many fear that countries with lower safety standards could quickly pull ahead. Without adequate safety in place, autonomous weapons could be more difficult to control, create even greater risk of harm to innocent civilians, and more easily fall into the hands of terrorists, dictators, reckless states, or others with nefarious intentions.
Anca Dragan, an assistant professor at UC Berkeley, described the possibility of such an AI arms race as “the equivalent of very cheap and easily accessible nuclear weapons.”
“And that would not fare well for us,” Dragan added.
Unlike nuclear weapons, this new class of WMD can potentially target by traits like race or even by what people have liked on social media.
Lethal Autonomous Weapons
Toby Walsh, a professor at UNSW Australia, took the lead on the 2015 autonomous weapons open letter, which calls for a ban on lethal autonomous weapons and has been signed by over 20,000 people. With regard to that letter and the AI Arms Race Principle, Walsh explained:
“One reason that I got involved in these discussions is that there are some topics I think are very relevant today, and one of them is the arms race that’s happening amongst militaries around the world already, today. This is going to be very destabilizing. It’s going to upset the current world order when people get their hands on these sorts of technologies. It’s actually stupid AI that they’re going to be fielding in this arms race to begin with and that’s actually quite worrying ― that it’s technologies that aren’t going to be able to distinguish between combatants and civilians, and aren’t able to act in accordance with international humanitarian law, and will be used by despots and terrorists and hacked to behave in ways that are completely undesirable. And that’s something that’s happening today.”
When asked about his take on this Principle, University of Montreal professor Yoshua Bengio pointed out that he had signed the autonomous weapons open letter, which basically “says it all” about his concerns of a potential AI arms race.
Details and Definitions
In addition to worrying about the risks of a race, Dragan also expressed a concern over “what to do about it and how to avoid it.”
“I assume international treaties would have to occur here,” she said.
Dragan’s not the only one expecting international treaties. The U.N. recently agreed to begin formal discussions that will likely lead to negotiations on an autonomous weapons ban or restrictions. However, as with so many things, the devil will be in the details.
In reference to an AI arms race, Cornell professor Bart Selman stated, “It should be avoided.” But he also added, “There’s a difference between it ‘should’ be avoided and ‘can’ it be avoided ― that may be a much harder question.”
Selman would like to see “the same kinds of discussions as there were around atomic weapons or biological weapons, where people actually start to look at the tradeoffs and the risks of an arms race.”
“That discussion has to be had,” he said, “and it may actually bring people together in a positive way. Countries could get together and say this is not a good development and we should limit it and avoid it. So to bring it out as a principle, I think the main value there is that we need to have the discussion as a society and with other countries.”
Dan Weld, a professor at the University of Washington, also worries that simply saying an arms race should be avoided is insufficient.
“I fervently hope we don’t see an arms race in lethal autonomous weapons,” Weld explained. “That said, this principle bothered me, because it doesn’t seem to have any operational form. Specifically, an arms race is a dynamic phenomenon that happens when you’ve got multiple agents interacting. It takes two people to race. So whose fault is it if there is a race? I’m worried that both participants will point a finger at the other and say, ‘Hey, I’m not racing! Let’s not have a race, but I’m going to make my weapons more accurate and we can avoid a race if you just relax.’ So what force does the principle have?”
Though preventing an AI arms race may be tricky, there seems to be general consensus that a race would be bad and should be avoided.
“Weaponized AI is a weapon of mass destruction and an AI arms race is likely to lead to an existential catastrophe for humanity,” said Roman Yampolskiy, a professor at the University of Louisville.
Kay Firth-Butterfield, the Executive Director of AI-Austin.org, explained, “Any arms race should be avoided but particularly this one where the stakes are so high and the possibility of such weaponry, if developed, being used within domestic policing is so terrifying.”
But Stanford professor Stefano Ermon may have summed it up best when he said, “Even just with the capabilities we have today it’s not hard to imagine how [AI] could be used in very harmful ways. I don’t want my contributions to the field and any kind of techniques that we’re all developing to do harm to other humans or to develop weapons or to start wars or to be even more deadly than what we already have.”
What do you think?
Is an AI arms race inevitable? How can it be prevented? Can we keep autonomous weapons out of the hands of dictators and terrorists? How can companies and governments work together to build beneficial AI without allowing the technology to be used to create what could be the deadliest weapons the world has ever seen?
This article is part of a weekly series on the 23 Asilomar AI Principles.
The Principles offer a framework to help artificial intelligence benefit as many people as possible. But, as AI expert Toby Walsh said of the Principles, “Of course, it’s just a start. … a work in progress.” The Principles represent the beginning of a conversation, and now we need to follow up with broad discussion about each individual principle. You can read the weekly discussions about previous principles here.
*The AI Arms Race Principle specifically addresses lethal autonomous weapons. Later in the series, we’ll discuss the Race Avoidance Principle which will look at the risks of companies racing to creating AI technology.
- Hawaii Becomes First State To Sue Over New Muslim Travel Ban
- The Health Care Industry Really Doesn't Like GOP's Obamacare Replacement
- The Victim Of Populism Is Democracy
- Trump Didn't Start The Anti-Iranian Fire
- 'Game Of Thrones' Season 7 Teaser Might Reveal An Upsetting Secret
- Saying Goodbye To 'The Vampire Diaries'
- New York Looks To Teens In Effort To Prevent Domestic Violence
- 5 Common Shampoo Myths Debunked
- 5 Mistakes You're Making When You Air-Dry Your Hair
- Call It What It Is: Climate Cover-Up, Not Climate Denial
- Martin Rees: We Are Living Through A Political And Scientific Transformation
- From Holocaust Denial To Hitler Admiration, Google's Algorithm Is Dangerous
You might also like
- Call It What It Is: Climate Cover-Up, Not Climate Denial
- Trump Officials Credit Him For Fewer Border Crossings. It's Not That Simple.
- Trump's Muslim Ban Is Still All About Muslims
- American Muslims Are Stepping Up To Help Jewish Community Defend Sacred Places
- Trump's Order Is Still A Muslim Ban, Faith Groups Say
- Margaret Atwood’s Advice For Young Feminists: ‘Be Informed, Be Aware’
- Democrats Ask DHS To Drop 'Unconscionable' Idea Of Splitting Up Families At Border
- Police Blame Arson For Fire That Killed 22 Girls In Guatemala Shelter
- Asians, Latinos Make Up Majority Of California's Population But Have Least Political Influence
- Donald Trump's Immigration Crackdown Is Silencing Exploited Workers
- Homeland Security Chief Admits He's Considering Splitting Children From Parents At Border
- The Real Goal Of Trump's 'Merit-Based' Immigration Plan May Just Be Fewer Immigrants
- Powerful South Carolina political consultant implicated in indictments of a veteran state senator
- Will Donald Trump get a second Supreme Court nomination?
- "Hazing" rituals await Supreme Court's "junior justice" Neil Gorsuch
- The hunt is on for Planet Nine. Here's how to join it
- Trump approves controversial Keystone XL oil pipeline
- Trump praises 'Fox & Friends,' renews old feuds in early morning tweets
- Rex Tillerson finally answers question from NBC News' Andrea Mitchell
- First Read's Morning Clips: The Latest in the Russia Investigation
- Spicer: 'I've let the president down'
- Russian President Vladimir Putin met with U.S. Secretary of State Rex Tillerson on Wednesday
- OMB Diriector Mick Mulvaney: Washington's 'a lot more broken' than Trump thought
- Trump attacks conservatives over failure of health care bill
- A very consequential week didn't go well for President Trump
- Health Care Showdown: Republicans look to go big or go home
- No deal on health care bill after conservatives meet with Trump
- CA gov on those supporting health bill: 'Their name is going to be mud'
- Give it to me straight, doc: Is Obamacare dying?
- First Read's Morning Clips: Waiting for CBO
- 14 People Share What's It's Really Like to Have An Ex Who Is Now Their In-Law
- The Internet Is Freaking Out About The Way This Chef Cuts Pizza
- The hunt is on for Planet Nine. Here's how to join it
- Israeli prime minister talks of a snap election amid concerns over a new public broadcaster
- U.S. condemns suspected Syrian chemical attack on civilians, but says the Assad government is a 'political reality'
- Canada's largest school board will end class trips to the U.S. due to Trump's travel restrictions
- Warplanes strike Syrian town already hit by chemical attack
- Vigilantes prowl Europe's border with a target: Muslim migrants
- A letter from Britain to the European Union will trigger the 'Brexit' process March 29
- Ukraine president suggests a Kremlin-orchestrated attack after former Russian lawmaker is shot dead in Kiev
- Russian officials say St. Petersburg subway blast killed at least 11 and injured dozens
- As death toll in hospital attack soars to 50, Afghanistan investigates whether it was an inside job
- South Korea's ousted leader moves out of palace, apologizes for 'not fulfilling my duties'
- A brazen political killing shakes Myanmar, already teetering on the path to democracy
- India's Narendra Modi leads his party to victory in a state with more than 200 million people
- A controversial Thai monk is wanted in connection with a fraud case. His followers won't give him up
- Another Dalit suicide on campus raises fears of a crisis of discrimination at Indian universities
- Syrian government insists it does not use chemical weapons; US vows serious response to attack
- Bodies of U.N. workers and interpreter found in Congo, prompting calls for investigation
- Hamas hangs 3 Palestinians in Gaza it says were collaborating with Israel
- Basque group ETA hands over weapons, ammunition and explosives to France
- Syrian ally Iran blasts U.S. missile strikes as 'dangerous, destructive and a violation of international law'