Governments, industry and world bodies like the UN must join together to quickly develop policy that puts the brakes on killer robots and autonomous weapons systems, Clearpath Robotics CTO and co-founder Ryan Gariepy told a Waterloo audience Wednesday night.

The price of not doing so, he said, presents a threat that’s on par with nuclear, biological and chemical weapons and may well prove to be more difficult to rein in if action isn’t undertaken soon.

“We felt like it was important for us to put our necks out there and say, as researchers, engineers, business people in the field, that this is now a very real threat,” Gariepy said, speaking at an event called Make Cool Stuff (Not Weapons), hosted by Waterloo-based Project Ploughshares at the Canadian Clay and Glass Gallery.

Kitchener-based Clearpath, in 2014, became the first robotics company in the world to pledge not to make killer robots. Since then, the company has become a member of the Campaign to Stop Killer Robots and Gariepy has become an outspoken advocate at the United Nations and elsewhere calling for measures to prevent the proliferation of like-minded technology.

“The reason we spoke out is because we don’t want to see our products used in this fashion,” Gariepy told the audience. “We do not want to see a world where this technology is used broadly. We do not want to see a news article one day where we see our black and yellow robots being armed with weapons and being sent to the battlefield to kill, possibly indiscriminately.

“But we can’t stop this on our own.”

Project Ploughshares, the event’s sponsor, is an internationally respected NGO with deep local roots that specializes in arms control, disarmament and international security. Like Clearpath, it too is a member of the Campaign to Stop Killer Robots and a frequent participant at the UN.

“This rapid advancement is really pushing and testing ethical boundaries, moral boundaries, legal boundaries, military boundaries,” said Ploughshares Executive Director Cesar Jaramillo. “There is little doubt that they have the potential to dramatically alter long-held paradigms about the conduct of warfare.

“We are not techno-skeptics. We recognize that many of these emerging technologies do have very concrete social and economic benefits for humanity.

“But they can also pose very serious threats.”

Gariepy described his company’s own encounters with AI that fails to work in the way that its designers expected, and based on that experience said he can envision weapons systems making mistakes that cost lives.

“Robots fail in different ways,” he said. “We do research on neural networks at [our sister company] Otto [Motors]. There was a data set we ran through, a network that we ran through, that would periodically misidentify other Ottos as furniture. Household furniture. It makes zero sense. Yet that’s what the Otto thought.

“People don’t understand that these AI algorithms that they’re hanging their hopes on will fail in different and spectacular ways.”

The tenor and content of the discussion was in keeping with the theme of Communitech’s two-day True North Conference, held last May in Kitchener, which focused on Tech for Good. The 2019 version of the event, with a similar theme, is slated for June 19-20.

Gariepy said the progress being made on autonomous weapons systems is rapid and now threatens geopolitical stability.

“Is this right? Is this ethical? What about the chain of responsibility? What about the rules of engagement?,” said Gariepy.

“The mass deployment of autonomous weapons will significantly change what it means to wage war. Would the Vietnam War have stopped as relatively quickly” if machines replaced people and “if American soldiers were not dying every day?”

Gariepy said he could envision the competing algorithms of different nations generating what he called “a flash war,” much like the flash crashes that are periodically generated by computer-generated trades on the stock market.

“When one of those things happens, the stock market gets rolled back, but you can’t do that with human lives,” he said.

“If you have autonomous weapons on both sides of a contentious border, of which there are dozens around the world, and one thing goes wrong, one switch that gets flipped, one test case that hasn’t been run, you could wind up with a war that escalates before humans have a chance to even step in.”

Lending Gariepy’s words additional potency and credibility was his disclosure, early in his talk, that Clearpath is a defence contractor.

“We’ve worked with major defence companies, Northrop Grumman, General Dynamics [and] others, as well as, I believe, all branches of the American and Canadian Forces.

“We do provide robots to the military. We believe that this is acceptable, that this is actually a good thing to do – providing the military serves a valid purpose in this society and this world. Even the United Nations was not designed to outlaw war.

“But what we do not believe is that autonomous weapons systems [are]  acceptable. We believe there is a line and that autonomous weapons systems cross this line.”

Anticipating the apparent disconnect, that a company which sells to the military is simultaneously advocating for restrictions on the weaponizing of robots, Gariepy told a story: He described being presented with pictures of his company’s robots outfitted with automatic weapons.

“To those who would say, ‘Well, you shouldn’t sell to defence contractors,’ it wouldn’t have actually helped. This robot [in the picture] was sold to a school, a university, one of the 500-odd universities we sell to. And then I believe it was loaned by that university to a private company. We had no control over this.”

In other words, militaries aren’t necessarily the organizations that concern him, a point he emphasized later, pointing out that because of his company’s public stand against killer robots, militaries and defence-related companies are inclined to “self-select” and not do business with Clearpath if they intend to use the technology for automated weapon systems, knowing they are unable to obtain follow-up contracts or support for the robotic technology they may have purchased.

He said that technology developed for the military has often found important civilian applications: nuclear weapons led to nuclear power for lights and medicine; Global Positioning Satellites, originally for U.S. military use, now help the world’s civilian aircraft and ships safely navigate; the Internet, originally affiliated with military use, has transformed and connected the entire world.

Likewise, said Gariepy, the products Clearpath sells to the military have applications for “search and rescue, space exploration, fire and police, ocean exploration and large swaths of industry.”

But “a little device that says, ‘Should I kill that person or should I not?’ That’s really only good for killing.”

And what steps can be taken to rein in the development of autonomous weapons? Walk away, for starters, Gariepy said.

It doesn’t follow that just “because the technology exists that it must therefore scale, as if it’s a given,” he said.

“My one piece of truth by example on this one is we had people on the moon in the 1960s and 1970s. Where is my moon cottage? We should have colonized the moon and Mars [by now] That we haven’t done it is proof that if we put our minds to ignoring things, we can actually do that.

“Climate change is another big one,” he said, to laughter.

What we can do, and must, with respect to killer robots and autonomous weapons systems, is “make sure there is traceable, failsafe, human control.”

Thus far, 26 countries, including Austria, Brazil and China, have called for a ban on killer robots. Canada is not among them.