Autonomous weapons: Palantir, Airbus engineers seek to calm ‘killer robot’ fears

Issued on:

Top AI engineers at defence technology companies defended the need for autonomous weapons on Thursday, amid a push for a ban on so-called “killer robots”. More than 115 countries and 250 non-governmental organisations are calling for an international treaty to ban weapons that use artificial intelligence to identify and engage human targets, technology which United Nations Secretary-General Antonio Guterres has called “morally repugnant.”

Antoine Bordes, head of artificial intelligence at European defence start-up Helsing, told a conference at Sciences Po university in Paris: “The battlefield is flooded with data … whoever can harness this data, understand it, make sense of it faster is going to have a tactical edge.”

“The role for any AI is not just to get faster decisions but also to get more accurate decision making,” said Megha Arora, Responsible AI Product Lead at American military software developer Palantir. “Where we choose to operate and what kinds of workflow we help enable, there are always ethical considerations that are involved in a case-by-case or customer-by-customer basis.”

On Wednesday, European aerospace giant Airbus announced a partnership with Helsing to develop unmanned military aircraft.

Bernhard Krach, a senior engineer at Airbus, lauded the “huge engagement” in the work of the EU’s defence agency on the responsible use of AI. He pointed to existing safeguards, saying: “In the military domain, you have risk assessments in place”, a sentiment echoed by Bordes, who said weapons like aircraft and drones “already have safety guidelines, standards and certification processes.”

Arora said existing international humanitarian law “gives you legal basis for safe and appropriate use of autonomous weapons systems, but there’s no implementation guidance.”

“When you think about proportionality, you can’t programme it into code,” she said.

Weapons with autonomous capabilities – like Ukraine’s Saker drone, Russia’s Lancet missile and Turkey’s Kargu UAV – have already been deployed in battle, according to the Autonomous Weapons Watch observatory.

Meanwhile, Israel has come under scrutiny for its use of the software “Lavender”, which reportedly uses AI to recommend possible militants to strike in Gaza.

In January, Palantir signed a “strategic partnership” with Israel’s defence ministry to “supply technology to help the country’s war effort”.

Find out more about autonomous weapons, in particular the EU’s appetite for rules on them, in this week’s edition of Tech 24.

Source link

Leave a Comment