Home » today » World » Boston Dynamics Calls to Stop Arming Robots | Radiocable.com – Internet Radio

Boston Dynamics Calls to Stop Arming Robots | Radiocable.com – Internet Radio

In San Francisco, police have announced plans to deploy killing robots. In Ukraine and other parts of the planet, military drones have already shown enough of their lethal capability. And in Dallas, a robot had already been used to kill a sniper. In light of this news, Boston Dynamics, one of the most well-known manufacturers responsible for the Spot dog or the android Atlas, along with other companies in the robotics sector have signed, as The Conversation highlights in this article, an open letter in which they announce that they will not weaponize their robots and will not support other companies in this purpose.

Anti-armed robot activists demonstrated outside San Francisco City Hall on December 5, 2022 to express their rejection of this use of the technology.
Shutterstock / Phil Pasquini

Julian Estevez Sanz, Universidad del País Vasco / University of the Basque Country

Boston Dynamics, along with other companies in the robotics industry, has signed on an open letter in which they announce that they will not arm their robots and will not support other companies for this purpose. What is going to get to this?

Robots armed with explosives in the United States

In late November 2022, news broke that the San Francisco police might be using robots capable of killing. According to them, “this type of robot will only be used in extremely dangerous situations, trying to protect innocent lives.” And they added that exceptionally “the cars can be equipped with explosives”.

But armed robots had been used before. It is precisely what happened in Dallas in 2016. A sniper had killed five policemen and wounded seven others. Under these circumstances, the policemen claimed that they were forced to send a robot, PackBotagainst the criminal.

The scene: The robot approaches, dragging an electric buzz that drives its gears. Bewilderment for the sniper. Will you bring the demands you negotiated with the police? The robot keeps getting closer. A little closer. The criminal doubts whether to shoot him, he just yells nervously at the policemen. He doesn’t understand anything.


Spanish Army iRobot PackBot 510 explosive defusing robot.
Wikimedia Commons

The robot gets a little closer and boom! Their C-4 explosive charge Kill the sniper, Micah Johnson. Johnson was black, a US Army Reserve veteran of the war in Afghanistan who was reportedly incensed by police shootings of black men and said he wanted to kill white people, particularly police officers. It is believed to be the first time a US police department has used a robot to kill a suspect. In 2018, The police officers were acquitted in court..

Digidog, the NYPD robot dog

In 2020 and 2021 It was the NYPD the one who started using the robot Digidog, created by Boston Dynamics. They used it to search suspicious homes, including an apartment in the Bronx after a kidnapping. They have also been used to negotiate with the kidnapper of a mother and her child, and even to bring food to some thieves.

In all interventions, the explanations provided by the police regarding the intervention of the robot in these scenarios were rather scarce. And that’s really the ethical problem with these machines. Can the use of this type of weapon guarantee respect for citizens’ rights and useless violence against a criminal? Can each of the robot’s actions be reasoned out in detail?

After the episode of the search in the apartment in the Bronx, the deputy Alexandria Ocasio-Cortez raised the alarm on the use of Digidogs only in less well-off neighborhoods.

At this point in the text, there are many reasons why Boston Dynamics, along with other companies in the robotics sector – Agility Robotics, ANYbotics, Clearpath, Open Robotics and Unitree Robotics – have signed an open letter in which they announce that they “will not arm their general-purpose robots for advanced mobility or the software they develop that enables advanced robotics, and they will not support other companies in doing so.”

eye! The devil is in the small details. In the letter they do not (deliberately?) clarify what they mean by generic robots. They also leave the door open for use surveillance missions or recognition of people, as they are already doing.

The war in Ukraine and other previous wars have also highlighted the use of robots, especially drones, in this field.

Both police and military robots are part of the ethical debate about autonomous weapons. In fact, its regulation is currently under discussion at the United Nations.

China will not sign any agreements for the use of autonomous weapons

A century has passed since the philosopher and sociologist Antonio Gramsci He said we must be pessimistic with intelligence and optimistic with will. However, there doesn’t seem to be a willingness to agree. For starters, China has already announced it won’t sign any, and the United States and other major military powers are investing heavily in this type of weaponized robotic system. The conflict in Ukraine was the straw that broke the camel’s back.

This ethical debate has very slippery foundations. Can we consider an automatic missile guidance system, such as those that have been around for decades, as an autonomous weapon? Or a landmine that doesn’t distinguish between allies and enemies?

The scientific community at the moment neither it has no clear definition about what an autonomous weapon is.

China says its interest in developing AI weapons is not related to automatically killing people, but to predictive maintenance, battlefield analysis, autonomous navigation and target recognition. Maybe in the military world it doesn’t even make sense to let there be an autonomous weapon that behaves unpredictably and kills without human control.

Drones in the spotlight

In 2021, a multitude of media outlets reported that a drone had completely autonomously killed a victim in the Libyan conflict for the first time in history. However, that drone never existed, according to the manufacturer himself. But that news was just published.

Resistance to change has always been a lever for human impulses and concerns. The enterprise of Hermanos Wright and his flying machine, when a great popular controversy arose about the possible use of these devices in warfare. the novel of H. G. Wells since 1907 the war in the air It is proof of that restlessness.

In the face of all this hoopla, the city of San Francisco rectified on December 7 and will not, for the time being, allow its police to equip robots capable of killing.

David Collingridgeteacher of Aston Universityin the UK, released in 1980 The social control of technology with the dilemma that bears his name, the Collingridge dilemma: “When change is easy, the need for it cannot be foreseen. When the need for change is evident, change becomes costly, difficult and time consuming.”

Surprisingly, this paradox is very relevant today.The conversation

Julian Estevez SanzProfessor and Researcher in Robotics and Artificial Intelligence, Universidad del País Vasco / University of the Basque Country

This article was originally posted on The conversation. read the original.

line

Radiocable.com and its La Cafetera program are funded by contributions from readers and listeners. We need your help to survive. If you like the journalism that the program champions and feel like it carries with you, become a subscriber-patron here.

line

Print, PDF and email friendly

Leave a Comment

This site uses Akismet to reduce spam. Learn how your comment data is processed.