Although weapon systems capable of thinking are still in their infancy, growing geopolitical pressures and mistrust may force them to use them prematurely.
One of the Pentagon’s main tasks is to anticipate what the wars of the future will look like so that the resources needed to ensure that the United States can take the lead in these battles can be allocated. When it comes to the instruments of the next war, the most frequently mentioned are artificial intelligence, autonomous weapons and the very different role of human soldiers during the fighting.
These technologies are in the early stages of maturity, and military forces are not yet mastering the best ways to deploy them in combat. Military leaders in other rich countries, including China and Russia, are considering the same things. The world could thus find itself in the middle of a race for the development and use of autonomous weapons before it understands how to use them predictably, effectively and ethically.
Cold War 2.0
We could go into a period of escalation reminiscent of the nuclear arms race between the United States and the former Soviet Union during the Cold War. Experts talked about this during a panel discussion by Defense One / Nextgs on the ethics and politics of artificial intelligence.
With autonomous weapon systems, this mutual paranoia could be even worse. At the same time, the development of such systems is advancing faster than the development of nuclear weapons. However, there is a fundamental difference between autonomous systems and nuclear weapons. While nuclear warheads are a unique type of weapon, AI is an assistive technology that can be used in many types of weapons and systems. The nuclear missile can also be equipped with an AI system, which would give it the ability to search for and destroy a specific target.
AI could also radically change the way soldiers fight. People will need to quickly build on and respond to large amounts of data when controlling or defending autonomous weapon systems, such as drone swarms. They will obtain data from sensors mounted on weapons, satellites and soldiers’ bodies. The party with the best data and the fastest means of processing them can have an advantage. Fear of gaining this advantage by rivals may force the state to accelerate the development of autonomous systems, possibly without addressing issues of reliability or ethics.
People – the weakest link
In order for autonomous weapons systems to be used predictably and ethically, a human decision-maker must be present in the “killing cycle”. However, it is likely that with the increasing progress of autonomous systems, man will become a factor in this cycle that hinders the speed and efficiency of the system.
It is similar to autonomous vehicles. Today’s autonomous cars may require a person to be behind the wheel. But once they have improved, they will almost certainly prove to be safer on the road than human drivers. They may be tired, distracted, but sensors and neural networks may not.
The near future of the military
The autonomous battlefield here may be faster than most people think. The U.S. military has been using ground-based drones since the 1990s and is now seriously considering funding technology companies to develop autonomous swarms of drones, as well as companies developing defense technologies against autonomous drones.
Such drones are mostly small and light, quiet, and therefore difficult to detect and very difficult to shoot down with conventional weapons. The conflict between Azerbaijan and Armenia in 2020 provided insight into this very different kind of war. Azerbaijani troops used Israeli IAI Harop drones, which can operate autonomously. They circled over the Armenian defensive line until they detected radar or a thermal signal from a rocket battery or tank on the ground. Then they rushed down and hit the target like a kamikaze. Most of the Armenian casualties and losses were caused by drone attacks. Azerbaijani and Armenian soldiers were rarely seen.
However, the vast majority of countries will not have access to this technology, so they are very concerned about the three countries that have it – China, Russia and the USA. If the world powers continue to compete for the development of increasingly deadly autonomous weapons (and defenses against them), it will not be just soldiers who will lose their lives. It is easy to imagine that two rich countries will fight with autonomous weapons, for example in the air over a poorer country.
Source: Nextech by www.nextech.sk.
*The article has been translated based on the content of Nextech by www.nextech.sk. If there is any problem regarding the content, copyright, please leave a report below the article. We will try to process as quickly as possible to protect the rights of the author. Thank you very much!
*We just want readers to access information more quickly and easily with other multilingual content, instead of information only available in a certain language.
*We always respect the copyright of the content of the author and always include the original link of the source article.If the author disagrees, just leave the report below the article, the article will be edited or deleted at the request of the author. Thanks very much! Best regards!