Exploring the Potential of Artificial Intelligence in Warfare
Inspired by MIT Technology Review
Artificial intelligence has never been closer to us since computers have been first created. From the moment you wake up in the morning to turn off the irritable beeping of your alarm and when you enter your favorite musician on Google to view the latest news, AI is close behind your back. The development of these non-existing, imperceptible beings with human intelligence is becoming more and more like us every day and they are almost ready to be entrusted with some command in wars, making crucial decisions—whom to kill or save. Some AI-powered technology such as autonomous drones and cybersecurity systems have already been used in several military conflicts in Ukraine, Syria, and Azerbaijan. However, due to the inevitable risk of psychological and physical destruction, considering the nature of wars, the application of AI in warfare is a subject of serious contemplation by many countries all over the world.
War brings destruction that can quickly bring down numerous houses, families, and communities in a certain region. The devastation caused by wars does not only include the deaths of many lives resulting from enemy bullets since there are still cases in which misidentification of targets leads to the killing of civilians or friendly soldiers. In addition, unlike the films where the faces of hostile soldiers are stored in a vast database and are immediately identified, the recognition in the battles between two similar racial groups is sometimes excessively challenging. To resolve such issues, AI-run software is adopted in recent wars. According to Mykhailo Fedorov, the vice prime minister of Ukraine, they have adopted Clearview AI to identify the facial information of the dead bodies of soldiers that have been collected by the military. What sets of data does Clearview use to run accurately? There is a place on the internet where huge amounts of images can be accessed without any strict verification — social media. Clearview finds images that match and track some pieces of related information of an individual that includes address, phone number, and date of birth. Nevertheless, this is probably one of the most friendly uses of Artificial Intelligence in warfare. AI has made killing way easier and quicker than before. AI-powered autonomous weapons, as the ICRC (International Committee of the Red Cross) defines them, are weapons that do not require human interference to operate lethal actions against targets. Drones, for instance, have been one of the most widely-used autonomous weapons in war. By combining image recognition technology and auto-pilot software, the convenience of having to only monitor their courses without remotely controlling them has been gained. The sensors that are installed in the drones analyze the surroundings and identify a target using algorithms. These kinds of abilities that allow humans to engage in hazardous missions without the risk of losing their soldiers might be a crucial advantage in conflicts.
Even though programs that run such autonomous weapon systems are articulate and highly developed these days, it seems too early to overlook the fact that they are problematic as well. Human decisions are not always defective. Since moral senses are involved in the decision-making process of humans, which artificial intelligence inevitably lacks, it is more likely that they will be more acceptable from an international perspective. Moreover, according to United Nations Secretary-General António Guterres, the use of lethal autonomous weapons that only produces decisions made by AI is “politically unacceptable” and “morally repugnant”. Due to the rapid advancements in military technology that have been expected to lower the risk of life losses in wars drastically, war conditions of the 21st century have been altered greatly. However, to benefit from such groundbreaking developments in artificial intelligence, it is necessary to perfect algorithms so human rights and values can be prioritized in wars.
Works Cited
“Homepage.” Lethal Autonomous Weapons, 4 Nov. 2022, https://autonomousweapons.org/. International Committee of the Red Cross.
“What You Need to Know about Autonomous Weapons.” International Committee of the Red Cross, 2 Nov. 2022, https://www.icrc.org/en/document/what-you-need-know-about-autonomous-weapons.
Kryvenko, Pavlo. “‘Artificial Intelligence’ in the Russian-Ukrainian War.” New Geopolitics Research Network, 17 June 2022, https://www.newgeopolitics.org/2022/06/13/artificial-intelligence-in-the-russian-ukrainian-war/.