Updated:
Save
which is considered Putin’s Russian steel doll, Margarita Simoniandirector of the Russian news channel Russia Today, stated last week: “Either we lose in the Ukraine or World War III begins. And I think the possibility of World War III is more realistic.”. For this journalist, “the idea that everything will end in a nuclear attack is the most likely scenario, perhaps inevitable.” Thus, another attempt at Putin’s dissuasive propaganda is added to fuel the nuclear escalation. A threat not so distant and with unimaginable consequences in which each time artificial intelligence (AI) and its evolution with the GPT-3 with a potential for disruption gain more prominence.
Does the use of this technology increase the risk of nuclear war? A report of the rand corporation It stated that advances in AI are enabling capabilities that were not feasible before, “which could exacerbate the confrontation that existed in the Cold War.
by then mutually assured destruction generated strategic stability by reducing the incentives for the two blocs to adopt measures that could lead to a nuclear war. But the scenario changed after Trump decided in 2019 the breaking of the nuclear arms treaty that Washington signed in 1987 with Moscow. Of course, Putin was not going to be left behind.
Televisions have even speculated on the results of using the Russian arsenal if the Kremlin’s intentions are defied. An example of this is the nuclear torpedo named Poseidon, which would drown Britain under a 500 meter tidal wave of radioactive seawater. Some say that Putin is using “Mad Man Theory”. A strategy that was described by Niccolò Machiavelli, “sometimes it is a very wise thing to simulate madness”. In 1959, Daniel Ellsberg, a theorist of nuclear strategy explained that a leader of one country might make more effective threats against another nation if he or she is perceived as insane. Would Richard Nixon during the Cold War who put it into practice, as recognized by his former chief of staff in the book `The Ends of Power´. Nixon used it to gain more bargaining power if his enemies believed he would be capable of reaching extreme solutions.
And another nuclear threat to the UK from Russian state TV’s Dmitry Kiselyov:
He says his country’s Poseidon nuclear underwater drone could cause a tsunami that would “plunge the British Isles into the depths of the sea” and turn them into a “radioactive desert” (with subs) pic.twitter.com/usElgqHeIG
— Francis Scarr (@francis_scarr) May 1, 2022
The use of AI programs in the nuclear command and control systems of the first powers allows to improve the flow of information, the precision in the collection and analysis of intelligence and cybersecurity, reducing errors or misinterpretation. But this positive side has a disturbing reverse. The development by Chinese, American and Russian companies of hypersonic missiles and weapons using AI has reduced the time to green-light a counterattack before being disabled by the enemy, undermining the condition of mutually assured destruction., and that is an advantage that a nuclear power would not be willing to waste. the possibilities of AI in defense is analyzed, among others, by a detailed report on the Russian weaponry of the
CNA.
“We are in a critical period. We will have technologies in the coming decades that can cause extreme damage globally, and there are reasons to believe that we lack the expertise to manage them safely,” he explains. Jaime Sevilla, researcher affiliated with the
Center for the Study of Existential Risk from the University of Cambridge.
Both Jaime Sevilla and Lucía Ortiz de Zárate Alcarazo, researcher in ethics and governance in AI at the UAM and collaborator of the
Alternatives Foundationgive as a clear example the case of Lieutenant Colonel Stanislav Petrov who prevented a nuclear holocaustin 1983when the Soviet missile detection system gave a false positive, warning that five missiles were headed for Russian territory and indicating on its screen «Attack with nuclear missiles. Launch Counterattack». It almost led to the USSR starting a nuclear conflict against the US, Sevilla told ABC. Petrov ignored protocol and it cost him his job.
However, the human being tends to give more and more credit to AI and to be less skeptical about its warnings, because as pointed out by Javier Palanca, doctor in AI from the UPV and researcher at the Valencian Institute of AI (VRAIN), this technology “makes mistakes, but it makes little mistakes. The level of efficiency and precision is much higher and it is able to see patterns where the human does not find them”.
So, Michael Horowitz, professor at the University of Pennsylvania and contributor to the Bulletin of Atomic Scientists, talks about ‘automation bias’. A phenomenon that he explains with a study in which American pilots stated that they would not trust an automated system that would inform them of a fire in their plane’s engine unless there was corroborating evidence. However, once immersed in the AI simulations, they decided to start doing it.
For all these reasons, the American experts in nuclear deterrence Adam Lowther and Curtis McGiffin came to propose give more presence to AI in the nuclear arsenal.«It may be necessary to develop a system based on artificial intelligence, capable of reacting at such a speed that the reduction in attack times does not place the US in an untenable position».
The nuclear briefcase or `Cheguet´
An idea that is not new because the USSR already had a semi-automated model, the Perimeter System, known as `the Hand of the Dead´. Although it was officially decommissioned after the Cold War, US intelligence states that it may have been maintained and sophisticated. This means that the latent threat of the AI can be reinforcing the traditional nuclear briefcase. This one receives in Russia the name of Cheguet. Putin received it in 1999, after coming to power. It started to work when Mikhail Gorbachev came to power. Putin in various public events has shown that he always accompanies him.
There are three briefcases in total, the encrypted codes of Putin’s nuclear briefcase need the other two that are in the possession of the Defense Minister, Sergei Shoigu, and the Chief of the General Staff, Valeri Gerasimov. The `Cheguet´ is made up of a control system with a red button and several white ones. The one in the center of this color is used to give the attack order to the General Staff of the nation, whose lieutenant general would be ultimately responsible for initiating the nuclear deployment. This mechanism has Kavkaz, an encrypted telecommunications system that communicates with military officials.
AI is turning these traditional systems upside down, and is also learning from the data it feeds on. AI can also play a deterrent role. An example is its use in nuclear warhead drillsThus, in 2019 a group of experts from Princeton University created a simulation called ‘Plan A’
(link) which revealed that the result would mean 91 million victims, of which 34 are immediate deaths and 57 million injured in less than five hours. The number was calculated with NukeMap
(link)a tool created by Alex Wellerstein, which displays damage based on weapon power and launch location. They are not free war gamesbut as the researchers said they intend to draw attention to the “potentially catastrophic consequences of current US and Russian nuclear war plans.”
www.abc.es
George is Digismak’s reported cum editor with 13 years of experience in Journalism