Categories
Quick Analysis

America May Lag in New War technology

Just as industrialization and the advent of flight and nuclear power dramatically changed warfare in the past, today’s cutting-edge technologies are altering how the world will fight in the future. America’s adversaries are making significant headway in this, much to the nation’s peril.

Science fiction readers may recognize some of the dangers from movies such as the famous “Terminator” series.

Army Lt. Gen. Robert P. Ashley Jr., the Director of the Defense Intelligence Agency (DIA), is clearly concerned. At a recent conference, he noted that “Within a decade, China and Russia’s militaries will be using data visualization, artificial intelligence, machine learning and possibly quantum encryption and communications. These tools are used to collect, analyze and secure data accurately and at high speeds. Both China and Russia realize that ‘whoever can leverage the data and understands that can dominate.’  China already is moving rapidly ahead with digital advances.”

Jayshree Pandya, writing for Forbes, believes that “In the competition to lead the emerging technology race and the futuristic warfare battleground, artificial intelligence (AI) is rapidly becoming the center of the global power play. As seen across many nations, the development in autonomous weapons system (AWS) is progressing rapidly, and this increase in the weaponization of artificial intelligence seems to have become a highly destabilizing development. It brings complex security challenges for not only each nation’s decision makers but also for the future of humanity… artificial intelligence is leading us toward a new algorithmic warfare battlefield that has no boundaries or borders, may or may not have humans involved, and will be impossible to understand and perhaps control across the human ecosystem in cyberspace, geospace, and space (CGS). As a result, the very idea of the weaponization of artificial intelligence, where a weapon system that, once activated across CGS, can select and engage human and non-human targets without further intervention by a human designer or operator, is causing great fear.” 

Therefore, it is the responsibility of the user to derive the most out of the medicine without putting himself in harm’s way. generic viagra store Joint pain may be highly affected more than 30% of men cute-n-tiny.com link purchase cialis from india in UK. You can buy Tadalista online via web medical store and get cialis for sale canada page it delivered at your place. For other women, the symptoms can be significantly eased through the viagra australia no prescription use of an appropriate vaginal moisturiser / lubricant.

General Ashley cited the Chinese company Huawei’s Smart City Intelligent Operation Center, which is using big data, 5G, machine learning and AI to collect, monitor and analyze security, transportation and emergencies, and to track people.

 Sydney J. Freedberg Jr., writing for Breaking Defense, outlined how the Army War College is concerned about “What happens when Artificial Intelligence produces a war strategy too complex for human brains to understand? Do you trust the computer to guide your moves, like a traveler blindly following GPS? Or do you reject the plan and, with it, the potential for a strategy so smart it’s literally superhuman?…’ I’m not talking about killer robots,’ said Prof. Andrew Hill, the War College’s first-ever chair of strategic leadership, ‘The issue is what happens once humans start taking military advice — or even orders — from machines.’

“’I’m not talking about killer robots,’ said Prof. Andrew Hill, the War College’s first-ever chair of strategic leadership and one of the conference’s lead organizers, at the opening session. The Pentagon wants AI to assist human combatants, not replace them. The issue is what happens once humans start taking military advice — or even orders — from machines. The reality is this happens already, to some extent. Every time someone looks at a radar or sonar display, for example, they’ve counting on complicated software to correctly interpret a host of signals no human can see. The Aegis air and missile defense system on dozens of Navy warships recommends which targets to shoot down with which weapons, and if the human operators are overwhelmed, they can put Aegis on automatic and let it fire the interceptors itself. This mode is meant to stop massive salvos of incoming missiles but it could also shoot down manned aircraft. Now, Aegis isn’t artificial intelligence. It rigidly executes pre-written algorithms, without machine learning’s ability to improve itself. But it is a long-standing example of the kind of complex automation that is going to become more common as technology improves. While the US military won’t let a computer pull the trigger, it is developing target-recognition AI to go on everything from recon drones to tank gunsights to infantry goggles.”

The Report concludes tomorrow.

Illustration: Pixabay