DARPA Tests Autonomous AI-F16 Pilot
The Defense Advanced Research Projects Agency (DARPA) disclosed the first-ever test of an AI-fighter pilot jet in simulated combat against a human-piloted F-16.
The demonstrations of the testing took place at the Air Force Test Pilot School in California in September 2023 but was recently revealed by DARPA. The AI system, developed under DARPA’s Air Combat Evolution (ACE) program, piloted a modified F-16 test aircraft called the X-62A, or VISTA (Variable In-flight Simulator Test Aircraft).
DARPA has previously covered the Pentagon’s plan to integrate autonomous AI Ghost Jets into its arsenal. However, this marks the dogfight scenario involving an AI-piloted jet, reminiscent scenes from Top Gun.
The Flight Process
While two human pilots were onboarding the X-62, the AI retained full control as the F-16s flew within 2,000 feet of each other at speeds of 1,200 miles per hour.
The results remain undisclosed. In prior where AI faced human pilots, the AI emerged victorious. Officials’ silence on the result might not be mainly due to national security; it could also be to protect the human pilots’ self-esteem.
Lt. Col. Ryan Hefron, program manager for ACE stated, “We had lots of test objectives that we were trying to achieve in that first round of tests. So, asking the question of…who won? It doesn’t necessarily capture the nuance of the testing that we accomplished. But what I will say is that the purpose of the test was really to establish a pathway to demonstrate that we can safely test these AI agents in a safety critical air combat environment.”
DARPA emphasizes that these experiments aim to foster collaboration between humans and machines while ensuring reliable autonomy. Their objective is to instill confidence in human pilots regarding their AI counterparts, ensuring seamless coordination and mutual trust.
Questions, Scenarios Playing in My Head
Let’s not hide behind the truth. There are undisclosed purposes regarding AI pilots.
Remember the Patriot missile launched in Iraq on March 22, 2003? It was developed in 1960, its full details were disclosed years later. Could a similar pattern be unfolding now?
It’s highly probable, especially considering the current events in the Middle East, especially the conflict in Palestine, now run by AI. AI fighter pilots and applications move beyond weaponry to messaging apps such as Meta’s WhatsApp. Israel’s Lavender AI system, used to target Palestinians Gaza, might have drawn data from WhatsApp groups.
Moving beyond the obvious, AI struggles with accurate facial recognition. False positives present substantial risks, potentially resulting in wrongful arrests or reputational harm.
Shall we allow AI to dictate our future, particularly in times of war?
Inside Telecom provides you with an extensive list of content covering all aspects of the tech industry. Keep an eye on our Tech sections to stay informed and up-to-date with our daily articles.