AI Adoption in Air Force Pilot Training

By: Evan Beebe
08/23/2024

In recent years AI has become an unavoidable buzzword we come across in the news, books, and films. Despite its popular usage, many across business, government, and academia are still working to understand the true potential uses and impact of AI. This is no different for the U.S. DoD, who is looking to leverage AI in command in control, biometrics, and everything else in between, including military flight training.

Pilot training in the US military has become a challenge for several reasons including aging training planes, shortages of pilots and instructors, and constantly shifting circumstances that elicit changes to training programs.

As Deputy A5T Pilot Training Transformation (CTO) and senior software engineer for the 19th Air Force, Brian Kirk is well versed in the impact AI could have in overcoming many of these challenges. Brian adds that many organizations are still in the early phases of understanding the true uses.

Before arriving in San Antonio this October 29-30 for Military Flight Training, Brian sat down with IDGA to discuss the potential role of AI in flight training, information sharing between the Air Force and Navy, and what he is hoping attendees take away from the Military Flight Training Summit.

What role is AI currently playing in pilot training operations within the Air Force?

Right now, AI doesn't play a very large role in our pilot training operations. We're still in the early stages of exploring how AI can fit into our programs and how it can truly help us. One of the challenges we're facing is the need for a large and well-structured database, which AI relies on to function effectively.

In the past, we haven't structured our data to support AI integration, and unlike commercial sectors, we can't rely on commercial databases because the military's needs and environments are vastly different. These differences lead to unexpected results when applying AI in our context. So, while we're exploring AI's potential, there's still a lot of work to be done to ensure it's the right fit and that it meets the unique demands of military operations.

What do you envision as the future of AI in military pilot training over the next five to ten years?

I'm hopeful that, in the next five to ten years, we'll not only maintain the necessary funding but also develop a greater acceptance of AI in military pilot training. AI is a largely uncharted territory, and Hollywood has certainly amplified its potential risks, sometimes painting a scary picture. While those concerns aren't entirely misplaced, it's important to recognize that AI holds tremendous potential alongside its risks.

As we continue to develop AI, we need to ensure a careful balance between rapid innovation and thoughtful oversight. AI and machine learning can offer phenomenal advantages, but like any powerful tool, if not properly managed, they can lead to unintended consequences. So, our goal is to harness AI's capabilities while pairing it with strong human oversight to prevent it from 'running wild.' In essence, the future of AI in military pilot training will be about leveraging its strengths while mitigating its risks through a well-balanced approach

How are you addressing the potential resistance to AI adoption among pilots and other personnel?

On the Air Force side, we’ve noticed that some of the resistance to AI adoption stems from miscommunication or misunderstandings about what AI can truly offer. I recall a conversation where a pilot suggested, 'Just put it into IBM Watson, and it’ll give us the answers we want.' That’s a common misconception; AI doesn’t work like a silver bullet that solves all our problems instantly.

To address this, we’re focusing on clear messaging. Our goal is to ensure that everyone understands we’re pursuing an AI experience that keeps humans in the loop—AI that makes jobs faster and easier, not replaces them. For example, we're applying AI to basic flying instruction techniques, not advanced ones yet, as we’re still ensuring that the AI provides the results we want and expect. This allows human instructors to adapt to changes and student needs more effectively.

Additionally, we’re looking at using AI to handle time-consuming tasks like scheduling. Currently, our instructor pilots spend significant time on scheduling, but AI can streamline that process. By automating these tasks, instructors can focus more on teaching, while AI handles the routine work. Over time, we hope to build gradual acceptance by demonstrating that AI is here to assist, not replace, our personnel.

How are the Air Force and the Navy collaborating to ensure interoperability and shared best practices in AI adoption for pilot training?

We communicate frequently with our Navy counterparts, but when it comes to AI interoperability for pilot training, we’re still in the early stages. Both branches have seen strong successes in training overall, but when it comes to AI, particularly on the Air Force side, we haven’t yet achieved the kind of breakthroughs that would allow us to share best practices.

AI and machine learning can be unpredictable, often delivering unexpected results, which can be concerning due to the unknowns involved. This uncertainty makes it challenging to fully embrace and share AI solutions across branches until we have more concrete success stories to rely on. Right now, our focus is on ensuring that AI operates within controlled boundaries while gradually expanding its capabilities. Once we achieve more consistent results, we’ll be in a better position to collaborate with the Navy and ensure interoperability in AI adoption for pilot training.

What are you hoping attendees of your session at the Military Flight Training USA Summit will walk away having gained?

Honestly, I’m hoping that attendees walk away with a clearer understanding of how we at the 19th Air Force are approaching AI and to dispel the misconception that AI is just a plug-and-play solution. There’s a common belief that you can simply integrate AI and it will automatically deliver perfect results, but that’s far from reality.

I like to refer to a quote from Punya Mishra, the Director of Innovation and Learning Futures at Arizona State University, who described ChatGPT as 'like having a smart, drunk, somewhat biased intern.' It’s a humorous way to remind us that while AI is an incredible tool, it’s not infallible and must be carefully monitored. Just as we do with people, we should approach AI with a 'trust, but verify' mindset. I hope the session helps attendees recognize the potential of AI, but also the importance of thoughtful and responsible implementation.