“We’re losing this battle.” The United States is losing the AI race against communist China, says Nicolas Chaillan, who recently resigned from his position as the chief software officer for the U.S. Air Force.
U.S. companies still lead in technological advancements, but they are unwilling to share their technology with the Department of Defense. “If we stopped over-classifying information … they might see pretty quickly that [the communist China threat] is going to become a real problem even to their day-to-day lives,” Chaillan says.
If the United States doesn’t start catching up now, soon the situation will “pass the point of no return,” Chaillan says, due to the accelerating nature of AI development.
Jan Jekielek: Nicolas Chaillan, such a pleasure to have you on American Thought Leaders.
Nicolas Chaillan: Thanks for having me.
Mr. Jekielek: So, Nicolas, you recently resigned from the Air Force where you were the Chief Software Officer, the first time this role has existed. I’m going to want to find out how that actually came about. Before we start, the true story here, why did you leave?
Mr. Chaillan: So, first of all, I’d like to thank the men and women who have served our great country, and are serving our great country, fighting for our freedoms. I think it’s essential to keep in mind that I was serving them, and that’s where the focus of my time in the department, to ensure that we’re addressing the issues we’re facing. We have seen tremendous success in the last three years, but only pockets of success.
While I’ve been hearing the Pentagon leader say the right things, I’ve yet to effectively see them walk the talk. So, that’s been challenging because what you see instead is a lack of urgency, but also a lack of adoption of Agile, and tremendous waste of taxpayer money, but also, U.S. companies that are not willing to partner with the United States.
Meanwhile, China is taking off, leading the pace by mandating their companies to partner with them. At some point I had no choice but to raise the alarm, because we’re seeing that we’re losing this battle.
Mr. Jekielek: Okay. You raised so many things here that we’re going to have to hit on. For example, the civil military fusion that the Chinese regime mandates basically, right throughout its system.
Mr. Chaillan: Yes.
Mr. Jekielek: But let’s go back to what’s happening here in the U.S. Now, it’s been said in a headline recently that you believe that the U.S. has already lost the war on AI.
Mr. Chaillan: I don’t believe that we have lost. What I said is that, if we don’t act now and don’t wake up right away, and not in five to 10 years from now, unlike some of the Pentagon reports are saying, that if we don’t take a stand now and take action, we have no fighting chance in succeeding 10 to 15 years from now.
Because with AI, the velocity of adoption of AI compounds over time. So, effectively, you’re going to be in a situation at some point, where you pass the point of no return. You will not be able to catch up.
So, when we say we have 10 years, or when we say, in 10 years, China is going to be leading. First of all, it’s wrong because China is leading right now. They’re already leading in many of those fields because of the adoption of the technology from their companies.
That’s the difference when you compare with U.S. side, where really at the end of the day, the U.S. companies are leading against China, but we do not have access to that technology, so that puts us behind because, effectively, we’re left not being able to partner and competing at the same time with a massive country with 1.5 billion people, that are not waiting for us to wake up.
Mr. Jekielek: Well, a country that also likes to steal a lot of technology, especially technology which can be found online in some way, and the Cloud. There are constant reports of Chinese regime hacking efforts, very successful efforts. We know, for example, that parts of the next generation fighter plane were obtained by China. There’s many, many examples of this.
But let’s start here, AI. What is AI and how does it play into the Department Of Defense in the military, and why is it important? Why is this particular issue so paramount?
Mr. Chaillan: Artificial Intelligence is going to be what is going to make or break us in the next years to come, because effectively what AI can do is making decisions for you, accelerating the access to information, coming to conclusions that the human brain cannot even comprehend.
It can also drastically automate access to data and tracking data, and being used to, for example, track satellite imagery, so we can detect objects and what’s going on, which can potentially prevent loss of life. We’ve seen it recently in Afghanistan. Potentially, with better AI, we could have recognized that inside this car were seven kids, and you could have known that proactively through automation.
So, effectively, it enables us to do much more by adopting Artificial Intelligence at scale across industries, and you see it around us everywhere from text-to-speech, when you can speak to your phone like Siri or Amazon, and tell Alexa what you want to eat for dinner and it’s going to propose different locations. All these technologies are driven and based on Artificial Intelligence, and without AI they could not exist.
Mr. Jekielek: So, AI allows us to make decisions faster, but it’s a lot more than that, isn’t it?
Mr. Chaillan: Yes. You can also take an example where recently we have a challenge with DARPA, which is the defense research lab, where we demonstrated that we could have a dogfight, where we set two jet fighters fighting together, and have one of the jets completely flown by Artificial Intelligence and the other by the best Air Force pilot. Every single time the human lost. That, I would argue, is not even the most advanced AI capability that there is on the planet.
So, it’s going to change drastically the way we think, we do business, the way we even build weapons because effectively, if you know that those jet fighters will not be able to compete, what’s the point in even investing more into the fifth generation fighters, or sixth generation fighter, when you have to drastically rethink the way you’re going to design them, man them, train people to use them, and what will be the end goal of these capabilities?
Particularly also, when you start combining cybersecurity to it, with cyber offense, where you can take an entire a grid system of an entire system down without even leaving your living room.
Mr. Jekielek: So you’re saying you had a fully automated AI driven jet fighter beat the equivalent jet fighter manned by a human being every single time, every single test?
Mr. Chaillan: That’s correct.
Mr. Jekielek: For a lot of us I think that’s still the realm of science fiction, but it’s not.
Mr. Chaillan: Mm-hmm (affirmative). It’s not.
Mr. Jekielek: Why are you so sure that the Chinese Communist Party is ahead of the U.S. right now, in terms of AI development?
Mr. Chaillan: I can tell you we could change this by ensuring that the U.S. companies partner more with the Department of Defense, but by not being able to do that, effectively what we guarantee is that these Chinese companies have no choice but to work with the CCP.
Effectively, what you end up having is a situation where they get so much data. First, you’re facing 1.5 billion people, so by definition already, based on numbers, you’re already all losing, because they have more data, and AI is a data game. The more data, the more access to data, the more you can leverage rapid prototyping, and rapid delivery of capabilities. That’s the other piece with the cycle.
AI learns upon itself, so the more you can deploy it rapidly, the more you can learn, the more it’s going to be able to accelerate its learning. That’s why time compounds and is exponential, and at some point you look back and you just have no ability to catch up.
Mr. Jekielek: You’re basically saying that the amount of available data, to the system that’s doing the learning, is actually incredibly important to the speed at which it learns, and basically to its effectiveness.
Mr. Chaillan: Yeah, and you see it with U.S. example, like Tesla. The fleet that we have with these cars on the street is how the system gets better weeks after weeks, and being able to send over the air update every two weeks allows Tesla to accelerate its learning, get better at it, and try new features, try a better algorithm, see what works, see what doesn’t work, try with a subset of the fleet.
Try with 5 percent of the fleet a new version, 5 percent with another version, see which one sticks. So, the more end users and the more data you get, the better the system becomes, so it’s exponential.
Mr. Jekielek: Why, for these military application AIs, is the number of people available, or the number of people’s data available, so important?
Mr. Chaillan: Because, effectively, to improve accuracy of the AI model is all about volume of data. So, the more data you have, the more accurate and precise, and effective this AI capability will be, in making decisions, in detecting objects, in recognizing my French accent when I talk to Alexa.
All these things are effectively driven through that automation. So, despite the fact the United States is spending more money than many other nations combined, in defense, what we fail to recognize is that, I will argue, when you compare what it costs to do the same capability on the commercial side, and I spent 20 years on the commercial side before joining the Department of Defense, when I was estimating work I would have to multiply by 10 the cost in DOD, because often that’s just the way it costs to do business in the department.
So, effectively, when you spend $1, you get 10 cents of value that you will get on the commercial side. So, we’re saying we’re spending more money but are we spending it wisely, effectively? Are we agile enough? Is our acquisition process broken?
If we don’t adopt Agile methodologies… I started at 15, 22 years ago, and I was implementing Agile at the time. The U.S. government has no Agile training to this day, mandated for our acquisition workforce.
Mr. Jekielek: This is very interesting. Explain to us what Agile means, for the layperson.
Mr. Chaillan: Agile is what allows you to become more efficient, and be able to deliver, continue to value incrementally, not having to follow this five year cycle process where you planned for some things. You have requirements, you plan it, and then you execute for multiple years before the capability comes to life in the hands of the war fighter or your end user.
By adopting Agile, what you do instead is you continuously deliver value, small incremental piece of value, so you can validate that what you’re building makes sense for your customers, or the war fighter, in my case. So, they can test it, they can see if it makes sense. You can prioritize features. You can, every two weeks, decide that, “Hey, this feature is more important than this feature,” so you can prioritize your work. You can be more efficient.
You end up, effectively, never being in a situation where you end up waiting five years, all to learn that the billion dollars of taxpayer money was wasted, because what we were building was built in a vacuum.
Mr. Jekielek: That’s really interesting. I mean it’s a completely different philosophy of development basically, like diametrically different.
Mr. Chaillan: Yes.
Mr. Jekielek: Yeah, but it has to be done thoughtfully because there’s no shipping at 80 percent here. You have to actually have something that functions if, you said, war fighters are going to be using this. It’s just that the initial product is just going to be a piece, perhaps, of the final product. Is that the idea?
Mr. Chaillan: Correct. That’s also the big difference. You look at SpaceX. SpaceX has 200 employees. You compare with F-35–it has 4,000, 200 developers and 4,000 developers with F-35. SpaceX reuses 80 percent of its code across the nine platforms of SpaceX, so it’s LEGO blocks, it’s modular, it’s pieces of a puzzle.
A platform, a new rocket, is never built from scratch. The software is reused across… And they can reuse pieces of these LEGO blocks and put them together in a new platform, just by swapping LEGO blocks and trying different things.
If you compare that with F-22 and F-35, F-22 and F-35 are showing 4 percent of the code base. 4 percent. There is no reuse despite both of them being built by the same company. That leads us to effectively being into, what we call, a monolithic architecture.
It’s very difficult to update. You’re very much locked into this entire system. You cannot cut it. You cannot reuse pieces. If you take a sensor on a jet, the same sensor could potentially be used on a ship. But that sensor is effectively built in a way that is so tied up to the system, that you cannot do that today.
If you were to build it right, like we do now with some of the initiatives I pushed, you could actually cut the systems into pieces, delivering these LEGO blocks, and share across services.
There’s a drastic waste of taxpayer money by not enabling reuse across the Air Force, NAVY, and the Army, where you see… Often because of egos honestly, and bureaucracy, and titles, where these teams are alleging that the mission is different.
But IT is IT, and we need to be able to reuse capabilities. Not everything is going to be the same. But even if we went from 4 percent to 50, 60, 70 percent, the improvement of that delivery and the ability to reuse, and the cost saving associated with this, would be dramatic.
That’s why when I say we spend a dollar, we get 10 cents of value, that’s part of the problem between the lack of agility, so we can continuously deliver value and see what sticks, between the silos, and the waste, and the lack of training, and the lack of investment in our airmen and war fighters, so we can actually improve their knowledge of Agile.
There is no Agile training. We built it during my tenure but it was not mandated. The mandated training is still the legacy training.
Mr. Jekielek: Something that strikes me here is the huge value of having these collaborations with industry. But something that strikes me is that, you look at some of the tech giants, Google, Facebook, Amazon I guess will be another one, the culture and kind of the outlook of the folks in these organizations strike me as really, really different from DOD, from the military.
Mr. Chaillan: Mm-hmm (affirmative).
Mr. Jekielek: How can that work together?
Mr. Chaillan: Well, first of all I think if we stopped over-classifying information, and we were able to share more what we see with some of these companies, so they understand the threat and they understand what we’re facing, I think there’s a real opportunity for them to proactively be willing to engage and partner with us.
They need to see that their freedoms and all the freedoms they enjoy today are mostly driven, thanks to the deterrents we have. Thanks to our war fighters, and if we don’t have that they might see pretty quickly that it’s going to become a real problem, even to their day to day lives.
Mr. Jekielek: So, let me get this straight. Basically, you’re saying if you could explain to some of the leadership of these companies, these big tech companies in the Silicon Valley, the true nature of the threat, what really is facing America, you think they would come on board voluntarily, but they just simply don’t have access to that information because they’re living in a bit of a different world?
Mr. Chaillan: Yeah, they don’t have access to classified information. Most of these companies are not cleared. The most innovative AI companies or smaller companies don’t have clearances.
It’s not just the leadership. It’s very important to get to the people as well because what you’ve seen recently with Google, even recently with the product Maven that Google decided to not extend, mostly because a few employees complained about the use of Google technology inside of the Department of Defense.
That’s mostly because they do not understand that by making those weapons more efficient, better, it’s also helping at preventing mistakes and saving lives.
You could always say, “Well, I’m not going to help DOD and maybe they’re just going to stop using the weapons,” but that’s just not going to happen. Life exists around you. You can’t just live in your Silicon Valley bubble. You have to look at what’s going on around the world, and we have to take action sometimes.
The goal, of course, is to have these capabilities as a tremendous deterrent. We don’t want to use them but if we have to use them, we want them to be efficient. We want them to be very precise, and only technology is going to get us there.
So, if we don’t have access to the best breed of technologies, of Artificial Intelligence in the vision space, in the analytics space, data science, all these key experts to get us there, we’re not going to be able to keep up. We’re going to get behind, and we are right now, as we speak, already behind.
When I see reports, 750 pages government funded reports, and I don’t know who’s going to even read it, but to tell us that we have 10 years to figure this out, when effectively, by 10 years, it will be too late to fix it. This is what’s criminal.
Mr. Jekielek: Let’s talk about the China threat here. When it comes to the Chinese Communist Party, it’s actively committing at least one, if not three, by my count, personal count, genocides. We know it’s antagonistic to the United States and frankly, to freedom in general. It’s an authoritarian system.
But basically, you’re saying the people that can help aren’t fully aware of that reality, and you want to bring them on board. I want to reiterate this again.
There’s probably a lot of concern about revealing certain elements of classified information in this reality, especially to folks who might not treat it with the respect that it should be treated with. This is what some people in DOD might be thinking, and maybe the reason for over-classification. So, how do you deal with that question?
Mr. Chaillan: You have to find a balance. I think we can easily declassify things by removing how we got to know them–removing some of the details. You don’t have to give all the details of the story for people to be able to grasp what’s going on.
I think we do a pool job, also, cutting classification documents, so that maybe the entirety of the document is not classified. Then you can easily extract pieces and help people have access to those pieces. That should be enough for them to understand the threat.
Quite honestly, I found many times that we classified things that I already knew on the commercial side, and things that you can find on Google. So, clearly we’re over-classifying, no doubt.
Now, the question is, is there a real threat? Some people argue right now that me talking publicly about all of this is creating what we call OPSEC, Operational Security risks to the United States and the DOD.
I argue that, if we think that it is the extent of knowledge and access to information, of a country like China, we are drastically underestimating what they can do with their intelligence. It’s foolish and ridiculous to even think that they wouldn’t know anything I just said today. It makes no sense.
The reality though, what’s real fear that no one is going to be willing to say is that, if people stop being able to talk about these things, then one day someone is going to have to be held accountable for making these mistakes. They know that might be done one day.
So, by keeping things in the family, as they called it, which I did for three years, I kept it and I tried my best to convince everybody very nicely, for three years, up to the point where now we’re running out of time. Our kids, your kids, my kids, are at risk here if we don’t wake up.
Mr. Jekielek: What is the cyber threat from China, and perhaps other bad actors? Explain what that is to us.
Mr. Chaillan: The cyber threat is tremendous. I said that our cyber defense across the government, and not just at DOD, but also across critical infrastructure; power, water, and so on, is at the government level, and I mean it.
If you compare that to Google cybersecurity or another company, pick your top cybersecurity company. These facilities are understaffed, under funded, they had to connect them to the internet to be able to remotely manage the systems because they can’t even afford to send people onsite.
So, now we [have] connected systems that are not designed to be connected, so you’re creating a tremendous cyber risk. We’ve seen it. We’ve seen countless breaches around critical infrastructure, including recently, water supply chain impacted in Florida, and so on. So, you see this happening already.
Honestly, if I’m China and I’m going to attack Taiwan one day, it would make a lot of sense to disable maybe some of our power, so our military would be so busy trying to make sure that we fix the situation in the United States, that we wouldn’t have to be able to have the bandwidth to even think about Taiwan.
Mr. Jekielek: Fascinating, and really scary.
Mr. Chaillan: Yeah, this is real life. People dismiss this saying that it’s as if it’s in the movies, but this is the life we live, and people need to sometimes realize that this is what’s going on around them. That’s why being able to see and have more insights about these cyber attacks and the extent of how deep these malicious actors can get into systems, and what they can do if tomorrow they turn off an oil supply, like we’ve seen recently where…
By the way, the recent breach of the pipeline didn’t even directly impact the pipeline. The company turned it off because it couldn’t track consumption and billing. They didn’t actually hack the actual pipeline. Imagine if that happens one day. What happens? What if they can hack it to the point where they can make it explode? It’s not just turning things off.
You know we have a tremendous risk on the supply chain side, where all the chips, everything we buy is made in China. What stops them from putting malicious code into these capabilities where, effectively, they’re dormant, up to the point where they’re not dormant anymore?
Mr. Jekielek: Well, and I think there’s been examples that we’ve seen, where those kinds of capabilities were discovered.
Mr. Chaillan: Mm-hmm (affirmative).
Mr. Jekielek: Are you concerned that these capabilities already exist and haven’t been identified?
Mr. Chaillan: 100 percent. I think we’re doing very poorly when it comes to supply chain management. We don’t know where things are coming from. Both on the software side, which by the way, why would you even bother to try to temper hardware and chips, when you can just do the same with the software supply chain?
You don’t have to travel. You can push a piece of software across millions of organizations. You’ve seen it with SolarWinds recently, with massive breach. That’s going to be the target of China now. They’re going to go after companies that are providing services to hundreds of organizations.
So, by getting into them, including cyber companies by the way, they’re going to become the target because if you hack the cyber company defending the other companies, you have the crown jewel. You can literally see everything that’s going on and get into the other systems.
So, they’re going to become massive targets. There’s a lot of startup, a lot of innovation in cyber. Many of these companies are doing a very poor job with their own cybersecurity, despite being cyber companies.
Really, at the end of the day, people are not taking seriously the supply chain risk as a whole. We see cars sitting in lots because they’re awaiting chips coming from China. How is that acceptable?
By the way, you want to talk about AI machine learning, how do you get to AI machine learning dominance and quantum computing if I can’t have the most advanced chips? It’s all driven based on compute and access to the most advanced capabilities. If you’re building stuff overseas, who is to know that they’re not stealing the IP we’re sending them? How is that even acceptable?
Mr. Jekielek: Yes, indeed. The most advanced development from what I understand for these chips right now is in Taiwan. So that’s another really profound national security question.
Mr. Chaillan: Mm-hmm (affirmative).
Mr. Jekielek: So, let me get this straight, are you suggesting that some of these supply chains should be repatriated?
Mr. Chaillan: 100 percent. Without a doubt. We should never have let them leave.
Mr. Jekielek: What are the highest priorities in your mind?
Mr. Chaillan: Anything that has to do with the most advanced chips, and when it comes to also, the software side of the house. We need to really… And that was part of the President’s cyber executive order, recently signed by President Biden. There’s a big push to start tracking the software supply chain.
Keep in mind, when you buy a piece of software, that software comes with dependencies coming from other companies, open source projects. These projects can be impacted by malicious actors. China is infiltrating some of these projects, having contributors that are contributing code for years, and people paying less attention to what they do, to the point where they can potentially inject malicious code into the system.
Keep in mind, we have tools to be able to scan code, but it’s designed mostly to scan bad code, meaning a developer that made mistakes. Quality issues, not so much malicious code.
Malicious in behavior. We have a concept called a time bomb in software, where that software can be triggered based on the date or based on the specific event to trigger itself, to explode the system, or turn off all the software in the system. All these triggers are very alarming to me, and could be dormant for years until the bad actor decides to push the button and say, “That’s it. It is time for us to activate this.”
Mr. Jekielek: Okay, so this is very interesting. Basically, you’re suggesting that you want to have a lot more cooperation between units, but trusted units. So, keep the untrusted units outside of the system, but develop a broad trusted system. Is that the idea?
Mr. Chaillan: Mm-hmm (affirmative). Yeah, and we also have to pay attention to who is working for some of these companies. The fact is the Chinese Communist Party is sending a lot of people to our universities, and to our most innovative companies, and there’s a very big risk of exfiltration of data from within. Insider threat is probably the most underestimated threat of all these top organizations on the commercial side.
Mr. Jekielek: I want to comment on that a little bit, because even the most well meaning people that will come over, let’s say from China, they’re still under the auspices of the Chinese Communist Party. The Chinese Communist Party can put pressure on their families, can put pressure if they happen to end up in a situation that the Chinese Communist Party is interested in. The Chinese Communist Party always has this supremacy over every aspect of society.
It still is sort of shocking to me that people don’t understand this. We’re not necessarily talking about James Bond spies being sent over, although I’m sure there’s some of those, but it’s more just like simple everyday people who could at any moment be required to cooperate, at pretty significant threat to the person or their family, and so forth.
Mr. Chaillan: It’s a real issue. There’s not many solutions. You don’t want to start saying. We’re not going to allow these people to contribute to society. We need those talents if they’re willing to come and make a difference. They can actually be great assets too by providing more insights about their countries.
So, I think the solution will have to deal with how do we help them bring their family, and try to remove these dependencies, or these kinds of side effects risks that could be spreading rapidly. So, you have to be proactive. It’s not as simple, and like you said, for everybody, everyday people, you can do that for everybody. At some point it’s a gamble, but there’s more risk not doing it sometimes, than doing it.
People say with DevSecOps in the Department of Defense, for example, is this concept of continuous delivery, and automation. Ruthless automation. You want to automate testing, and scanning, and all the nuclear surety, all the assessments we do so humans can focus on the more advanced stuff, and the more basic principles can be automated so we accelerate the delivery of software multiple times a day.
So, this concept around DevSecOps re-enables that ruthless automation, including removing humans from these processes, so they’re not part of the risk anymore. So, there are a lot of things that automation and AI can help to mitigate in terms of threat, including potentially detecting the kind of malicious behavior, by monitoring what these employees do inside of the system, to see if they’re doing things that are malicious in nature.
So, really, the AI is again the answer here to proactively detect things that are going on.
Mr. Jekielek: Incredibly fascinating. Something struck me a little bit earlier actually. If some of these big tech giants that have some of the advanced capability that they’ve developed, for example, around security, which you mentioned, if they’re not ready to work on, for example, war fighting equipment or software, or something like that, perhaps they’d be willing to work on basically helping DOD maximize its own security realities.
Mr. Chaillan: Yeah, and keep in mind there is cyber defense, there’s cyber offense, cyber defense. Cyber defense, I think everybody would agree that we need to protect our systems, but keep in mind also that AI models, AI capabilities can actually be reused for different things.
You train it with different data. You can take something that’s used on the commercial side, which is exactly what China does. You take self-driving cars and you can turn them into self flying jets. There are tweaks and things but the models are often very similar. You can find a way to access technology and do things without really having everybody to be part of the engagement. So, just access to the technology.
Google responded to my comment when I said, “We have companies like Google that don’t want to do business with DOD,” and they said, we… I can’t quote them, but they pretty much said that they have contracts with DOD, which is true. Those contracts are mostly in the business side of DOD. What we mean by that is the management of people and our business systems. It is not on the-
Mr. Jekielek: Google Docs, for example. Maybe, I don’t know.
Mr. Chaillan: Right. It’s certainly not on the weapons side, so that’s the difference. They even ended the statement saying it has to follow the company value, and terms and conditions. But the terms are clear, you cannot use those capabilities to do any harm.
Well, a weapon is going to create some harm for the wrong… For the right targets, that’s the way it’s designed. So, by definition we can use it for weapons, which is what we need it for. So, it’s just foolish to say we have contracts with DOD, when you know very much that you’re not having contracts on the weapons side of DOD, which is why we’re here.
We’re not here to do business and manage people that are not doing weapons. Those people are here to do weapons. We wouldn’t need the people there if we were not building weapons. That’s the reason why we exist.
Mr. Jekielek: To me, TikTok is kind of this massive elephant in the room. Basically I see it advertising on Twitter all the time. We’ve talked about how every company based in China is subordinate to the Chinese regime—100 percent.
You and I have both seen, and I think many viewers have seen “The Social Dilemma” film. We know the level of fine grained information that a social media company gets on its users. I mean, how is TikTok not a massive intelligence operation for the benefit of the CCP, creating individual profiles on millions of Americans, and just kind of in plain sight?
Mr. Chaillan: Yeah, absolutely it is. That’s why it should be banned. I have no doubt TikTok should be banned in the United States and in Europe. It is used actively as an intelligence weapon. People don’t understand the breadth of the capabilities around AI.
First of all, it gets tremendous access to your phone, so it sees your pictures, videos, and your geolocation, and a lot of different things about you. But then it’s also able to see what’s inside of these pictures and be able to recognize objects behind you, if you walk around. You can be in your room, they can see what books you read. They can see what you have in the room. They can look at your mood. They can even track your mood when interacting with those videos. So, there’s a lot of intelligence around this.
This can be used also as a weapon of misinformation, where they can promote content, they can track content. The latest valuation of TikTok is 450 billion, which grew drastically in the last six months. People dismiss TikTok as a teenager app, which is absolutely not true.
You have a massive volume of adults now. It’s used by countless companies for their marketing. It’s an advertising tool for many companies, and they let, again, their profits and their interests of getting more customers get in the way of common sense.
You’ve seen President Trump try to ban TikTok. Of course, that didn’t really work out. You could imagine a world where companies like Apple and Google are proactively banning it. Of course kids will get upset but they should get over it. That will be what’s best for the nation.
Mr. Jekielek: Well, you know kids and these adults, and these advertisers, It’s just become almost like a fixture. Very rapidly.
Mr. Chaillan: Yeah, but there will be other options. Like anything in life, we move on to whatever is next.
Mr. Jekielek: We talked about this a little bit, but social media has these kinds of addictive properties, and we’ve seen in some of the recent whistleblower testimony that some harms can well be known, and still exist. So, in the hands of the Chinese Communist Party, it frankly creates a very scary proposition.
Mr. Chaillan: It compounds the problem. It’s already a privacy concern. It’s already something that we have right now in the United States, but it’s even worse if it’s not made in the United States. So guess what? If TikTok ends up being banned and we have to replace it by something else, at least let’s make sure that’s a United States based company.
Mr. Jekielek: Is DOD looking at TikTok to your knowledge?
Mr. Chaillan: Not that I’m aware of.
Mr. Jekielek: Fascinating. I mean one would think they would be looking very closely at it. So, during your tenure some things were accomplished. You laid out some of those things in the letter you put out in September, explaining that you were going to be resigning. What would you say were the most significant contributions that the DOD, or that the Air Force can build on now?
Mr. Chaillan: Well, first of all we demonstrated that a small group of people can drastically impact change in the department. That’s a behemoth. If you can do it, and you can demonstrate you can do it with weapon systems.
We took the F-16 and the U-2 jets, and we were able to deploy advanced Agile capabilities and AI machine learning on the jet in 12 days, fly the jet, show that AI could help the pilot make decisions and manage the essentials of the jet, receive over the air update like Tesla does, where you get an update directly on the jet while flying the jet, without impacting the air worthiness of the aircraft, and the safety of the people on board. All that in 12 days.
We built this capability. We open-sourced it to the world. A lot of people can now reuse this. We see five nations using it. We see dozens of other federal agencies using it. It’s probably the biggest contribution back to open source from the U.S. government.
This capability now can be used at scale. It’s built as an enterprise service. It’s actually used today by all branches of the government. It’s designed to be effectively modular and flexible, and obstructing us from getting locked into a cloud provider, or to a specific company. So, we have different options and a diversity of options.
So, when you lead the way and you can show people that, “Hey, we can do this with a 60 year old aircraft on legacy hardware.” If you can do that you can pretty much do anything you want.
Mr. Jekielek: So, let me get this straight. This system that you’re saying has been deployed in, is it throughout the DOD?
Mr. Chaillan: It’s been deployed. Yeah, with dozens of projects.
Mr. Jekielek: This is basically kind of like a platform you used in those 12 days, to create this technology to run this 60 year old aircraft.
Mr. Chaillan: Yes. It’s designed to orchestrate the entire cybersecurity stack we have, because you can’t just activate and enable an over the air update, and hope for the best, right? That’s where you get hacked. If someone compromised your updating system remotely, you can imagine the damages you could have.
So, the entire stack we built is the foundation of cybersecurity. We call it Zero Trust. It’s been mandated just recently with the President Biden cyber executive order. We were the first and only agency that was fully compliant with the mandate before it was even written.
That’s work we’ve done in two years, part of the platform on the team that I founded–part of my job as the CSO. Platform One is now used across the government, but also of course, across DOD, to enable first the cybersecurity stack, the ability to run these LEGO blocks on top of it.
You get Artificial Intelligence but you also get sensors, you can get different mission software capabilities on top of it. You can now become modular. That’s the first step to being agile, because now you’re not building this monolithic system. You can cut it into pieces and you can be more agile, more flexible. You can try things out. You can swap LEGO blocks when it doesn’t work out.
So, you’re not stuck in time. You can try new things. Back to your point, you can’t deliver a product that’s not fully finished. Well, if you start being more modular and more flexible like that, and you enable reuse of these LEGO blocks—we call it Containers. We have 900 containers that were brought in through Platform One, from industry, so commercial products, open source products, and we have thousands of these LEGO blocks built within the DOD, with DOD software, that can now be shared across DOD to enable that reuse of code and that agility.
It’s not mandated to be used. That’s part of the problem. Like I said, we had great success of pockets across the department. It was never scaled to become the norm. What I’ve been pushing for the last two years is a way to raise that sense of urgency, that we have to stop the waste of taxpayer money. We have to do better when it comes to delivering value. We have to mandate these new Agile concepts. To this day, even for new programs that sell tomorrow, this is now mandated by the Department of Defense.
Mr. Jekielek: Well, something you mentioned is incredibly important here. The Zero Trust.
Mr. Chaillan: Mm-hmm (affirmative).
Mr. Jekielek: Explain to me in the simplest terms what that means, and why it’s important? And why, even for that reason, mandating might make sense?
Mr. Chaillan: Yeah, I completely agree that mandating Zero Trust is essential for national security. Effectively, what Zero Trust is, is moving away from the traditional model of firewalls.
Back in the day you would build a wall around your system. It would be a very thick wall. You don’t want people to get into your building, whatever it is you’re protecting virtually, of course.
The difference of course is when we started to have mobile devices and cloud, what’s the wall? Where is the wall? It can’t be around your mobile, around all these things, so it doesn’t scale and it doesn’t work.
Worse, if someone gets into your mobile device, maybe because of a malware you downloaded from some employee in your team. Now they’re inside the system and they’re fully trusted to, what we call, laterally move. Move across the system to find your crown jewel, exfiltrate your data, and do malicious harm at that point, across your system. That’s the traditional parameter model.
The new Zero Trust model is to say, “We don’t trust anything,” whether you’re inside the system, not inside, we validate everything. We make sure that everything is where it needs to be. We’re going to assess the security of your device. We’re going to look at who you are as an individual, are you using multi-factor authentication? So, you need to type your pin and your code to get into the system.
So, all these different things give you a level of trust to then get access to what you’re supposed to see. So, if you’re not supposed to have access to healthcare data in your organization, because that’s not your job, you don’t get to see it. It’s very granular, very precise, it’s called micro-segmentation. You can cut your network into pieces.
Again, back to the LEGO block concept. With this concept, you can now say, “This LEGO block can talk to this LEGO block, but it cannot talk to this LEGO block.” So, if a malicious actor gets into the system, that limits the attack service. They cannot laterally move between the other LEGO blocks. It’s going to be harder for them to get to crown jewels.
Mr. Jekielek: Basically, if one piece of the system is compromised that doesn’t mean the whole thing is.
Mr. Chaillan: Exactly.
Mr. Jekielek: And frankly, a lot of the sort of big issues that we’ve had in terms of hacking have been that one little piece was compromised. Then the whole system was open and a whole bunch of data, to use your term, was exfiltrated.
Mr. Chaillan: Yes, spot on. That’s exactly what happens.
Mr. Jekielek: It’s kind of shocking that this type of Zero Trust approach wouldn’t be used in, frankly, every aspect of development, nevermind just in the DOD, but especially in the DOD.
Mr. Chaillan: Yeah, particularly when I started at DHS [Department of Homeland Security] five years ago. I was the Chief Architect at DHS before joining the DOD. I pushed Zero Trust five years ago to DHS, and the leadership at DHS said they didn’t want Zero Trust at the time. So, we worked without deploying Zero Trust, which was best of breed already on the commercial side five years ago.
So DHS said, no. DHS is also here to secure all critical infrastructure. All water, power, which are tremendous targets from Russia, China, and so on.
So, it’s borderline criminal for people to dismiss it at the time. So, I left. I went to DOD where they understood the importance of all this. Like I said, I hear all these senior leaders say the right things. Everybody agrees, right? I was never sitting in meetings where people were saying things that were completely insane. That never happened.
But unfortunately when it was time to take action and mandate things, or… People are so afraid now of mandating the wrong things. Even when it’s obvious to everybody in the room, and no one is even pushing back, that no one is even willing to do it. So, we were stuck in time because some people were concerned with legacy systems, to have to mandate Zero Trust, and have to mandate the implementation of Agile practices, that they wouldn’t even do it for new programs. That’s really concerning.
[Narration]: Our team reached out to the Department of Homeland Security, but we did not immediately receive a response.
Mr. Jekielek: There seems to be an almost pathological need or compulsion for leadership, for people in positions of responsibility, to avoid having to take responsibility. This is from many conversations I’ve had. Is this the kind of thing that you’re saying you see?
Mr. Chaillan: Well, I think the issue is there is no reward for taking risks. On the commercial side, you do good, you get bonuses–you get credit. In the government, it’s actually safer not to take it because you have more chance of rising up if you don’t make noise.
Even if you end up having a large program that fails, the duty bubble is designed to, first of all, prevent people like me to come into the system. Very difficult to get clearance. Very difficult to get into the system. I had to divest a lot of stock, including after being in the job where they told me I have 24 hours to sell stock overnight, or I lose a lot of money overnight with no notice.
All these things are designed so people, effectively, wouldn’t want to do what I did. On top of it, you end up with people that are used to working in the government, when they leave the government they’re going to go work for the defense base, which is the same bubble. If they come back, they will come back with the same knowledge.
They have never been outside of the DOD bubble. They don’t know what’s going on. They don’t know the velocity of work in a company like SpaceX or Google, where their head will explode, literally.
Look, we were going to have an exchange program where we send majors and so on, to these companies, which was a great idea. But again, what they missed is that if you’re going to send these people to see the light, and you bring them back into the same system, and you’re unwilling to take action to address issues they see.
All that’s going to do is push them to leave, which all these people end up coming back, get more frustrated because now they see what normal is, and they end up leaving because they can’t take it anymore, right? So, we have a massive retention problem.
Leadership is really, at the end of the day, if there is no benefit to taking risks, both for their career and for the advancement within the department, then it’s just safer not to do it.
Mr. Jekielek: Because the risk, basically, is if you fail you might not get that promotion, but if you just kind of go along, you’ll just naturally kind of flow to the next natural position.
Mr. Chaillan: Of course, I mean you’ve seen it even recently. There is no one held accountable when something goes wrong. Effectively, when something is going to go wrong, it’s going to be most likely classified, and we can’t talk about it.
If that happens then…. There are programs that have been wasting billions of taxpayer money every year, and even be revamped and waste again taxpayer money, again and again.
From travel to basic business, thanks to Jedi with the cloud contract for the DOD, which was a complete debacle for three years, because they only wanted to do singular world, with a single cloud provider. All these things, effectively, happen and most people can’t talk about it, and no one is held accountable.
You’ve seen the recent world of GBSD [Ground Based Strategic Deterrent], which is all the replacement of all the nuclear groundwork, replacing all the nuclear missiles in the ground, across the nation. We have a very limited amount of years to do it. A lot of pressure to do it because if we don’t do it, we’re losing the existing missiles. So, we have to deliver this just for a deterrent…
Even if you don’t agree with nuclear, we need the deterrents. You only had one bidder to that contract. So, how is that good for the taxpayer? How is that good for the government to negotiate the right deal? So, it happens, the company that won the award was willing and eager to make this work. We managed to pull this together, which was awesome, and they’ve done a great job adopting Agile for this program. They were the first one to adopt Platform One and help us scale.
But at the same time, should we be in a situation where we have only one bidder for a multi-billion program, one of the largest DOD programs on the planet? That’s probably not healthy.
Mr. Jekielek: At one of the highest levels of security required.
Mr. Chaillan: The highest. Yes.
Mr. Jekielek: As we finish up here, where would you like to see things go for DOD?
Mr. Chaillan: Well, first of all I would like to make sure that we empower our fighters. We train them, we invest in them, and we empower the lowest level to make this happen for us. We have tremendously bright people. They can do this. They can make this happen.
We have to be able to communicate better with industry. And industry is not just a DOD bubble. It’s a broader industry. It’s all the U.S. companies, startups, to join us in the fight. We have to share this knowledge. We have to be able to raise awareness.
But what’s very important is we have to stop funding reports. We have to ask Congress to stop continuously going back to DOD and asking the department to invest more money writing reports.
We need actions. We need outcomes. We need tangible value to the war fighter, and that will take, effectively, Agile to become the norm. It’s going to take proper training. It’s going to take partnership with industry, and it’s going to take scale. It’s going to take a real agency, and we need to stop being complacent and having reports that tell us we have 10 years to figure this out, because by 10 years it will be too late to fix it.
Mr. Jekielek: Do you regret at all? You know at this moment you’re out, do you regret at all having made that decision?
Mr. Chaillan: I always feel like I could have done more and better. You always think back and want to improve. I don’t regret leaving because I was at a point where it was not healthy for me. I was not able to impact enough. It was frustrating to talk about the same problems every day of the week, when we have the solutions, and I had the solutions. We couldn’t implement it.
I was at the point where we proved we could do it. We did it, yet we didn’t see actions to make it happen at a broader scale, so it was very frustrating. So, by leaving, now I can be more efficient on the outside, to raise awareness but also, potentially make sure that we empower the right people to make this happen.
I would never go back and sell things to the department. I despise when people do that. I will not be doing that. I will be always very eager not to benefit personally from any of this. But I will really make sure that we wake up before it’s too late for our kids.
Mr. Jekielek: Well, Nicolas Chaillan, it’s such a pleasure to have you on the show.
Mr. Chaillan: Thanks for having me.
[Narration]: Our team reached out to the Department of Defense, but we did not immediately receive a response.
This interview has been edited for clarity and brevity. A previous version of the transcript had “DevSecOps” erroneously transcribed as “Def. PSYOPS”. The Epoch Times regrets the error.
Subscribe to the American Thought Leaders newsletter so you never miss an episode.
Follow EpochTV on social media: