This week, Bonnie sits down with Alex Miller, Senior Science and Technology Advisor for the Army Chief of Staff, for a deep dive into data architecture, AI implementation, and the ever-evolving landscape of technology in the military. Alex shares his insights on the concept of data meshes, bridging the gap between tactical and strategic operations, and the need for continuous transformation. Tune in to explore the challenges and potential solutions to harnessing the power of AI while ensuring data integrity and mission effectiveness.
TIMESTAMPS:
(5:38) Bottoms-up versus top-down data architecture
(7:45) What is the difference between data mesh and data fabric?
(12:24) How to overcome the challenges of multiple standards
(15:08) AI has a sex appeal problem
(21:27) Structured versus unstructured data
(26:02) How to properly train generative AI
(29:37) Building a doctrine bot
(31:38) Removing the acquisition stigma around technology upskilling
LINKS:
Follow Bonnie: https://www.linkedin.com/in/bonnie-evangelista-520747231/
CDAO: https://www.ai.mil/
Tradewinds AI: https://www.tradewindai.com/
Alex Miller [00:00:00]:
The government is pretty good about adopting technology. And I know people like when they hear that they're going to go, no, they're not. Stop it. The government's pretty good about adopting technology. That is not the same as being really good at adopting the right technology and having an exit strategy when it's no longer the right technology. We choose to go to the moon in this decade and do the other things, not because they are easy, but because they are hard. Through our blood and your bonds. We crushed the Germans before he got here.
Alex Miller [00:00:35]:
You and I have a rendezvous with destiny.
Bonnie Evangelista [00:00:39]:
All right. Good morning, Alex. How are you doing today?
Alex Miller [00:00:42]:
I am awesome on a Monday morning in DC. How are you?
Bonnie Evangelista [00:00:46]:
Good. Good. And are you local to DC. Or are you in town for the conference that you were just telling me about?
Alex Miller [00:00:53]:
No, I'm a local, so I'm headquarters at headquarters army, so I'm not a towny for DC proper, but I am see.
Bonnie Evangelista [00:01:01]:
I see. Okay, well, introduce yourself. Tell us who you are and who you work for.
Alex Miller [00:01:06]:
Awesome. I'm Alex Miller. I am the science and technology advisor for the Chief of Staff of the army. And that's a super overloaded set of terms. So really I have an opportunity to work for our new Chief of Staff, general George as. Really? A CTO mixed with technology evangelist mixed with an S-T-R-D instigator. I think that's probably the best word in terms of, hey, figuring out where do we need to go, what's the real operational priority? And I do fundamentally mean things that are going on, things that will be going on, and then what our long term plans are. But that is in headquarters army.
Bonnie Evangelista [00:01:39]:
So it sounds like you're kind of at a crossroads of a lot of different elements in the army. Is that a fair statement or would you describe it differently?
Alex Miller [00:01:49]:
No, that's a perfect statement. My background has been in the intelligence side since I graduated from school. But in this position, I'm really focused on we have this concept called warfighting functions, things like intelligence or fires maneuver. And it's really focused on how do you find the intersections on where things really need to work together especially as we think in terms of how to apply data to problems. It's not just one person's data or one group's. So getting to sit at that intersection has been really interesting for the last 910 months now that I've been doing this job.
Bonnie Evangelista [00:02:22]:
What have you been doing before this job? Like, what's your background? How did you kind of land in this crossroads of all things army and technology?
Alex Miller [00:02:32]:
I don't know that I could map it, but I know what happened. Background undergraduate in computer information and then immediately started doing biometrics and forensics for the Army Intel core. And then that turned into doing a lot of network management and network modernization in terms of how do we apply at the time emerging 3G technologies to Afghanistan, which meant that I moved out of the ones and zeros into really waveforms and working on what we would call a signal environment. That turned into how do you do intelligence architecture for the global war on terror, which turned into how do you do intelligence architecture for the bigger. We call it large scale combat. What that really means is how do we plan against a threat actor like a Russia or a China. And then that turned into how do you think about all of these things at a much broader perspective? Not necessarily trying to do point solutions, but how do you set up the army for success as part of the joint fight in the future without getting so bogged down in our own lexicon that we sort of lose sight that we are part of something bigger.
Bonnie Evangelista [00:03:38]:
Wow. It helped orient me a little bit because you're educating me when you're talking about those architectures. And I heard you say signal, so I'm thinking network architectures, but are you going from strategic operations to tactical or vice versa? Is that what you're describing? Because I know you also said it's IC oriented so I'm not sure or sorry, intel. Sorry, not IC and not sure if I'm drawing the right parallel.
Alex Miller [00:04:06]:
No, it's a great question. I've had a unique opportunity to work very tactically for a long time, focused on how do you solve a specific problem in a specific area for a specific set of people. And then through probably the last two or three years it's been hey, how do you focus more on the strategic landscape and what the fourth estate or the intelligence community or even CDAO is providing at an enterprise level? And then how do you bridge those between hey, what's happening in know places where you have infinite internet and infinite bandwidth down to places where you have almost no bandwidth and you still have to do things with data, you still have to make decisions and hopefully those are data informed decisions.
Bonnie Evangelista [00:04:46]:
So I'm going to ask you, I don't know if this is going to be a controversial question or not, but I definitely feel the two sides of two schools of thought. I would say in terms of bottoms up, I'll say architecture. I might be using the wrong technical term, you can tell me otherwise. And then Enterprise top down architecture. What are your thoughts on that? How do we balance the two? They both have valid needs and I think you know and understand the Enterprise is always trying to push these top down driven platforms and whatnot how do we weigh that against what you just said? The tactical environments that may not be oriented to the enterprise type needs or architecture that is implemented?
Alex Miller [00:05:38]:
It's an awesome question because it's actually one that we're struggling with or maybe struggling is too strong board. It's one that we're trying to retackle now. And I use retackle deliberately. Because about 15 years ago, what we did in the army is we picked an Echelon, we picked the Brigade Combat Team, and we said, you are the bridge between everything that exists at the Enterprise level, from the intelligence community and from the other parts of the US government, down to the tactical space. And we actually forced that upon them. And what that turned into was not only a different way to make decisions, it turned into a lot of stuff, physical stuff, different servers and different processors because it really became a filter. And sometimes the filter was people and sometimes the filter was machines and sometimes the filter was I just don't have the time to do anything with it, so I'm not going to do anything with it. So as we think through, both answers are right, but neither of them are sufficient on their own.
Alex Miller [00:06:37]:
There is definitely a need from the very tactical side to move information up because at that moment in time you are the single source of truth. But then what we are trying to relearn is if something is coming from the enterprise and there's this misperception that the intelligence community will know everything all the time. And I'm saying that as a former intel person. It's not a magic eight ball, right? You don't know the truth all the time. There's not some big prediction engine in the sky that's going to tell you, hey, this is what's happening. So being able to figure out what the bottom looks like all the way up to the top takes some deliberate measure of where you want to build that bridge. And for like I said, where the army is, we are focusing on where the right amount of bandwidth exists to move data both from the tactical side up and where the right amount of human processes exist to actually move data from the enterprise.
Bonnie Evangelista [00:07:31]:
So my CBAO, my organization, is there's a lot of conversation about data meshes right now. Is that what you're describing as the bridge or do you see it? What's the technically oriented mechanism to be the bridge?
Alex Miller [00:07:45]:
It's a great question. So we play a lot of buzzword bingo, especially in the National Capital Region. So when we start talking about data meshes and data fabrics and different technical implementations or technical user access patterns, my first thought is, hey, we've sort of missed the mark. So I am a big fan of the mesh concept because what it does is it says specific domains are owned by specific organizations and they have to apply rules and rules are good in data. But the problem is those rules have to be knowable and implementable and they can't be so cumbersome that somebody trying to implement them just doesn't have the time to. And I'll give you just one anecdote, we're going to end up with a lot of robots and those robots are going to be generating a lot of combat information at the edge and they're going to have to be able to push those data up. Well, if we have to figure out every time how they subscribe to a mesh architecture, how they publish information to it, and then every robot needs a different touch before we can update that, it's just not going to be tenable. And I work for General George, so they put us in here, they said I get up until eleven.
Alex Miller [00:08:51]:
In terms of anecdotes for implementing data mesh. What we're trying to figure out is what is the right technical implementation between something like a mesh that would exist in an enterprise and something like a data fabric, which is another technical implementation where we would have to set up the rules and how you access the data and who can access the data. But right now industry is sort of showing us, and I mean private industry is sort of showing us that different rules work in different situations. I actually read an article last night from, I think he's the CTO of Chickfila that described their endpoint observability architecture as amazing. Every franchise manages somewhere upwards of 2800 different containers for IoT devices at each franchise for high observability and availability. And I went, wow, that's a lot of stuff. And the whole reason they do that is because they generate so much log data and they recognize if they try to push all of that log data back up, like from their tactical side up to the Chickfila skynet and the giant chicken in the sky, it would bog down all of their internet access. So they just don't.
Alex Miller [00:10:00]:
So we're trying to sort of work through the same types of patterns.
Bonnie Evangelista [00:10:03]:
When you talked about the right rules for the right environment and you use as an example who has the right access. When you speak of rules, are you referring to more technical standards or is it more business process rules?
Alex Miller [00:10:23]:
Another great question. So it's both, and I hate to be a bureaucrat, but it really is both. And I'll pick on the IC because they have just sort of the most robust and easy to use. There's specific rules about who can access and handle specific intelligence data, particularly on the title 50 side. So that's one set of rules. And then there's also rules that are softer that we implement in terms of what should somebody have access to, not because they're bad or malicious or nefarious, just because otherwise if they have access to too much, they can't do anything with it. Right? So we did an exercise last year called Project Convergence, and on the last day I wanted to see just how much information we could generate in one eight hour period. And in that eight hour period with a very well bounded box with very well bounded processes, we generated somewhere north of 28,000 different object reports just in one eight hour period.
Alex Miller [00:11:17]:
From one very finite set of sensing capabilities. Super cool, right? Looks great on a map, lots of dots, but no one can practically do anything about it. You can't really make a decision off of that because there's just too much. So what we learned from that and what we're trying to document in some of the softer rules is how do you filter that? How do you bound that? How do you only give the data that's really necessary, but you make all of the data accessible? If a decision maker of any flavor says, I really need those data too.
Bonnie Evangelista [00:11:48]:
Given my background or my familiarity with some of this supporting army defensive cyber, what is stopping us? Or are you a fan of creating technical standards across the department? Like not just within the army or Air Force, Marine Corps. Like, why can't we just have draw the line in the sand and say, this is our technical standard to connect your software and this is how it needs to run and these are the rules to provide capability. What are your thoughts on that? Because this is also something I get mixed point of views completely on this.
Alex Miller [00:12:24]:
Yeah, I am of two minds here. So on the one hand, standards sound super awesome because in theory, in thought, you'd say, hey everyone, here's a standard, and everybody would just comply with it. The challenge is we have tons and tons of standards and if you were on the defensive cyber side, you saw a lot of them because before you could even get a tool on network, they had to comply with some type of standard. Well, to be doing that when we're talking about the future capabilities, we have tons of standards. What we don't have is common implementation guidelines. And those implementation guidelines are, here is how you implement the standard. And if you want one that everyone's super familiar with, it's USB. So Universal Serial Bus.
Alex Miller [00:13:06]:
Intel Corp open sourced it 20 some OD years ago. But if you say USB, it could be USBA. USB b. USB c. USB lightning. So until you actually say, hey, not only is this the standard for how you're going to implement, but here is also what you're going to implement against that, I think we're going to be sort of in this wild west now. That being said, almost everything that we've done in the It space for the last five, six, seven years has been Alex's opinion about abstracting away all of the different choices that people have made to not have to worry about the standard. So if you think about what we did with virtual machines about ten years ago, that was to totally abstract away.
Alex Miller [00:13:45]:
It doesn't matter what OS you're running or your base software, here's your VM, as we start thinking about containerization and how to do container management, that's all about, hey, I don't even care what version of software you have on top of your OS. I'm just going to give you the container with everything you need to run. So now we're trying to figure out, hey, what level of abstraction is next? And I think I'm super excited about the data integration layer and CDO's efforts there because then if we can abstract away one more layer of connective tissue, that makes my life way easier because right now we manage thousands of bespoke point to point system connections that I just don't want to have to manage.
Bonnie Evangelista [00:14:25]:
I think this is a good tee up to where I think we wanted to land in terms of, like, all this type of work is absolutely necessary when we start to think about implementing AI capabilities. Right. If you don't have some of these technological foundations with regard to where data is going and how easy it is to access, then you're really limiting your ability to optimize AI. So what are you seeing, I guess on the army side or not in terms of abilities to actually capture and grab whatever that whether it's emerging technology or just enhance capabilities that will enhance mission effectiveness. How is that looking on your side?
Alex Miller [00:15:08]:
Yeah, I love the ended with the mission effectiveness. Pete so in the AI space, I see two big classes of problem. And this is what I've been trying to work through as I talk to other people. Not in a pejorative way, but just sort of in a, hey, how do we implement the technology way. The first one is AI has a sex appeal problem. And the second one is AI has a blue jeans problem. So on the sex appeal side, you said it perfectly. It's, hey, you're trying to do mission effectiveness.
Alex Miller [00:15:35]:
Well, it means a couple of things. One means you understand what the mission is, so you have a problem and you have some way to rate your effectiveness, which means you have a metric. And when you start talking about the application of AI, those are the two things you need. You need something that you're predicting against and a level of decision space for what that metric or what that level of prediction is going to be used for. And right now, we as a department, we as an army, we as everything, have focused so very heavily on, oh, I want to do AI. I don't think we've put enough emphasis on what do I want to predict? How is that decision going to be made? And then what are my risk parameters? How am I actually willing to take the output of that AI and apply it somewhere? And the one that comes to everybody's mind because it's sort of at the forefront is this targeting thing, how do I find things in pictures and then do something against them? And I love it. It's because I've been around plenty of geospatial analysts who have just been staring at pixels for 1516 hours, days like when I was in Afghanistan and we had the Jayak, we had analysts that were literally their entire life was just staring at pixels. And if I can make their lives easier, that's a win for everyone.
Alex Miller [00:16:44]:
But at the same time, a lot of them, because they're so well trained in how to do their job, we have not gone, hey, what are the parameters by which you are comfortable going to your boss, your leader, or your targeting chief and going, I've got a valid target? And that's because we're just so focused on the tactical, very cool, romantic side of, I've got an AI. I want to use an AI. We have not focused necessarily on the infrastructure for people and processes to say, here is how you're able to use this. Here are the decisions that are going to come out. Here's how you're going to apply those decisions, and then you can tweak it however you want within that. So that's sort of one side of the equation. The blue jeans problem for AI is a little bit more topical. I love the story of the blue jeans because it's just sort of quintessential American.
Alex Miller [00:17:33]:
The gold rush happens. People start moving west. Levi Strauss realized that all of these foreman and these miners are coming back and just absolutely destroying their clothes. So they say, hey, we're going to build a better tool in the form of the blue jean. Right. It was just an overall. It was a coverall that had double rivets, and the stitches were stronger so that these miners could just beat their clothes up without having to replace it all the time. And then that became the gold because they realized that, hey, one guy could find a hunk of gold, or I could sell blue jeans to every miner.
Alex Miller [00:18:07]:
Well, right now, we are trying to give tools to everyone without figuring out what is the real base, what is the common thing that everyone needs? Let's invest in that, and then they can go figure out how to do their specific tasks. On top of that, my passion right now in the army is figuring out what is that lowest common denominator for AI for data science, pick your automated machine intelligence type of workflow and then getting that in place. It's sort of a long winded answer.
Bonnie Evangelista [00:18:37]:
Yeah, no, you're making me my mind's kind of going in a couple of different directions. But just on your lowest common denominator thing, isn't that the data, or is that too obvious, or there's additional nuances that I'm just a layman?
Alex Miller [00:18:53]:
No, it's perfect, because you certainly have to have the data. But I think the lowest common denominator is how you discover access and share those data, because if you can figure out that, you can do anything else you want. One issue we're having right now is I've got a program office who has actually leaned really far forward. Their entire system is actually all about how you use data for people and then I have an organization down on Fort Liberty who's saying, hey, we would like to access those data. Perfect, right? We have a customer, we have a specific workflow and a mission, and then we have an office that is supposed to own those data and broker them. So now what we're trying to figure out is it has the authority to share those data. Because what we found out is the program office actually doesn't. And even though it's people data, it's people that we own, they're in our formations, we now don't know who the right person is to say yes, you can share those data, or yes they can consume those data.
Alex Miller [00:19:50]:
So the data is super important. But being able to really discover, access and share, I think is the lowest common denominator.
Bonnie Evangelista [00:19:58]:
What do you think about structured data versus unstructured data? And I'm asking because we're kind of honing in on data right now, and I'm seeing some companies on the commercial side who are building capabilities that allow us to query unstructured data the same way we would structured data. And kind of their niche or they believe where the hockey puck is going in this data world or data labeling world is that data labeling is a losing game. And we will never have enough hours in the day to label all the data. And so we just need to build capability that still allows us to discover share, like you were just saying, the data in our unstructured environments. And whatnot, what are your thoughts on that?
Alex Miller [00:20:47]:
I'm of mixed mind here, partially because I have seen it was in 2000, I want to say eleven or twelve, the intelligence community came out with this thing Eyesight, the Intelligence Community information Technology Enterprise. And as part of that, they implemented an ICD and intelligence community directive which said thou shalt label all data with metadata. So this was way before the computer vision labeling craze. This was really more about metadata tagging. And they came up with two things. It was the trusted data format and the enterprise data header. So TDF and EDH and they said, hey, whatever piece of data, if you generate it or you ingest it or you get it from somewhere, your first process is you're going to metadata tag. This with.
Alex Miller [00:21:27]:
Some of it was super simple, like where did it come from? Some of it was much more complex. Like where did it come from, what was it derived from? Classification, dissemination authority, those sorts of things. So when I hear this sort of age old conversation on structured versus unstructured, generally what I have seen, and this is not necessarily true for everyone, it's just what I've seen. The first thing that somebody or a company or a process or piece of software does when they get unstructured data is they structure it somehow, whether it's a very loose structure or a very strong structure for their product. They say, hey, no matter where it is, I need to be able to do something with it. Therefore I'm going to apply a structure and then query off of that. So that's certainly one use case. The other one is I've seen this only work when there's really strong documentation from the data owner or data steward that says even if you have no understanding of what's in the data, here's what the data looks like.
Alex Miller [00:22:24]:
And it's sort of a chicken and an egg because even on the definitions and I'll give you just one anecdote video. So we would get companies coming in saying, I can help you process your video and say, well, video is highly structured. Like we know the frame rate, we know what's in each frame. And they went, no, but we're going to break it down to feature analysis so that each frame, even though it's a picture, you can do something with it without knowing what's in the picture. And I went, okay, that's interesting. But still that was really taking highly structured data, applying a process to it, getting some more structure out of it, and then doing something with it. The one that was probably the hardest to ever work with, it's gone through several different name changes. It was originally document and media exploitation.
Alex Miller [00:23:06]:
Domx and then it became captured enemy material. Now it's really pocket litter. We had this in the global war on terror where if somebody was captured and they had pieces of paper or something in their pocket, what do you do with it? Can you scan that in and do something with it? That was really the biggest unstructured data problem we faced, at least in my time in government. And the first thing we did every single time was apply a structure.
Bonnie Evangelista [00:23:29]:
What does technology adoption mean to you? So we've talked a lot about some fundamentals, some other constructs within infrastructure that are required for this, I would say tech modernization era that we're in. So what does it mean to you or how would you kind of define or identify if we are truly moving in the direction of technology adoption?
Alex Miller [00:23:54]:
That's a heavy question.
Bonnie Evangelista [00:23:56]:
That's the goal, right?
Alex Miller [00:23:57]:
Yeah, no, 100% I love it fundamentally being able to apply something against your problem and then if it works, continuing to do that, and if it doesn't, stopping it. And I'm sort of using that. And I'm only slightly guarded because the government is pretty good about adopting technology. And I know people like when they hear that they're going to go, no they're not, stop it. The government's pretty good about adopting technology. That is not the same as being really good at adopting the right technology and having an exit strategy when it's no longer the right technology. And I think that's what we like on the technology side. And actually cyber defense is one of the unique ones where it's gotten really good at the latter.
Alex Miller [00:24:40]:
That's what everybody gets really frustrated about, right? So we buy technology all the time. We really do. We don't necessarily buy the latest technology, and we don't have good escape hatches for contracting. And Congress doesn't really give us good escape hatches in terms of their ability to use specific kinds of money in specific ways to say, hey, when this is no longer the right technology, I need to be able to adapt the Next technology or even the current technology. And I think the Leo Garciga, who was my buddy in Army Intel for a little while now, he's the Army CIO. He was up at the Undersecretary of Defense for Acquisition and helped write some of the software pathways documentation, which is really fundamentally about, hey, not only how do we adopt software, how do we do it in such a way where you continuously adopt it, and then if it's no longer the right thing, you just kick it out.
Bonnie Evangelista [00:25:30]:
Mr. Garcia. Excuse me. His office, I don't know if it's him has, I guess now, Asault issued a memorandum, I don't know, a month ago, maybe it was like three to four weeks ago, saying no generative AI for the acquisition workforce unless it's approved by the CIO, which is now. Mr. Garciga, what are your thoughts on that? Because I understand where that's coming from, but also how is that helping the army do what you're talking about?
Alex Miller [00:26:02]:
So I remember he and I actually started that discussion before he moved over to the CIO shop. And we started it because, of course, we saw Chat GPT-3 come out and then 3.5 and then four at the same time. We saw Dolly pop up and hugging face and we saw all these things and I went, this is going to be really bad if analysts start dropping questions that are very sensitive into these generative engines and they start storing those questions and all of a sudden those becomes the training baseline. So the impetus for those policies was really, hey, don't go to a website on the Internet and put very sensitive acquisition or intelligence information into it to try to make your job easier, like, we're here for you, but also don't do that. So in discussions with him, which he is super forward, I love having him as the CIO. He's awesome. What he's really trying to get to, and what we're really trying to get to is let's figure out if we need a generative and let's take two classes. We need a generative text model and we need a generative image model.
Alex Miller [00:27:06]:
Let's figure out how to pull them into something like our cloud environment on the classified network or otherwise train it on our lexicon so we're actually getting some value out of it rather than just statistically generative sentences. And then let's do the right thing for everyone. We've had some really interesting use cases on the sort of non war fighting side for things like contract boilerplate generation or. Personnel action boilerplate, and I'm using boilerplate very deliberately. I don't want everybody to think that we're trying to cheat the system. Yeah. And that's been pretty successful. And then our TC army artificial intelligence center up at Carnegie in Pittsburgh has been pulling different LLMs into their ecosystem.
Alex Miller [00:27:52]:
It's a sandbox. It's protected. It's gone through all the right defensive metrics to bound it to look at the same problem. So I know that just playing devil's advocate for Leo's team. He's really been concerned about, hey, don't throw a bunch of stuff on the Internet while at the same time helping all of these different organizations in the army say, hey, if you want to sandbox some type of generative AI structure, we're all about it. That's awesome. Just make sure that it is very well protected and you know exactly what you're putting in and you know exactly what you're getting out of it.
Bonnie Evangelista [00:28:22]:
Yeah, maybe I'll have to talk to Mr. Garcia then, because just knowing some people at the action officer level, because policy, as you can imagine or know and understand, takes a life form of its own, maybe. And so I don't know if that's the intent that carried forward with the memo. I don't know if that sense of play or that you're talking about, like, that sandbox or this explorative or exploratory nature that I think you are or he is in favor of is being fostered because of that memo.
Alex Miller [00:28:57]:
Let's just say that no, that's totally fair because I also know that with Task Force Lima coming online and being able to say, hey, at the department scale, how are we doing this? We're seeing I'll just give you an example because this is one project I'm actually pretty excited about. I got a demonstration from the team on Friday. It's Monday. Got it on Friday. And it's part of the 75th Innovation Unit, which is part of our Army Futures Command. It was A Rand team, and then it was part of the Office of Army Analytics. And some of our folks that are stationed out at Naval Postgraduate School out in Monterey, California, and they're trying to build a doctrine bot. So sort of reductionist.
Alex Miller [00:29:37]:
But our doctrine is crazy, and it's not legalese and it's not acquisition ease. It's like its own language. So you end up with these things that are crazy. And I remember before I left ROTC, when I was in college, we would get these field manuals and these army pamphlets and then talking to my dad, who was a career soldier. I have now his army pamphlets and infantry field manuals and everything. I'm going, oh my God, how do you remember all this? Especially if you're a soldier and we're updating doctrine every couple of years. How do you go through and find the time to read all that and then go, oh, I'm going to scrub that part of my memory because that's been replaced, or I'm going to update that part. So what they really did was they just trained two models.
Alex Miller [00:30:19]:
They trained a DA Vinci model and then they trained one from the army side. Here's all the doctrine. Ask natural language questions, get natural language answers. And I was really excited because they showed to me one, anytime you can show me in real time that it's not a canned demo and I can ask a question and get something out of it, I think that's more powerful. But what they really showed was, hey, now we have a process. Like we have a process for a sandbox where my question to them was take all of the NDAAs for the last couple of years, the National Defense Authorization Acts, and then take all of our Justification books, our P. Form, our procurement forms, the research and development forms, everything that you can find on the Internet because they're published openly, because they're part of the congressional train. One of these models on them.
Alex Miller [00:31:02]:
And then see what we're putting out there about ourselves that maybe we don't want to put out there about ourselves. But because everybody only looks at one piece of paper at a time, or one book at a time, they don't have the whole picture that a generative model gets by factor of just going through all of it.
Bonnie Evangelista [00:31:18]:
All right, knowing all of this, won't need to belabor that point any further. You were king for a day. I don't know, you were the Secretary of the army. What would you do to or what would you like to see different? Or what would you like to see more of to move the needle in the right direction for whatever that means for you?
Alex Miller [00:31:38]:
There is a stigma right now from what I've seen about people who are in acquisition fields saying, I would like upskilling on technology concepts. The Defense Acquisition University has started doing some more modern technology courses, but I remember had an opportunity to spend about five years at the night vision lab. So I was in an acquisition billet. And the day that I signed my government, hey, you work for the government now. They said, here's a memo, it says that you have two years to complete your acquisition training or you don't have a job anymore. So I went through all of that acquisition certification and all of it was very brute force. Like here is Jcits, here's Pbbe, here's the acquisition process. So when you start asking these questions like, hey, how does this technology, which doesn't fit neatly into the I'm going to buy one, and then I'm going to buy 1000 copies of the same thing, where does that fit? So just having a pipeline for anybody in any career field really focused on the hard work and acquisition folks to go, I would like upskilling and technology, that would be number one.
Alex Miller [00:32:41]:
And I'm sort of cheating because I know that the army's bought some Udemy and coursera licenses, which is awesome for data science literacy and basic data literacy, but really sort of removing the stigma there. The second one is having some more open and honest conversations about the iterative nature of technology. Back to your technology adoption question and saying it is okay for us to say that something we bought ten years ago is no longer the thing. It is okay for us to listen and go, hey, I wouldn't take this piece of equipment home and use it. Why would we ask soldiers to use it when they are literally and Ukraine has been a great example. I had some great friends who were in Poland for the start of that. I have good friends that are all over now and I hear these stories about them saying, hey, we had this piece of kit. It did exactly what it was supposed to, but what I needed it to do, it couldn't do.
Alex Miller [00:33:35]:
But because we went through this process and we haven't really communicated, hey, the character of war has changed so greatly in the last couple of years. We really need to focus on technology that keeps pace with that. We're sort of at a weird inflection point for being able to institute what my boss would say is continuous transformation as opposed to modernization, which is really focused on kit. Right. So it's like not necessarily how do I do a better set of air pods? It's, hey, what is the right technology for the new world that we're living in? And instead of saying, hey, I'm going to buy this for the next 30 years, how do I buy it for five years and then figure out what the next thing is.
Bonnie Evangelista [00:34:10]:
Yeah, that's a huge mindset shift. And it goes back to what you talked about earlier, like identifying problems first rather than solutions. And I would akin that to the requirements. We should only be defining requirements if it's useful today, but for future years stuff, just stick to problems and not like I feel very firmly about that, especially in the technology landscape. Great point.
Alex Miller [00:34:37]:
So I agree. And this is one area I'm super proud of the army here because they were shifting. You've seen a Capability Description Document or a CPD or a CDD like they're tomes, right? They're probably longer than my dissertation most of the time. Why? It doesn't make any sense. It made sense for the Blackhawk, it made sense for the Patriot, it made sense for the Apache. But if I really just need a piece of especially with software that should be no more than about a page and a half and it should sort of list high level, I sort of needed to do this. I sort of needed to do this. And then I know this is a high go ahead, Bill.
Bonnie Evangelista [00:35:12]:
Sorry, I'm going to steal one of your army doctrine concepts where this is how I teach people how to write problem statements. Current state and state gap. Can you just define what's happening today, what you need to happen and where the gap is, and let somebody else fill in the solution or give you ideas for COAS right, for solutions.
Alex Miller [00:35:30]:
So 100%. And it is okay. And this is like Blasphemy for acquisition, for generally, it's okay to use Kentucky Windage to sort of walk yourself into the right solution as long as your users are the ones doing it.
Bonnie Evangelista [00:35:42]:
Well, this has been truly a pleasure. I thank you greatly for taking some time on a Monday morning to talk to me. Hopefully it was fun for you.
Alex Miller [00:35:49]:
No, this was a blast. Thank you for the questions. It got my brain going, so that was awesome.
Bonnie Evangelista [00:35:54]:
All right. Good stuff. Thank you so much.