This week, Ryan Connell sits down with Alex Martin, the CEO and co-founder of Clearspeed, to discuss how they are using AI-enabled voice analytics to revolutionize risk assessment in defense. Alex shares his personal journey from the Marine Corps to tech entrepreneur and how he’s aiming to solve the age-old problem of vetting individuals efficiently and at scale. He elaborates on the challenges of building a technology company in the defense and enterprise sectors, emphasizing the importance of trust, speed, and the human element in AI applications. Tune in for an inspiring conversation on startup success and the future of risk assessment.
TIMESTAMPS:
(1:09) Meet Alex Martin
(2:22) Why Clearspeed was born out of tragedy
(5:44) How to scale risk assessment
(9:57) Productizing a defense engine for enterprise
(12:28) How to find pockets of people with imagination
(15:27) Is this an invasion of privacy?
(19:02) How to balance speed and security
(21:47) Advice for aspiring entrepreneurs
(25:42) How to build a passionate team
(30:50) Creating a purpose-driven culture
(37:37) Why imagination is critical in VC
LINKS:
Follow Ryan: https://www.linkedin.com/in/ryan-connell-8413a03a/
Follow Alex: https://www.linkedin.com/in/alexandersmartin/
Clearspeed: https://www.clearspeed.com/
CDAO: https://www.ai.mil/
Tradewinds: https://www.tradewindai.com/
[00:00:00] Alex Martin: We're the most sophisticated military on the planet. we have incredible technology. We have the fiercest operators and we cannot solve this ancient problem of how to vet people that we're working with.
So I asked the question, how are we vetting these people?and the answer was, we weren't doing it very well. And it was no one's fault. We had incredible groups of counterintelligence, you know, operators, and they were doing way more work than they could, but it doesn't scale. Human experts don't scale.
And so I looked at this fundamental problem and said, how can we flip the equation instead of one human trying to find a needle in the haystack using maybe lie detection technology, how do you flip the equation on its head and say, actually, what's beyond lie detection? Is there a tool where you could actually say who don't I need to use a lie detector on or a human counterintelligence to do an interview on and would that have saved my buddy's life?
[00:01:09] Ryan Connell: Hey, this is Ryan Connell with the Chief Digital and Artificial Intelligence Office. Joined here today with Alex Martin. Alex, how's it going?
[00:01:15] Alex Martin: Going great. Thanks for having me, Ryan.
[00:01:17] Ryan Connell: Yeah, absolutely. Uh, let's kick off. Uh, just turn it over to you to give you a quick, uh, opportunity to introduce yourself.
[00:01:23] Alex Martin: My name is Alex Martin. I'm the CEO and one of the co founders of ClearSpeed. We're doing risk assessment using AI voice enabled voice analytics. It's A new way to approach risk kind of from the consumer and customer experience first, all with the intent to get trust faster. So we're building a technology company at scale for the enterprise and for our defense.
[00:01:43] Ryan Connell: So let's, let's talk about that. So, what types of risk are you trying to, uh, overcome?
[00:01:48] Alex Martin: I like to think about it in terms of all domain risk. So all domain risk, meaning wherever humans interact and have, you know, kind of that marketplace of ideas or exchange transactions, these things we call trust events. So people are putting something out there to get something in return, and they've got a trust each other, trust the market, trust the technologies and platforms they're using. And ClearSpeed finds themselves at the core of any of those trust events.
[00:02:12] Ryan Connell: Got it. Okay. So, let's start to go unpeel the onion, if you will. why, like what's, what's the story? Like what, uh, inspired you?
[00:02:22] Alex Martin: Yeah, how did a, history major marine grunt get to be a CEO of a technology company? I'm still trying to figure that out myself, Ryan, but, um, I guess the combination of, luck timing and being surrounded by some amazing people along the way, the, origin story for me, you know, started after commissioning into the Marine Corps, going forward and working with, our Marines in really tough locations and understanding, you know, a lot of the mission there, especially, especially post the surge was all around trusting folks were working with over there, over in Iraq and Afghanistan and in the San Conmeor. And so that was enlightening to me that this was a mission that was so critical to win. And the way in which we built that fundamental component of the strategy was actually the most ancient kind of form of exchange ever, which is just trust humans getting to know each other and saying we have this common shared objective.
and that happened, you know, on every patrol, you know, with every cup of chai, with every shared experience, every shared engagement, over there in, combat that brought us closer together, but there were also experiences, unfortunately, too many of them in which those soldiers, you know, were actually not working for, the Iraqi government, Iraqi military, but they were working for other, terrorist affiliations, um, et cetera.
So that, that was heartbreaking to lose so many of, my buddies, and our friends to green on blue attacks. Me personally had an experience with one of my dear friends. Uh, it's a bit of a longer story, but I'll just pull it forward. We were dear friends and we lost him to a green on blue pack. And it was particularly hard hitting to me in that, you know, he had met his wife at my house and he was just a beast of a man, a wonderful guy.
And did it need to go on that next deployment and did, uh, and I felt some responsibility for that. and on that deployment, he was killed. And so the question that I had was how did this happen? We're the most sophisticated military on the planet. we have incredible technology. We have the fiercest operators and we cannot solve this ancient problem of how to vet people that we're working with.
So I asked the question, how are we vetting these people?and the answer was, we weren't doing it very well. And it was no one's fault. We had incredible groups of counterintelligence, you know, operators, and they were doing way more work than they could, but it doesn't scale. Human experts don't scale.
And so I looked at this fundamental problem and said, how can we flip the equation instead of one human trying to find a needle in the haystack using maybe lie detection technology, how do you flip the equation on its head and say, actually, what's beyond lie detection? Is there a tool where you could actually say who don't I need to use a lie detector on or a human counterintelligence to do an interview on and would that have saved my buddy's life?
I think the answer to that is probably. And so that's what drove kind of the beginning of the founding of our first company and then how we got paired with a couple technologists working on this tool and just became the laser focus to bring this to bear to screen foreign troops overseas to prevent green on blue attacks.
[00:05:18] Ryan Connell: Wow. So, yeah, I'm sorry for your loss and, and, certainly inspiring to have taken, you know, a tragedy and turn it into something that you're trying to build, for the better. So I appreciate that. so not to get into like proprietoriness of it, but like help me in terms of, are you listening to words, pitches, does language matter kind of all the tech part of that.
I'd love to understand.
[00:05:40] Alex Martin: Yeah, sure. And I think it starts to frame it better. Put it beyond, like, what's the problem we're trying to solve really at scale. So it's, it is this idea of imagine having 100, 1000, 10, 000 troops you want to work with in a location. And those folks go through a vetting process. So let's go to Afghanistan at the height of the Afghan conflict there.
And we're screening and vetting a lot of special operations commandos. There was a very sophisticated process in place to vet and screen those folks. biometric capture, you know, cell phone exploitation. You know, they're running, you know, they're running an incredible amount of risk assessment on those people using technology and screening all those people through, pushing them forward to the line and doing great work.
So I want to be clear that this is the best in class in the planet for vetting people where there's no background checks, there's no databases and a really incredible, you know, methodology was being used here. The problem was it was slow and it was incomplete. So they had a high false negative rate, meaning if a new entrant wasn't in a database or didn't exist or, you know, was leaving his cell phone behind or there was no way to capture that.
If there was no indication of risk, there was no way to exploit it. And we have incredible technologies and people that can exploit risk and ascertain if that risk is actionable or not. And so I just want to set that stage that if you're augmenting into an existing system and this is going to have parallels to the fraud world, whether it's financial services, banking, human capital management,there's these incredible platforms that manage human risk.
Again, can it do it quickly and at scale? And can it provide a data point that augments and enhance a decision making process? And so that is fundamentally the problem we were trying to solve. We're not trying to replace, we're not trying to create some silver bullet that, that looks at speech and says, this person's lying, don't talk to them.
you know, and I in particular don't believe you'll ever be able to leverage AI to be, Human lie detector, human counterintelligence agent. In fact, I'm going to go on a whole other spiel, you know, later about why I think we actually need to reinforce those incredible, you know, kind of high demand, low, density resources that we have with, technology that we're not bringing to bear.
And so when we're looking at that problem. There is no good way to identify risk at um, rapidly and with great precision. That is the problem we tried to insert ourselves. So we know it needed to be voice because a lot of people can't read. Also, voice is very rich with data and information. We knew that it needed to be scalable.
We needed to be lightweight. You know, we knew we had these, all these assumptions. And then we had the way in which we had built the technology to accomplish this automated risk assessment, which we're doing. So, so fundamentally, just imagine. Being able to a world in which someone can answer a questionnaire with a yes or no response and that yes or no response is in any language Non la, da, net, si, no.
These responses yes or no Our responses to whatever question the client defines as their biggest risk. So, are you currently working with a Haqqani network? You know, have you failed to, have you withheld any information on your security questionnaire? You know, whatever these questions are they want, deployed in that language, and then the end user is simply answering yes or no on a phone.
Okay. So you can be running tens, hundreds, thousands, tens of thousands of these simultaneously, pack all of these into a single day, a single hour, a single week of vetting, and you produce a data point that is given, you know, in real time now, back then next day, it's, it's given to the analyst to then use against those other data points and how that works.
From a process standpoint is injecting into the workflow where there's the greatest inefficiency, but where the customer experience or the, you know, the function of time and cost that needs to be maintained. And what I mean by that is you need to inject into a workflow ahead of a decision made of do we trust this person or not?
And if we don't, now we have to put more cost against mitigating that risk or escalating that risk. So high, highest order kind of problem solving and then what our tech is doing is discerning those patterns and speech or identifying risk, in speech, calling it out as an alert. And we're just saying, here's a data point that you didn't have that can help you follow up or fast track again with the purpose of fast tracking.
So it was a lot there, Ryan, but I just wanted to kind of first set the layout of how it deploys.
[00:09:55] Ryan Connell: No, that's super helpful. I get it now. So thank you. Um, so are you, are we in an operational phase or where are you in terms of, uh, the startup life?
[00:10:05] Alex Martin: Yeah. So, you know, overnight success, uh, eight years in the making here. So we are at a point right now where we have incredible product market fit and scaling rapidly, in the enterprise in particular, in the insurance sector on our enterprise kind of rollout, we're just starting to get into banking.
We've been doing some really incredible stuff with human capital management, but in terms of pushing what is a kind of a, built for government engine. And then productizing it for the enterprise, we have had incredible results. Now it took a long time for a lot of reasons to both make sure that the product was, you know, didn't degrade the user experience, was, you know, provide, you know, deliver at the right cost, um, the speed, et cetera.
But now we're at the point where, you know, CNBCs has recognized us one of the top 100 Uh, insure techs on the planet, you know, some of our customers are the biggest insurance companies in the world as well as the banks that are starting to adopt us now as well. So we have this now, this excitement around this, what I call is all domain risk assessment, right?
But we needed that beachhead and insurance to be able to demonstrate our kind of metrics against current fraud detection, right? And we can talk about that on the government side, government defense security side. We have had an incredible journey, very painful journey, um, in the defense world, right? And this was something that again, when you're an entrepreneur, and I think this will resonate for a lot of the audience,you're purpose driven, you want to solve a problem, you build a product, you know, people get addicted to that product, they pay you for it, you grow that business.
And we all know there's been many podcasts here that have talked about agile procurement and where the valley is at death and all that. So I'm not going to get into any of that. Just know that we have been through. all the chasms and all the valleys. We've worked with all the works is and done all the things and had incredible success finding pockets of people who have that problem, who are innovators and were able to bring us forward to take really that engine, which got validated over in Afghanistan in that same environment that I described to you and productize it for use in other settings.
But what was fascinating about on that journey is you still have in the kind of the middle. Of what I would call the broader defense organization, a gap and the gap is really on imagination and the problem is, that you have, you know, people at the end user level that simply need to do their work. You have problem at the top that see the efficiencies of scale that are needed and all the cost implications that go with throwing more humans at a screening or vetting problem.
And there's a disconnect because we're trying to solve a screening problem with lie detectors. It doesn't work. So when I see something go out to say, Hey, we need a second order polygraph or we need, you know, a better way to do this. I say that's great, but until you solve the top of the funnel problem, which is all about, you know, cost savings, efficiencies, managing through these complexities until you look at really this problem as more the insurance and banking world does, you're going to be stuck.
You're going to be stuck and you're not going to, you iterating on the candlestick to get to the light bulb. It's not going to work. Right. And so, so in this journey, as you said, where we're at right now, it's been finding pockets of people with imagination, top and bottom and kind of seeding through that middle layer, mostly using a lot of these, you know, incredible servers and other platforms that allow you to kind of demonstrate the efficacy of your tool to be able to demonstrate this is something that has great and broad application.
But more interesting than that was the kind of the work there has had a reflective effect. you know, effect into other government agencies, other applications and security where there just isn't that tied to the polygraph, right? Like, there's such a visceral reaction to when people hear voice, they think pseudoscience or they think voice lie detector, and there's such a painful experience over the past three or four decades with people trying to innovate in the credibility assessment world, and we just stay away from that.
We just say, Look, the credibility assessment is something that's needed. It's not going away. It shouldn't go away. And there's great people doing good work there. Yeah. but we have to, there has to be another group of people working on something else or whenever we're going to achieve that. And so all this to say, eight years later, we've been able to build, you know, design, build, deploy, validate, um, an engine, a technological engine that's been productized for the enterprise to do what the exact same thing that DOD needs, which is to process lots of transactions for trust events at scale efficiently by not finding the needle in the haystack, but by clearing the hay.
And that's kind of the biggest thing. And like, and you know, when you think about a hay clearing tool, it's like the metal detector in the airport. Like let's clear an alert rather than, you know, serially look at every single person through a one on one experience in exchange. So the company is doing extremely well.
but we're not giving up. On areas where, needs to be tied to imagination. And that's what we're working on a lot right here.
[00:14:47] Ryan Connell: that makes sense. And you kind of hit on something that was running through my head. Cause I've seen, I'll say other tech that, that kind of dances around some similar aspects. Um, but a little bit more invasive in terms of like, I could update, you know, take this podcast and, and send it somewhere and it'll effectively, you know, tell us how you and I are doing from a health perspective and like things like that.
Right. And my reaction to that is, wow, that sounds so novel. as an American, there's almost this like, Ooh, it's like almost invasion of privacy feeling. and it's interesting how you're focused on just yes, no. And I wonder if that's helped you overcome that, or if you have any thoughts on that,
[00:15:26] Alex Martin: Totally has. And I really appreciate you bringing that up. I mean, everyone's used to doing a questionnaire saying yes or no, and everyone is used to either filling that out or having experience with a human. Now, if you ask them to then take an automated questionnaire where, you know, there's going to be voice analytics run now, there's still in the early, you know, still we experienced that there's this kind of chasm of, you know, acceptance culture, right?
Like, do I forget even being an American? Do I as a consumer? Want to take, you know, a voice based questionnaire. Well, the answer is my alternative is I talk to human for ages and agents and ages and get nowhere. Or I ultimately get shook down or I'm subjected to another fraud solution, which is using AI.
And maybe because I'm a certain color age and in a certain zip code, I'm being, I'm kind of unfairly. off ramped to a fraud, you know, team, or in the defense equivalent, if I'm a member of the wrong tribe and I'm a military age male and I come from area, then all of a sudden I'm in line to talk to, you know, to talk to a really serious group of people.
And so if you flip it on its head and say, wait, what if there's a new way to do it? What if the new way is consent, disclosure, take the same questions you already would be asked and if you, you know, clear it in a matter of minutes, you get fast tracked, you get paid, you get moved, but if you alert, no harm, no change, like no adjudication, no detriment to your case, normal process.
So it's creating the fast track line and then the normal thing. And people say, wow, I could opt into something that won't hurt me or won't harm my thing. Like, let's go. Because this is, incentivized for the folks that aren't committing fraud that aren't working for Haqqani, right? this is about using, a technology that has no bias, no personal information collected.
We don't know what your gender is. We don't know where you're related. Like this tool. Is agnostic to all of those things. it is simply designed, created and built to say, we just want to move you forward. And then if we alert normal thing, and I think that's supremely important because the adoption rate has been insane and the consumer feedback has been exceptional and people are really delighted that I could get paid same day or move through a process.
And when we talk about the work we're trying to do in refugee camps, you know, imagine extending credit to the unbanked because. They simply aren't credit worthy by the eyes of the banks. So what if you had a tool that could actually Say, Hey, trust me and give me a loan so I can, be an entrepreneur, expand my business, like get more fertilizer for my farm, feed my family.
Like, these are the kinds of things that we have to lean into if we're to grow and get better and to deal with the challenges we have. And that's our approach, um, is to actually be non invasive, unbiased, non determinative. And, we really thought a lot about that the beginning, because it can't be something that tech relies on tech alone, stakes are too high.
I don't care if it's an insurance claim, a job, or certainly overseas where lives are death. I've worked with these folks, I've been there myself, right? So every time we deploy and do something in support of anyone doing hard missions, it has to be seen as a tool and a tool only. Now it's an important tool, but we need to be tethered to a human and to other data points that back that up.
[00:18:36] Ryan Connell: yeah, I think, and I think the concept that you've come up with in terms of, uh, kind of fast tracking, if you pass and, if you, if there's a flag or you don't, then just normal process, right. It's kind of like, there's only an upside, that obviously promotes the adoption that makes a lot of sense.
you know, you just kind of dovetailed and you said something earlier that I jotted down, but you said you, wanted to talk about reinforcing the high demand, low density with tech, um, those types of roles. Um, let's talk about that a little bit.
[00:19:02] Alex Martin: Yeah. this is something we think about every day. Not only. You know, when we're looking at the market, but when we're having meetings with our engineers, we have this framework where the current world exists between attention, between speed and security. Think about that. Everyone does you do in your work here we do, but we're all managing this on a daily basis.
And so for speed, you're thinking about efficiency and productivity and user experience. And those are the things that say, I want to move fast. And for security, the security guys in the room and gals in the room, and they're going protection, reliability, compliance, risk management, right? They're all very important things.
The problem is that there's always a trade off. And so what they have to do is they manage this tension and they manage it by assuming risk on one side or the other of that equation. So if you imagine a hundred units and you can have 50 units. One or the other and you have maybe this kind of harmony, but sometimes you need to over index on security.
Oh my gosh, we're going into, you know, this country. I'm willing to move slower to get a better baseline for my risk so that I can buy it down by taking a little bit more time or we have no time. We have to go. Or an insurance fraud, right? Like we have to process these claims. It's so costly if we don't.
And so we're willing to take some fraud on and kind of, just have that as part of the cost of doing business. But at the same time, it's making it more expensive for policyholders. Like everyone's ultimately impacted by the fraudsters and they're getting more sophisticated. And so what we talk about in our meetings is we don't say, Hey, this is a, paradigm that is, that is outdated.
No. What we say is you can break the tension by using speed as your security. Right. And so what we do is we say, look, we're, we're focusing on, you know, that, that end of it. We're focused on the user experience and the efficiency and productivity with our tool. So if you can move so much faster through so many more people, then you can do better job at the security, right at the protection, reliability and compliance and risk management because you have a less You know, you have less folks to work on, like you can more efficiently use those resources.
Those resources can never go away, but they don't scale. And so, you know, that's that, tension. We call it breaking that security speed tension and leverage tech like ours, AI powered tech that makes speed your security. And now there's a whole new way of doing business. We're changing operational workflows because people putting us in are going, whoa, whoa, whoa.
I can now reimagine this. They're creative people. They want to change, but they don't have tools that allow them to do this. And so now with the injection of clear speed, they can say, ah, I can rethink this, we don't do that work for them. They do, but then all of a sudden it's unlocking the ability for them to put their people on those hard problems, better, faster, quicker, and then again, the flip side.
So that's kind of what we talk about every day is managing the speed, security tension, and you know, we can't have one undermine the other. So you need. You need to be able to inject automation. You need to be able to inject AI. You need to be able to inject imagination right into this much ancient thing.
[00:21:46] Ryan Connell: Awesome. I want to dump, jump in a little bit to, uh, like startup life, um, and just talk about, you know, I know you, you said, Hey, I dealt with a whole host of challenges that everyone else deals with, in working with the government and working with the OD, I don't know if you want to highlight something specific or if it's worth just, giving advice to anyone that's thinking about, taking something like this on.
[00:22:08] Alex Martin: I think the fundamental thing is if you feel a calling, then that calling should then be checked against, you know, will that calling produce the creation of a scalable enterprise that will delight shareholders and like make the world a better place? Because oftentimes, and again, I've had a couple of startups before this one, you know, calling not tied to something that has, you know, a defensible moat going after a really big market with a really, you know, rad team behind it.
is just going to be, it's going to lead to failure and you're already going to have a higher probability of failure anyway by jumping into being an entrepreneur anyway. So you might as well do that self assessment first. The second thing is if you're calling, and the reason why I say calling is because as we know, it's just so hard, right?
And it, so it has to be something like that you never give up on. or you only give up on in the face of, you know, all other facts saying like this now needs to get pivoted to something else and then you just try again, or you listen to the market and say, I'm going to take this passion towards something else.
And so my first advice would be assess if it's a calling or are you doing it because you want to get rich or you don't want to work at JP Morgan anymore or you're, you know, leaving some bad job. So like, that's the first thing. The second thing is. When you have that calling tied to what could be a great business, you have to just put it in a 10 year framework.
when I was at this program, this, uh, a program, a certificate program at Stanford called Ignite. It was just a incredible, you know, small, you know, effort that they, they do for PhD candidates. They created it for vets. It's still ongoing. It's fantastic. I was in the first cohort, but I met a professor there, got him Chuck Holloway.
And when we had this prototype, we came to him and he looked at this thing and he said, look, it's, it could be. It could change. it could change markets because no one's thinking about risk this way. No one's thinking about fraud this way. And also this tool is doing stuff that I haven't seen being able to do, but it's going to be hard.
It's going to take 10 years. And I said, no, no, no, we'll do it in five. No, no, it really does take 10 years. Here we are at year eight and we're, you know, we're cranking, right? We're cranking, but it took, you know, through year five and up to get really, you know, essentially the development, the buy in, all the right pieces had to come together.
You're surviving again, our own personal journey. We're going through pullouts in Afghanistan. We're going through, you know, wars in Ukraine, like everyone, we're going through Silicon Valley bank drops. You know, if you add up all these things, you know, it's hard. Right. And so my founder journey, I've been in more combat zones than I have when I was in the Marine Corps.
And so you're going, am I willing to risk my life for this company? The answer for me was yes. And that's weird. And I do not recommend that on anyone, but you heard where I, why I started this. Right. And so me, I'm will, I was willing to give my life. I got blown up on airport road to do another validation exercise to prove this thing works.
I was willing to do that. And the reason why I went, because I didn't want anyone else to have to be willing to do that. Some of my other co founder, Ben, who went with me, and another guy, crazy guy named Sean, who was also there, but the, so you have a couple people that are maybe, you know, a little crazy, maybe willing to take a little too much risk.
I do not recommend that for anyone, but if you really do not wake up throughout the night with excitement and fear in the same night, if you don't dream about your company. If you don't wake up in the morning and even before your own family, you're thinking about this. I hate to admit that, but my wife knows that's true.
Then, you're not, I don't think at the level of, uh, addiction is the wrong word, because it has a negative connotation, but you're not at the right level of passion. To be able to survive what it oftentimes just insurmountable series of chasms that are crippling. And so you look at it and you step back and you go, okay, insane tech, big market, great team.
This is a no brainer. No. And so everyone that we hire and everyone that's joined clear speech team, and this isn't me because you're going to break on a daily basis, but you have to look at this as the team going through, it's the same way we survived in one in combat, the strength of a team, small units doing great things.
With high imagination, and a lot of courage and grit. And so, you know, I don't know what, I don't care what business school you went to. I don't care how smart you are. If you aren't a man or woman of, of high character who can white knuckle and like, you know, the kind that will just like step up in that kind of proverbial bar fight to defend, what you're working on and listen and learn in the case we do, then you're not going to make it.
So my overarching piece of advice is find the thing that's a calling. Truly give yourself a gut check against a 10 year journey. We're at the end of that 10 years, probability of success, low probability of wealth, low, but Providence, probability of meeting amazing people, having incredible experiences and trying to make, bring something to bear that doesn't exist high, then you'll have a shot and I'm still the guy that's on the mountain trying to peak.
And all I think about when we hit the peak is where's next, you know? and so, you know, for me, it's just, it's a gut check of like courage, grit, imagination, and then surrounding yourself with others because. You know, there's a mental health side to this as well. and it's, you've got to be in it for that long run.
so not very encouraging, but it's kind of like the shackled journey. If there's people listening to this guys, that's me. It's like, you know, terrible risk of return, low, low wages, whatever. They're like, I'm joining you. that's the person you need in your startup.
[00:27:12] Ryan Connell: That was an awesome response. So I appreciate it. you know, I I'm thinking about everything you just said, and I'm curious, you know, a little bit of a, maybe leadership type question, but, I don't know how you can get, someone who works for you to have that same level of Excitement and fear and wake up in the middle of night with ideas and passion and willing to risk their life for you.
not their dream, your dream. Right. And I'm curious, how you've been able to tackle that. Cause obviously, you know, in order to have a successful company, you need the people.
[00:27:43] Alex Martin: Yeah. Three components to that. Great question, Ryan. First component is your co founder base. You know, these, folks gotta be aligned with you in terms of like doing the hard things. And there's hard conversations that are hard to have at the beginning of a founder's journey. And so you just have to start with the right people.
I was fortunate enough to start with the right group of folks. The second component is your investors. If you pick the wrong investor at your, for your C term sheet or series a, whatever, good luck. Good luck. And because VCs are going to do what VCs do, as they should, they're going to make decisions that are best for them, best for their LPs.
You know, we're not here to tear on heartstrings. Oh, remember why we started this? And they shouldn't, right? They shouldn't be like, okay, let's on, you know, heart and emotion go longer, let the burn keep going or whatever. So, if you can align with investors that are patient, that get it, they're going highly sophisticated and understand that things that change the world take time.
Um, now again, it can't be, you know, the time scale has got to be relative and communicated, but we had the benefit of our first investor. And we had a series of investors to choose from, um, being a guy named Bob King from Stanford. And if you've never heard of Bob King, you should Google him, Bob King, you know, Stanford, Bob and Dottie King, you know, had done really well.
Bob did incredibly well as a business one businessman, as an investor. And he started a VC that invests in his philanthropy. So basically his, his investment of VCs, the profit will fuel his philanthropy. So climate change, poverty alleviation. So his impact to the world is I did some incredible things that had impact in the world.
I invested in them. It made a ton of profit. I want to take this profit invested in other companies that can make my philanthropy more scalable. Okay. Think about that. it's like an X corp, like something like this exists and there's, there's people doing it. Some doing it better than others, but no one was doing what he's doing then.
And the point of that is this is the second component. I'm telling you courage. Great. His thesis was essentially, let me find like strange people. and you know, you'd think I would take offense to this. Let me find like kind of people that don't shouldn't be there. They're maybe not like the most brilliant people, smart even, but they're working on something that could do it, but it's probably going to fail.
But if it doesn't, it's going to be great. Like essentially that's like his thesis. And so we're one of his portfolio companies. So that that's like, to me, that's a badge of honor. You know, and we have brilliant people, by the way, and some of the best, you know, people in voice and technologists and engineers working on this, but my point is that you find an investor that was, that wants you to be prof highly profitable, that they're capitalists first, but right, right behind it, their impact there for the world for profit, for purpose.
So that was that second component. So we had a great group of founders. Then we had this great investors. Now you can imagine what I can do, Ryan. No one has to hear my story or care about, and they really shouldn't because this is not my company, right? this is their company.
This is actually our customer's company. And so what we try to do here is we say, look, look who our investors are. look, and there's been success of others, highly sophisticated, incredible investors that also believe in, in, in purpose. Look what, look, who's behind you. So I don't care if you've never woken up in the morning and thought, I want to go serve my community.
I don't care if you've ever worn a uniform. I don't care if you've ever taught in, you know, teach for America or you've been a part of USA or you're whatever. I've never cared, but if it's in your heart, it's in your DNA.and if you feel a calling to know that you can go and be a great marketer or a great salesperson or a great operations person or a great engineer and know that you're going to help climate change and poverty alleviation, that should get you up in the morning.
Right? And so now attach, attach. What I'm saying is the third component is create a culture in which people attach what we're trying to do to a personal experience or to an experience they want so they can tell their grandkids or if they never have kids they can just tell a great story to their mates as they're getting old.
My point is when we're all 80 or 70, if we're fortunate to live to those ages, don't we want to be able to tell a story? about the good we did in the world. I think we do. I don't think you and I care if we talk in 20 years, if I've got a million and you've got 10 million in the bank, I don't think we care.
Or I got a hundred thousand or 10, 000 or one. If we're having a beer and I'm going, Ryan, what's the coolest thing you've done in your professional career? You're going to cast your eyes back and you're going to tell a great story, you know, and if you don't have that story, Man, what a sad thing in life, right?
And so what we're trying to do here is create environment. People go in and say, if I roll up my sleeves, if I jump in, I will create the story that drives me and my calling. To have the dreams, to have the nightmares, to wake up ahead of my alarm clock, to have to force myself to block out time on the weekend for my family, because I just want to get back to work.
And I want people to block out time for their family. And we have to take care of each other. And mental health is a real thing, but we also have to want to be, you know, in this elite performance culture where we cannot not do it.and so I guess that's the long answer to your question. You got to start with it.
You got to find the right investors to believe in it. Then you got to do the thing and got to produce the results. So really you just got to execute. and then create a culture in which people come in and tie in and make this a part of their story because I would love nothing more magical that at our next offsite to have 10 people stand up.
That had no form of formal service over and say, you know what? Now I'm teaching on my weekends for refugee kids, or I'm volunteering some time to do this or clear speed, create a new program where I can, if I'm the best performer in last quarter, I can go off and work on a social impact project in Africa, and that's something I'm really excited to do.
Like, these are the kinds of things we can do. And I think it's an extreme differentiator because. Our tech is incredible. It's incredible. Our team is amazing. The team is the real thing. Like the people really are the real thing. Tech, great tech is table stakes. you have to have great people that can connect to other great people again with imagination and come together and say, we got to do this thing or else there's no change.
That whole frozen middle. I think it's never going to get solved. Never going to get solved. and so that's, kind of how I look at that. Long, long answer, but you know, it's the three parts of that.
[00:33:43] Ryan Connell: Yeah, it was great. One question based on something you said, I'm just, and it might be a shorter answer but, uh, I'll ask it anyway. The comments about the investors, you started getting me thinking you talk, cause I'm kind of connecting some dots where you also had the advice about the 10 year plan and all that.
So like, uh, I assume having that 10 year plan, Helped the communication with like the VC and the investors so that people aren't there's no expectation that we're doubling our money in a year or Tripling our money or 5x our money in a year. It's everyone is going in under the same Understanding that this is a long term play?
[00:34:17] Alex Martin: Let's be clear. I did not do that at the front. Okay. I, all right. And so that's a good question. That's advice for future people is aligned with people. If you come in and your first slide says this is going to take 10 years, no one's going to people are going to be all right next. You know what I mean? So my point is you break it.
You, you personally, as investors say, it's going to take 10 years from this thing to go from zero to zero. But then along the way, you create milestones that make investors go, this is working. No investor wants to throw good money after bad. And you shouldn't throw your precious units of time against something that's not working.
You know, and when I see these, like, all the tech that's being created and everything, I always think to myself, like, there's kind of bad tech in this, right? And, there's probably bad teams and you you gotta wash those out. You got to fill those fast. Like that's noise, right?
But, communication to the right investor is look, what milestones do we need to get to the next point where we get the step up in valuation, we can expand and it's all around de risking. Right. And so if we can de risk at first for us, it was de risking the technology. Then it's de risking kind of around the deployment of the technology vis a vis the product.
Then it's de risking delivery and scale announced, and then it was de risking sales, you know, marketing sales operation. So you're kind of building this. and it's not in serial, it's happening in, in a series. And so you're looking at this very complicated, ever changing function.
But what you said there was right, which is communication. So when things change, like when I come in and maybe it was day one saying, Hey, you know, all we're going to do is military screening. And that's the next three years. We're going to have a laser focus. We're going to do it. Well, when we start to have headwinds against people that think polygraph and lie detection is the only way to think about human risk management, we say, ah, we're good.
We're good. We're good. Not give up. We're going to keep working on servers and do cool things, but we're going to take this now kind of purpose built tool. We're going to prod, we're going to pivot. I call it tacking cause I'm a sailor. So I'm going, we're going to, we're going to tack, uh, into insurance. And by golly, now those investors are going, okay, boom, boom, boom.
And the milestones are being achieved. We're de risking along the way. And so I guess to, to kind of put it in play, like when I, what I would give advice to is this in terms of the investor, you have to solve for character first, because the only thing. That matters is that when you get in a gunfight, they're going to be with you.
That's it. cause if they're not, if they run at the first gunshot, it's done, right? It's done. So, so we're fortunate to have investors, not only get in the trenches with you, but they're supplying the ammo. And then you help overcome the next objective, but then you give them, Your steadfast execution, you give them your commitment to getting to the next series of milestones and then they will reward you with further investment because their returns are higher.
And you've, taken care of a lot of the risks that was. present in the last round. And I'm not sure Sand Hill Road and the traditional VC is right, especially for folks listening to this podcast. There is a group of people probably listening to this, but the wrong people are listening to this podcast.
There's incredible VCs. We're doing incredible things, bringing defense tech to the front. There's, you know, there's, Joe Musselman. There's, I mean, I could go on and on and on about all the guys that are doing incredible things in the Silicon Valley community, know your audience. And if you can steer towards folks that, think differently and are aligned with who you are and what you want to do, your chances, your probability of success increase.
And I'm not saying we figured it out. I'm just saying there needs to be an asymmetry. There, there needs to be a change too, in thinking about where capital come from. I actually think, Silicon Valley capital is too risk off right now. I mean, 1964. He's seen every change in innovation and every, you know, change in markets and, you know, in technology, for the past 60 years.
And so in, in six decades of experience, you're looking at going, have the bulk of, you know, in a normal, probably a normal curve, have a bulk of investors become money managers, following an algorithm to place money where return comes first and it's risk adjusted so much, I might as well have it in a bond.
You know, or just keep my money in my pillow, but I don't know, but I'm just saying like, sometimes, you know, you can look at some of the people looking at it and they're running a series of math equations against what's worked in the past and guys, gals, I'm here to tell you that's antithetical to change.
You got to look data as a trailing indicator. If you're looking at data and past models to predict a future outcome, you're only doing half the math. The other half to math needs to be solving for the team, the size of the market and disrupting to produce value to the end user. That's what we are trying to do here.
That's why Silicon Valley so powerful is disruption against norm. So if capital is flowing through a normative process of math equations from just really smart math people, We're losing, especially in defense when China doesn't care, right? China doesn't care. So, you know, this is, then we have this kind of moral imperative.
So it means you got to look to family offices. You got to look to VCs that have a different LP base. You've got to look to folks that are trying to make an outsized return, but are willing to think differently. And willing to assume more risk and kind of go back to the throwback to kind of older, older school venture thinking of disruption and disrupting the norm as opposed to managing money against risk because of threat of an LP and some fund that, you know, I'll stop there, but you can see where I'm going with this.
This disconnect is actually existential and it's a national security risk that the lack of imagination. In most of the VC community is actually a national security risk, not even counting the fact that some of the capital that comes in working for some of those VCs even heightens that risk to a, true national security level.
[00:39:45] Ryan Connell: that was profound, um we are at time, so I'm going to wrap. but I want to say thank you so much for being on today. Uh, this was a really fun conversation.
[00:39:54] Alex Martin: Ryan, I appreciate it. Thank you so much.
[00:39:56] Ryan Connell: Absolutely. Talk soon.