WEBVTT
00:00:00.140 --> 00:00:01.122
Hey, what is up?
00:00:01.122 --> 00:00:04.471
Welcome to this episode of the Wantrepreneur to Entrepreneur podcast.
00:00:04.471 --> 00:00:20.596
As always, I'm your host, brian LoFermento, and I am so excited to have a Boston-based entrepreneur here in today's episode, because this is someone who is always committed to refining his skills, teaching others and putting great solutions out into the world.
00:00:20.596 --> 00:00:23.963
This is someone who's juggling working in academia as a professor.
00:00:23.963 --> 00:00:24.946
He has his PhD.
00:00:24.946 --> 00:00:36.585
This is someone who's super accomplished from the academic side of the coin, as well as starting and running and growing a business in the tech space that brings really important solutions to so many different businesses.
00:00:36.585 --> 00:00:38.409
So let me tell you about today's guest.
00:00:38.409 --> 00:00:40.122
His name is Ajay Joshi.
00:00:40.444 --> 00:00:55.527
Ajay is the co-founder and CEO of Cyphersonic Labs, which is a startup offering a game-changing cloud-based cybersecurity solution designed for enterprises to securely share and collaboratively process data with other enterprises.
00:00:55.527 --> 00:01:01.326
In his other life, he is a professor in the electrical and computer engineering department at Boston University.
00:01:01.326 --> 00:01:06.963
Before joining BU, he was at MIT, so still in the Boston area and before that at Georgia Tech.
00:01:06.963 --> 00:01:10.921
He has had industry stints at Google, intel and Light Matter.
00:01:10.921 --> 00:01:13.668
So yes, he introduced me when I came across his LinkedIn.
00:01:13.668 --> 00:01:19.210
I've never seen ex-Googlers put their name as literally X and then Google without the G.
00:01:19.210 --> 00:01:26.944
So I love the fact that Ajay has been in the tech space refining those skills and then has taken the plunge while still being a professor.
00:01:26.944 --> 00:01:31.224
There's so much to his journey that I'm excited to hear about, so I'm not going to say anything else.
00:01:31.504 --> 00:01:34.352
Let's dive straight into my interview with Ajay Joshi.
00:01:34.352 --> 00:01:40.649
All right, ajay, I'm so excited that you're here with us today.
00:01:40.649 --> 00:01:41.450
First things first.
00:01:41.450 --> 00:01:43.807
Welcome to the show, thank you.
00:01:43.807 --> 00:01:45.260
Thank you very much, heck.
00:01:45.260 --> 00:01:47.790
Yes, you do so many cool things, ajay.
00:01:47.790 --> 00:01:52.489
You're very accomplished in so many different regards, but you've got to take us beyond the bio.
00:01:52.489 --> 00:01:53.293
Who is Ajay?
00:01:53.293 --> 00:01:55.320
How'd you start doing all these awesome things?
00:01:58.286 --> 00:02:01.792
I basically focus on things that get me excited to come to work.
00:02:01.792 --> 00:02:05.509
This is a question that I often get like how do I choose my field?
00:02:05.509 --> 00:02:06.162
How do I choose?
00:02:06.162 --> 00:02:07.126
What do I work on?
00:02:07.126 --> 00:02:12.468
So my typical answer is work on something that gets you excited to come to work on Monday mornings.
00:02:12.468 --> 00:02:14.867
Those are the most challenging days of the week.
00:02:14.867 --> 00:02:18.586
That is the most challenging day of the week, and so that's what I do.
00:02:18.586 --> 00:02:29.024
So lately I have been excited about cybersecurity and I've done some good work with my students at BU, and one of the work was related to homomorphic encryption.
00:02:29.024 --> 00:02:38.641
That led to the startup and me and my former student, now co-founder we founded this company and we are chugging along, trying to save the world one day at a time.
00:02:39.245 --> 00:02:44.843
Yes, I love that overview, especially because right away you went right to that purpose and that passion.
00:02:44.843 --> 00:02:51.139
So I've got to ask you about that, because I know that your journey into this field, of course, didn't happen when you started your business.
00:02:51.139 --> 00:02:57.466
It happened way before that, not even when you're a professor, but all the way back to your days as a student and, I would imagine, even before that.
00:02:57.466 --> 00:02:59.311
So where did that passion come from?
00:02:59.311 --> 00:03:05.640
What was your first exposure into the world of not just IT but cybersecurity and going as deep into it as you are today?
00:03:07.103 --> 00:03:14.480
So if we really go back into my past, the first inspiration maybe it came from my parents.
00:03:14.480 --> 00:03:22.806
So both of them are doctors by profession and they run their own clinics and, if you think about it in some way, they are running a small business.
00:03:22.806 --> 00:03:30.281
They have their employees, they are dealing with patients who are the customers, and so I guess that thread was always there at the back of my mind.
00:03:30.281 --> 00:03:37.421
But I got my PhD, went down the academic track, always thought I might have a startup at some point.
00:03:37.421 --> 00:03:39.406
Unfortunately, it didn't come along.
00:03:40.388 --> 00:03:45.388
Cybersecurity, you can say it's something that has been at the back of my mind.
00:03:45.388 --> 00:03:56.370
I myself had an identity theft issue back when I was in grad school which was challenging to deal with, but I dealt with it and then I was like, OK, let's get on with life.
00:03:56.370 --> 00:04:00.120
That's fine, Happens to everyone, Can happen to everyone.
00:04:00.120 --> 00:04:01.743
It doesn't necessarily happen to everyone.
00:04:01.743 --> 00:04:05.832
But then I started my research group.
00:04:06.199 --> 00:04:10.706
I started looking at cybersecurity and my student, Rashmi, who's the co-founder.
00:04:10.706 --> 00:04:14.967
She went through the same challenge as a student.
00:04:14.967 --> 00:04:24.374
Her identity got hacked, Credit cards were basically drawn on her name and that created a problem.
00:04:24.374 --> 00:04:36.072
And then we started wondering maybe we should do something about this, Maybe we should tackle the grand cybersecurity challenges Started looking around and then homomorphic encryption was one of the grand challenges.
00:04:36.072 --> 00:04:37.584
Rashmi did excellent work.
00:04:37.584 --> 00:04:45.713
She solved the problem and we realized wait, why limit all this just to these papers that we publish or presentations that we give?
00:04:45.713 --> 00:04:47.286
Why don't we bring it to the market?
00:04:47.286 --> 00:04:49.367
Why don't we make this accessible to everyone?
00:04:49.367 --> 00:04:54.899
And so that's where the idea of startup took root, and then we started the company.
00:04:54.899 --> 00:05:00.233
We are chugging along, talking to customers, raising capital, developing the IP.
00:05:00.233 --> 00:05:04.310
Hopefully we'll be one of the go-to companies for cybersecurity in the future.
00:05:04.959 --> 00:05:06.480
Yes, I love that, Ajay.
00:05:06.480 --> 00:05:06.922
Honestly.
00:05:06.922 --> 00:05:13.634
I love the fact that you really simplify it for our listeners about taking it step by step because, as you said, it literally changes the world.
00:05:13.634 --> 00:05:16.545
I mean, this is a big problem and it's only going to get more serious.
00:05:16.545 --> 00:05:24.309
You already alluded to the fact that we face it on the consumer level when we talk about identity theft, I mean that's a real risk factor for so many of us.
00:05:24.309 --> 00:05:28.595
And then, on the flip side, on a way bigger scale, is obviously that enterprise level.
00:05:28.595 --> 00:05:30.545
So talk to us about Cyphersonic Labs.
00:05:30.545 --> 00:05:33.740
Obviously, you've identified that enterprise side of the equation.
00:05:33.740 --> 00:05:40.584
What was your thought process there in deploying it in that way or really targeting it that way to make that bigger change?
00:05:42.247 --> 00:05:45.894
So it was almost a case of a perfect storm, as they call it.
00:05:45.894 --> 00:05:54.187
So these days, everybody, all companies are using machine learning, but they don't necessarily have the resources to run machine learning workload.
00:05:54.187 --> 00:06:01.163
So the thinking is okay, I'll go to the cloud, aws, azure they offer these fantastic cloud services, let's run things there.
00:06:01.163 --> 00:06:15.870
The problem is that when you run things in the cloud, to date data gets run in non-encrypted form in the sense that, let's say, I'm looking for restaurant recommendations, I open the app, I go to Yelp and then I look at restaurants.
00:06:15.870 --> 00:06:19.947
So all my personal data is with Yelp and will be sent to the cloud.
00:06:19.947 --> 00:06:29.132
In the cloud, there'll be some processing done on my information and then I'll get a response back saying hey, you're in Boston, go to Restaurant X or go to Restaurant Y.
00:06:29.779 --> 00:06:36.704
Now, while my data is sent to the cloud, or while it's being stored in the cloud, you're fine, data is encrypted, no problem.
00:06:36.704 --> 00:06:45.411
But while processing the data, while Yelp is running, say, sophisticated machine learning models on my data, my data cannot be encrypted.
00:06:45.411 --> 00:06:47.201
And that's where things can be challenging.
00:06:47.201 --> 00:06:48.687
And I'm not picking on Yelp.
00:06:48.687 --> 00:07:02.708
This is just the case for all enterprise companies out there who, when they send the data to the cloud, data has to be processed in non-encrypted form, and that makes it very attractive to attackers Think that, hey, here is all this bunch of data to be harvested.
00:07:03.120 --> 00:07:04.846
We can do bad things out of this.
00:07:04.846 --> 00:07:08.088
And so we started wondering what to do with this.
00:07:08.088 --> 00:07:16.887
So we said, okay, homomorphic encryption is the solution where we can keep data protected in the cloud on the enterprise side by keeping it encrypted at all times.
00:07:16.887 --> 00:07:24.451
So during the entire life cycle, data will stay encrypted and so in transit, while it is at rest and while it's processing.
00:07:24.451 --> 00:07:44.886
And so the thinking was that if we can bring the solution to the enterprise side, then the clients, the companies that are using this cloud, they can have peace of mind that, okay, I'm going to the cloud, but my data is protected, my end users are happy, I don't have to worry about paying extensive fines or I don't have to worry about my users getting upset that data got compromised.
00:07:44.886 --> 00:07:48.894
So that was the thinking behind having this solution for enterprise clients.
00:07:53.040 --> 00:07:54.425
Yeah, I love hearing that, especially in a word.
00:07:54.425 --> 00:07:55.648
I mean, you obviously talk about machine learning.
00:07:55.648 --> 00:08:07.959
Where my head goes as an end consumer is obviously that plays a lot into the world of AI, which leads me to really I mean, the simple conclusion about all of our AI usage is that we're putting even more of our data unfiltered into all of these AI tools.
00:08:07.959 --> 00:08:15.454
I personally, with the AI tools that I use, I mean I'm uploading episode data, information, show notes, transcripts, all of that.
00:08:15.454 --> 00:08:21.052
At this point, ai knows more about my show than I do because we give all of our transcripts to it, so it has all this data.
00:08:21.139 --> 00:08:26.369
Now, fortunately, none of that stuff is sensitive that we're talking about here today, but for a lot of companies, it is Bigger.
00:08:26.369 --> 00:08:29.353
Companies are uploading financial data, customer data.
00:08:29.353 --> 00:08:40.436
Talk to us about the, I guess, the increasing data exposure risks of AI tools that we're now all leaning on even more within the business world.
00:08:40.436 --> 00:08:42.727
This is only going to become a bigger problem.
00:08:42.727 --> 00:08:47.972
I'd love to hear your perspective on the way that AI is actually encouraging us to give more of our data away.
00:08:49.660 --> 00:08:52.245
So I guess there are two sides of the coin.
00:08:52.245 --> 00:09:00.048
That's how I look at it, and in one case, ai has access to all this data and it is as you said.
00:09:00.048 --> 00:09:03.543
Maybe the AI tools know more about the podcast than you do.
00:09:03.543 --> 00:09:17.988
But if you were able to get that information, if you were to tell an AI modeler, one of the LLM models, and say can you extract some novel insight from all this data that you have collected about my podcast over the years, that will create a lot of value to you.
00:09:17.988 --> 00:09:24.624
Then you will know exactly what worked, what didn't work, how can I position myself moving forward, what would be the strategy that would be adopted?
00:09:24.624 --> 00:09:38.993
But that means that your data has to be shared, with the cloud, of course, and the other flip side, as I mentioned the issue is that not everybody's comfortable sharing the data, so they do lose out on AI being able to provide them with good quality service.
00:09:40.019 --> 00:09:46.493
So using homomorphic encryption gives you that sweet spot right when you say okay, you want access to my data.
00:09:46.493 --> 00:09:51.871
Fine, you are this nice AI model who's going to tell me interesting things based on my data.
00:09:51.871 --> 00:09:54.589
But I'm going to use homomorphic encryption.
00:09:54.589 --> 00:09:56.947
I'm going to keep my data encrypted all the time.
00:09:56.947 --> 00:09:59.226
So can AI now still help me?
00:09:59.226 --> 00:10:01.989
And AI models can say, yes, we can still help you.
00:10:01.989 --> 00:10:10.868
We can still run our models on your encrypted data, generate novel insight and give you much better quality of service than what you would want to see before.
00:10:10.868 --> 00:10:14.529
So AI, it's a double-edged sword.
00:10:14.529 --> 00:10:24.504
I've heard a lot about that, but if you use it the right way, I feel that it can totally change the game, totally open up more doors than we would have thought before.
00:10:25.024 --> 00:10:27.309
Yeah, ajay, I love hearing your perspective on all of this.
00:10:27.309 --> 00:10:36.451
The way that you love cybersecurity, obviously, I feel that way about business and entrepreneurship, and so I want to ask you this next question.
00:10:36.451 --> 00:10:37.936
It's a bit of a business nerd type of question, but that is.
00:10:37.936 --> 00:10:48.572
It sounds like everybody benefits from your cyber encryption and your cybersecurity solution, and so my question to you from a business perspective is where's that demand driven by?
00:10:48.572 --> 00:10:50.619
Because I really see two sides of it.
00:10:50.678 --> 00:11:01.312
Is that, as a consumer, if I choose to use a large language model, I want to demand that they are responsible with my data and, of course, if that's something that I value, I get to vote with my wallet.
00:11:01.312 --> 00:11:16.421
Now, on the flip side, the marketing departments of all of these tools also want to brag about how much the encryption is important to them, and so if the market values that, then these companies will be further incentivized to invest in good cyber security.
00:11:16.421 --> 00:11:19.929
So the question is is this something that you envision?
00:11:19.929 --> 00:11:36.462
You know, when we talk about cyber security changing the world in positive ways, do you think that demand will and should be driven by consumers, or is it going to come from the providers, the enterprise level companies themselves, of saying this is something that we're prioritizing.
00:11:36.462 --> 00:11:39.572
We're putting money behind, whether the market values it or not.
00:11:41.660 --> 00:11:52.778
So I believe that the consumer today is a lot more aware of the challenges of their own data than maybe 10, 15 years ago, and so consumers do demand that their data be protected.
00:11:52.778 --> 00:11:56.145
Consumers do demand that their data is used in ethical ways.
00:11:56.145 --> 00:12:10.481
So I feel that there is a big push and is continuing to be a big push from the consumer side that whenever I use some service from an enterprise client, I will demand that keep my data protected, tell me exactly how you're going to use my data.
00:12:10.481 --> 00:12:17.524
And then the onus is on the cloud provider and the enterprise client to say we understand, we want to protect your data.
00:12:17.524 --> 00:12:24.629
So here is an interesting technology that we are going to bring to the market which can keep your data protected.
00:12:25.340 --> 00:12:29.163
And the good thing is, since your data is I'm talking about homomorphic encryption here.
00:12:29.163 --> 00:12:33.004
Because we are using homomorphic encryption, your data is encrypted.
00:12:33.004 --> 00:12:35.520
You don't have to worry about data breaches.
00:12:35.520 --> 00:12:39.910
Feel free to share more data so we can give you better service.
00:12:39.910 --> 00:12:45.211
So you can almost look at this as a positive feedback mechanism where consumers demand something.
00:12:45.211 --> 00:12:48.509
The company say great, we have this fantastic solution.
00:12:48.509 --> 00:12:51.467
Consumers are like great, this is going to work for me.
00:12:51.467 --> 00:12:52.410
Here is some more data.
00:12:52.410 --> 00:12:55.730
Help me or give me better service.
00:12:56.320 --> 00:13:05.187
Then if you go back to the enterprise clients, more data, better models, AI models are going to then do a better job and then the service to the customer improves.
00:13:05.187 --> 00:13:09.405
So you can look at it as a nice positive feedback loop that can benefit everyone.
00:13:10.061 --> 00:13:11.222
Yeah for sure, ajay.
00:13:11.222 --> 00:13:17.428
I'm trusting you with this next question because I also know that you've got a professor hat where you can explain very complex things very nicely.
00:13:17.428 --> 00:13:21.567
And that is about the actual encryption, because obviously this is not a technical show.
00:13:21.567 --> 00:13:27.779
People may not fully understand the intricacies and the complexities and beauties of the solution that you're bringing to the marketplace.
00:13:27.779 --> 00:13:35.470
But you put such a strong emphasis and I know it's part of what makes Cyphersonic Labs what it is but the homomorphic encryption-based approach.
00:13:35.470 --> 00:13:38.525
What the heck does that mean and what are some of the alternatives?
00:13:38.525 --> 00:13:59.597
For those of us who have seen different encryption approaches I'm thinking about all the work that I previously did within sequel databases you know you have to decrypt certain things, but talk to us about what's different about your encryption approach and why that's the best way so Today we use something called AES Advanced Encryption Standard.
00:13:59.658 --> 00:14:01.462
That has been around for a long time.
00:14:01.462 --> 00:14:06.221
It's pretty much baked into all the computing systems that we use today.
00:14:06.221 --> 00:14:13.884
The new encryption technology called homomorphic encryption the core math behind it that has been around for some time.
00:14:13.884 --> 00:14:21.662
That was invented back in the 80s and it's based on a different type of math lattice-based cryptography.
00:14:21.662 --> 00:14:26.615
The challenge with that technology is that actually let me take a step back.
00:14:26.615 --> 00:14:35.302
The way that technology works is that if I do A plus B on non-encrypted data, then I can do encrypted data A, encrypted data B.
00:14:35.302 --> 00:14:43.719
I can add those two encrypted values and then the result that I get is also an encrypted form, and if I decrypt that result, that will be the correct result.
00:14:43.719 --> 00:14:53.515
So doing A plus B in non-encrypted form, it's equivalent to doing encrypted A plus encrypted B to give me the encrypted result, and so that's how the things match.
00:14:53.515 --> 00:15:00.856
Now the idea behind this right the core math behind homomorphic encryption that has been around for some time.
00:15:02.412 --> 00:15:05.778
We didn't invent that, but the problem was it was super slow.
00:15:05.778 --> 00:15:12.631
I mean, there was no way this was going to be practical and so it kept it basically stayed in academic research where it was.
00:15:12.631 --> 00:15:14.960
Here's a cute idea you can keep data encrypted.
00:15:14.960 --> 00:15:19.113
But it's not practical never going to see the light of the day, let's not bother about it.
00:15:19.113 --> 00:15:24.006
But practical never going to see the light of the day, let's not bother about it.
00:15:24.006 --> 00:15:24.830
But it didn't die.
00:15:24.830 --> 00:15:42.760
People kept working on it and over the last five, 10 years or so, there has been this big jump in the improvements that we have seen, partly because the cybersecurity issues of cybersecurity challenges have grown and there is this need for something more concrete, something more robust, something that gives strong data security and privacy guarantees.
00:15:42.760 --> 00:15:47.321
And so what we did was we said this math is great, not practical.
00:15:47.321 --> 00:15:48.936
How can we make it practical?
00:15:48.936 --> 00:15:51.438
How can we bring this to the whole world?
00:15:52.291 --> 00:15:54.799
And we realized that, okay, we are computer engineers.
00:15:54.799 --> 00:16:02.938
The first thing we do is try to understand the problem, try to understand the algorithm, try to understand the workload that is going to run on our computing system.
00:16:02.938 --> 00:16:13.557
So we analyzed what exactly is happening in that homomorphic encryption and we said, okay, do we need to have more memory, do we need to have more compute, or is communication the problem?
00:16:13.557 --> 00:16:18.296
We realized, oh, memory is the problem, communication is the problem, compute not so much.
00:16:18.296 --> 00:16:26.293
So we started attacking the problem from that side and then said, okay, what do we need to do to really solve the communication problem and memory problem?
00:16:26.990 --> 00:16:33.096
And then we said, okay, that's fine, but it shouldn't be a case where there is a solution that will take 10 years to come to the market.
00:16:33.096 --> 00:16:34.234
That doesn't make sense.
00:16:34.234 --> 00:16:39.956
So can we use the technology, the hardware that we have today to deploy this homomorphic encryption?
00:16:39.956 --> 00:16:43.342
And that's where FPGAs came into the picture, which are already there today.
00:16:43.342 --> 00:16:44.474
They're there in the cloud.
00:16:44.474 --> 00:16:48.600
Our solution uses those FPGAs that can be deployed in the cloud right away.
00:16:48.600 --> 00:16:52.380
So the core technology, homomorphic encryption.
00:16:52.380 --> 00:16:55.052
The idea is to keep data encrypted at all times.
00:16:55.052 --> 00:17:04.196
And then our company, basically, has figured out a way to make that technology practical using a variety of algorithmic ideas, hardware ideas and software ideas.
00:17:05.058 --> 00:17:15.160
Yeah, Ajay, I'm going to call this out publicly because part of what I think is your unfair advantage is you have that rare combination of the academic world, but also with the practical applications.
00:17:15.200 --> 00:17:16.349
We could talk about your past career.
00:17:16.349 --> 00:17:19.496
Obviously, you've worked for some of the most impressive tech companies out there.
00:17:19.496 --> 00:17:23.804
You're an ex-Googler, You've worked at Intel, and so we have that practical side of the equation.
00:17:23.804 --> 00:17:33.571
But now you're talking about and you actually you revealed it right there in that answer about how the academic world is pushing the envelope when it comes to research and finding new ways to do things.
00:17:33.571 --> 00:17:52.771
You're uniquely situated in that you've been deep into both of those worlds, which a lot of times those solutions never come to the light of the world because it takes the private industry, it takes entrepreneurship and in business and profit incentives and a lot of other positive sides of how I argue that entrepreneurship changes the world.
00:17:52.771 --> 00:17:58.451
It takes that to bring these solutions to the, to see the light of day and to actually make a difference in the world.
00:17:58.451 --> 00:18:08.961
Talk to us about that unique setting for you, because I'm sure that that is different in your industry, that you bring that research-backed approach but also the practical experience you've picked up.
00:18:10.892 --> 00:18:14.065
So I would actually like to give a shout out to my mentors.
00:18:14.065 --> 00:18:23.265
When I was a grad student and when I was at MIT, they emphasized that whatever you do, make sure there is practicality to it.
00:18:23.265 --> 00:18:26.597
Make sure that whatever you're doing can actually come to the market.
00:18:26.597 --> 00:18:30.394
Don't work on something obscure which nobody cares about.
00:18:30.394 --> 00:18:36.811
So that motivation or that aspect of thinking has always been there.
00:18:37.534 --> 00:18:42.003
And I was lucky enough to work at Intel, work at Google, also work at Life Matter.
00:18:42.003 --> 00:18:48.714
So while working there, talking to engineers, understanding how is their line of thinking, they do also care about new ideas.
00:18:48.714 --> 00:18:58.063
They also do care about coming up with novel circuits or novel architectures or novel systems, but they are grounded in reality and they are grounded that.
00:18:58.063 --> 00:18:59.733
Is this really going to work?
00:18:59.733 --> 00:19:02.991
Is this something that I can deploy in the next three years or four years?
00:19:02.991 --> 00:19:15.193
And so that helped me set my mindset that it's okay to think about blue sky ideas, it's okay to think I'm going to design the next best piece of hardware, but is it practical?
00:19:15.193 --> 00:19:16.076
Can it really be done?
00:19:17.112 --> 00:19:22.237
And so, in fact, a shout out to my postdoc advisor who always says we can't fool Mother Nature.
00:19:22.237 --> 00:19:26.701
So whatever you do, make sure that it's actually practical and can actually be prototyped.
00:19:26.701 --> 00:19:46.342
So that's how the mindset developed and that helped me stay grounded, helped me keep my one leg in academia, where I can do work with these fantastic grad students and colleagues and think about ideas, and then, other than, on the industry side, where my industry colleagues keep me grounded by saying, yeah, this is interesting, but we'll not fly.
00:19:46.342 --> 00:19:48.462
Can you come up with something, some other ideas?
00:19:48.462 --> 00:19:53.026
So it helps me keep grounded, helps me come up with this balance of things.
00:19:53.606 --> 00:20:01.334
Yeah, I love to hear that real life advice, especially since it's happening on a beautiful campus sitting right on the Charles River, outside of Boston and Cambridge.
00:20:01.334 --> 00:20:07.079
It's very cool to hear how your mentor's influence has really shaped your perspective and your entrepreneurial beginning.
00:20:07.079 --> 00:20:09.813
So huge shout out to everyone who's been involved in your journey.
00:20:09.813 --> 00:20:18.173
Ajay, I want to ask you about your professional career, because when I list those companies, people are inherently going to be impressed by the fact that you've worked at these tech giants.
00:20:18.173 --> 00:20:20.137
But I want to hear the inside scoop.
00:20:20.137 --> 00:20:25.571
I'm sure you picked up things along the way where you thought that's best practice and I want to bring that with me throughout my career.
00:20:25.571 --> 00:20:34.970
But also as a tech co-founder and CEO, you have the chance to be nimble and to do things a little bit differently, so I'd love to hear both sides of that coin from you.
00:20:35.872 --> 00:20:36.692
Yeah.
00:20:36.692 --> 00:20:54.520
So one interesting thing that I heard during my time at the industry that there are large companies companies like Google, Meta, Amazon, Microsoft who are like well, we can call them elephants, where if they move, then it completely changes the scenery, completely changes how things work.
00:20:54.520 --> 00:21:04.663
Then there are companies, smaller companies, who are at the scale of a gazelle or a deer, who are very nimble, can change directions relatively quickly and then can carve out a dish.
00:21:04.663 --> 00:21:11.881
And there are very small companies, which are the size of a mouse maybe, who are good but can get squashed easily.
00:21:11.881 --> 00:21:17.439
So the challenge is to make sure that you fit in the right, depending on what stage you are.
00:21:17.759 --> 00:21:21.201
You be careful If you're the size of a mouse, then watch out, Be careful about what other you are.
00:21:21.201 --> 00:21:40.994
You be careful If you're the size of a mouse, then watch out, Be careful about what other people are doing, Move faster and then hopefully you'll grow into a gazelle where you do influence the industry but you are able to change directions relatively easily, and then eventually you can become a giant like Google, Microsoft, Amazon, where size of an elephant, which can completely change the game with a single decision.
00:21:40.994 --> 00:21:48.223
So there are these interesting perspectives that I've heard during my time and that did influence my way of thinking.
00:21:48.223 --> 00:21:54.462
The main thing is that be brave, go out there, try to do something challenging.
00:21:54.462 --> 00:22:00.882
You'll have regrets about the shots that you don't take rather than the shots that you took and failed.
00:22:00.882 --> 00:22:07.093
So that's my outlook has been.
00:22:07.093 --> 00:22:08.056
I'm super excited about this company.
00:22:08.056 --> 00:22:09.578
I'm cautiously optimistic that our company will succeed.
00:22:09.578 --> 00:22:13.017
We will be there for the long time and we'll see how it goes.
00:22:13.719 --> 00:22:15.444
Yeah, ajay, I love that perspective.
00:22:15.444 --> 00:22:16.648
I love analogies.
00:22:16.648 --> 00:22:26.071
So, hearing you use the elephant, gazelle and mouse analogy, it resonates deeply with me because I think it also speaks to the fact that there are advantages and disadvantages to both.
00:22:26.071 --> 00:22:29.402
An elephant can't turn nearly as quickly as a mouse can.
00:22:29.402 --> 00:22:33.855
However, every single step is felt much farther and much greater.
00:22:33.855 --> 00:22:41.041
So I love that analogy and I'm going to put you on the spot, ajay Where's Cyphersonic Labs going to be when you look at your own growth trajectory?
00:22:41.041 --> 00:22:43.929
The spot, ajay, where's Cyphersonic Labs going to be when you look at your own growth trajectory?
00:22:43.929 --> 00:22:48.961
Obviously, taking on the cybersecurity industry is a huge thing and I know that funding is part of it and your growth path.
00:22:48.961 --> 00:22:57.653
It is a step-by-step process, but I'd love to hear, when you think about where Cyphersonic Labs is going, do you think, one year out, five years out, 10 years out, what's the future of your company look like?
00:22:58.855 --> 00:23:01.142
So we are right now the size of a mouse.
00:23:01.142 --> 00:23:13.325
I'll be honest, we are not that big yet, but in five years time we hope to be the size of a gazelle, where we can more quickly change directions, if required, but have an influence on the cybersecurity industry.
00:23:13.325 --> 00:23:19.583
And maybe 10, 15 years down the line, we are as big as an elephant, fingers crossed.
00:23:21.090 --> 00:23:21.953
Yeah, I love that.
00:23:21.953 --> 00:23:30.660
Let's talk about the needs of that industry because obviously, when you jump into an industry as big and far-reaching and important as yours, especially with enterprise level solutions.
00:23:30.660 --> 00:23:34.653
You've talked about funding before and obviously you and your past you know.