Co-founder of Brainhub, Matt describes himself as a “serial entrepreneur”. Throughout his career, Matt has developed several startups in Germany, wearing many hats- from a marketer to an IT Engineer and customer support specialist. As a host of the Better Tech Leadership podcast, Matt talks about growing successful businesses and the challenges of being a startup founder and investor.
Jochen Issing is a seasoned software engineering leader with a strong focus on performance, scalability, and agile methodologies. He is currently the Director of Engineering at FERNRIDE, where he oversees engineering and verification ecosystems. Previously, Jochen held key roles at Argo AI and Autonomous Intelligent Driving GmbH, contributing to the development of autonomous systems and scalable infrastructure. His extensive experience includes positions at Fraunhofer IIS, Skype, and the University of Erlangen-Nuremberg, where he worked on cutting-edge technologies in media streaming and IP telephony. Jochen’s expertise spans across various engineering disciplines, always with an emphasis on lean and agile development practices.
00:00:09,465 --> 00:00:15,565
Speaker 1: My name is Matt, and I will be talking to Johannessing about leadership insights, team dynamics, and continuous learning.
00:00:17,930 --> 00:00:20,830
So we are at the point of we are talking about the education.
00:00:21,369 --> 00:00:26,705
And what interests me a lot, you have the PhD in philosophy. Right?
00:00:27,645 --> 00:00:31,699
So I'm just wondering, like, why?
00:00:32,640 --> 00:00:34,239
Speaker 2: You wanna add with my question?
00:00:34,239 --> 00:00:38,160
I got so I was working at Fraunhofer for quite a while.
00:00:38,160 --> 00:00:40,445
Fraunhofer is a research institute in Germany.
00:00:40,984 --> 00:00:48,905
It's probably best known for things like video coding and audio coding, but, like, it has all kinds of discipline, actually.
00:00:48,905 --> 00:00:55,339
So it's a really, really broad institute, probably the biggest research institute in Europe, roughly like that.
00:00:55,339 --> 00:01:03,655
And, I was in the institute for MP 3, auto coding, even though MP 3 was long long done before I joined.
00:01:04,034 --> 00:01:07,580
And, I was working there for a couple of years.
00:01:07,580 --> 00:01:10,060
It's, actually following the tariff system.
00:01:10,060 --> 00:01:13,680
So eventually, like, you only get limited contracts.
00:01:13,935 --> 00:01:15,875
Eventually you have to change your contract.
00:01:17,455 --> 00:01:20,275
And doing a PhD was actually, like, a possibility.
00:01:20,575 --> 00:01:26,320
So, my institute actually offered me to do it because they actually had interest in keeping
00:01:26,320 --> 00:01:30,399
me, and they wanted to actually offer me, like, another job also after the PhD.
00:01:30,399 --> 00:01:34,505
So that that they said, hey, don't you wanna do a PhD?
00:01:34,505 --> 00:01:42,080
And, I always wanted to do a PhD because I always, I I kind of look out for challenges, in a way.
00:01:42,860 --> 00:01:50,475
And, the only thing that actually, kept me thinking was that it's, like, financially, it's,
00:01:50,475 --> 00:01:53,375
like, even less than the tariff system in in Fraunhofer.
00:01:53,515 --> 00:01:56,875
But I I said, well, it's a chance, and I'll take it.
00:01:56,875 --> 00:01:58,940
And let's see if I can make it.
00:01:59,740 --> 00:02:04,640
And that was the thing that brought me into the PhD in the first place, I would say.
00:02:05,500 --> 00:02:12,115
And it was a challenge, and I actually didn't finish in the original time frame.
00:02:12,335 --> 00:02:18,550
I actually took it to Sweden with me, and I finished it in the evenings.
00:02:19,410 --> 00:02:25,815
And I think two and a half years or so after the original time frame, I I managed to close it as well.
00:02:25,815 --> 00:02:31,194
So, yeah, it, it was a tough tough one, but I'm happy I did it.
00:02:31,255 --> 00:02:38,970
And, also ever since I I finished it, I anyways don't really use the title, in that sense.
00:02:40,310 --> 00:02:48,075
But also in the job, the title itself or the education itself, I wouldn't say it didn't make any difference.
00:02:49,840 --> 00:02:57,605
But what did make a difference is, what I learned in terms of, like, academia work ethics, kind of. Right?
00:02:57,605 --> 00:03:00,025
So what does it mean to do research?
00:03:00,405 --> 00:03:02,105
What does it mean to write papers?
00:03:02,724 --> 00:03:06,105
It's, like, a certain diligence that you apply to your work.
00:03:06,410 --> 00:03:11,630
And, it's like a good practice in, like, technical reasoning as well.
00:03:11,770 --> 00:03:19,114
And I think I drew from from that experience quite a bit, and I still do it today, even though,
00:03:19,114 --> 00:03:25,410
like, the day to day reasoning in management and leadership is sometimes not necessarily the
00:03:25,410 --> 00:03:28,550
same as they do in a in a scientific paper,
00:03:29,090 --> 00:03:31,510
Speaker 1: but still, like, similar principle.
00:03:32,584 --> 00:03:37,864
And you said something which is not popular, especially here in Germany, to be honest. Yeah.
00:03:37,864 --> 00:03:44,319
You said, like, you have the title, you have a PhD, you don't care about so much about the title,
00:03:44,319 --> 00:03:48,495
but maybe about what you did because of this title, like the resurgence and so on.
00:03:48,655 --> 00:03:54,254
But in my experience in Germany, especially, I worked for for BMW for a while, and everybody
00:03:54,254 --> 00:03:58,790
who had a PhD, they were really like emphasizing that.
00:03:58,790 --> 00:04:01,770
And this was like a huge part of their career.
00:04:02,070 --> 00:04:04,115
So if this is what I understood.
00:04:04,175 --> 00:04:09,075
So if you want to make the career in the engineering side, you need to have the PhD.
00:04:09,135 --> 00:04:11,455
And this was what I heard here in Germany.
00:04:11,455 --> 00:04:18,080
So you are the first German guy saying this is you haven't done it for the title. It's not so, important.
00:04:18,460 --> 00:04:24,115
Speaker 2: I'm very happy that you have me here, that I can also show a different perspective on what what
00:04:24,115 --> 00:04:28,435
Germans can can do with how Germans can leave it. Yeah.
00:04:28,435 --> 00:04:32,840
It's probably like, a a special kind of ecosystems.
00:04:34,420 --> 00:04:43,325
To be frank, like, I I joined the automotive world without ever feeling like that, yeah, I'm an automotive guy.
00:04:44,345 --> 00:04:54,705
And, I I must admittedly say that I kind of despise, like, the the style that sometimes, the
00:04:54,705 --> 00:05:04,770
industry was, representing itself, like, very, very, proud, very, very, yeah, sometimes a bit arrogant, I would say.
00:05:06,509 --> 00:05:12,205
I I don't think it's necessarily still the case, and I think that I I probably exaggerated my
00:05:12,205 --> 00:05:18,205
my perspective there, back in the days, because now I know a lot of guys that actually were
00:05:18,205 --> 00:05:28,275
in the auto automotive industry, but they're everything but arrogant, so I had to I had to learn that as well. But, Yeah.
00:05:28,275 --> 00:05:32,055
I think, there are actually many people like like I described.
00:05:32,355 --> 00:05:37,015
I could totally tell from from all to the friends I had and the colleagues that I had at university.
00:05:37,409 --> 00:05:42,129
Many do it mostly for the for the challenge, I would say. Yeah.
00:05:42,129 --> 00:05:43,590
Maybe not all of them.
00:05:43,810 --> 00:05:50,795
We had a couple of folks from auto from the automotive industry, and, maybe they were following that path. Yeah.
00:05:50,795 --> 00:05:53,775
I now I know what to do.
00:05:54,449 --> 00:06:01,410
Speaker 1: Yep. And and you work in automotive industry, especially, regarding autonomous, cars for the
00:06:01,410 --> 00:06:07,665
last 6, 7 years, more or less. Right? That's a good question. I think so. Yeah.
00:06:08,525 --> 00:06:16,570
So I'm just wondering because I recently read an article about the Titan, product from Apple.
00:06:16,570 --> 00:06:19,690
So they wanted to create autonomous car, but they killed the product.
00:06:19,690 --> 00:06:21,630
So I assume it was not so.
00:06:22,995 --> 00:06:25,635
It was quite challenging to create a car.
00:06:25,635 --> 00:06:32,060
The Tesla, you have, already those system to help you with the autonomous driving and support you while driving.
00:06:32,060 --> 00:06:38,000
But I'm just wondering, like, how do you think here in Europe or in Germany when we'll see finally,
00:06:38,060 --> 00:06:40,285
like, a fully autonomous cars?
00:06:40,664 --> 00:06:44,044
Or this is still hard to hard to say?
00:06:44,585 --> 00:06:49,970
Speaker 2: It's it's definitely hard, and I had no clue when I joined, AID back in the days.
00:06:49,970 --> 00:06:54,050
I I was even surprised, like, what do you guys want from me?
00:06:54,050 --> 00:07:02,915
Like, I I don't do autonomy and, autonomous cars and, like, I'm not I don't even know, like, much about cars. I actually ate cars.
00:07:03,855 --> 00:07:10,430
So I'm, I'm a I'm a buy bicycle rider by heart, like, big time, actually.
00:07:11,610 --> 00:07:16,865
But, yeah, they didn't care about that, And I'm actually happy that they they didn't.
00:07:17,805 --> 00:07:19,085
So I just looked it up.
00:07:19,324 --> 00:07:26,720
I started 2018, at AID. So 6 years. Pretty good. Yeah. Good testament. Thank you.
00:07:27,580 --> 00:07:32,205
Just just yesterday morning yesterday evening, I wondered which year do we have now.
00:07:32,205 --> 00:07:39,164
So now you can imagine how good I am with and knowing all the dates. But, yeah.
00:07:39,164 --> 00:07:49,175
So when I when I started at AID, I I learned pretty quickly that, like, this problem that the
00:07:49,175 --> 00:07:52,235
industry is trying to tackle is pretty pretty hard. And,
00:07:54,759 --> 00:08:00,780
like, we have to use all kind of support technologies to make it happen.
00:08:01,479 --> 00:08:07,535
It's not that, well, that's what I originally thought it would be, like, that you, like, build
00:08:07,535 --> 00:08:13,860
a car, you add a couple of sensors, and then the car can see the world and then can make decisions. Right?
00:08:14,000 --> 00:08:16,580
It's, unfortunately not that easy.
00:08:17,535 --> 00:08:23,295
Even though, Tesla is trying to approach and maybe a bit like that, but there is actually a
00:08:23,295 --> 00:08:25,395
lot of more information that is coming.
00:08:25,550 --> 00:08:27,310
And, actually, also Tesla is using that.
00:08:27,310 --> 00:08:37,775
So, like, the collection of data, the creation of HD maps, like the, like this, like all of
00:08:37,775 --> 00:08:45,250
these things, they, they need to be slowly build up, and there is always this it's actually,
00:08:45,250 --> 00:08:49,810
I think, a classic nowadays, the long tail, in autonomous driving. Right?
00:08:49,810 --> 00:08:59,134
So, whatever technology you build, there might still be edge cases, that you don't fully cover.
00:08:59,470 --> 00:09:03,149
And, I think that kind of alludes or yeah.
00:09:03,149 --> 00:09:10,385
That leads to the, to the solution that we will have to wait for actual autonomous cars, how
00:09:10,385 --> 00:09:13,825
we imagine them today for a long, long time, I would say.
00:09:13,825 --> 00:09:20,300
But I could totally imagine that within certain ODDs, like certain limited, maybe sometimes
00:09:20,360 --> 00:09:25,639
fenced areas, maybe like slower speeds and so on, which is also happening, right?
00:09:25,639 --> 00:09:32,535
You can see that there, I think, in the special purpose areas, that is where, this High Court's
00:09:32,535 --> 00:09:35,275
revolution is going to, take off.
00:09:35,890 --> 00:09:43,190
And then for, like, this full self driving capability stuff where you don't need anything, the car just drives itself.
00:09:44,235 --> 00:09:49,695
Right now, I see it rather as, like, an approximate goal eventually in eternity.
00:09:51,240 --> 00:09:54,040
But, yeah, that's the that's the tough thing.
00:09:54,040 --> 00:09:56,519
So we we have to address it.
00:09:56,519 --> 00:10:04,175
And it's, it's actually not that hard to imagine because even as humans, you cannot drive everywhere, right?
00:10:04,875 --> 00:10:10,270
Go to, I don't know, Poland is maybe not that different from Germany, but go to China, and will
00:10:10,270 --> 00:10:18,590
you just sit in the car and drive safely in China as a German or Polish guy? Probably not, I guess. I wouldn't dare, actually.
00:10:18,590 --> 00:10:24,355
I was I was actually afraid to death when I first came to China and sat in a taxi.
00:10:25,295 --> 00:10:29,240
So, I wouldn't I wouldn't dare to drive there, let alone cycling.
00:10:30,660 --> 00:10:36,714
So it's it's actually a tough a tough problem, and, it has a lot of edge cases.
00:10:36,714 --> 00:10:39,455
And so I think, we we'll have to be patient.
00:10:40,690 --> 00:10:46,370
But, yeah, I think the industry also shifted a bit away from this big bang solution, right,
00:10:46,370 --> 00:10:51,585
where we just build this a robotaxi, and off you go and everything's fine. Right?
00:10:51,585 --> 00:10:57,380
Waymo is still, like, on that track, but, I think they can do it because they, first of all,
00:10:57,380 --> 00:10:58,740
they also limit their ODD.
00:10:58,740 --> 00:11:00,540
It's visible also from the videos.
00:11:00,540 --> 00:11:07,915
And, while they they just sit on, seemingly infinite amount of money. Right?
00:11:07,915 --> 00:11:14,890
So that is also good driver for building stuff, technical stuff, but not everybody has that.
00:11:14,890 --> 00:11:20,170
And, the others, they have to build intermediate products, and that means, like, driver assistance
00:11:20,170 --> 00:11:28,504
systems, and, like, the the going through the different levels 1 by 1, building something profitable
00:11:28,504 --> 00:11:34,839
that you can sell so that you can keep going building stock for the company. Yeah.
00:11:34,899 --> 00:11:39,095
And you mentioned, interesting thing that this is a lot about the data.
00:11:39,095 --> 00:11:40,214
So you're collecting a lot
00:11:40,214 --> 00:11:48,590
Speaker 1: of data, and you kind of need to interpret, those data to kind of, adapt to the external circumstances
00:11:48,970 --> 00:11:51,070
and to drive the autonomous car. But
00:11:53,850 --> 00:11:59,265
how do you see the AI and, like, current, like, large language models?
00:11:59,265 --> 00:12:01,605
Like, does it, like, somehow help?
00:12:01,745 --> 00:12:06,920
Do you see some kind of implementations that might be in case of autonomous driving?
00:12:09,460 --> 00:12:16,135
Speaker 2: So definitely, I'm super impressed by, all the things, all the progress that is happening, like,
00:12:16,135 --> 00:12:21,069
all the generative AI, products that are hitting the ground these days.
00:12:21,449 --> 00:12:28,145
I'm I'm using them all as much as I can, because I'm really, really thrilled by it.
00:12:29,585 --> 00:12:37,400
I think well, of course, large language models probably won't drive a car, but the artificial
00:12:37,400 --> 00:12:42,220
intelligence models, they are they are really closing the gap, I think, in many technologies,
00:12:42,280 --> 00:12:44,460
and they're they're also pushing the boundaries.
00:12:46,324 --> 00:12:53,660
The only thing that, we'll have to tackle there is, for things like vehicles or cars in general.
00:12:53,660 --> 00:13:02,235
Like, we're talking about, vehicles that weigh a ton, and if they smash into anything, they,
00:13:02,235 --> 00:13:04,154
like, they create a big disaster. Right?
00:13:04,154 --> 00:13:13,850
So, you have to prove that whatever they do or, like, however they behave is safe, and they don't harm anything.
00:13:13,990 --> 00:13:17,589
Like, that's that's really tough.
00:13:17,589 --> 00:13:26,185
And especially with, AI models where, like, I'm not an AI expert, but, with the hallucinations
00:13:26,645 --> 00:13:33,500
that you see with the large language models, the certain unpredictable, unpredictable behaviors
00:13:33,560 --> 00:13:35,740
that you see with some networks. Right?
00:13:36,040 --> 00:13:39,205
Like, for human perception, they are still pretty good. Right?
00:13:39,525 --> 00:13:44,825
But to actually prove that the system is safe is a very, very different story.
00:13:45,365 --> 00:13:55,095
And, it's also different in a way because, if you sell a car, the responsibility is with the driver. Right?
00:13:55,975 --> 00:14:00,714
And you cannot prove that every person on the planet is driving safely. Right?
00:14:00,855 --> 00:14:07,290
You probably cannot even prove that every person that has a driver's license, is, is driving safely. Right?
00:14:07,290 --> 00:14:12,910
But the responsibility is, anyway, is not with a driving school, I guess, but, with a driver.
00:14:13,245 --> 00:14:19,565
But when you when you sell a product, that has artificial intelligence on it, then, like, the
00:14:19,565 --> 00:14:22,699
responsibility is with the vendor, right, with the OEM.
00:14:23,240 --> 00:14:30,965
And then, yeah, like, they want to really prove that, yeah, the the product is safe, and it's,
00:14:31,365 --> 00:14:35,625
it only behaves as desired in all possible circumstances.
00:14:36,649 --> 00:14:40,029
And, that's that's a pretty tough thing.
00:14:40,329 --> 00:14:43,630
Also, like, cars, they don't see
00:14:46,255 --> 00:14:54,540
like we do. Alright? Even even the even the, the II networks, at least the ones that I I saw.
00:14:55,800 --> 00:15:02,135
So as a as a driver, when I drive my car, I I see, like, moving objects in a way.
00:15:02,135 --> 00:15:08,715
I can, like, immediately classify them as, as humans, as trucks.
00:15:09,575 --> 00:15:15,580
I can for me, often it's enough that, oh, yeah, it's a big moving object or it's a small moving object. Right?
00:15:16,520 --> 00:15:22,415
But it's not working necessarily like that with the with the lang with the with the networks.
00:15:22,795 --> 00:15:33,880
And, like, moreover, I can actually tell from the pose of the person and the way how it's walking. Is that person drunk?
00:15:34,020 --> 00:15:35,560
Is it like a child?
00:15:35,700 --> 00:15:39,545
Is it like unstable in its walking style?
00:15:40,565 --> 00:15:46,630
Where is it likely that the person will actually enter the street, or will it actually go the other way?
00:15:47,190 --> 00:15:54,790
It's like there are so many subtle information pieces of information that are attached to, these
00:15:54,790 --> 00:16:00,935
interactors that, you can actually perceive as a human, but the car cannot do that yet.
00:16:00,935 --> 00:16:03,435
So it has to actually take a couple of assumptions.
00:16:04,060 --> 00:16:10,825
And, so the actual building of that product is very different to the way how we actually drive our cars.
00:16:11,385 --> 00:16:17,405
Speaker 1: During the discussion between us and to discuss, like, a potential topics for our today's, interview,
00:16:18,500 --> 00:16:25,220
I got the, I got the impression that the infrastructure is something it's something kind of interesting for for you.
00:16:25,220 --> 00:16:30,165
But, like, if I when I hear the infrastructure and I imagine myself, the infrastructure in my
00:16:30,165 --> 00:16:37,250
mind, I have some something like completely opposite feeling that I don't see it, like, super sexy or interesting topic.
00:16:37,790 --> 00:16:42,510
Speaker 2: Yeah. Well, welcome to my world. Yeah.
00:16:42,510 --> 00:16:43,790
I, I know what you mean.
00:16:43,790 --> 00:16:48,255
And I'm super happy that we can talk about infrastructure because this is where I'm coming from
00:16:48,255 --> 00:16:51,774
or what what I've been going for the past 6 years. Oh, woah. Okay.
00:16:51,774 --> 00:16:54,675
The Elia is trying to eat your microphone. That's not good.
00:16:55,089 --> 00:16:56,290
We have a dog with us.
00:16:56,290 --> 00:16:58,529
So, we have to fight it while we're talking.
00:16:58,529 --> 00:17:02,800
It's just like an additional challenge here. So yeah.
00:17:03,904 --> 00:17:11,610
When I when I started the AD, I I mentioned it earlier in a private conversation that, I I was
00:17:11,610 --> 00:17:14,270
supposed to actually help with a sensor interfacing.
00:17:15,690 --> 00:17:22,635
But in the in the build system team itself, like, we call it engineering process, actually,
00:17:24,855 --> 00:17:28,030
there were a couple of people missing, and, I said, yeah.
00:17:28,030 --> 00:17:31,710
Well, maybe I help out a bit, but not for norms. And yeah.
00:17:31,710 --> 00:17:40,365
Well, that was my last word about that, I guess, for So I, I, stayed in the infrastructure business
00:17:40,365 --> 00:17:42,705
ever since, and I'm I'm still in there today.
00:17:44,660 --> 00:17:51,380
So for me, infrastructure is really like the whole like, all the tooling, all the ground on which everything is built.
00:17:51,380 --> 00:17:59,615
And not only the the dumb tools, but also the, like, the integration of AI models. Right?
00:17:59,755 --> 00:18:07,250
Like, if you want to, have this, inference of the network, so, like, you apply the network in
00:18:07,250 --> 00:18:11,510
your product, there is so much infrastructure that needs to happen.
00:18:11,945 --> 00:18:18,505
And the actual, like, model tweaking and model training and so on, even for the experts that
00:18:18,505 --> 00:18:25,770
work on the NL integrations, it's it's not, like, the only work that they do. Right?
00:18:25,770 --> 00:18:31,715
They don't only to do do network tweaking, but there is already there a lot of infrastructure that needs to work.
00:18:31,715 --> 00:18:38,375
Like, all the sensor data needs to be merged highly efficiently, needs to find its way to the actual network.
00:18:39,140 --> 00:18:42,340
It has to, it has to be super reliable also, the sensor.
00:18:43,460 --> 00:18:49,765
You have to also make sure that the, like, the failure modes are are captured in the sensors and so on.
00:18:49,765 --> 00:18:57,040
So there is so much infrastructure out there, that, while in my world, infrastructure is everything,
00:18:57,100 --> 00:19:04,815
but, I think in a in a product like an autonomous car or truck, like we do it here, it's easily
00:19:04,815 --> 00:19:09,375
90% in my in my opinion. So yeah.
00:19:09,375 --> 00:19:13,630
Well, maybe not as much, but it's it's a it's a it's a huge part.
00:19:14,190 --> 00:19:21,685
And I think that is something that many people don't see or actually miss, and, something that
00:19:21,685 --> 00:19:24,745
is also, as you say, it's not necessary, sexy. Right?
00:19:25,525 --> 00:19:31,510
So infrastructure sounds boring, but it's, it's extremely powerful.
00:19:31,650 --> 00:19:35,590
And, it's also exciting, and it has its own challenges.
00:19:36,985 --> 00:19:44,045
And, yeah, you can even do it with a with a PhD in, computer science.
00:19:44,200 --> 00:19:47,260
And it's still in it's still interesting, it's still exciting.
00:19:47,800 --> 00:19:52,380
Speaker 1: So what what you're saying, like, the the the really good infrastructure is important.
00:19:52,845 --> 00:19:55,825
Probably, it's super important to reach the company goals.
00:19:56,205 --> 00:19:58,145
As you said, that's like a backbone.
00:19:58,684 --> 00:20:05,250
I assume that this might be a bit difficult to explain maybe to the business stakeholders that
00:20:05,250 --> 00:20:11,112
this is such a crucial crucial part of the business, and especially here, right, like, in in
00:20:11,112 --> 00:20:11,705
a company which you which you are or previously worked. Yeah. That's that's true.
00:20:11,784 --> 00:20:14,230
Company which you which you are or previously worked. Yeah. That's that's true. I mean,
00:20:14,230 --> 00:20:16,445
Speaker 2: here in this company, I have the benefit that I already knew the people
00:20:21,630 --> 00:20:30,830
that do are in, like, the positions, that, like, in the positions that have the power to decide,
00:20:30,830 --> 00:20:37,075
like, where do we move people and who gets hold on headcount for, like, all the issues that we need to solve.
00:20:38,095 --> 00:20:43,350
In previous companies, yeah, we added We had to learn it the hard way, because the the thing
00:20:43,350 --> 00:20:45,850
with infrastructure is that it's actually invisible.
00:20:45,910 --> 00:20:49,485
You don't see it, as long as it works.
00:20:51,085 --> 00:20:57,325
And that's also one of the tough things as an infrastructure person that you have to actually
00:20:57,325 --> 00:21:01,530
show your progress very actively.
00:21:04,950 --> 00:21:10,885
And when I said we learned it the hard way, what I meant is that the build system that we have
00:21:10,885 --> 00:21:14,985
been working on, which was, like, the most critical part of the build, of the infrastructure,
00:21:16,085 --> 00:21:18,500
it was slowing down a lot. Right?
00:21:18,560 --> 00:21:23,440
And if the build system is slowing down and you have, something like a monorepo where everything
00:21:23,440 --> 00:21:29,675
is in one large repository, right, and every single developer depends on that build system to
00:21:29,675 --> 00:21:37,590
be fast, then, suddenly, like everybody in the company realises there is some problem there.
00:21:37,810 --> 00:21:39,730
And, like, can't we fix this somehow?
00:21:39,730 --> 00:21:44,675
And then, like, the obvious solution is, yeah, well, we have to actually build the tooling that
00:21:44,675 --> 00:21:50,935
supports the load that our build system and our overall ecosystem that we like, the engineering
00:21:51,075 --> 00:21:57,040
system that we build needs to handle, and that requires, like, really, really high quantity
00:21:57,100 --> 00:22:00,140
tooling, like scaling tooling as well.
00:22:00,140 --> 00:22:06,249
If you can imagine that you need to handle, like, millions of lines of codes, you need to handle,
00:22:06,249 --> 00:22:12,870
like, thousands of builds a day, like, very diverse builds as well, like, simulations in addition
00:22:12,870 --> 00:22:18,009
to sanitizers, in addition to static analysis, in addition to, like, the plane compilation,
00:22:18,215 --> 00:22:20,475
and all the different languages that we're talking.
00:22:20,775 --> 00:22:30,210
So, it's it's a highly complex system, and it it requires, like, really good engineering, and
00:22:30,210 --> 00:22:37,475
lucky for us, Google has actually paved the way for us, with a lot of tooling, and we can reuse
00:22:37,475 --> 00:22:43,929
that tooling, and learn from it, and apply these principles also to all of these things that
00:22:43,929 --> 00:22:50,429
we add to the build system and the infrastructure in general, especially the scaling one.
00:22:51,735 --> 00:22:58,855
And, yeah, as soon as the company once went through such downtimes, so we had, like, a severe
00:22:58,855 --> 00:23:05,750
situation in Argo where actually, when the build system is down and every developer stares at
00:23:05,750 --> 00:23:11,245
the wall, like, it's a huge fire, and everybody is running around and trying to solve things,
00:23:11,245 --> 00:23:13,404
and it's really, really stressful situation.
00:23:13,404 --> 00:23:16,945
And, that usually sticks with people.
00:23:18,090 --> 00:23:24,190
And then you also make sure that, you you build the right stuff, with the right quality.
00:23:24,995 --> 00:23:30,515
So, yeah, I I'm in a lucky position now that I have the people around me and the people in the
00:23:30,515 --> 00:23:39,640
decision maker situations that, positions that we we get the support that we need, actually.
00:23:40,020 --> 00:23:45,985
Speaker 1: And, talking about the complex solutions, I think, like, for complex solutions, you need to
00:23:45,985 --> 00:23:49,365
build, like, really high performing, team.
00:23:49,770 --> 00:23:54,090
So, like, I'm just wondering what is your approach to building those things?
00:23:54,090 --> 00:23:55,710
What is important for you?
00:23:56,170 --> 00:23:57,770
Speaker 2: I used to play hand bone, actually.
00:23:57,770 --> 00:23:59,705
So I'm a, like, a team sports person.
00:24:00,585 --> 00:24:05,965
And for me, it's you know the saying, like, if your if your tool is ever, everything looks like a name?
00:24:06,825 --> 00:24:10,810
So for me, every team is like a sports team. Right.
00:24:10,970 --> 00:24:18,935
I can I can recognize that not maybe not everybody agrees with me on that, but I totally see it like that?
00:24:18,935 --> 00:24:23,275
And so far, I think it actually has a really, really good, analogy.
00:24:24,935 --> 00:24:28,230
And there are a couple of things there. So,
00:24:31,810 --> 00:24:42,305
if you if you have a really, really well oiled team, it's it feels like, like a super developer to me.
00:24:42,365 --> 00:24:49,550
So it's it's almost like a person that you can interact with, and, doesn't matter what you throw at it.
00:24:50,090 --> 00:24:58,085
It always creates, like, like, a high quality, super stable, and exactly right sized outcome.
00:24:59,505 --> 00:25:05,480
And that re like, that has some resemblance to a team, I would say. Right?
00:25:05,480 --> 00:25:09,195
Like, with football or handball, what matters is the ball.
00:25:09,274 --> 00:25:13,534
How how does the ball get to the front, and how do you get it into the goal? Right?
00:25:14,154 --> 00:25:20,930
And there is so much work that is done around, like, that goal, that starts with defense, that
00:25:20,930 --> 00:25:26,914
starts with the interactions between the individuals, that starts with understanding all, the
00:25:26,914 --> 00:25:33,174
skills that everyone has in your team, and playing with those skills, and making your decisions
00:25:33,235 --> 00:25:39,110
based on those skills, and actually also pushing each other a bit, and, like, in a positive
00:25:39,110 --> 00:25:45,255
way and supporting each other, whenever there is a mistake being done, like, somebody else is helping out. Right?
00:25:45,255 --> 00:25:52,235
And, it's not about the individual, but it's about the ball, getting the ball, as close to the goal as possible.
00:25:52,950 --> 00:26:01,350
And with everybody working together, it's really hard to to actually, win against the team that
00:26:01,350 --> 00:26:03,325
is playing really well together.
00:26:04,425 --> 00:26:11,950
And, like, if if you look at that picture, then there are so many many parallels to, like, what
00:26:11,950 --> 00:26:17,330
does it look like, if if a team is high performing, and I I draw from this a lot.
00:26:17,630 --> 00:26:23,644
And so I look a lot at the interactions that the team has, how people are working with each
00:26:23,644 --> 00:26:31,950
other, also, how people how do people react when maybe a certain person is not in the room versus
00:26:32,169 --> 00:26:33,389
when it is in the room?
00:26:34,250 --> 00:26:36,330
Do we have psychological safety?
00:26:36,330 --> 00:26:43,815
Can we say what we think, and how do people actually react when we say what we think, right,
00:26:43,815 --> 00:26:52,380
and how well do we actually know each other in a way that, we can also make decisions for the other in a way.
00:26:52,760 --> 00:26:55,740
I will maybe, like, come to that, in a moment.
00:26:56,975 --> 00:27:04,735
But it's so, it's such a such a powerful, analogy that I I really come back to it all the time.
00:27:04,735 --> 00:27:12,440
And, so far, what I at least also what I heard as feedback from the teams, it it works really, really well.
00:27:12,740 --> 00:27:16,085
And, people really enjoy, like, a good teamwork.
00:27:17,345 --> 00:27:22,885
And, yeah, that's that's my basic principle how how I do it.
00:27:24,580 --> 00:27:28,120
Speaker 1: And as a leader, I'm wondering who is your partner in crime?
00:27:28,179 --> 00:27:34,335
So you work as a a guy for the engineering, like, responsible for our engineering side.
00:27:34,475 --> 00:27:40,909
So, regarding the strategy strategy, our ideas, or, like, new initiatives, like, with whom usually
00:27:40,909 --> 00:27:44,990
you work or crush your ideas, to to see if they make sense?
00:27:44,990 --> 00:27:48,769
Speaker 2: That's a good question, actually. I'm a Yeah. A partner, Caroline.
00:27:49,245 --> 00:27:54,125
I don't necessarily have, like, an individual, that is my partner in crime.
00:27:54,125 --> 00:28:05,809
I actually, I try to use, like, almost everyone, to to, bounce ideas with, to also, express
00:28:05,809 --> 00:28:15,390
my worries or my concerns, and then think about, like, how should we do things instead, and
00:28:15,390 --> 00:28:16,750
what should we do instead?
00:28:16,750 --> 00:28:22,850
And then even then, like, even if I have a clear picture, I don't necessarily execute, but I
00:28:22,965 --> 00:28:25,125
I tried out on a couple of people. Like, hey.
00:28:25,125 --> 00:28:26,005
Like, what do you think?
00:28:26,005 --> 00:28:27,945
Should we do it like this or that? Right?
00:28:29,135 --> 00:28:39,085
It's So I really try to, like, outsource it as much as possible or crowdsource it almost.
00:28:39,085 --> 00:28:45,325
So there is very there are very few things that actually don't go through the team that I, that
00:28:45,325 --> 00:28:50,700
I decide or, like, well, decision is then actually no longer really my decision. Right?
00:28:50,700 --> 00:28:53,600
If I let it go through the team, then, like, everybody decides.
00:28:54,780 --> 00:29:01,275
But I I really try to be super transparent, actually, with everything that I do, as long as
00:29:01,275 --> 00:29:02,815
it's not damaging to anyone.
00:29:03,230 --> 00:29:04,990
Of course, it's a slippery slope. Right?
00:29:04,990 --> 00:29:09,790
You can never really tell, like, that somebody doesn't feel, like, offended by it.
00:29:09,790 --> 00:29:17,185
But, I rather try to on the side of being rather more transparent than than not, and then I
00:29:17,245 --> 00:29:20,450
sometimes need to tackle the problems that I created.
00:29:21,630 --> 00:29:28,795
But overall, it's everybody around me, and everybody also has it's similar to what I said in
00:29:28,795 --> 00:29:35,870
the beginning, or or, like, what you asked me about the high performing teams. Like, everybody has skills. Right?
00:29:35,870 --> 00:29:41,650
And the more you know people, the more also you know what you can ask them and what's what their experiences are.
00:29:42,590 --> 00:29:50,095
And then you also build an intuition in terms of, what things can you discuss and what could
00:29:50,095 --> 00:29:51,315
be, like, a great outcome.
00:29:52,740 --> 00:29:57,140
Even though sometimes I just try something, I just ask a stupid question even sometimes.
00:29:57,140 --> 00:30:03,755
Even though I even though I know it's stupid, just to, like, get get people talking. Right?
00:30:03,815 --> 00:30:07,434
It's very easy that they tell me, yeah, that's completely wrong. Right?
00:30:07,575 --> 00:30:14,010
Like, your idea is shit, and I I really enjoy that if people actually open up, and and tell
00:30:14,010 --> 00:30:18,669
me that I'm wrong, especially when I knew that I was wrong, of course.
00:30:19,054 --> 00:30:21,535
So maybe the other way around is not as as pleasant.
00:30:21,535 --> 00:30:25,695
But, anyways, I have to I have to, survive this.
00:30:25,695 --> 00:30:27,934
It's, it's part of the game, I think.
00:30:28,095 --> 00:30:33,620
We have to have open exchanges, and very often, I don't see things that the team sees.
00:30:33,840 --> 00:30:41,485
And then, yeah, we we are not going into, like, analysis paralysis or so or discussing things forever.
00:30:41,485 --> 00:30:45,265
We we we use time boxes. We say, hey. We need to decide. Right?
00:30:45,750 --> 00:30:51,510
One time I even, had a situation where the team and I, we couldn't decide, and then I said,
00:30:51,510 --> 00:30:55,665
hey, let's throw a dice, right, and it was fine. Right?
00:30:55,665 --> 00:31:02,420
We just moved on because we didn't know which decision was better, so any decision was as good as the other.
00:31:03,380 --> 00:31:08,040
And, actually, yeah, it it did also turned out to be fine. Right?
00:31:08,420 --> 00:31:14,135
And, so I I try to not make a big deal out of decisions when it's not necessary.
00:31:14,835 --> 00:31:21,370
And, when we have time, And we usually have time because when we're good prepared, like, we
00:31:21,370 --> 00:31:24,990
we gather the data, and then we can also make decisions together usually.
00:31:26,125 --> 00:31:33,005
I know that a lot of people are a bit worried about, like, having big debates with the whole
00:31:33,005 --> 00:31:40,860
team and so on, But it's really also about how you, how you approach such decisions. Right?
00:31:40,860 --> 00:31:46,795
It's not that I I make a big brainstorming session with inviting 16 team members and then ask
00:31:46,795 --> 00:31:47,835
them, hey, what do you think?
00:31:47,835 --> 00:31:49,275
Like, how should we approach it?
00:31:49,275 --> 00:31:54,450
I, of course, I make my homework, I think about, like, what should we do?
00:31:55,070 --> 00:32:02,485
And then then I make proposals, and, like, then you pretty quickly learn if your proposal was good or not.
00:32:02,485 --> 00:32:06,665
Then it's because they will take it apart as quickly as possible.
00:32:07,045 --> 00:32:17,225
And if if they don't take it apart, then you at least have the, the assurance that, okay, it's not completely off.
00:32:17,304 --> 00:32:19,145
So we can actually iterate over this.
00:32:19,145 --> 00:32:27,559
And then you iterate over it as as long as you need, to get enough confidence to say, alright. Let's try this. Let's do this.
00:32:28,340 --> 00:32:32,520
Even then, it's not necessarily settled in stone. Right? You can still adjust.
00:32:33,059 --> 00:32:36,555
It's like what we always do. We do software anyways. We can adjust.
00:32:38,375 --> 00:32:42,455
But, like, this is this is my partner in crime, I think.
00:32:42,455 --> 00:32:45,600
Like, it's the it's the people around me.
00:32:46,059 --> 00:32:54,875
And, I have, like, truly good people around me, and I'm very I'm very happy to have them. And, I yeah.
00:32:54,875 --> 00:32:57,215
Actually, that's that's a good point.
00:32:58,555 --> 00:33:04,940
I once heard the the proverb that you are the average person of the 5 people that you surround yourself with.
00:33:05,240 --> 00:33:11,340
So I try to surround myself with smarter people than I have, but then they actually, like, catch up to them.
00:33:12,105 --> 00:33:19,325
And, so far, I've been pretty, successful in terms of surrounding me with me with people that are smaller than me.
00:33:19,900 --> 00:33:22,480
And, yeah, I draw from that, I would say.
00:33:23,179 --> 00:33:27,980
Speaker 1: So what you said, it brings me to to to the question that I I wanted to ask you.
00:33:27,980 --> 00:33:35,745
So you said about the crowds, crowdsourcing or outsourcing the decision making, and, like, being
00:33:35,745 --> 00:33:40,490
transparent with the people, having, like, a bright minds around, which is, like, I think it's,
00:33:40,490 --> 00:33:45,310
like, a huge lessons learned lesson learned as a leader that you don't have to know everything,
00:33:45,610 --> 00:33:51,275
that you can ask other people, you can you can have ego in your pocket, you can ask for help. Right?
00:33:51,275 --> 00:33:53,915
Because this is not, not the obvious thing.
00:33:53,915 --> 00:34:02,570
And I'm just wondering, like, about your lessons learned, like, your moments in your career as, engineering leader.
00:34:02,865 --> 00:34:10,545
Speaker 2: So I think, like, in, very, very small thing, actually, where where I felt, a lot of time that,
00:34:10,865 --> 00:34:17,670
that was a good lesson, actually, is when I thought, oh, I don't have to go, like, run this through the team.
00:34:17,890 --> 00:34:19,589
I think this is a safe bet.
00:34:19,730 --> 00:34:21,829
Let me just, like, send this out.
00:34:22,455 --> 00:34:24,475
It it always never worked.
00:34:25,655 --> 00:34:28,695
Like, always every time there was something super stupid in it.
00:34:28,695 --> 00:34:34,610
And I just thought, like, why did I acknowledge even just let someone read through it. Right? Just a review.
00:34:34,610 --> 00:34:36,770
Like, we do this all the time with code. Right?
00:34:36,770 --> 00:34:38,945
Like, and it doesn't take that long.
00:34:38,945 --> 00:34:43,585
Just show it to someone, and that's what I actually do a lot.
00:34:43,985 --> 00:34:50,970
When I, like, when I say I'm I I outsource the decision, it's it's maybe also a bit of an exaggeration, I would say.
00:34:50,970 --> 00:34:56,395
But it's more like, I try to come up with something sensible, and then I share it.
00:34:56,395 --> 00:34:57,675
And then I ask, k, folks.
00:34:57,675 --> 00:34:59,375
Do you see anything wrong with this?
00:34:59,600 --> 00:35:03,860
And then usually, I get a couple of those out. Well, I hope.
00:35:04,640 --> 00:35:06,535
And then someone says maybe, hey.
00:35:06,535 --> 00:35:07,494
What do you mean with this?
00:35:07,494 --> 00:35:10,555
This is unclear, or I don't think this is actually true.
00:35:10,615 --> 00:35:12,954
You should probably change it in that way.
00:35:13,095 --> 00:35:19,590
And, like, these things are usually done within a few minutes, and then you can just forward it.
00:35:19,590 --> 00:35:27,035
So whenever I thought, I don't need to run this through the team, I actually learned that I should have.
00:35:28,855 --> 00:35:37,730
And especially, like, decisions around, like, personal changes, in a sense when you, like, you
00:35:37,730 --> 00:35:39,270
make someone as a manager.
00:35:40,005 --> 00:35:44,885
It doesn't hurt actually to ask everyone in the team when you share your thoughts and say, hey,
00:35:44,885 --> 00:35:51,190
I actually am thinking about, like, making that person a manager. What do you think? And please be honest. Right?
00:35:51,250 --> 00:35:54,450
It's like, what happens in the 1 on 1 stays in the 1 on 1.
00:35:54,450 --> 00:35:58,390
So, and that is super important to me as well.
00:35:58,995 --> 00:36:00,615
We have those honest conversations.
00:36:00,755 --> 00:36:05,955
And usually, like, for the most part, what I heard actually that was that people say, yeah.
00:36:05,955 --> 00:36:08,430
Well, he is anyways already acting like a manager.
00:36:08,430 --> 00:36:12,050
So you can also make him make him officially the manager.
00:36:12,430 --> 00:36:20,724
And this is, like, the easiest way ever, I think, to make a manager of a team. Doesn't always work. Right?
00:36:20,724 --> 00:36:25,030
You'll like, you not always have, like, high quotes manager material in the team.
00:36:25,990 --> 00:36:32,010
It's not only that the people need to be able to do it, but they also need to want to do it. And,
00:36:35,065 --> 00:36:39,565
so it doesn't always apply, but when it does, I think it's really useful.
00:36:39,625 --> 00:36:46,230
And I always wanted to actually interview my manager before they joined.
00:36:46,609 --> 00:36:55,285
I actually never had the chance to do it, and, if my team listens now, they probably will quote
00:36:55,285 --> 00:37:03,770
me on that, but but I I think I I, I would like to, hire a manager only, also, if the team improves.
00:37:03,770 --> 00:37:10,775
But because the team, like, the team and the manager interactions are, like, what what is building
00:37:10,775 --> 00:37:13,835
the foundation of trust in a in a company.
00:37:13,895 --> 00:37:18,250
And that has to be really well oiled, in my opinion.
00:37:18,950 --> 00:37:24,815
And so I should I should do that as well, like, to even drive that through the team and not
00:37:24,815 --> 00:37:26,675
make that decision on my own.
00:37:27,615 --> 00:37:33,875
I think that is that is maybe the biggest learning even even though they were all small pieces.
00:37:34,040 --> 00:37:40,300
But adding them all together, I think they they form the foundation for for what what I'm doing mostly.
00:37:41,224 --> 00:37:49,670
What I also learned is that, and maybe that's obvious for everyone, but, I still wanted to mention it.
00:37:50,230 --> 00:37:59,450
So, like, this psychological safety thing, or being able to say whatever you want, and it's safe.
00:38:00,515 --> 00:38:04,435
It doesn't start with saying everything that you think. Right?
00:38:04,435 --> 00:38:09,160
It just, like it's it has a certain order to it.
00:38:09,160 --> 00:38:17,705
So you have to first, lay a foundation of trust and being able to say what you what you want
00:38:17,705 --> 00:38:23,625
or being able to speak your mind completely freely is only a symptom of, like, really, really,
00:38:23,625 --> 00:38:26,125
like, well established psychological safety.
00:38:27,599 --> 00:38:32,819
I must say, I sometimes remind myself of that, painfully, at times.
00:38:33,680 --> 00:38:39,724
But that's also something that, we, like, we always have to keep in mind when we communicate.
00:38:39,865 --> 00:38:46,230
We are moving, especially as managers, or leaders in general, we are moving between different
00:38:46,230 --> 00:38:53,289
teams and we have to actually be able to play the role appropriate to the situation in which we are.
00:38:53,655 --> 00:38:58,775
Especially, it's hard when you switch between different teams, when you switch between different
00:38:58,775 --> 00:39:01,115
groups of people that you have to communicate with.
00:39:01,280 --> 00:39:08,640
If you switch also between different, culture, areas that at Argo, for instance, we had a team
00:39:08,640 --> 00:39:15,135
in Palo Alto, a team in Pittsburgh, a team in in Munich, and they all had very different dynamics.
00:39:15,435 --> 00:39:23,930
And, it took a while until, you could actually speak in the same way in the different teams.
00:39:25,510 --> 00:39:30,985
So, yeah, that is also, I think, a skill that is is super, super important to me.
00:39:31,125 --> 00:39:36,470
And I think that that is something that I would recommend every every person, every leader, every manager
00:39:42,790 --> 00:39:51,130
trying to there is an audience, and, the audience should drive what you say, or at least how you say it.
00:39:52,089 --> 00:39:58,029
You still wanna get a message across, but it can be perceived very differently. Yeah.
00:39:58,694 --> 00:40:06,474
Recently, a colleague of mine, said that, they read that, the the Germans don't embellish.
00:40:07,410 --> 00:40:11,569
So in Germany, you don't meet someone and say, hey. How are you doing?
00:40:11,569 --> 00:40:14,245
And then the other person, great. How are you doing? Right?
00:40:14,405 --> 00:40:17,605
It's more like, hi, and you come to the point. Right? Yeah.
00:40:17,605 --> 00:40:22,105
Like, no, no something around, no beating around the bush. Bam.
00:40:23,890 --> 00:40:33,715
And, if, like, if you say hi and then, like, make a statement and you do that, in the US, it's perceived very, very differently. Right?
00:40:34,335 --> 00:40:36,255
Also, of course, depending on the other person.
00:40:36,255 --> 00:40:39,455
If the other person comes from Europe and the US, it's again different.
00:40:39,455 --> 00:40:48,320
But, so, like, having having some sense of, the other person in the room is, like, super super important.
00:40:49,275 --> 00:40:55,435
And it's also something that, I I I probably will never stop learning how to do
00:40:55,435 --> 00:41:05,434
Speaker 1: it. And, and the last question but not least, I'm wondering any books, resources, I don't know,
00:41:05,434 --> 00:41:12,095
conferences that were particularly, influential, on yourself, on your journey as a leader?
00:41:12,790 --> 00:41:19,589
Speaker 2: Yeah. I I read, like, all the books that I found around, agile development in, in the early
00:41:19,589 --> 00:41:23,125
areas of well, early early ages.
00:41:23,125 --> 00:41:26,164
It's, the early ages that I learned about it.
00:41:26,164 --> 00:41:32,870
So that was already far after after you were thinking that, Like, the classics there, is what
00:41:32,870 --> 00:41:36,730
I really, really loved is, coaching agile teams.
00:41:39,395 --> 00:41:45,575
It's it's one of the best books, I think, for for, actually, it's for, agile coaches.
00:41:46,460 --> 00:41:49,680
But I think it's just as applicable also to managers.
00:41:51,980 --> 00:41:55,895
Something that is very different from that.
00:41:55,895 --> 00:42:02,154
But what I also really enjoyed is especially everything around, like, a lean startup with Eric Greece.
00:42:04,500 --> 00:42:06,599
It really struck a chord with me.
00:42:06,819 --> 00:42:14,315
And, actually, the book that is, in my opinion, like, moving the needle even further in that
00:42:14,315 --> 00:42:17,855
direction is, the product development flow.
00:42:18,049 --> 00:42:18,549
Speaker 1: Yeah.
00:42:19,089 --> 00:42:27,914
Speaker 2: From Donnie Reinhardsson. The product development flow is more about, assessing, like, product development situations. Right?
00:42:27,914 --> 00:42:35,055
Making proper decision, trying to establish proper procedures, processes, practices in particular,
00:42:36,599 --> 00:42:48,954
Whereas, coaching agile teams is, is much more around how do you behave as a coach, as a manager, as a scrum master.
00:42:50,460 --> 00:42:53,760
And, actually, there is another book.
00:42:54,940 --> 00:42:59,440
It's actually, my favorite book is, Culture Code from Daniel Creul.
00:43:00,525 --> 00:43:11,630
And that is actually about teams and, how they interact and, how high performing performing teams are composed.
00:43:12,970 --> 00:43:20,895
And I always had, like, an argument with, with my previous boss, about, like, command and control.
00:43:21,115 --> 00:43:27,550
I I totally hate command and control as you can imagine maybe now after our conversation. And he said, yeah. Yeah.
00:43:27,550 --> 00:43:33,730
But there are situations where you totally need command and control. Like, with, like, firefighters. Right?
00:43:34,035 --> 00:43:35,315
You can't say, like, hey.
00:43:35,315 --> 00:43:36,595
There is a fire over there.
00:43:36,595 --> 00:43:38,835
And then everybody thinks start thinking about, yeah.
00:43:38,835 --> 00:43:42,615
But what do we do? Right? And I totally disagreed.
00:43:42,755 --> 00:43:49,470
And it took me really long to actually be able to to assess the situation and to form like,
00:43:49,470 --> 00:43:52,695
to phrase my opinion about it.
00:43:53,655 --> 00:43:56,475
Because I also saw that, yeah, he has a point. Right?
00:43:57,415 --> 00:44:03,640
It's not a good idea to just stand around and then scratch your head and then wondering, what should we do? Right?
00:44:04,040 --> 00:44:07,980
But then I I also read the the book, The Culture Cone.
00:44:08,280 --> 00:44:13,974
And, the book goes through, several team compositions.
00:44:15,315 --> 00:44:20,660
It starts with, like, kindergarteners, like, teachers in the kindergarten, how they interact.
00:44:21,280 --> 00:44:28,385
Then it went through, like, pilots that interact in a cockpit of a crashing airplane, and, then,
00:44:28,385 --> 00:44:30,305
like, a group of peaks. Alright.
00:44:30,305 --> 00:44:39,599
So very diverse, very, like, extremely different areas of of, like, teams in particular.
00:44:39,819 --> 00:44:42,705
Also, I think sports teams is also part of it.
00:44:42,785 --> 00:44:50,885
And then one of the team formations, and I was super happy about that, was actually, like, a group of soldiers.
00:44:51,430 --> 00:44:54,730
I think it was Navy SEALs or Marines. I'm not honestly sure.
00:44:55,589 --> 00:45:04,545
But, I I there is probably no no higher definition of, like, a command and control team than,
00:45:04,545 --> 00:45:06,145
like, a team of soldiers. Right?
00:45:06,145 --> 00:45:10,220
Like, this is this is my stereotype of, like, this is command and control.
00:45:10,760 --> 00:45:20,115
And then, the author described, like, how that team actually practiced and how they trained
00:45:20,115 --> 00:45:26,460
and, how they pick, I think, Simon Simic is actually using their their principles of how do
00:45:26,460 --> 00:45:27,839
you pick a new team member?
00:45:28,460 --> 00:45:32,319
How much is it about, the performance and the skills?
00:45:32,994 --> 00:45:35,555
And how much is it about the trust? Mhmm.
00:45:35,555 --> 00:45:41,395
And, like, they also have these retro sessions where they where they optimize, and they seem
00:45:41,395 --> 00:45:43,540
to be really, really rough, those retro sessions.
00:45:43,540 --> 00:45:45,000
So they're really, really honest.
00:45:45,060 --> 00:45:47,320
I don't actually wanna be there when it happened.
00:45:48,420 --> 00:45:51,800
But it like, there was nothing command and control.
00:45:52,705 --> 00:45:56,645
It was really, really, like, for me satisfying. Of course.
00:45:56,785 --> 00:46:00,780
I immediately bought a book, and I I gifted it to my my boss.
00:46:00,860 --> 00:46:02,460
IKEA, read this, read this.
00:46:02,460 --> 00:46:11,385
I don't know if you read it, in the meantime, but, they they did all through, like, training,
00:46:11,925 --> 00:46:15,125
direct feedback, and it didn't matter who gave the feedback. Right?
00:46:15,125 --> 00:46:20,060
If you're, like, a soldier and then you you do a practice session, right, you, they actually
00:46:20,060 --> 00:46:22,380
were the, I think they were Navy SEALs.
00:46:22,380 --> 00:46:24,620
They they were the the team that actually captured
00:46:24,780 --> 00:46:25,280
Speaker 1: Lullaraug.
00:46:25,420 --> 00:46:32,645
Speaker 2: Oh, well. And they practiced that situation over and over, like, with all kind of constellations.
00:46:32,945 --> 00:46:36,640
And, like, after our practice, they came together and they discussed it again.
00:46:36,960 --> 00:46:41,599
And, like, everybody had the same voice in there. So yeah.
00:46:41,599 --> 00:46:45,700
That is my example of this is how well, what I think works.
00:46:46,145 --> 00:46:50,705
And, I, so far, am working really well with it.
00:46:51,025 --> 00:46:56,860
So I think the teams like, if you if you treat them as adults, they also behave like adults.
00:46:56,860 --> 00:47:00,720
That is one of my guiding principles as well. Yeah.
00:47:01,234 --> 00:47:07,895
Speaker 1: Awesome. Thank you. I think the last sentence, it's like a good, dot at the end of at the sentence of our interview.
00:47:07,954 --> 00:47:13,059
So thanks, Johan, for, for the the thoughts, lessons learned, and
00:47:13,220 --> 00:47:16,200
Speaker 2: Thanks, Lej Sainz. Thanks for having me. It was my pleasure.
00:47:16,795 --> 00:47:19,455
Better Tech Leadership powered by Brainhub.
00:47:20,715 --> 00:47:25,935
Follow Les Schick on LinkedIn and subscribe to the Better Tech Leadership newsletter.
What's the difference between being a technical leader in a large company and a startup? How do you set goals and measure them? What are the pitfalls to avoid?Leszek Knoll digs into these topics with Robert Coletti, co-founder of Cello.
Matt Warcholiński talks with Phil Freo, VP of Engineering about mixing leadership requirements with building tech strategy.
Milan Stancevic (bookingkit) discuss the game of merging the business strategy and tech needs into one.
Interview with Alessandro Lemser, Chief Engineer at Zalando.