Lex Fridman had Sundar Pichai on his podcast and discussed AI and software engineering. Is it all doom and gloom coming from the Big Bossman of Google? Or is there hope for software engineers?
Let's see!
📄 Auto-Generated Transcript ▾
Transcript is auto-generated and may contain errors.
What's up folks? I'm going to go over to YouTube to talk about a interview that was between Lex Freriedman and Sundar Pachai, the CEO of Google. And I wanted to talk about this because a while back now I did a sort of a review or perspective video on code commute for uh an interview that was done with Mark Zuckerberg. And people were losing their minds over this interview with Mark Zuckerberg. And at the time when I did the first video, I hadn't I hadn't even seen the Mark Zuckerberg thing, but I kind of gave my perspective on what people were saying. And then I went back and watched it and did a follow-up video because my takeaway from Mark Zuckerberg was not that, you know, developer jobs are going to be eradicated and that kind of thing. He was just purely talking about the amount of code volume that is being uh turned out by AI.
So what I thought was really interesting in this interview with uh Lex and Sundar was that he comes right out and says like what I believe is is the reality. So confirmation bias for me. But no, it was really cool to hear like a tech leader saying like no, I think there's going to be more jobs. I think like from his perspective, he's like, "We're hiring more engineers." So, I just wanted to talk through some of these concepts. Um, I will probably do like a more polished video on my main YouTube channel and that way I'll share the clips and stuff and go through it. But this is code commute. I don't edit anything and I just kind of do a stream of consciousness.
So again, framing here is that, you know, if you're especially if you're a software developer, you're probably not uh living under a rock because you're watching YouTube and you've been hearing everyone kind of talking about how AI is, you know, going to be the end of programmers because you see all these companies around us that are uh you know, hire like the job markets are bad. Um you have companies that are trying to get their employees to use AI more and more. You have some companies that are There are actually some that are like, "Oh, well, we don't need developers. We'll just use AI." And I think there's a lot more examples showing up now where the companies are going back and going, "Oops, we shouldn't have done that." Yeah, no Um, so I think that there's still just a lot of noise, right? There's a lot of hype around this stuff and the end result is that it has a lot of developers very scared.
Um but what doesn't help is when we have media and we have executives that are talking about these concepts and they're like they're try okay media is basically trying to make people afraid because it's not it's not like a I don't know it's not it's not a secret that when people are afraid they click things right like you get engaged into viewing and reading when people have emotions evoked right like if you turn on the news. Doesn't matter what channel. It's extremely rare that you're watching news and there's good things happening, right? It's the same thing with like article titles and YouTube titles. It's always something that wants to make you click. So, you have the media that's driving this.
And then you have executives of companies that are very motivated to be telling other you know their the news uh people in general consumers that like AI is the thing because they're motivated to do that right they have they have a lot of money and uh time and resources invested into AI tooling AI products AI services they want to be able to try and tell the masses this is the future this is the only Okay. And the side effect of that is that like the sales pitch becomes very exaggerated or misconstrued, right? So, um I think for some of these folks that I've heard talking about AI and their services like especially the big tech companies, I I often do not get the impression like I work at Microsoft, right? I have never gotten the impression working at Microsoft that they're like, "Hey developers, your days are numbered because once the AI is good enough, we don't need you." It's just it's never come up.
When I was talking about the Mark Zuckerberg interview, that's not even the impression I got from Mark Zuckerberg. The Sundar Pachai interview that I'm going to be talking about. He's literally saying like, you know, it's going to enable developers to do more and they're going to hire. So all of this kind of coming together is like I just think that you know I don't blame people for being afraid because that's that's kind of what's being put in front of people but I think the more and more of this that we're seeing like the less and less like I mean I haven't really been afraid I guess but the less if I can say the less afraid I am or the more confident I am about my personally my positioning on this uh like there's there just seems to be more evidence mounting in my opinion, right?
Of course, this is code commute. I welcome other opinions like leave them in the comments, right? Um, but what I liked about how Sundar was talking about this. Wait, when I was reflecting on it, when I hear other people talk about AI and developers, it feels like um what's I don't know a good way to to describe it, but they they seem to talk about like engineering work as like there's a a ceiling or it's bounded like there's only so much work. So if we introduce AI, by definition, it must be taking away developer jobs. Right. I don't I don't know the right word for that. Um it's not like it's not pessimistic. It's um I don't know there's a word, but it feels like there's limited uh work to be done. So as you introduce something that can do the work, therefore it replaces the engineers.
And what I really like about how Sundar was talking about this was that it's the opposite. It's it's just quite the opposite. And I've said this in other videos. I have made the statement I have never worked anywhere where there was a limited amount of work and then we got done we just finished and the software or the service was just done. There is always far too much work to do. Far too much. And it's always a matter of trying to prioritize the most important thing. And you like personally I've always felt like I shouldn't say always, a majority of the time it feels like we're trading like our highest priority thing for the next highest priority thing.
And I I you know from the perspective of a startup that grew into like a you know like a scaleup from being at Microsoft on different teams it has always felt like we need to chase the P 0 highest priority thing and as we're chasing it you know next P 0 things coming up and there's all of this other stuff we can't even we can't even look at there is seemingly an unlimited amount of work to get done and it's all about trying to make sure that we're getting the next highest priority thing done. When Sundor was talking about this, he very much framed it in a way that I agreed with, which was like, you know, AI will allow developers to do more creative work. You'll have more people being able to do more creative work. You will have people that can use AI and agents to be able to do the work that we don't want to do.
described like in software engineering, right? You have there's engineering work, there's going to be coding, but there's going to be stuff that feels like grunt work and you're like, "Sucks, man. It's part of the job." Like we have to either go refactor things or we have to go uh fix up tests or there's just stuff where you're like, I know it's got to get done and I don't love it. For some people, maybe you do love those things, but there's going to be aspects to programming like never mind software engineering in general, just programming where you're like, I don't like doing that, but we got to do it because that's how we have to end up building the software. You'd rather focus on the more interesting creative things. He's saying in this interview, like from his perspective, he's like, "Use AI for that stuff. Get that out of the way." Then that frees up people more.
So frees up those people to do more creative work and you you create more opportunity by doing more creative work and then bringing in more people to do more creative work. So he sees this as sort of like a um an unbringing or right wording but like an unbounded kind of thing, right? If we can keep multiplying the productivity of people, do that like get more people and multiply their productivity even more then you get more done versus the opposite or the opposite way of framing that which is like there's a a set of bounds around how much work there is therefore introduce AI therefore reduce the number of developers because you don't need them because there's only so much work that's getting chipped away. So, I I really like the framing. Um, he, you know, goes it literally says in the interview, and I was trying to see if I could like play it and then hold the mic up to it.
I'm like, I'm just going to make a a Dev Leader sort of review video on this. So, if you haven't checked out my main channel, it's just Dev Leader on YouTube. The videos are edited. They're not like me just talking in this chair, uh, holding a holding a tiny mic. I have a a better one. I use the good mic for the main channel, but check that out when it's ready. Um, but the other thing that I wanted to kind of touch on that I thought was interesting, um, was that Sundar goes on like it's kind of in the beginning of this clip, um, and I'm specifically looking at Lex clips on YouTube just so there's clipped segments of like the two plus hour interview, but um, in the beginning of this clip, he was talking about how a lot of companies are talking about metrics like how much code is written by AI and he doesn't like he's like that's not really the right metric.
But I agree. I feel like it's a misleading metric and it leaves a lot for people to to kind of interpret from that. And when I've talked about this type of metric before, the reason I feel like it's misleading is that you if we think about even before AI when we talk about how much code is written or how much code there is like in a codebase there are plenty of things like I'm a C developer like I I'm engineering manager but I I develop in C on the side and that's what I've been using for years and years. But even in C, I can think about plenty of things like if you're doing as far back as Windforms, if you're making desktop applications like buttons and text boxes and stuff, you got Windforms, you got WPF. It's the more modern version of Windforms. Even in those two things alone, remember, we're not talking about AI at this point.
The amount of code that is generated by a computer is a very high amount of code. because you use a visual designer to lay things out how you would like and then it generates the code for you. So if you were to go run an analysis on a codebase, how much code is not generated by a human? Depending on the codebase, there might be a disproportionate amount written by a machine. Okay? Like have people been freaking out about that? Okay, now we fast forward and I'm just talking in the .NET land. There's plenty of other examples for this kind of stuff. We have source generators innet. So the idea is that there are different patterns that you could have in your software development.
And sometimes what we would need to do in order to sort of scale a pattern is that we would make things I don't know a good way to describe this like more more generic or we could use some metadata on the types and dynamically at runtime we could infer things and so that's called reflection and there's a lot of stuff that we've been able to do now inn net I don't even know when this was introduced this is already years back but they're called source generators and basically you can you can sort of template some code and the source generator will go blast out all of the the code behind to generate. A quick example that comes to mind is that there's a a library I really like using called strongly typed IDs. And what you're able to do with that is create an identifier. So you could have like I don't know like you have a a ecom product database or whatever and you want to have like a product ID.
So instead of having that be a string or a gooid or an integer, you could have a product ID type and then you annotate it with strongly typed ID. Just to give you an example, that annotation ends up using a source generator. So you write like literally two lines of code. One is the name of the type. So you would say like this is called product ID and then one line above that that annotates it and says this is a strongly typed ID. Then a source generator is run and it goes and makes all of the source dynamically for you. So it uses a template to go extrapolate all the code that needs to be generated. So the end result is not that you at runtime have to go dynamically infer type information and do this reflection sorcery. It's that you have that code generated up front.
So the more of that that we start to include how much code is generated by machines and not people, right? You have this metric that I feel like is used to I don't know if I want to say trick people, but it's it's just misleading I feel like. So when we say how much is generated by AI, like that doesn't mean that there's no human involvement anywhere. That doesn't mean that a human wasn't looking at it. It doesn't mean that no humans needed to be included in the entire process. Like it's just a bunch of AI talking to each other and going and building out a product. So it's almost like who who gives a how much is actually written by AI, right? Like give you another example completely outside of code. You go write an essay, okay?
you write a a 50page essay and then you say that you want to go put it into AI and you tell the LLM, hey, like go, you know, format this or go like clean up sentence structure like whatever. The end result of that, does that mean that all 50 pages that you wrote are now zero pages that you wrote or zero sentences because if AI had to touch anything along the way, it's no longer your sentence? Like these are just misleading metrics. like who like really who cares? But what Sundar talks about in this interview I think is the more interesting one which is like how much productivity gains are we getting which is I feel like traditionally a very tricky thing to measure because I think he actually calls it as like engineering velocity and then productivity but depending on who you ask and
how these things are measured you'll get wildly different sort of answers for that but he estimates that for Google it's like roughly 10% currently At the time of recording, which is middle of June 2025, a 10% increase in productivity from AI, for Google, 10% may not sound like a lot, like it's only 10%. Okay, we just passed, you know, single digits. We're now into the double digits. Cool. But 10% for Google, it's a huge company. Getting a 10% boost, like that's actually pretty freaking awesome. So he he's focused on measuring the productivity and he's like you know the lines of code that are written by AI it's kind of like whatever. So I just wanted to call that out because I agree with it. We see this come up a lot. I'm pretty sure even SA from Microsoft has made claims about this and stuff and like when I see it I'm like interesting but like what is it?
What does that actually tell us? Um and I'll give you just a personal anecdote. And I'm going to do another I'm going to do a full video on this on dev leader as well. But um I've been using GitHub Copilot in uh in GitHub to do pull requests. And I already made a couple videos just chatting through this. But it's now been 3 weeks of me using this. And when I look at my my commit history for Brand Ghost, which is a product and service I'm building on the side for posting to social media, when I look at the history for this, the the number of commits or like things that have landed is disproportionately more co-pilot than me now in the past 3 weeks. Right? Right.
So, if I were to tell you that, I don't know, I don't have it pulled up right in front of me, but if I were to tell you that and I said, just to make up something that's probably pretty close, if I said 80% of the the commits are are co-pilot that land on main and 20% are from me. Now, does that mean that, you know, based on that that Nick's not really important? No. Because what you don't get to see from that is how much back and forth I'm doing on some of the reviews, how much wasted code it's generated that I just have to scrap. Right? You don't see from that metric that I was pulling down the pull request. I was fixing up a bunch of stuff or going, "hm, we need to change direction, recoding parts of it, pushing it back up, and having it continue." You don't see that.
You just hear the number that's like 80 to 20 and you're like, "Oh, like that's that's crazy." like, you know, we should be afraid we're never going to work again. No. Um, if I were to let GitHub co-pilot in pull request mode to do what it said and I wasn't reviewing it, would be totally busted. Totally. Now, that doesn't mean that it's been bad. It's actually been amazing. So, I'm extremely happy with it. It's the I was telling the the guys I work with on Brandos, this is the best for me as uh like from a development perspective, the best dollar per month I've ever spent on development ever. And that's even with it being like stupid making mistakes or me having to hold its hand sometimes. It has been awesome. But my point is that the metric can be very misleading. So you have to think like what does that metric actually mean?
Don't take it at face value because the numbers sound like oh just what does it actually mean? And I I just encourage people especially for the more junior folks. I know that there's so much of this stuff put in front of you that seems scary as hell because it's like we haven't we haven't been in a time like this before. At least not to this degree. and people are freaking out. But like I don't know the you have media that's doing it, you have tech execs and then you have these like these different uh metrics put in front of you. And I just I totally get why people would be losing their minds over it. So think a little bit more about what those things mean, right? Try to interpret them. Try to understand what that actually means in practice because at face value, I don't think it's there.
Anyway, I do recommend at least minimally uh if you've watched this, I would definitely go recommend watching the uh the clip on Lex Clips Sundar Pachai and Lex Freiedman. Um I think it's a really awesome clip. It's like six minutes long. If you you've already watched this one, I'm looking at the time, it's already over 20 minutes. If you can make it through my video, you should watch that one. And then I will also do a follow-up video on Dev Leader that'll be edited down. So, watch that one, too. Um, I'll give a little bit more refined perspective. I like doing the code commute ones first so that I can think through it and then the dev leader one will be even better. So, thanks for watching. Don't freak out too much. Make sure you're trying AI tools out. Work on your prompts. See you in the next one.
Frequently Asked Questions
These Q&A summaries are AI-generated from the video transcript and may not reflect my exact wording. Watch the video for the full context.
- What is Sundar Pichai's perspective on AI's impact on software engineering jobs?
- Sundar Pichai believes that AI will not reduce the number of software engineering jobs but instead enable developers to do more creative work. He mentions that his company is hiring more engineers and that AI helps remove grunt work, freeing people to focus on higher priority and more interesting tasks.
- Why does the amount of code generated by AI not accurately reflect developer productivity?
- The amount of code generated by AI is a misleading metric because it doesn't account for human involvement in reviewing and refining the code. For example, even before AI, a lot of code was generated by tools like visual designers or source generators. Productivity gains are better measured by engineering velocity rather than just lines of code written by AI.
- How has using GitHub Copilot affected the author's software development process?
- The author has been using GitHub Copilot for three weeks and noticed that a large percentage of commits come from Copilot, but this doesn't mean the author's role is diminished. They still spend time reviewing, fixing, and directing the code generated by Copilot. Overall, the author finds Copilot extremely valuable and considers it the best development investment they've made, despite its occasional mistakes.