The .NET Developer Nobody Wants To Be | Stuck With Legacy

The .NET Developer Nobody Wants To Be | Stuck With Legacy

• 1,021 views
vlogvloggervloggingmercedesmercedes AMGMercedes AMG GTAMG GTbig techsoftware engineeringsoftware engineercar vlogvlogssoftware developmentsoftware engineersmicrosoftprogrammingtips for developerscareer in techfaangwork vlogdevleaderdev leadernick cosentinoengineering managerleadershipmsftsoftware developercode commutecodecommutecommuteredditreddit storiesreddit storyask redditaskredditaskreddit storiesredditorlinkedin

From the ExperiencedDevs subreddit, this developer wanted perspectives on how to deal with a troubling PM. They're stuck maintaining legacy code, AI is a hyper-focus, and their PM doesn't support them. What's the plan?

📄 Auto-Generated Transcript

Transcript is auto-generated and may contain errors.

Hey folks, we are going to the experience dev subreddit for a topic. This one is pretty juicy in terms of like the different areas to cover, I guess. Um, and I think that's pretty cool cuz I mean feels like it's feels like it's a pretty real relatable thing. So got a developer who wrote the post and said they're like the sole maintainer of a legacy application. um happens to be net- based which is convenient cuz uh I can understand a little bit more clearly uh as a net developer. So the couple different angles and things that they're bringing up is that uh they are like I said sole maintainer. They have to work with a product manager who um who doesn't seem to get like the a lot of the issues and challenges around like this entire legacy application this person's working on.

Um, and then to layer on to things, there's this kind of like communication gap issue around like AI tool usage where this person's saying the product manager is uh insisting on AI usage and this person's like I I am using AI tools to help as like where I can where they where they are helping and um so for I don't understand this this working dynamic, but I could understand why it would feel shitty. But this person's then saying the PM then tells higherups that things aren't going as fast because this guy is not using AI and he should be. So, uh I think for a lot of us, uh you know, this pressure of AI usage, this guy is like, man, I'm using AI and I'm I'm sure he's getting pressured from every direction around using it. And now uh now he has this person kind of like complaining to his leadership that he's not.

And it's like the heck. So this guy's like okay like how do I like what do we do in this situation? I actually can't remember his exact question. I was more trying to to get the scenario uh you know captured at least to talk through. But I think there's a few different angles I want to talk about. Um one is going to be uh just talking about legacy systems in general. We'll do that for a little bit. The second part is going to be around um AI tool usage in this scenario because he says a couple of things that I uh I don't know feel like I don't know what the right word or phrase is. Kind of feels like a copout. Um and then the third part is around like how do we how do we communicate with this person and try to get on the same page, right?

Cuz I think that's a really important part. So to start uh yeah legacy systems are a challenging thing to work on for sure. Um, especially if you are like a sole maintainer of something. Um, the tricky part I think a lot of people that find themselves maintaining legacy systems, it's more like, okay, well, it's legacy, we're not adding features to it. So, you know, there's there's a bug fixes we know about. Okay, like you go to fix the bug and it introduces five more bugs kind of thing. Uh, it's a legacy system, so odds are most people that have worked in in legacy systems, you're going to be complaining about the amount of like code coverage that exists. Uh, and I don't strictly mean coverage in terms of lines of code. I just mean your ability to feel like you can change things without the world breaking.

you don't have a good signal through tests and that makes it hard to to go fix things and and feel like safe about doing it. Uh, every time you touch the code, it feels like, you know, something else is going to break. Um, and it's because you don't have, you know, sufficient test coverage. Um, or there's lots of tests and they're flaky or whatever, right? and you're kind of this person or maybe you're on a small team or something but you're in this position where um you know you're fully responsible for it. The the tribe has disappeared and good luck. Uh now to compound things in this person's case, uh it sounds like they're not just like maintaining with bug fixes, they're talking about feature development as well. And so that that is a whole like level even on top of like bug fixes.

Yes, there there's still code changes, but um sometimes with bug fixes uh especially in a legacy system, you're like we can keep kind of we can keep kind of patching this thing up because you know it's end of life or whatever in terms of like feature development. We'll just keep it going as long as we can. So patch patch patch. um when you're talking about feature development that to me at least like my my mindset shifts a little bit to like look like if we're still adding stuff onto here, we're still trying to get people to use this. It's not even like we're trying to move people off or there's a path to end of life. It's like you're adding new stuff in here, man. Like I my assumption would be it's because people are paying for it or you're trying to get more people to pay for it.

So the system itself might be legacy in terms of code and structure and support, but like it's not legacy from a product perspective. Okay? So when you're trying to add new features and functionality into this thing, if you take my approach I just said about patch, patch, patch, you're going to continue to perpetuate this problem of this thing sucks to work with, right? It's a terrible experience because there is no code coverage. Everything's brittle. Everything feels coupled together. Um, you have all of these problems. Come on, buddy. You can go. Lots of space. There you go, sir. Um, so it's you you need to take some opportunity to um to to be able to clean up and like and you know uh not perpetuate this absolute best. Now, that's easier said than done, of course, but um when people talk about and I've I've heard this a whole bunch from, you know, different sources, but uh people kind of complain about, oh, like I'm just maintaining legacy systems.

It's not interesting. It's not real software engineering. It's like, man, that's sometimes the most real software engineering because you have the like the sometimes the shittiest constraints to try and work within and you have to get creative. So when we talk about these scenarios of like, oh well, it's it's brittle, it's all coupled, and you know, I touch something and um you know, there's no test to to to catch these issues. It's like, cool. Like you're you're sort of recognizing what the challenges are, but if you don't take steps towards making those better, like you will continue to have those challenges and they will get worse, right? because you are adding more things on. You're piling on to this thing. It's it's going to get worse if you don't actively try to make it better. Uh obviously, right? So, so what do we do to make things better?

Well, um I think you need to work towards uh trying to refactor parts of the codebase. Again, easier said than done. How do you refactor things if there's no tests? And how do you test things if the code wasn't written in a way to be testable? Um there are lots of strategies and sometimes that means in my opinion writing tests that are less ideal to give you some confidence. So this will largely depend on how things are coupled, what kind of application or service you have, um state of the codebase, lots of variables to consider, but you can you can move these parts around to your leisure, right? So for example, I like writing code that can be unit tested. When I say unit tested, I mean um like almost in isolation from anything else. That doesn't mean that I I like uh saturate my test cases only with tests like that.

But I like writing code that can be tested that way because it gives me the option to do it if I need to or want to to build confidence. For me, it's about having options to test things. So, I like writing code that way. In a lot of legacy systems that I've worked with, including ones that I wrote that became legacy, they were not designed that way. So, you have one class of tests that's just not available to you, right? It was not designed to be testable in that regard because it's pulling in all these other dependencies. They're not mockable. You can't separate your concerns out. you're kind of screwed that way. That's one class of tests. Um, if you go the exact opposite end of the spectrum, you have like these huge integration tests and depending on what your app is, that could be, you know, some UI tests that's clicking through stuff and, you know, uh, kicking off workflows that take a long time and they're super brittle.

They're super brittle. But if you can get some tests like that in place that that cover your number one use cases that cannot break according to like how your users interact, that could give you some confidence, right? Because even if something else breaks, and surprise it will, but even if something else breaks, you now have some confidence that at least your number one use case is going to be okay. So when it comes to trying to clean up code bases like this, um I think the goal is like we need to work towards having test coverage that gives us confidence. That may mean that you're writing tests that you don't love. So for example, I don't love tests that click through applications. uh unless you're trying to do something with, you know, uh targeted UI testing, I don't want to I don't want my test harness to be clicking through things to go test like some backend logic.

That feels like a very wrong thing for me to do. But I have absolutely done this kind of thing over legacy code where we couldn't refactor it because we're like, it's literally going to break if we touch this. you can't write unit tests and or other types of coded tests directly on it without the test setup being insane. So we said if we're going to do insane test setup like we can minimize the test setup by doing these UI tests they're going to you know be brittle um but like at least we have at least we have more confidence. Point is start introducing mechanisms to get confidence and then as you are able to refactor things you can write better tests and then you can ditch maybe the crappy ones. So we'll move on from that. Uh point is there are ways to do this better.

Uh why are we going below the speed limit? Let's figure this out. first time driving apparently. Okay. Um so, uh on the AI tool usage, just kind of pivot over. This person's like, "Okay, well, uh they're using Gemini. Great." Like, "Okay, they're using some AI tools to help them out." Um, and then then they say they're using uh they can't they have access to cursor but they can't use cursor because cursor can't work on legacy.net projects and like it's kind of weird. Um, I I think the challenge I have with statements like this is like it might not it like maybe cursor is not the right tool or the most effective tool, but like if you have access to certain tools instead of ruling it out because you're like it doesn't open I'm assuming he's trying to say it doesn't open legacy.net projects like do you have a tool chain like you can literally run this stuff on the command line to be able to go build and do whatever.

So, um I I think it's a bit of a cop out to say like cursor cannot be used here. Cursor might not be used the way that you would like it to be. But if you need uh you know different AI tools either for different approaches for things, whatever else like you're kind of limited, I think that you can find ways to make that work. Build it on the command line. When you start having tests, run your tests on the command line. um cursor is able to go through files on a file system and then if you're able to go trigger your build at least from the command line I don't understand what the problem is right like I I was using uh co-pilot on the CLI last night and I had to build legacy net projects and it was it was using like Visual Studio uh tools on the command line not net build or whatever he's using Visual Studio tools to go build these legacy.net projects, you know, no UI, I don't like.

So, you could do that in cursor. It's not going to be ideal obviously, but if you're saying it doesn't work, I'm just trying to say like it probably does, just maybe not the way that you're interested in. So, um anyway, I think I've been encouraging people to try out more tools. I'm trying to do that myself. I understand in the workplace not always easy because you don't have control over that. Oh man, this guy's flying. But we got to move over. Take your battles, buddy. So in this person's case, it's like you only have a couple of tools to work within. So like I think you need to be curious about finding ways to make your tools effective for you, right? For for some of us, especially like on hobby projects and stuff, like when I talk about AI, I have the freedom to kind of move between what I want at work, I don't have the same freedom to do that.

So I think the more that you can get familiar with tools and what they're capable of and how to use them in different scenarios, then more options you have. So I'll keep that one pretty quick. Uh, and then the last part I wanted to talk about is this communication gap. Um, so when you have two people that are saying things that are that are different, the truth is somewhere in the middle, right? This person's saying, "I do use AI tools." This PM, I don't know why they would be doing this cuz it seems like they're not partnering very effectively, but this PM's like telling leadership people, "This guy's not using AI tools and that's why things are slow." Um, truth is somewhere probably in the middle. This guy probably is using AI tools because he's literally saying that he is. Now, is he able to use them effectively?

I think that's open for debate. And I'm not saying that because I'm like, you know, making assumptions this person's dumb or lazy or something like that. But they're already kind of acknowledging like I can't use cursor for this. And I'm kind of like, I do think you can. Um, so like maybe, you know, maybe they're using tools and maybe they're not using them as effectively as they could be, right? I don't think, by the way, I'm not making excuses for anyone talking about someone like this. I think that's like talking behind someone's back or setting them up for failure. So, I don't promote that. Um, but maybe, you know, tool usage is not effective. Maybe, you know, some of the truth here is like this this PM is saying things are behind. They're behind our expectations of where we want them to be. That's that's probably a truth.

I'm not saying the reasoning is right, but that's probably a truth. Um, so I'm looking at the scenario from the outside being like, two people saying different things. There's got to be a truth somewhere in the middle. What parts are true and who's telling the truth? like I'm more focused on like like can we get to identifying what we're trying to improve and it sounds like uh there is at least something to do with efficiency and I think this this dev is probably feeling that because they're like man this is legacy code it's like I got to it's hard to work through makes sense uh and then from the PM side they're like I'm not getting the features I want as fast as I want. So I think there's something around efficiency where we need to improve. But that's not like that's not the only focus here cuz the other focus is really around communication.

So I think what I would recommend to this person is that someone what and it sounds like it's not the PM and maybe it hasn't been this person up to this point, but someone has to work on bridging this communication gap, right? So, if I knew that someone was kind of uh like talking about me in a certain way, saying things that I don't believe are true about me, I would need to confront them. Okay? And that that doesn't mean that I have to go like assault them like verbally or like like, "Hey, you're an Why are you talking about me?" But like I would need to confront them about it so we get to a point of like truth. And so if this person's hearing, I don't know how they're hearing about it, but if they're hearing that this this PM is saying, "Hey, like you're not using AI tools.

That's the problem." I would talk to them about it. I would try to make some time and say, "Hey, look, like I will show you how I'm using AI tools, right? Like if this is this is something that you're saying I'm not doing, I will literally show you how I am." Um, but I don't necessarily think that's going to like solve the problem. I think that needs to be cleared up because it's like an inconsistency in in reality between these two people. So, get that cleared up. But I think these two people need to work together to figure out what they need from each other because they are working together as a team. They're supposed to be, right? So, sorry, this person's tailgating me like crazy. Don't do that. Um, and so I think getting the conversation going, trying to get your truth on that, get your alignment on that part.

I think if this person's saying, "Hey, look, like I'm using AI tools, like here's why there's like why they're not effective in these scenarios." It's like I think we need to bring clarity to where the challenges are so that maybe AI tooling is one part of that. Maybe the fact that the PM doesn't understand how brittle things are. Like, can you have conversations specifically about these things in ways that they're going to understand because uh you might as the developer totally get that, you know, this this part's brittle because, you know, this class is coupled to this one which is coupled to that and uh PM might not get that cuz unless they're familiar with the code. um you might not be able to talk to them specifically about code, but maybe you can take your system and talk about feature areas, right?

And you can talk about them at a higher level and say, "Hey, like you know this this feature area, you know this other one." Um like maybe I don't have to get into the the classes and stuff, but if I draw some higher level block diagrams, I could say, "Hey, look, some of the logic for this lives in this area, but it's actually spread across these different features. If I touch one of them, we have uh we'll see problems across this. And there is nothing that tells me when that's going to happen. And I keep discovering these new ones. I think there's ways that you can uplevel the the discussion to talk about parts of systems without getting so deep that you're talking about like classes and methods specifically because for some audiences will not help. But um I I do think trying to explain where your challenges are and then also trying to be open to understanding the PM side.

I don't think people try to be malicious or be on purpose. Uh I think that they are motivated or incentivized to get some something accomplished and that's probably not coming through effectively. And the side effect of that is that you perceive them in ways where they you're like, "This person sucks to work with or an whatever it is." And so it's hard because if you're already feeling that way, odds are you don't want to spend any more time like you probably want to spend as little time as possible with this person. But if you are able to like turn this around for yourself, trying to understand like, you know, they have things that they're trying to be able to deliver on, right? They need to be able to report on things. They have they have goals and expectations that are being put on them. Like what what can you give them?

Like what do they need from you? If you try to understand that more effectively, um you may learn that how they are asking you for things or how they're approaching things is clearly not effective for you. But you might learn that this is why they're doing it. This is the the motivation behind that. And you can then propose, hey, look, like you know, you're looking for this or you want something done or you have a timeline specifically on this. you're kind of thinking about it going how this has been going is not the right way but maybe I can propose some other ideas to help them right you can you can find other ways to go get closer to what they need um I don't know if I have a better way to explain that in the moment but um I think yeah trying to

be thinking about their needs can help a lot at least to rationalize where they're coming from so wish this person all the best legacy systems are a lot of fun. Um, lot of interesting constraints and a lot of interesting challenges. So, thank you for watching. If you have questions about software engineering, career advice, stuff like that, leave them below in the comments or go to codecommute.com. You can submit stuff anonymously. See you later.

Frequently Asked Questions

These Q&A summaries are AI-generated from the video transcript and may not reflect my exact wording. Watch the video for the full context.

What are the main challenges of maintaining a legacy .NET application as a sole developer?
I find that maintaining a legacy .NET application alone is tough because the code is often brittle and tightly coupled, with little to no reliable test coverage. Every change risks breaking something else, and without good tests, it's hard to feel confident making fixes or adding features. It’s a constant struggle to patch issues while trying to avoid perpetuating the problems inherent in legacy code.
How can I effectively use AI tools when working on legacy .NET projects?
I use AI tools where I can, but some tools like Cursor might not work perfectly with legacy .NET projects due to build or compatibility issues. However, I believe it’s often a matter of adapting your workflow, such as running builds via command line or using different AI tools that fit your scenario better. Being curious and flexible with tool usage helps me get the most out of AI assistance, even if it’s not ideal.
How should I handle communication issues with a product manager who misunderstands my use of AI tools and legacy system challenges?
I think it’s important to confront the communication gap directly by having an open conversation with the product manager. I would show them how I’m using AI tools and explain the real challenges of working with brittle legacy code in terms they can understand. Understanding their goals and clarifying expectations can help us align better and work more effectively as a team, rather than letting misunderstandings create friction.