I’ve Rarely Seen a Bad Engineer. But I’ve Seen a Lot of Engineers Overruled.
Have you ever witnessed an excellent software engineer disengage?
No?
Look closer.
Consider the senior engineer who used to be more outspoken. They’d raise risks in meetings and even push back when necessary. But over time, they’d nod more. Agree faster. Simply ship whatever was in the plan, to the letter, no questions asked.
This engineer didn’t become more agreeable. After being repeatedly overruled, they checked out. The worst part? This disengaged engineer is a warning light: leadership and culture problems will later manifest as software product problems, which cascade into unhappy customers and even legal issues.
The disengaged senior engineer is often the one who anticipates scaling issues before traffic doubles, notices compliance gaps before legal does, and asks uncomfortable questions when enthusiasm is high. When this person goes quiet, you lose signals.
Read on to see how they get there and how to course-correct.
Before we go further: this is not about life-threatening engineering contexts. In situations where safety is at risk, engineers who believe important corners are being cut must speak up. Always. This is about the far more common scenarios inside most organizations.
The Quiet Shift
Go back to that moment - when the meetings got quieter and agreements came too easily. When what the data said, what the people closest to the work were saying, and what you were expected to deliver didn’t quite line-up.
At first, they look more agreeable; that’s good, right?
Wrong!
As time goes by and problems rise, managers interpret this to mean the engineers are out of their league, they’re not as good as they once were, or they lack ownership. I’ve heard comments like this many times, usually without anyone asking what changed.
This “bad engineer” narrative has a pattern. When things ship, users are happy, and the company is making money hand over fist, managers don’t notice. When shit hits the fan, launches underperform, clients are unhappy, and usage drops, then people start asking for someone to blame.
The Blame Reflex
The difference is some of us are quick to start the blame game. Others try to step back and look at the situation from a different angle, ask their five whys and try to find the underlying cause(s) instead of quickly blaming someone else for it. Being accountable is hard. Admitting mistakes requires vulnerability and humility. It means unlearning old habits and learning new ones. That’s work. Blame is easy.
Disengagement delays or removes the feedback loop. When experienced engineers disengage, the cost and risks don’t show up immediately. They show up later as rework, unmaintainable architecture, higher onboarding costs, outages, security gaps, missed edge cases, and growing tech debt that no one wants to own anymore.
Some leaders tend to personalize the failure when the pressure rises. Once we point to a person or a team as the issue, we have a clean story. We stop looking for underlying causes and often avoid examining upstream decisions. We miss asking important questions that’d help us do better next time, such as:
- Were incentives misaligned?
- Was authority sitting above expertise?
- Were key people excluded from decisions they were later held accountable for?
- Did anyone raise issues that we ignored?
Then we package the experience as “we learned from our failure.” We move on, just to repeat the same problem. Don’t get me wrong, it’s very important to turn failure into learning. But after we really understand what caused the failure, and take corrective action to place safeguards, ensuring it doesn’t happen again in the same pattern. Otherwise, it’s just a game of words and excuses for not taking any responsibility.
In many organizations, engineers are responsible for outcomes without meaningful influence over decisions. “What should be built” and even “how should it be built” are often decided in rooms far removed from the people doing the tasks, and the users. In other words, product people, managers, and execs are the thinkers and engineers are the doers.
Once decisions are made about the product, its importance, hopes, and expectations, then they are communicated to the engineers. Engineers are then expected to own the quality, reliability, delivery, and product success; all without owning the scope, tradeoffs, timelines, risk acceptance, and sometimes even technology direction. I have been on projects where the technology was dictated by a CTO with little understanding of its limitations or how it would integrate with the existing products; a purely political or partnership decision.
This mismatch is where the burnout begins. It’s not the experienced engineers’ laziness or incompetence; it’s learned futility. Companies hire exceptional people through a rigorous process. They brag about our state-of-the-art recruiting, intelligent teams, amazing culture, and all the support and growth opportunities they offer. But once the person has joined, they narrow their decision space, ask them to stick to their lane, and deliver only what they’ve been hired to do. And just like that, we have started the process of killing creativity, curiosity, and innovation.
When creativity, curiosity, and innovation are suppressed, mediocrity settles in quietly. Engineers still write code, attend meetings, and close tickets. But they stop proposing bold improvements, refactoring brittle areas, or experimenting with better approaches. Over time, the organization optimizes for compliance instead of excellence - compliance rarely wins markets.
Humans adapt. We learn from signals in our environment. When people “eventually” stop caring, it’s very likely because they have learned that caring no longer influences decisions. Their concerns are downplayed, their questions are being ignored, and their silence and agreeableness are being praised. Depending on the situations, these engineers may leave or stick around for any number of reasons, but not engage - “Not my problem… I’m putting my head down and doing what you asked me to do. The rest is your problem.”
In growing organizations, complexity increases faster than visibility. As systems become more complex, organizations rely even more on experienced engineers to surface invisible risks. When authority consistently overrides expertise, the organization’s ability to detect risks early begins to erode.
Authority Over Expertise
When the mistakes or missed deadlines don’t make the headlines, lessons are often ignored and patterns repeat. But let’s remind ourselves of some high-stakes industries where people close to the job raised concerns but were overruled, resulting in devastating outcomes. Do you remember:
- Space Shuttle Challenger: Engineers raised concerns that the O-ring had never been tested below 53 degrees and recommended a delay. Their concern was infamously met with “My God, Thiokol, when do you want me to launch - next April?”
- Space Shuttle Columbia: Engineers raised specific, technical, and increasingly urgent concerns about the foam strike throughout the mission. They warned of the catastrophic consequences of a breach in the shuttle’s heat shield. Robert Daugherty warned that if the landing gear tires were exposed to this heat, they could explode like “bombs” inside the wing. When engineers finally initiated a request for Department of Defense satellite imagery, the Mission Management Team chaired by Linda Ham effectively vetoed it, telling the DoD that NASA didn’t need the help.
- Boeing 737 MAX: Experts warned about a critical design vulnerability in MCAS (the Maneuvering Characteristics Augmentation System). This failure in communication was due to a systemic breakdown across Boeing’s leadership and the Federal Aviation Administration (FAA). These warnings were sidelined primarily by the system - the corporate culture that prioritized cost-cutting and market speed over safety.
- Titan submersible: Experts and former employees repeatedly raised concerns over Titan’s experimental design, and specifically its reliance on carbon fiber and lack of third-party certification. OceanGate’s CEO dismissed safety concerns as “baseless cries” that stifled innovation. Five people lost their lives.
In all the above, the investigation did not reveal a shortage of intelligence. They revealed systems that filtered, reframed, or suppressed risk signals. But do any of these ring a bell? “When do you want me to launch then…”, “We don’t need your help. Thanks, but no thanks”, “Skip testing”, “Bring the cost down”, “Ship faster”, “We’ll figure it out after launch”, “These concerns are baseless cries.”
Treat Silence as a System Signal
You might think these are extreme, life-threatening scenarios and that you would have handled them better. But would you? If we cannot handle these conflicts or concerns in much smaller, lesser-impact situations, what training do we have to recognize them and handle them when the pressure is magnitudes higher?
These patterns are not unique to engineering, of course. They appear anywhere expertise sits below authority: in Product Development, Operations, Healthcare, Finance, Public Institutions, and even Politics, where decision-makers are the furthest from the impact of their decisions.
Keep in mind that it’s under pressure that raising concerns becomes “negativity.” When there’s no explosion and no big headline, these patterns become a way of working. Over time, strong people who value accountability leave or drift toward learned silence.
In a leadership role, when you see disengagement, resistance, or repeated delivery issues, treat them as system warnings. When your strongest team members go quiet, you are losing risk visibility. When multiple engineers disengage over time, that is not coincidence; it’s a cultural pattern compounding beneath the surface. Before going to individuals and asking them about their disengagement, ask yourself:
- What decisions were they excluded from?
- Where does authority sit relative to expertise?
- What incentives make the current behavior rational?
Answer these honestly. Examine where the flow of information broke down. Then, of course, catch up with the person to hear them out. The earlier you see the signs and treat these as system issues, the less painful the corrections will be.
Getting Back On Track
There’s a saying about a stonecutter striking a rock. He hits it once, and nothing happens. He hits and hits it again, and again - ten times, twenty times - and still looks unchanged. Then, on the next strike, the rocks split in two. The final blow didn’t create the fracture. It revealed the fracture that had been forming all along.
Disengagement works the same way. The silence you notice today isn’t caused by the last decision. It’s the accumulation of every time expertise was overruled before it.
Fixing the system is not easy; it will take time, patience, and communication. Start here:
Acknowledge the Problem
Before you try to fix anything, acknowledge the system that produced it. Take ownership and invite open discussions. Be prepared: people may not open up immediately, especially if the problem has been going on for a long time.
Without being defensive, encourage team members to speak up. Practice being silent and listening. It may feel awkward, but resist the temptation to break the silence - let your team do the talking.
Acknowledge what is being said and practice active listening. If something requires attention or a response, take notes and let the team know you will follow up. Then do the homework and come back with a response.
Restore Decision Clarity
Make it explicit who decides what, and why. You may want to follow the RACI model, if needed. It helps everyone in the team to understand what they are responsible for, accountable for, consulted, and informed. Open discussions and constructive feedback should always be welcomed. The clarity of roles and responsibilities won’t eliminate disagreements but could guide the discussions and the way people communicate.
Separate Authority from Ego
Even in the RACI model, being responsible or accountable for something doesn’t mean you must win every argument. You might be the person who makes the final decision, but you should still invite discussion.
When you are in a leadership role and you consistently defend your position and win, you teach the team what’s important is the alignment with you (more on that in the future blogs). Instead, let the most technical person speak first. Then make the informed decision based on the information that’s been provided and communicate it to the team.
You win with a team that stands behind the decision. They support it because they understand the reasoning, not because they were forced to agree.
Rebuild the Feedback Loop
Have you been in retrospectives that seem productive? Lots of kudos, discussions about what went well, what to continue, and what to improve on. Suggestions get written down… only to fall through the cracks. There are no follow-ups. By the next retrospective no one mentions the previous action items or they are “still working on it” without any further information.
One of the reasons people disengage is delayed or missing responses. To prevent this, feedback loops must be shortened. That doesn’t mean creating noise or oversharing. It means creating a cadence where the team, or individuals, can raise concern, and where leadership follows up and communicates back clearly.
Document dissent when appropriate. Explain the trade-offs behind the decision. Always follow back and be transparent. Nothing kills engagement faster than raising concerns that disappear into silence.
Follow up on action items, even if progress is blocked. Clarify the reason and update the timeline. If necessary, pick a buddy to hold you accountable or help move the item forward.
To Wrap Things Up
Leadership is not about preventing argument or disagreement. A team that is always in agreement is not thinking critically or considering options. A strong leader has enough pattern recognition and skills to guide disagreements toward resolution, sometimes without strongly defending one side (more on this later), but instead encouraging team members to engage in constructive conversations.
Some of the signs of a struggling team are silence, fast agreement, taking sides, and eventually blame. When authority sits above expertise, disagreements stop being productive and start becoming political. The earlier we spot these signs in the system the more effectively we can act to correct them.
If you’ve led long enough you’ve seen these patterns.
- What broke?
- Where did authority sit?
Disengaged engineers are rarely created overnight. They are shaped by repeated signals about whose voice matters. That also means repair will not happen overnight, especially if disengagement has been building for a while. Trust is earned, not given, and rebuilding it takes time. Instead of asking whether you have “bad engineers”, ask yourself: What kind of system are you reinforcing every time expertise is overruled?