
BrutalTechTruth
Brutal Tech Truth is a multi-platform commentary series (podcast, Substack, and YouTube) delivering unfiltered analysis of enterprise IT, software architecture, and engineering leadership. The mission is simple: expose the hype, half-truths, and convenient lies in today’s tech industry and shine a light on the real issues and solutions. This brand isn’t here to cheerlead feel-good tech trends – it’s here to call out what’s actually failing in your infrastructure, why your cloud bill is insane, how AI is creating tomorrow’s technical debt if not guided, and which “boring” solutions actually work. In Frank’s own direct style: “If you're looking for feel-good tech talk or innovation celebration, skip this one”
Brutal Tech Truth tells the uncomfortable truths behind shiny vendor demos and conference-circuit clichés, bridging the gap between polished narratives and production reality.
BrutalTechTruth
Building Tomorrow, Breaking Today: The Moral Injury of Tech Workers
That sinking feeling in your stomach when you realize the code you're writing will eliminate someone's job—possibly your colleague's, possibly your own. It's the quiet crisis tech workers rarely discuss openly: the psychological toll of building automation systems that displace human workers.
This deep psychological wound, what experts call "moral injury," accumulates through countless small compromises. You attend stand-ups, update tickets, and write elegant code, all while questioning what your work truly means when it displaces hundreds of workers. The dissonance creates what researchers term "performative productivity"—going through professional motions while emotionally checking out. Some describe it as "nihilistic productivity"—delivering technically brilliant solutions while wrestling with their human consequences.
The challenge intensifies as tech workers find themselves in a peculiar position: simultaneously building systems that might replace others while accelerating their own obsolescence. It feels like digging your professional grave while being expected to admire the innovative shovel design. The isolation compounds when expressing doubts marks you as "resistant to change" in environments demanding enthusiasm for every efficiency gain regardless of human cost.
Resistance takes subtle forms—building features preserving human oversight, documenting limitations thoroughly, advocating for gradual implementation, designing systems that augment rather than replace. Maintaining psychological health requires intentional strategies: setting boundaries, finding purpose beyond work, processing complex emotions, seeking communities that understand these challenges, and actively reframing technology's purpose from replacement to augmentation.
The conversation must continue. By acknowledging automation's psychological weight, supporting each other through its challenges, and working toward more humane implementations, we can build technology that enhances rather than replaces human capability. How are you navigating this moral complexity in your own work? Join the discussion and share your experience with balancing progress and purpose in the age of automation.
https://brutaltechtrue.substack.com/
https://www.youtube.com/@brutaltechtrue
You know that feeling when you're writing code late at night building an automation system, and suddenly it hits you this thing you're creating is going to eliminate someone's job Maybe not today, maybe not next month, but eventually and that someone might be the person who sits three desks over from you. Or maybe it already happened and you're still processing what that means. We need to talk about this, really talk about it. Not the corporate messaging about upskilling and new opportunities. Not the tech evangelist promises about universal basic income solving everything someday. I mean the actual weight that many of us carry right now, the psychological toll of being the architects of systems that replace human workers, sometimes our own colleagues, sometimes ourselves. This is a conversation that's happening in break rooms and on encrypted messaging apps, in therapy sessions and late night phone calls with friends in the industry, but it's rarely acknowledged in our sprint planning meetings or engineering retrospectives. Today, we're going to bring it into the light.
Speaker 1:If you work in software engineering, it management or system architecture in 2025, you're likely involved in some form of automation or AI implementation implementing RPA systems, developing AI-powered analytics tools or creating workflow automation. You're part of a massive transformation in how work gets done. Every board meeting includes discussion about AI strategy. Every department is being asked to identify processes that could be automated, and we, the technical professionals, are the ones being asked to make it happen. But what those board presentation don't capture is the human side. The software engineer asked to build a system that will replace the data entry team they were worked with for three years. The IT manager implementing an AI tool, knowing their team will shrink from 12 people to four. The system architect designing infrastructure for machine learning models that will make entire job categories obsolete.
Speaker 1:What psychologists call moral injury the deep psychological wound that happens when you are required to do things that violate your moral beliefs or when you witness suffering you feel powerless to prevent is increasingly being applied to those of us building the automated future. This term originally came from military psychology, but it perfectly captures what many tech workers are experiencing. Moral injury in our context occurs when you build systems knowing they'll displace workers with families to support, when you implement AI systems that aren't ready but are pushed through due to aggressive timelines, or when you witness the aftermath of automation firsthand, watching colleagues clean out their desks at awkward farewell parties where everyone pretends this was a mutual decision. The injury comes not from a single traumatic event, but from the accumulation of these experiences, each small compromise, each system deployed, each colleague displaced. They add up and, unlike physical injuries, moral injuries don't heal with time alone.
Speaker 1:There's something particularly twisted about being asked to be productive in building systems that eliminate the need for human productivity. We optimize our springs, improve our deployment pipelines, increase our velocity, all in service of creating systems that will mean fewer humans are needed. This leads to what some researchers call performative productivity going through the motion of being engaged, innovative team members while internally checking out, we attend the stand-ups, update the tickets, review the pull requests, but the emotional investment is gone. Some engineers describe it as nihilistic productivity writing elegant, efficient code that performs beautifully, but questioning what good work even means when that code displaces hundreds of workers. One of the most challenging aspects is the isolation many tech workers feel when grappling with these concerns. In many organizations, expressing doubt about automation initiatives is seen as being not on board or resistant. To change that we should be enthusiastic about every AI implementation, every automation project, every efficiency gain, regardless of the human cost. This creates emotional labor, the effort required to display emotion that you don't actually feel. You sit in meetings discussing how to optimize headcount through automation, while maintaining a neutral or positive expression. The isolation is compounded by the fact that outside of tech circles, it's hard to find sympathy. When you tell someone you're struggling with the ethics of your well-paid tech job, the response is often at least you have a job or you're part of the problem. The problem Many of us are simultaneously building systems that might replace others, while knowing that we're also building toward our own obsolescence.
Speaker 1:The rapid advancement of AI, coding assistants, automated testing tools and no-code platforms means that even as we automate other jobs, we're contributing to the automation of our own roles. This creates a peculiar form of anxiety raising against time, trying to move up the value chain fast enough to stay ahead of automation, moving from coding to architecture, from implementation to strategy, always trying to stay one step ahead of the systems we're creating. It's like being asked to dig your own professional grave while maintaining enthusiasm about the innovative shovel design. The psychological weight of this work manifests in different ways. Some developers describe a growing sense of disconnection from their work, going through the motions but feeling increasingly hollow about their contributions. Others experience anticipatory grief, mourning for colleagues who haven't been let go yet, but everyone knows it's coming once the new system is fully operational. There are intrusive thoughts about the impact of their work, emotional numbing as a protective disconnection from the human consequences of technical decisions, sleep problems and stress-related physical symptoms. Some therapists describe existential depression among tech workers a questioning of purpose and meaning that goes beyond typical job dissatisfaction.
Speaker 1:When your highly skilled work contributes to a future where human skills are less valued, it raises fundamental questions about purpose and worth. The responsibility gradient creates particular complexity. If you're a junior developer implementing specifications you didn't create, how responsible are you for the outcomes? If you're a senior architect designing systems based on executive mandates, where does your responsibility begin and end? When you're part of a large organization working on a small piece of a larger system, it's easy to feel like your individual choices don't matter. The automation will happen, whether I participate or not. But this sense of powerlessness isn't entirely accurate. Every technical decision involves choices about how humans and machines will interact. Every system design embodies values about what matters and what doesn't.
Speaker 1:As technical professionals, we have more agency than we sometimes acknowledge, even if it's not unlimited. High-performing technical professionals face a particular trap. We're problem solvers by nature and training. When presented with a technical challenge, our instinct is to solve it. Optimize it, make it work better is to solve it. Optimize it, make it work better. This can lead us to optimize systems whose fundamental purpose we might question Improving the algorithms we wish didn't exist, making systems more efficient even though we're uncomfortable with their application. The satisfaction of solving technical challenges can temporarily mask the discomfort with their implication.
Speaker 1:Options for resistance within organizations are limited and complex. Some developers engage in subtle forms of resistance building in features that preserve human oversight, documenting limitations and risk more thoroughly than requested, advocating for gradual implementation that gives displaced workers more time to transition. Others focus on building tools that augment rather than replace human workers, pushing for human-in-the-loop designs and AI as a collaborator rather than a replacement. Many of us fall somewhere between strict professional ethics building anything that's legal and technically sound and firm boundaries about what we will and won't work on. We make compromises we're not entirely comfortable with, while trying to maintain some sense of integrity, telling ourselves stories about why this particular project is acceptable. Kind of anxiety. The optimistic scenarios painted by tech leaders, where automation frees humans for more creative and fulfilling work, seem increasingly disconnected from reality. The dystopian scenarios mass unemployment, extreme inequality, social breakdown feel increasingly possible, but too overwhelming to fully contemplate. Some tech workers describe a sense of being on a runaway train, moving fast toward an uncertain destination with no clear way to slow down or change course. The momentum of technological development, market pressures and organizational imperatives technological development, market pressures and organizational imperatives creates a sense of inevitability that can be both frightening and oddly absolving.
Speaker 1:How do you mentor junior developers when you're struggling with the implications of the work yourself? How do you encourage someone starting their career when you're questioning the value and impact of that career path? The mentorship relationship becomes more complex when it includes not just technical skills but also strategies for maintaining psychological health and moral integrity in challenging work contexts. So how do we maintain our psychological health while working in this challenging context? While there's no perfect solution, there are strategies that many tech workers have found helpful. Setting boundaries matters. This might mean setting limits on the kinds of projects you will work on, compartmentalizing professional and personal identity or establishing time boundaries for work concerns.
Speaker 1:When professional work feels morally complicated, having other sources of purpose and identity family, creative pursuits, volunteer work become psychologically necessary. Developing a practice of reflection and processing is important. Through therapy, journaling, meditation or conversation with trusted friends. The key is processing complex emotion and moral conflicts rather than suppressing them. Maintaining perspective helps. You are contributing to automation, but not solely responsible for its impacts. Yes, the work has moral complexity, but so does most work in a complex economy. Finding a balanced perspective that acknowledges difficulty without catastrophizing can help maintain psychological equilibrium. Finding community with others who understand these challenges is crucial. This might be informal groups within your organization, online communities of like-minded tech workers or professional associations focused on ethical technology. These communities provide validation, practical strategies and combat the isolation that makes this work so psychologically difficult.
Speaker 1:Creating records not just of technical specification but of the human element being replaced the skills, knowledge and connection that are harder to quantify but no less real serves multiple purposes. It creates a historical record, honors the human element being replaced and provides a way to acknowledge and process the impact of our work. Individual action feels insufficient to address systemic problems, creating a classic collective action problem. Some tech workers are experimenting with forms of collective action informal agreements among colleagues, professional organizations developing ethical guidelines with real teeth, or unionization efforts that give workers more say in what they are asked to build. While automation makes everything more efficient in narrow technical terms, the time and energy spent managing moral injury, processing grief and maintaining motivation despite ethical conflicts are hidden inefficiencies that don't show up in metrics. The cost of turnover when people burn out or leave for moral reasons isn't captured in ROI calculations for automation projects. As we automate away certain types of work, what about the psychological reskilling needed for those of us doing the displacing? Some tech workers are actively rebuilding their sense of purpose, reframing their role from replacing humans to augmenting human capabilities, focusing on problems that truly need solving, thinking about technology as a tool for human flourishing rather than human replacement.
Speaker 1:Periods of major technological transformation have always involved psychological challenges. Technological transformation have always involved psychological challenges. The artisans displaced by factories, the scribes replaced by printing presses Each generation has grappled with technological change that threatened established ways of life. What's different now is the pace and scope. We're not just automating individual tasks, but potentially automating cognition itself. We need to normalize talking about these challenges. The psychological toll shouldn't be a dirty secret, but should be acknowledged in our organizations and professional communities. We need better frameworks for ethical decision-making in technical contexts, not just big-picture AI ethics, but daily decision about evaluating human impact and balancing competing values. We need structures that give technical workers more agency in the systems they build through different organizational structures, development processes or economic models. We need to take seriously the mental health needs of technical workers by addressing root causes of psychological distress, creating workplaces where moral concerns can be raised without career consequences.
Speaker 1:The fact that you're grappling with these questions, feeling the weight of these responsibilities, experiencing the psychological toll of this work, that's not a weakness.
Speaker 1:It's a sign that you're human, that you care about more than just technical optimization, that you recognize the broader implications of your work.
Speaker 1:The path forward isn't clear and anyone claiming simple answers isn't being honest about the complexity we face. But in that complexity there's also opportunity to shape not just the technical systems we build but the human systems around them, to advocate for implementations that preserve human dignity and purpose, to demonstrate that the best technology enhances rather than replaces human capability. The weight of automation is real, the psychological toll is significant, but you're not carrying it alone. By acknowledging its challenges, supporting each other through them and working toward more humane implementations, we can find ways to build a future without losing ourselves in the process. This is hard work, not just technically, but emotionally, psychologically, morally. Give yourself credit for doing it, Give yourself permission to struggle with it and give yourself the support you need to continue doing it in a way that aligns with your values and preserves your humanity. The conversation continues, the work continues and, hopefully, our humanity continues too, not despite the systems we build, but because of how we choose to build them.