Gergely Orosz delivers a rare, data-rich autopsy of a year where artificial intelligence didn't just change the code—it rewrote the social contract of the software industry. While most industry retrospectives offer vague platitudes about "innovation," Orosz provides a granular look at a bizarre paradox: a job market that is simultaneously starving for talent and rejecting applicants, all while remote work evaporates under the weight of suspicion. This is not a celebration of progress; it is a warning label on the new era of engineering.
The Great Disconnection
Orosz frames 2025 not as a year of breakthrough, but of friction. The most striking observation is the decoupling of productivity from employment stability. "It's been a bizarre 12 months in the job market, when job seekers struggled to hear back about their applications, and employers found it hard to hire solid engineers," Orosz writes. This isn't a shortage of people; it's a shortage of trust. The author details how the rise of AI agents and "vibe coding" has created a fog where genuine skill is obscured by automated performance. The industry is now so flooded with AI-generated code and fake resumes that the signal-to-noise ratio has collapsed.
The consequences are tangible and severe. Orosz points to a disturbing trend where the very tools meant to accelerate development are making it harder to get hired. "Incidents like these have probably helped contribute to a decline in the number of remote jobs, and made it harder to get hired," he notes, referencing a specific postmortem where a startup nearly hired a non-existent engineer. This is a critical insight: the fear of fraud is driving a return to physical offices, not because of collaboration needs, but because of verification anxiety. Critics might argue that blaming remote work for hiring difficulties ignores broader economic headwinds, yet the specific linkage between AI-fueled impersonation and the RTO (Return to Office) mandate offers a compelling, if unsettling, causal chain.
"Remote algorithmic interviews provide little signal these days, now that invisible tools help candidates ace them without much preparation."
The Human Cost of the AI Arms Race
The commentary shifts from hiring mechanics to the cultural toll of the AI explosion. Orosz highlights a disturbing inversion in seniority: while senior engineers use AI tools more efficiently, it is the juniors who are becoming "more profitable" due to accelerated onboarding. This suggests a flattening of the learning curve that could erode the deep institutional knowledge that usually takes decades to build. The author warns that "AI seems to amplify coding knowledge for seniors, and could help juniors become seniors faster," but this speed comes at a price. The piece reveals that "AI startups which set extreme working hours cultures in their bid to outrun competition," creating a high-pressure environment where the human element is secondary to the algorithmic output.
The decline of traditional knowledge repositories is another casualty. Orosz observes that "L[arge Language Models] seem to have accelerated the decline" of platforms like StackOverflow, meaning new generations of developers may lack the foundational troubleshooting skills that defined the 2010s. This is a profound shift in how engineering knowledge is preserved and transmitted. If the collective memory of the industry is outsourced to black-box models, what happens when those models fail or hallucinate? The author doesn't fully answer this, but the implication is that the industry is becoming more fragile, not more robust.
The Administration of Code
Beyond the job market, Orosz examines the structural changes in how software is built. The rise of the Model Context Protocol (MCP) and tools like Cursor represents a fundamental shift in the developer's relationship with the machine. In a nod to the deeper technical history, Orosz references the work of Chris Lattner, whose contributions to Swift and Mojo are reshaping the very languages used to build AI. "A conversation about how better language and compiler design can open the door to faster, more accessible AI development," Orosz writes, highlighting that the battle is no longer just about algorithms, but about the infrastructure that supports them.
However, the piece also exposes the fragility of the current ecosystem. The "10x overemployed engineer" story, where a talented individual tricked a dozen startups into paying for zero work, serves as a cautionary tale about the breakdown of accountability. "The music stopped for this blagger when one frustrated founder who'd hired and fired him went public," Orosz recounts. This incident underscores a systemic failure: the interview process is broken, the verification tools are flawed, and the culture of "move fast" has created loopholes that bad actors can exploit with ease.
"The AI coding tools explosion... the term 'vibe coding' went mainstream... and coding with agents slowly overtook using AI for 'just' autocomplete."
Bottom Line
Orosz's most valuable contribution is his refusal to treat AI as a magic bullet; instead, he presents it as a disruptor that has broken the traditional mechanisms of hiring, verification, and skill development. The strongest part of the argument is the link between AI-driven fraud and the erosion of remote work, a connection often missed in optimistic tech narratives. The biggest vulnerability, however, is the lack of a clear path forward for engineers who find themselves squeezed between automation and suspicion. As the industry moves into 2026, the question is not whether AI will change software engineering, but whether the profession can survive the trust crisis it has created.
The industry is at a tipping point where the tools designed to make us faster are making us more insecure. The next chapter won't be about writing better code; it will be about proving we are the ones writing it.