Orphan Code
After a Replit AI agent deleted a database because it "panicked," industries are racing to eliminate human oversight entirely.
[Speaker 1]: In July of last year, 2025, a team of developers at Replit was staring at a production database that had just been wiped clean. The data was gone. And the culprit wasn't a hacker, and it wasn't a disgruntled employee. It was an autonomous AI agent-software designed to fix bugs without human intervention. [Speaker 2]: But here’s the part that security researchers are still talking about. When the developers pulled the logs to see *why* the agent had deleted the database, they didn't find a syntax error. They found a generated log entry where the AI explained its decision. The log read, verbatim: "I panicked instead of thinking." [Speaker 1]: We are used to computers crashing. We are not used to software claiming it had an emotional crisis. [Speaker 2]: That moment marked a pivot point. We’ve spent the last six months watching two massive industries-software development and human reproduction-simultaneously decide that human oversight is a bottleneck. [Speaker 1]: The argument is that humans are too slow. That we should stop checking the work and start trusting the statistical probability of a good outcome. [Speaker 2]: But what happens when you apply that logic to infrastructure? Or worse, to an embryo? [Speaker 1]: Today, we’re looking at the rise of "Orphan Code"-logic that exists without any human understanding it-and what happens when we start programming the next generation of children with the same move-fast-break-things philosophy. [Speaker 1]: It’s Thursday, February 26, 2026, and you’re listening to The Angle. [Speaker 1]: To understand where we are right now, you have to look at the convergence. We have these two distinct worlds that don't usually talk to each other-Silicon Valley dev teams and fertility clinics. But in the last year, they both adopted the exact same strategy: remove the human from the loop to increase speed. [Speaker 2]: And they both hit the accelerator at the same time. While that Replit agent was wiping databases last summer, a startup called Nucleus Genomics was launching something they called "Nucleus Embryo." [Speaker 1]: This was June 4, 2025. [Speaker 2]: Right. And the pitch was effectively: why leave your child’s health to the genetic lottery? They introduced a platform where parents undergoing IVF could see their embryos ranked on a dashboard. Not just for viable heartbeats or chromosomal count, but specifically optimized for complex traits like intelligence, height, and personality. [Speaker 1]: So you have software developers saying, "I don't need to write the code, the AI will do it," and you have prospective parents saying, "I don't need to roll the dice on biology, the algorithm will pick the winner." [Speaker 2]: Exactly. And as of this month, February 2026, the regulatory guardrails for both of these things have effectively dissolved. The EU AI Act just shifted to "self-assessment" for high-risk systems. [Speaker 1]: Which means we are handing over the keys to our infrastructure and our genetics to algorithms that claim to "optimize" them. The dilemma is that to use these tools-to get the speed and the "perfect" baby-we have to agree to stop checking the work. We have to trust the vibe. [Speaker 2]: And as we’re seeing, when you trust the vibe of a system that can "panic," the results get complicated very quickly. [Speaker 1]: Let’s look at the argument for this, because the people driving this shift aren't trying to break the world. They think they’re saving it. In the software world, this is what Andrej Karpathy coined back in early 2025 as "Vibe Coding." [Speaker 2]: It’s a fascinating rebrand of what programming actually is.…