Don't push AI to 100%: keep your hands dirty

But getting from 85% to 100% without programming by hand gets exponentially harder. The effort required to prompt, tweak, and steer the model to write that last piece often outweighs the cost of just writing it yourself.
The actual percentages do not matter, the core principle does: it is much better to have 80% of the code generated in a way that requires zero rework, than 99% of the code generated in a way that requires constant fixing. That 80% must be highly attractive for developers to work in. The focus should not be on maximizing the AI's output volume, but on ensuring you don't have to rework the output it does produce.
Over time, as models and tooling improve, this baseline will naturally grow towards 100%, and eventually, the starting point of our work might be a product that is already done. Theoretically, this means the programmer is no longer needed.
But as the saying goes: In theory, theory and practice are the same. In practice, however, they differ.
When an AI model messes up in a production environment, a human is still responsible. This means the developer needs to know exactly how to navigate, debug, and fix what was produced. And while AI will eventually become capable enough to fix its own bugs, we are living in the present.
Is the role of a developer changing?
To understand how we should deal with this right now, we need to take a small side-step and ask if the role of a developer is fundamentally changing.
Currently, yes. The world is asking us to transfer our knowledge, context, and experience to these models. In a very real economic sense, the reason we are "orchestrating" AI right now is to train our future replacements.
Some developers argue that modern development is simply shifting toward this "orchestrating" role. I disagree. I don't see orchestration as development. I am a programmer; I am happy writing code. Orchestrating is a completely different job.
You might ask: Isn't this just the next logical abstraction step? Like moving from punchcards to binary, from binary to assembly, and from assembly to compiled languages?
No. Those historical steps were about finding easier, more expressive ways to feed concrete instructions to a dumb machine. A compiler simply translates exactly what you tell it. Using a model to program is fundamentally different. It is roughly the same as asking a (very knowledgeable) junior colleague to program for you. It is delegation, not abstraction.
A programmer's job is to turn a vague, abstract idea into something extremely concrete. A model can do this as well, but managing a model is management, not programming.
The main thing to get from this side-step is that we must not confuse the skill of programming with the skill of orchestrating.
Why are we still working in text editors?
Let's take another brief side-step into our history. Why, after more than 50 years of software engineering, are programmers still typing freely in text editors?
Over the decades, we have seen many "obvious" improvements that promised to end plain text programming. A sizable number of smart people worked on "structural editors"; environments where it was literally impossible to create invalid syntax. Similarly, we saw the rise of UML (Unified Modeling Language) to visually create software specifications, and a focus on type theory to completely describe the what of a program before it runs.
But a kind of natural selection took place in our industry. These strict environments and "obvious" improvements never entirely took over. It turned out that specifying the what is not always sufficient. In many cases, writing out the how in a procedural, step-by-step way (like a recipe for food) is the most efficient way to make a vague idea concrete.
To write that how, developers need freedom. The history of our tools shows us that going from a vague idea to a concrete execution in its purest form requires the flexibility of plain text. Encoding ideas freely as text is programming.
I am quite convinced that AI will eventually master this text-based translation. But even when the machines can flawlessly turn our thoughts into code, we won't run out of work. Humanity will never run out of the vague, abstract ideas that need to be made real. I think the demand for this will always be greater than AI alone can supply.
The main takeaways for this side-step are:
- Using code to express certain ideas seems to be an evolutionary peak; the best way to do it
- The demand for this skill will always be greater than the supply, even with AI
The risk of losing the craft
We are approaching an interesting point in history where humans may no longer have any capabilities that are entirely unique to us. But let's tie this back to our two side-steps to see where that leaves us today.
In our first side-step, we noted that developers are currently transferring their knowledge and experience to models. While this helps move the technology forward, it naturally takes away time they would otherwise spend actually programming. If developers stop programming and only orchestrate, they will slowly lose their ability to program. Their skills will fade away. Losing this capability is a real danger.
In our second side-step, we established that programming in its purest form, turning a vague idea into a concrete textual procedure, is a highly specific specialty. Eventually, AI will be able to fully replicate what programmers are doing here. However, this specialty will remain important.
Clearly, it is important in the short run: we need to retain the skill in our teams because we are still the safety net right now. But it will likely remain important in the long run, too, even after we reach the point where AI can program, release, debug, and maintain software securely and autonomously.
Why? Because for a specific group of people, the craft itself is the point.
Conclusion & advice
There are different kinds of developers. Some deeply love the craft of programming. Others love the act of creating, regardless of the medium. Recognizing this subtle difference will help you figure out who in your team needs to remain a programmer, and who will naturally transition into an orchestrator.
This extends beyond developers. Other "creator" types will likely make great orchestrators. But people whose passion lies purely in communicating with other human beings will probably dislike it. Much like the craft programmers, they prefer the real thing over managing a machine.
If you are navigating this transition right now, whether in software, design, writing, or any other discipline, here is my advice:
- Make the AI produce a foundation you actually want to work with. Don't try to force the model to deliver 100% of the final product if it requires endless prompting. Get it to a solid 80%, but ensure that 80% is clean, coherent, and doesn't require any rework. Don't overspend your time wrestling the tool; the models will get better naturally over time.
- Keep your specialists practicing their craft. We need to retain human skill for the coming years. Ensure your team is still doing the manual work, solving hard problems, and understanding the underlying fundamentals. We are still the safety net.
- Don't panic about the sci-fi future. The idea of a future where humans have no edge over AI is only scary because we cannot imagine it. It will not happen overnight, and no one can accurately predict what it will look like. If it turns out to be a 'not-fun world', you will have plenty of time to be sad about it when it actually arrives. For now, focus on building good software for the ones who experience biological sensations.

