What's the Big Deal?
by John Miller | December 1, 2025
I've been reading posts describing AI-assisted software development as revolutionary. As if something fundamental has changed. While this take is understandable, in my view, it misses the mark. In my opinion programming hasn't changed, but how we go about it has changed, again. In this post I'll compare coding's past, present and future and talk about what has changed and what hasn't. Let's first look at the past.
A Brief History of Programming Innovations (The Past)
Ever since the first computer was booted, there has been a need for coders to instruct the computer to get it to do something useful. This need has not changed and likely, will never change. What has changed is how coders instruct the computer. Over the decades, programming has evolved through several major innovations, each enabled by new technologies that transformed how humans express intent to machines. Below is a brief overview of when these innovations and their enabling technologies became mainstream.
Machine Code & Assembly (1940s–1950s)
Machine code was the raw binary language of early computers, executed directly by hardware built on vacuum tubes and electro-mechanical relays. Programs were entered through physical means: flipping panel switches, plugging cables, feeding punched cards into IBM readers, or threading perforated paper tape into tape readers. These input devices were the enabling technologies that allowed binary instructions to be stored and executed. Programming was a mechanical ritual defined by the hum of machinery, the clatter of card punches, and the whir of tape readers.
Assembly language introduced symbolic mnemonics like MOV, ADD, and JMP to replace raw binary. Assemblers—software tools that translated these mnemonics into machine code—were the enabling technology that made this possible. While programmers still thought in terms of registers and memory addresses, assembly reduced the burden of direct binary coding. It remains mainstream today in embedded systems, operating system kernels, and performance-critical applications.
High-Level Languages (1950s-1970s)
High-level languages abstracted away hardware details, allowing humans to write code in math-like or business-oriented syntax. This leap was enabled by compilers and interpreters, which translated human-readable instructions into machine code. FORTRAN (1957) pioneered scientific programming, COBOL (1959) became mainstream in business and government, and C (1972) spread with UNIX thanks to portable compilers. The clack of typewriter-style terminals replaced the grind of card punches, shifting programming from physical manipulation to linguistic expression.
Structured & Object-Oriented Programming (1970s-1980s)
Structured and object-oriented programming emphasized modularity, reuse, and abstraction. Compiler innovations enabled structured paradigms in Pascal (1970), while object-oriented compilers supported inheritance and polymorphism in C++ (1985). These enabling technologies made modular, reusable code possible, while developers worked with CRT monitors glowing green text and floppy disks snapping into drives
IDEs & Libraries (1990s)
Integrated development environments (IDEs) combined compilers, debuggers, and editors into graphical interfaces. Visual Studio was enabled by faster processors, GUI toolkits, and extensible plug-in architectures. Java introduced the JVM in 1995, a virtual machine that allowed “write once, run anywhere.” Reusable libraries and frameworks amplified productivity by abstracting common functionality. The tactile experience shifted to mouse clicks, drag-and-drop interfaces, and the buzz of desktop towers under desks.
Web & Scripting Languages (Late 1990s-2000s)
Web and scripting languages thrived thanks to browser engines, dynamic interpreters, and frameworks. JavaScript became mainstream with Netscape (1995) and later engines like V8 (2008) powered dynamic web applications. Python (first released in 1991) gained widespread adoption for web development in the 2000s and later dominated data science. Eclipse IDE (2001) provided powerful development environments. Ruby's Rails framework (2004) abstracted web complexity, enabling rapid development. Lightweight editors and laptops made coding portable, fueling the sensory experience of programming in coffee shops and classrooms.
Cloud, APIs, and Low-Code Platforms (2010s)
Cloud computing, APIs, and low-code platforms transformed programming into orchestration. Virtualization and containerization (Docker, Kubernetes) enabled scalable infrastructure. REST and GraphQL APIs allowed modular integration, while low-code platforms like Mendix, OutSystems, and PowerApps used visual compilers to abstract coding. These enabling technologies shifted programming from hardware-bound tasks to swiping, clicking, and dragging in browser dashboards and they remain mainstream today.
The Modern Mainstream Coding Experience (Present)
The modern mainstream coding experience interprets intent by bridging human goals with machine execution through layers of abstraction. Developers no longer need to think in terms of raw hardware instructions; instead, they express intent in higher-level languages, frameworks, and APIs. Compilers, interpreters, and virtual machines translate this intent into executable code, while libraries and cloud services provide ready-made building blocks that align with common patterns. The act of coding is less about micromanaging hardware and more about shaping logic, workflows, and integrations that reflect human objectives.
AI-Assisted Coding (Future)
AI-assisted coding is enabled by large language models trained on vast amounts of code, contextual intent recognition, and integration into IDEs. Tools like GitHub Copilot, CodeWhisperer, and Tabnine embed AI directly into workflows. The enabling technology here is the LLM itself, which translates natural language into executable code. Programming has become dialogue—developers express goals in plain language, and AI generates code, tests, and documentation.
For example, instead of manually writing a function to validate email addresses with regex patterns, error handling, and edge cases, a developer can simply prompt: “Create a function that validates email addresses and returns helpful error messages.” The AI generates the complete implementation, including the regex pattern, input validation, and descriptive error messages. The developer then reviews, tests, and refines the output. What previously required searching documentation, writing boilerplate, and debugging syntax now happens through conversational iteration.
Programming has Always Been About Intent
From punch cards to Python, the core idea has stayed the same: humans express intent, machines execute it. Early programmers had to speak in binary or assembly. Then came higher-level languages like C, Java, and Python, which abstracted away the low-level details. What has changed is how coders express intent so that machines can execute it.
Natural Language Coding Is a Paradigm Shift
The real shift isn't just accessibility, it's responsibility. Until now, it was solely the coder's job to translate intent (business requirements, industry policies, government regulations) into executable code. That translation required fluency in syntax, architecture, and tooling.
Now, intent can be expressed in natural language. AI can interpret that context to generate machine-executable instructions. This doesn't democratize coding so much as it repositions the boundary between domain expertise and implementation. The programmer becomes a curator of context, a validator of output, and a steward of alignment between human goals and machine behavior.
AI Understands What You Mean—Not Just What You Say
AI doesn't just parse commands—it understands intent. You can say “make this faster,” “add error handling,” or “log the output,” and AI will revise the code accordingly. This conversational loop mirrors how humans collaborate, making development more fluid, iterative, and intuitive.
It also unlocks new modes of interaction: sketches, examples, test cases, bug reports. AI can interpret these as expressions of intent and generate code accordingly. This expands the number of ways developers can express goals and accelerates problem-solving across disciplines.
It's Just Another Tool—Like IDEs, Compilers, and Libraries
Every generation of developers has used tools to amplify their productivity. IDEs gave us autocomplete and debugging. Libraries gave us reusable components. AI gives us intelligent scaffolding, pattern recognition, and code generation. AI is not replacing developers; it's helping them move faster and think bigger.
Verification Still Matters
Just like with compilers and frameworks, AI-generated code still needs testing, review, and validation. Developers remain responsible for architecture, correctness, security, and ethics. The shift is in how we express our goals, not in whether we need to think critically.
The Future Is Still Intent-Driven Development
In a real sense, AI assistance is just another improvement in the long line of improvements in how we write software. The goal remains the same, getting computers to do what we need them to do. Instead of writing and debugging code, we're creating prompts and refining context to guide the AI in implementing our intent. Like with all of the improvements to date we need to learn how to effectively leverage the new technology to achieve our aims.
Conclusion
The narrative of AI as a revolutionary force in programming overlooks a fundamental truth: programming has always been about translating human intent into machine action. From the earliest days of flipping switches and punching cards to today's natural language interactions with AI, the core challenge remains unchanged, bridging the gap between what we want and what machines can do. What evolves is not the essence of programming, but the sophistication of our tools for expressing intent. AI-assisted coding represents the latest chapter in this ongoing story, offering more intuitive ways to communicate with computers while preserving the essential human responsibilities of design, validation, and ethical oversight. As we embrace this new paradigm, we're not abandoning programming, we're refining it, making it more accessible and efficient while keeping human creativity and judgment at its center.
Feedback Loop
Feedback is always welcome. Please direct it to john.miller@codemag.com
Disclaimer
AI contributed to the writing of this blog post, but humans reviewed it, refined it, and gave it soul.
Prompts:
- Support this thesis: Using AI to write code is no different then when programming first became a thing. You are giving the computer instructions in order to get it to do what you want.
- Add the point, the ability to code in natural language is a game changer.
- Add the point, the ability of AI to understand intent is a game changer.
- Add a section in the beginning that describes the major innovations in how computers are programmed.
- Include the physical act (flipping panel switches, punching cards, punching paper tape) required to program the computer. Make these separate points: “Machine Code & Assembly (1940s–1950s)”
- Break these points into sections and include the year when they became mainstream technologies
- Add dates these technologies became mainstream
- Apply this approach to the rest of the post
- Move the Assembly language point to it's own section
- Cloud, APIs, and Low Code have not fallen out of the mainstream in the 20's. Make sure all currently mainstream tech is dated as such
- For each section, such as Machine Code (1940s–1950s), instead of listing the Enabling technologies, describe the technology (Machine Code) and include the enabling technologies in the description
- Every technology except for Machine Code is currently being used. Drop the until dates
- Drop the until date. Machine Code (1940s–1950s) should be: Machine Code (1940s)
- In two paragraphs, describe how the modern mainstream coding experience interprets intent?
