Python Code

From no-code to know-nothing

The rise of AI-assisted development is giving no-coders bad vibes, says Jason Walsh
Blogs

22 August 2025

The ‘no-code’ revolution was supposed to democratise software development, putting programming power into the hands of accountants, marketers, and domain experts who knew what they needed but couldn’t write a line of code. Then ChatGPT arrived, and suddenly everyone could just ask an AI to build their app instead.

What we’re witnessing may be the death of a movement before it truly lived: the rise of AI-assisted development giving no-coders distinctly bad vibes as their nascent field gets leapfrogged by large language models that can generate professional-grade code on demand.

The idea is that so-called ‘domain experts’ would be given the ability to programme their computers, typically to meet specific business needs, without having to become a developer. In plain English that means that an accountant or bank analyst might want to automate some important but repetitive tasks, and do it using a graphical interface without having to even look at a brace or pipe.

 

advertisement



 

Full-blown application development is also possible, but somewhat less common. Whether this would have continued to be the case is an open question, though. So has a promising new technology – or at least techne – been strangled shortly after death? Possibly, and it wouldn’t be the first time.

After all, why waste your time with no code development when you can just tell an AI to churn out professional grade code in any programming language under the sun?

There is some important historical context to understand here: the truth is that most of us never actually do much in the way of ‘computing’, at least not in the sense meant by computer scientists. The computer, then, was envisioned as a kind of universal or generic machine that ordinary people would programme to solve their own problems

Before then early high level computer languages were designed specifically for end users to, well, use. BASIC was explicitly designed at Dartmouth College in 1964 to allow students and faculty from any discipline to write simple programmes without becoming computer scientists.

Later applications like Apple’s HyperCard and Macromedia’s Director and Authorware allowed users, often in education and the graphic arts, to develop their own applications in a largely graphical fashion with only minimal programming required, relatively speaking. These died out in the face of the Web, but rapid application development (RAD) tools do still exist, such as Livecode, which can even deploy to mobile platforms.

In fact, this idea that users would all be programmers is written throughout tech history like the writing on a stick of rock. From the 8-bit computers of the 1980s, all of which demanded basic programming skills, to the original vision of the Worldwide Web as a read-write medium where anyone could create as well as consume, it was simply assumed that we would all ‘code’.

Further back there was COBOL. Hands up if anyone out there knows of any non-developer who, in the course of their work, does their own programming? Anyone? And yet, COBOL (Common Business-Oriented Language) was intended specifically for that, which goes a long way toward explaining its verbose English-like syntax.

No-code tools were the latest version of the promise to eliminate the friction between human intent and software reality, but AI has arguably achieved this more elegantly. Why drag and drop components in a visual interface when you can simply describe what you want in plain English and watch as the AI generates working code?

But this is not just a case of one methodology displacing another; it is a fundamental change in how we think about the democratisation of programming and, hence, actual computing.

For instance, no-code platforms require users to think in pseudo-programming terms, understanding concepts like loops, conditionals, and data flows even if they actually write any code. In comparison, AI development uses the ultimate high-level interface.

And yet, this transition raises uncomfortable questions about dependency and understanding. No-code tools taught users something about how computers work: when you build a workflow in Zapier or create a database in Airtable, you develop an understanding of logical structures, whereas ‘vibe coding’ produces sophisticated applications that end-users could never debug or modify independently.

Perhaps the real casualty isn’t no-code development itself, but just one of many recurring, but always brief moments when it seemed we might all become programmers and actually learn what these damn machines were actually doing – and how they did it.

Does understanding how computers actually work matter? Well, yes: as information technology burrows ever deeper into our lives it is no bad thing to understand our tools well enough to shape them ourselves, rather than simply typing into a text field to ask the machine to do it for us.

Read More:


Back to Top ↑