Low-Code, No-Code, AI-Code… Coding without Coders?

Low-code and no-code have been with us since the nineteen fifties, trying to make programming easy and turn business users into “citizen developers.” Code generation using AIs is taking this approach even further, promising to reduce the formality of the “input” to a human conversational level. So, if we accept this starting date for the long run towards the development of AI coding bots, how come we all laughed when Scotty tried to talk to a Macintosh? Even stronger: Why do we get fresh announcements of high-productivity coding environments every few years, promising that finally, this time for real, we don’t need expensive programmers anymore to build our business applications? And why does each iteration then slowly disappear toward the toolboxes of those same developers, if not being replaced altogether with something else?

TLDR

  • Trying to replace developers with business users is a goal that has been with us ever since computers became widely available for business use. Programmer scarceness and productivity have always been the driving force behind “low-code” and “no-code” products.
  • Generally, Low-code and No-code solutions provide fast first-time delivery, at the cost of long-term maintenance and vendor lock-in.
  • Software development is a skill that can only partially be captured using automation. Therefore, all code-generating solutions, including generative AI, realize their optimum potential when applied as tools for developers.

When “Coding” was still a thing

If you’re not old enough to have had first-hand experience with them and read back the Wikipedia articles on COBOL and its predecessor FLOW-MATIC, you’ll note that the code examples almost read like common English. This was definitely by design, as, until that time, programmers were even more like mathematicians than they are today. Even though there already were programming languages that were easier to use than assembly, they still did not shine in the usability aspect. Computers were, for the most part, seen as glorified calculators. They were to a great extent used for counting and tabulating things.

Developers are quite capable of delivering magnificent pieces of code, but their minds work differently from accountants and business administrators. This means there is a definite need to first analyze the business needs by someone more versed in that world, trying to figure out how those needs can be supported by a computer. From this “functional design,” we can move towards something a dumb (but fast) computer can do, let’s call that the “technical design.” Finally, this technical design can be put into computer code, which is the work of the “coders.”

In this day and age, coding is done in a language that still requires that specific skill set, but at the same time reduces (if not removes) the need for a technical design. I’ll grant there are some fields where this separation can come in handy, but that is in fields where verification and correctness make “slow and thorough” worthwhile. If coding errors or a failing deployment can result in catastrophic failures, “move fast and break things” is simply not an option.

So, when you need to send your requirements in a barrel over the waterfall, and you still think that computers and software can best be managed on budgets in a cost-center, you’ll start to look for options to not just speed up delivery, but at the same time cut out as many of the middle-men as you can. “Don’t do what I say, do what I want!” Right?

Let’s just draw a picture

A lot of what a computer needs to do for us can be drawn in pictures, using arrows to show progress towards the goal or the flow of data itself. This way we can have business users describe what they do in a form that they understand, while those pictures can easily be translated into something a computer can understand. Flow charts were used a lot in the designs, but when detailed enough to capture all possibilities, they quickly become unreadable.

You can also use them to describe a business process. The Business Process Model and Notation provides a standardized way to do this, and using it to support the business has been booming. However, in practice, it just gives you electronic dossiers that are handed through from one outbox to another inbox. If you want any intelligence applied to make specific decisions, you still need a user to actually look in the dossier, or else some piece of (often non-trivial) code to do that. Also, any competent manager will quickly try to streamline overly formal and complicated processes, and doing that so both the process is still correct from a business perspective and the computer won’t actually lose any (part of the) dossier, is what keeps the vendors and consultants for BPMN tooling occupied.

But now the long-term cost: We want a constantly evolving collection of modeling components to build our models with, running on the latest (and most secure!) infrastructure, and supporting whatever devices we currently use to run our business departments. John Smith from Procurement may have built a wonderful application in Microsoft Access that keeps his manager happy, but when his desktop finally needs to be upgraded and he has no idea how to support iPads, we suddenly have a costly problem. Luckily we can find a vendor who will happily take this burden off of us, but now we have replaced a dependency on John Smith with one on a vendor, while we continue to increase that dependency as long as we stay on that platform.

No, it’s all about screens… Yes, definitely screens!

A very attractive way of developing applications is by using a screen builder. This is again a proven approach that worked for mainframes as well as it works for web pages. Now the business user can simply focus on translating dossiers into screen designs, while the tool can do all the difficult bits. However, even if we just restrict ourselves to data entry, we’ll soon want to add validations, entry support with tables, and calls to external services. These bits of functionality can be filled in with components provided by the tool vendor, or, if we’re lucky, we can get IT to provide them customized for our own situation.

Screen designs can be combined with flow charts and BPMN models and life seems to take a turn for the good. That is until someone on the Board shows up with a mobile device that completely ruins our best design efforts. Designing screens that are easy to use is actually pretty hard. Different devices have different dimensions and suddenly what worked well for a laptop is terrible on a phone. Also, senior front-end developers can tell you how to ensure some designs will work and look great on specific devices, but most senior back-end developers turn out to be just medior front-enders, and neither may turn out to be good at visual design.

Lastly, the most demanding people to satisfy will not be the developers, but the marketing department that came up with the visuals for the campaign. Now our project is getting delayed by several weeks due to the absolute requirement for rounded corners on some form components when the targeted devices don’t support that out-of-the-box.

Developers, developers, developers!

The conclusion, love it or hate it, is that we eventually will be forced to acknowledge that we need developers, either internal or external. Now we need to switch perspectives: Seen from the tool vendor’s side, when computers and software become common goods, just selling a great tool doesn’t do it anymore. So, if you want companies to use your digital products and services, you need to target developers, so they’ll “bring along” your products. No better example of this was Steve Ballmer trying to energize a crowd with his “Developer, developers, developers” chant, which you can even find on YouTube as a dance remix. So, what is it with developers that we see this love/hate relationship?

Basically, we need developers to realize our digital goals, while having trouble accepting them as fundamentally different from “business people.” Why can they realize our wildest dreams at one moment, while flatly refusing to bring those dreams to production as soon as we say they are exactly what we need? The thing is that, while being budgeted as a cost center, IT has been asked to ensure everything just keeps working. While it is no problem to accept that you need to buy a new smartphone every three years just to keep being able to perform everyday tasks with it, we forget that business applications gather dust too. So developers have learned to think about what can go wrong and plan ahead for future change because their “keep the bank running” budget is continuously being downsized. Any small app “Business” thinks up or buys externally, adds to the burden of preventing “IT trouble.”

No wonder internal IT is slower in delivering new software than any external vendor. Any future incidents need to be covered by a predictable and tight budget, while external developers and consultants just love the future guaranteed income. Citizen developers seem like a good idea, but a whole department in a panic over some Access application on a forgotten desktop under some desk is a definite flip side of that coin.

But… it’s not about technology, it’s about productivity!

Yes, low-code and no-code environments are definitely able to deliver solutions fast and often with relatively low requirements for the user’s background. But at an insurance company where I worked the highest productivity was not achieved by developers with wonder tools, but by a team with a large collection of ready-to-use building blocks. If they had had a visual development environment with those components as building blocks, then you would have seen the power of visual programming. Unfortunately for this story, they did not. They were some of the brightest developers in the company, using a wonderful programming language named Smalltalk. Citizen developers would not have flourished in that environment, but this particular business department was tightly involved with the team, so there was no need for them.

When you talk about productivity, you are thinking about the time from “Ideation to launch.” In such a case it doesn’t help to have a wall between business and IT, either organizationally or culturally. In a world where good developers are hard to get, you cannot aim to just employ the “best of the best.” Neither can you raise a junior to senior level by taking the coding out of his or her hands, because you would just separate work into difficult and easy “bits.” Better to strive for productive teams, employ pair programming to reduce coding errors and enhance learning, and use the DORA metrics to track the result. Also, make sure you don’t fall into the trap of using any sensible-sounding metric as a KPI. Most metrics should be seen as qualitative instead of quantitative. A bad value is a signal, as we now often say “a smell.” They tell us there may be something wrong, but trying to always strive for specific values is like taking care of the pain without fixing the disease.

How about AI?

Yeah, how about AI? Isn’t that going to solve the issue?

Well, let’s start by being honest: It’s not AI but rather “Generative AI,” so AI technology used to generate code. Not understand our problem, just generate some code that may turn out to be what we want. It has been trained on existing code, so don’t expect it to come up with something completely new. It may be new for us, but it is not the answer to “an unknown unknown.” No current AI solution is able to actually understand what it does, as we would use the word “understand.” It is often amazingly good at coming up with so-called boilerplate code tailored to the task at hand. If we start to write something completely new, it can just as happily come up with total nonsense.

So, an experienced developer will love the way it takes the drudgery out of the coding while being very careful about accepting large blocks of code without going over it in detail. A junior developer must treat it as “magic”: great when it works, but inexplicably useless when not. Better to make sure you start by specifying what you expect the result to do. TDD and BDD anyone?

No hope for the future?

When discussing new ways to work, the most important phrase is “Culture has Strategy for breakfast!” What many companies have realized is that how their business processes work has more impact on the result than any tool you employ along the way. Conway’s law states that a system’s design will reflect the organization’s structure, but many IT professionals would love to “pull a reverse Conway maneuver” and reduce an application’s complexity by adjusting the organization instead.

From a developer’s perspective, the saying is “Real programmers can write FORTRAN in any language,” meaning that skill is ultimately more important than the tools. A good carpenter can do wonders on a shoestring, but their toolbox will just as happily include a battery-driven screwdriver. You want a normal driver when you need to “feel” the screw, but if you just need to fix a plate using 20 screws you go electric. The same holds for any solution that generates code: it will work in some situations and not in others, but you need a professional to tell you which is which. Generative AI is great, but the drive to high-tech one-upmanship makes us forget (temporarily, I hope) that it is still a dumb tool trying to be smart. We gave it treats when it did right, spanked when not, and now hope the result is what we wanted.

This entry was posted in Programming Languages, Software Engineering, Tools and tagged , , , . Bookmark the permalink.

Leave a comment