Select Page

The crux of digitally transforming a business is all about making the business more agile in anticipating and responding to the customer wants and needs in the context of a much more networked business ecosystem; where every player brings an incremental value to the customer or falls out of reckoning. In such a setting, assuming a solid digital business strategy is in place, every business is now looking at ways to increase the agility for achieving the business goals – with an increased need for experimenting, failing fast, continuously innovating or facing the downhill of commoditization and ultimate extinction of the business. Consequently businesses’ use of technology is not about wading through the latest and the greatest technology available, but about using the right technology to quickly achieve business goals. Software application development is directly in the critical path between the business goals and the realization of the business outcomes in the shortest possible time. What does this mean to the future of application development?

Generative Pre-trained Transformer 3 (GPT-3) from OpenAI has been receiving a lot of press of late. GPT-3 is a new AI language model created by OpenAI that is able to continuously learn all the stuff available on the Internet and generate text that is largely indistinguishable from that written by a human writer. It has created a lot of buzz with research and non-research folks alike who are looking at many interesting use cases. What else can innovations using AI do in the near future that will impact the future of application development?

This set me thinking about how the world of software programming has evolved for the past decades, the current state of the art, and where it is very likely headed, especially of importance given the businesses’ ever-increasing need for speed.

From Machine Language to Machine Learning to Machine Programming

If you trace the history of software development, what strikes you is the increasing human aspiration to get the computer to do more and more of the detailed grunt work, an adoption of the beautiful concept of encapsulation – essentially hiding the complexity behind the layers and work with more and more at a conceptual level of abstraction. Without getting into too much detail, let us take a quick look at how this has been evolving over the years.

The Past

If you started your career in technology in the late 80’s or even the early 90’s, you were considered to be an elite tech geek if you had mastered what was then called as machine language or assembler language. This was one level of abstraction above machine code. Machine code is of course the instructions and data coded in 0’s and 1’s, the only language the digital circuits in the computer understand (interesting, this is still the case). Machine language, comprising English acronyms and numerals, was a set of instructions the programmers used, to make the CPU do all the number crunching. This then gave way to higher levels of abstraction provided by the so-called third generation languages (3GLs) such as COBOL, C, Pascal, FORTRAN, BASIC, PL/1, et al. A majority of the “legacy” world of Business-IT applications still uses these languages, testament to their popularity with entire generations of developers. Then came the 4GLs such as Focus and Mark-IV, which supposedly aimed at increasing the developer productivity but never really became mainstream. As newer technologies came in, with client-server, distributed computing and graphical user interfaces, this resulted in the creation of various types of application development paraphernalia, many of which are still in use today. The advent of the Internet and the browser, and the great interoperability and openness provided by Java, was a major inflexion point. Later 3GLs such as Java, C++, and C# using the object-oriented paradigm proved to be extremely popular. Even today, the most mission-critical to the simplest of applications could be found written in these languages; they did their job pretty well, and kept the programmers happily arguing whether programming was an art or a science.

The Present

Model-driven development has been around for at least a couple of decades now. It came from the motivation to model the business requirements in diagrammatic notations and take it through to the design, software construction, and testing. Early approaches focused on code generation from the models. In my experience, the tools that enabled this were not able to achieve continued patronage from the developer community. I have seen the approach have reasonable success in some areas such as tool used to generate another tool that then did specific things like application remediation or data conversion, rather than development of mainstream business applications. The approach was also beset with complications arising from the fact that developers were always looking for ways to still do custom-coding, to actually change the generated code. This was because the generated code lacked in several critical aspects when it came to developing complex applications that had stringent non-functional requirements such as high performance and scalability, or just complex functionality. On top of it, drawing all those UML artifacts AND maintaining them back when the generated code was manually changed, simply wasn’t appetizing for most developers. Trying to modify generated code (or even wanting to see the generated code) in effect defeated the whole purpose of going model-driven.

Luckily the situation drastically improved with the current crop of Low-code / No-code platforms that did the Model-driven development the right way – which is not generating code from the models but generating definitions which the platform then automatically takes to execution in the target compute engine. In effect, the Model is the Application (not the model + generated code, or worse, the generated code with the manual tweaks). This has also meant that the ability to develop applications goes beyond the purview of professional developers and to the business users, who after adequate training on the tools can themselves model the applications with the platform doing the rest, with the user not needing see a line of code. If Cloud made it easier for the smallest startup to get started quickly to develop an application with the latest digital technologies, these Low-code / No-code platforms make it even more easier. For example, many of them let users very quickly build a prototype with great looking user interface built in a jiffy by drag-and-drop, connecting to a simple spreadsheet backend. While the Low-code / No-code platforms are still improving, the leading ones are already being used in serious large mission-critical business application development, in a way that naturally uses the Agile methodology to deliver incremental functionality iteratively; really bringing the business and IT folks to collaborate closely in creating the models. The leading platforms have also reduced concerns on vendor lock-in by providing the facility to export the model in industry standard notations such BPMN flows, UML, XSDs, CSS3, and HTML5. The platforms are still in early stages on interoperability, though.

Component-based development as a concept has been around for many years, moving from one avatar to another (SOA, Micro-services, etc.). The paradigm of composing applications by not building everything but integrating with functionality coming from outside the organisation, open source or paid, through the API mechanism is very much mainstream today. Many application development platforms allow you to drag-and-drop components, whether GUI or business logic, or technical components, into what you are building and automatically integrate.

Machine learning has completely changed the way a business uses its data in decision-making and in automating business processes. While the traditional approach was focused on visualization of data taken from a data warehouse or data mart and the human interpreting what the analytics said, machine learning has enabled a new paradigm. The focus is on building a model and train it on data, and then the model takes over in looking at new data and either drive downstream decisions straight-through where it makes sense, or gives the insights to the human as an assistant would do, helping drastically increase the human productivity at far less error rates. Not just on the analytics side, machine learning and the other AI domains such as NLP and NLU are now being used in mainstream business transaction processing as well as changing the paradigm of how the users interact with the application (e.g. conversational apps, chatbots, etc.).

The Future

If an AI technology such as GPT-3 can write articles like a human, they can certainly create complete applications by themselves. Like AI, the exciting field of Machine Programming, software that creates its own software, has been around since decades but is now at an inflection point. Some of the best research groups are at it, and I will not be surprised if it picks up pace and becomes mainstream very soon. It promises to bring together machine learning and the traditional programming paradigm to automate the development and maintenance of software.

Gazing through the crystal ball, I see the future of software development go towards what I call Smart Application Creation Platforms (SACP). Imagine this scenario. You open your SACP canvas, either drag and drop a business requirements document or just talk to the canvas in plain English (or whatever human language) as to what you need the SACP to create……and the SACP walks you through the steps through the same canvas, automatically builds the model and whatever artifacts it deems necessary, plays everything back to you for confirmation, and voila, the “application” is created and ready to use. Under the hood is advanced intelligence that “knows” your business domains, has a vast learning of best practice frameworks and patterns, the ability to take unstructured input from humans and create the structure that is needed internally to create the application. Building software is after all largely a matter of using the right patterns (and knowing the anti-patterns). That’s for building new applications. I know what you are now thinking – what about the existing applications? The SACP has already trawled through all your existing code and application artifacts and subsumed them into its own internal representation of what it calls as application. What is left for the human to do is to just make sure he/she gets what was required to be created and focus solely on the business outcomes. Now, who is in the best position to build such platforms? My bet is such an innovation will come from the top 5 leading Cloud providers, who are already churning out innovations every day; Okay, may be along with some of their niche technology partners, some of whom will anyway become acquisition targets for these Cloud providers.

A final word

Plethora of new technologies, new paradigms to create software, new devices to integrate, what does all of this mean to the kind of skillset employees of the future business should have? If creating applications will no longer need hardcore technology skills, will there still be a distinction between business unit and IT department, or will roles merge / new roles emerge? I would say very interesting times are ahead.

Take a look at this week’s Challenge of the Week; will be keen to hear your views.

Ramki Sethuraman