The Future Revolution of Software Development


The Future Revolution of Software DevelopmentThe current decade and also the world of how we develop and deploy software is now changing incredibly fast. In 2010 many groups had already adopted practices such as Agile, Test Driven Development (TDD), as well as Continuous Integration (CI), but the vast majority of development teams were still doing things the conservative way, with big Product Requirement Documents, year or multi-year-long development cycles with a multi-month code-freeze and QA interval at the end. 

A shockingly large number of software jobs did not even have any kind of comprehensive regular regression testing. Five years later, just about everyone uses some kind of Agile with a decent test-suite, and those published on a yearly or quarterly cadence are the laggards, firmly in the minority; most software jobs release or deploy new versions multiple times a month and some even a few times per day. By moving out of the old, waterfall style to Continuous Integration (CI), Continuous Deployment (CD) and adopting DevOps, new attributes are introduced quicker, delivering competitive advantage and reacting more quickly to consumer needs. 

The wider digital market has obviously been undergoing huge changes at the exact same moment. The computer software is ever-more vital to the world we live in. Waymo’s self-driving cars lately passed eight million miles travelled. Microsoft’s translation engine, even though not fluent in six million forms of communication, can match human levels of accuracy in Chinese-to-English tasks.

In this new world, developers no longer need to design a unique algorithm for each problem. Most work focuses, instead, on generating datasets that reflect the desired behaviour and managing the training process. Pete Warden from Google’s TensorFlow team pointed this out as far back as 2014: “I used to be a coder,” he wrote. “Now I teach computers to write their own programs.”

The Coming Revolution in Software Development

Programming and data science will converge

Most applications will not include “end-to-end” learning systems for the foreseeable future. It will rely on information models to provide core cognition capacities and explicit logic to interface with customers and interpret results.  

AI professionals will be rock stars

Doing AI is hard. Rank-and-file AI programmers – not just brilliant academics and researchers – will probably be one of the most valuable resources for software companies later on.  

The AI toolchain needs to be constructed

“Machine learning is at the primordial soup phase. You really had to be a world’s expert to find these things to function.” Studies also show that many AI versions are tough to explain, trivial to fool and susceptible to bias. Tools to tackle these problems, among others, will be essential to unlocking the potential of AI programmers. 

The core breakthrough supporting every one of those advances is deep learning, an artificial intelligence technique motivated by the structure of the human brain. What started as a relatively narrow data analysis tool now functions as something close to a general computing platform. It outperforms traditional software across a broad assortment of jobs and may finally deliver the intelligent systems which have long eluded computer scientists – feats which the press sometimes blow out of proportion. 

As software controls more and more of the planet around us, programs no longer function in isolation, but interact with other applications and operate on multiple devices in an increasingly complex ecosystem. Modern practices didn’t make life easier for long; rather they allow us to perform more and so complexity increased.