Enabling technologies for software development


4 technologies that changed software development

In his blog post The Great Works of Software, Paul Ford enumerates five applications that excel in longevity, popularity and usefulness. Pac-Man made the list, so you know it’s a good one. Ford got me thinking – less so about software itself and more about the technologies that shaped the way it’s made.

In my view, four technologies have revolutionized software development in the last decade. The technologies themselves are less important than the indelible impact they continue to have on the IT industry. Understanding their effect will help developers discover new ways to attack their problem spaces and build better software faster.

Without further ado, here is my list in no particular order. Though I warn you now: Hadoop didn’t make the list.


The agile movement advocated practices that are counterintuitive at best and bizarre at worst. Most notable among these were unit testing and test-driven development. The notion that developers should have the discipline to test their code bit by bit – and even let tests drive their design – was fantasy. To have any chance of success, unit testing needed to be simple and automated, or developers would never do it.

Kent Beck, a signatory to the Agile Manifesto, wrote JUnit, a unit-testing tool for Java. It was easy to use and was soon integrated into development tools. To me, what made JUnit so popular was its compelling interface, a progress bar that gradually lights up green from left to right as tests pass, or sadly turns red when a test fails.

Most important, JUnit improved quality, and  consequently other languages developed JUnit equivalents. Libraries now have testing facilities built in, and unit testing features prominently in job postings. Even the lead on the HBO comedy Silicon Valley ran a suite of unit tests after a mishap. Once unorthodox, unit testing is now standard practice, and that is because of JUnit.


While Web applications vary widely, at a high level they all do pretty much the same thing: provide an interface that allows developers to create, query, update and delete data; the interactions dictate the next step in the workflow. Yet for all this commonality, it once took an unnecessarily long time to get a Web application running.

Rails changed that.

While legacy Web frameworks required manual setup and configuration, Rails used smart defaults and conventions to deliver common functionality right out of the box. Rails also sparked a revolution in productivity. A developer could easily have a basic Web application connected to a database running in a day.

The themes of Rails – convention over configuration, the Model-View-Controller pattern, the Active Record pattern, etc. – influenced every Web application framework that followed. Because of Rails, Web developers are now empowered to produce real applications fast.


Making sense of unstructured text is a very hard problem, but it is critical to sophisticated analytics like sentiment analysis or intelligence gathering. Lucene has long been at the core of these efforts.

Written in Java, Lucene parses and analyzes free text with profound efficiency. All of the common text searching capabilities we take for granted – e.g. wildcard and N-gram searches, faceting, text highlighting – are part of Lucene. In one form or another, Lucene has become the text mining tool for all languages and platforms.

Meanwhile, every leading text analytics project builds on Lucene. Apache Solr provides a Web interface to text analytics with clustering, geospatial search and numerous other capabilities. Elasticsearch extends Lucene to distributed environments. With text analytics so critical to so many missions, even if using something like Solr or Elasticsearch than Lucene itself, Lucene’s influence cannot be overstated.


Disseminating more data to more users connected in more ways demands a shift in the way Web applications are architected. The old paradigm of leveraging more machines and threads to compensate for the need to wait for operations like database reads to finish isn’t viable anymore. Single-threaded, event-driven architectures are leaner and more scalable, and Node.js pioneered this approach.

While we think of JavaScript in the browser, Node.js operates on the server to perform actions in an asynchronous, non-blocking way, which basically means no waiting. A Node.js application is always ready to service new requests as high-latency operations like database queries are handed off, and users are notified when they’re done so they can process the results. The result is a highly-scalable, blazingly fast application.

Node.js was the first technology to make so-called “reactive” applications a reality with real-world deployments at Walmart and elsewhere, and reactive alternatives like Play Framework and vert.x have followed suit.

For all its celebrity, Hadoop’s absence from my list may be disappointing, but I believe too few have actual big data problems for Hadoop to be as influential as the others. Of course, I welcome your feedback.


  • Records management: Look beyond the NARA mandates

    Pandemic tests electronic records management

    Between the rush enable more virtual collaboration, stalled digitization of archived records and managing records that reside in datasets, records management executives are sorting through new challenges.

  • boy learning at home (Travelpixs/Shutterstock.com)

    Tucson’s community wireless bridges the digital divide

    The city built cell sites at government-owned facilities such as fire departments and libraries that were already connected to Tucson’s existing fiber backbone.

Stay Connected