Dave McQueeney, IBM Federal chief technology officer

Dave McQueeney
Chief Technology Officer, IBM Federal

As I reflect on what is coming in 2009, both government and the commercial sector will be focused creating a far more assured path to value creation. This is clearly true in commercial companies, which keep score with profit and loss and stock price, and which are being pressured by a weak economy. But it is no less urgent for government, where the outcomes cannot inherently be monetized…such as national defense, securing the homeland, taking care of our veterans, exploration of earth and space, and the list goes on.

Outcomes in the governmental areas are hard or impossible to monetize, and arguably are more important than a simple dollar amount. But I am confident that with some deeper, perhaps multidimensional thinking, we can make inroads towards appropriate metrics for these non-monetary outcomes. That would be a great topic for some of the think tank organizations to pursue if they are not already doing so.

Let me expand a bit on this idea of assured value creation. I am talking about developing a deeper confidence that the people, processes and technologies that we employ to meet our mission needs are deployed in a smart way. I see tremendous promise in the visions coming out of the incoming administration, perhaps the first time we're seeing a lot of "digital natives" woven into the teams, as opposed to the "digital immigrants" who mostly constituted prior administrations.

The basic idea is not just to make systems that are vast in data-scale (which many in our U.S. federal world surely need to be, given the size of the problems), fast in processing, and powerful analytically, but systems that "know" more about the actual human-tangible outcomes. A good way to look at this is to consider the word "smart".

Smart can mean a lot of things to different people, here is what I am thinking:

1) Smart means insightful -- we have a deep understanding of the value that the mission creates, and how to measure it.


2) Smart means optimized – a just-right level of effort and investment to create the required outcome.


3) Smart means innovative -- new approaches are developed, tested, adopted, and people's creativity is accelerated.


4) But perhaps most of all, smart means enabling people -- with techniques for making people and organizations smarter, by facilitating collaboration and opening up innovation to all stakeholders

So, while this idea of missions, processes and industries being smarter is not really a technology, it's really a more important step to take advantage of the fact that our IT systems are no longer on a raised floor, or in a departmental LAN. They actually reach all the way to the edge of the process in question. We all know the examples like RFID bringing a supply chain into the real-time computing environment, or the highly instrumented battlefield that we now have, or the ability to instrument low-level data flows to detect patterns of fraud, or security risks.

Once the basic flow of a process is digitized, we can take human insight and creativity, capture it with software and hardware and apply this human insight to the very essence of a mission process at a scale and depth that would never be achievable by a human. Think of the computing infrastructure as a giant amplifier for the human creativity of our people.

If I get more focused on more specific technologies, a few do come to mind:

1) Cloud computing. The idea here is simple: Aggregate computing power -- including hardware, the software infrastructure and applications -- into a virtualized, manageable whole, and then automatically dole it out as demands are made by end users. It really extends the ideas of grid computing and virtualization all the way up the software stack to rapidly create environments where teams can define, request and get an entire computing environment, tailor-made for a task, automatically and rapidly.

One thing to be careful about ... Cloud is not the same as service-oriented architecture! In fact, they are really quite different, and complimentary, ideas. SOA talks about morphing IT resources into a componentized model that lines up with mission and business needs, and lets mission experts think in terms of the mission first, rather than having to start with technology. Cloud uses a lot of the base technology of SOA, and is easily implemented on a well-designed SOA infrastructure, but does not tackle the idea of entire end-to-end enterprise processes. It is nonetheless, incredibly powerful and useful.

2) SOA governance. You could argue that this is not a technology, and that would certainly be correct. But, to make SOA real, there are significant challenges in how an enterprise operates: how resources are allocated, how budgets are assigned and measured, how local vs. global value creation is evaluated, and what incentives balance local vs. global optimization.

It has become clear to anyone implementing SOA-styles of business and mission processes and computing that the governance models needed to change. I like to think of it as an example of Conway's Law. [Programmer Melvin Conway, in 1968] said that the deliverables that an organization produces -- the "artifacts" as the software folks like to call them -- have a structure that mirrors the organizational model of the engineering team, in particular its communication pathways.

When we worked in silos created by decomposition of problems, which we were compelled to do to make tasks manageable with technology to assist us, we developed a lot of models for personnel reporting, budgeting, success metrics that were very local. That made sense. Now we want to combine significant capabilities across organizations for greater impact or greater efficiency, and we have many of the technical assists we need, but the old organizational model. If you assume that culture and organizational design moves relatively slowly, then Conway would tell you that the new SOA-style of thinking is being "unwound" by the properties of the old organization and culture. Governance is the body of work that addresses this, and the software and services industries are innovating rapidly around tools and processes to better align organizations and outcomes.

In the end, it's really about realigning how the enterprise works, more than it is specifically about SOA.

3) Modeling and simulation. Once the digital boundary moves out to the edge of our systems and we can sense the actual physical state of our mission elements in real time, it is possible to use large-scale computational resources to simulate future states, and to conduct what-if analyses that help us make smart decisions as we control the mission systems. Weather forecasting is probably the most familiar example of this, but it can be applied to logistics and transportation systems, staffing and skill provisioning of human-centric activities (e.g. analysts). It's becoming a longer list every day, and the computing power that represented a Top 500 supercomputer several years ago is now affordable at essentially departmental-level budgets.

I would consider the relatively new work of using simulated 3-D environments a part of this topic. We have done a lot of interesting work using Second Life to animate the operations of complex software systems in a way that is very comprehensible and very intuitive to most. The Defense Department is experimenting with these techniques for training, and it shows great promise.

4) Real-time and responsive systems. Again, once we have sensors at the edge, many things change. We are seeing more and more demands to make critical elements of the middleware stack run in real time. Just remember, real fast is not real time. Mostly we are talking about making systems deterministic with guaranteed worst-case response times. Usually, you have to actually give up a little bit of throughput to build a system with a guaranteed maximum response time. We have been shipping a deterministic real-time Java runtime system for about a year now, and it is changing the thinking of many system designers about how they can use a simple and powerful language like Java once the inherent non-determinism associated with its simple (to the programmer!) memory management process is removed.

About the Author

Connect with the GCN staff on Twitter @GCNtech.

Reader Comments

Please post your comments here. Comments are moderated, so they may not appear immediately after submitting. We will not post comments that we consider abusive or off-topic.

Please type the letters/numbers you see above