GCN INTERVIEW—Lt. Gen. Jeffrey Sorenson

Army CIO sets sights on improved data sharing

Sorenson discusses life without portable drives and his plans for improving data sharing, data quality and governance

Lt. Gen. Jeffrey Sorenson, the Army’s chief information officer since late 2007, has spent more than 20 years in military information technology acquisition and has directed numerous science and technology integration programs for the Army. He spoke recently about how the Army is coping without flash drives and how he intends to improve data sharing, data quality and governance. -- Barry Rosenberg

GCN: How has the Pentagon’s ban of flash drives and other removable media, issued in November 2008, affected the way the Army shares data?

Lt. Gen. Jeffrey Sorenson: Clearly it has had some operational impacts, which we didn’t have full cognizance about at the outset. When the dictate was put out that thumb drives were no longer going to be allowed, it did have some operational implications because this was how different orders, missions and organizational information were transmitted from headquarter to headquarter. Over time, we’ve had to go back and look at how we transfer data, and, clearly, the use of the thumb drive was one of these expedient methods by which information was passed between computers because we didn’t have a system set up properly to transfer the data. So, I don’t want to say it was a blessing in disguise, but it has helped us go back and look at precisely how we transmit data, what data is required, and how can we have that data transferred via the system versus manual thumb drive.

Can Army Knowledge Online play a role in data sharing?

In some cases the answer is yes, but it still poses a considerable challenge because we have not built our systems to essentially effect the data-to-data transmission. It is now system to system. And until we get systems like the [Future Combat Systems'] Battle Command system, otherwise known as [the System-of-Systems Common Operating Environment], out to the field, where they are built on a service-oriented architecture [and] are essentially drawing data and using modules, we’re going to continue to be pressed with the need to transfer data through either AKO or hosted on a shared drive.

Can you estimate when that might happen?

I think for the entire Army to do that will be a number of years in the future. We are at this time developing battle command capabilities like the [FCS] that will begin to mitigate that. We also have capabilities like the Command Post of the Future that is beginning to provide for collaboration and use of data transmitted via the system. And there is the whole concept of the network service center, by which data can be forward-staged and transmitted via the network as opposed to people picking up their hard drives, or, in this case, what used to be thumb drives or servers, and moving them. We’re still a number of years in the future before we have a net-centric or net-enabled capability that can be used to share data.

What are your thoughts on the Obama administration's plans to create a Pentagon command to coordinate security of military computer networks?

I think it’s a good idea. In many cases, as we’ve learned through the most recent Army “Rampart Yankee” and [Defense Department] “Buckshot Yankee” exercise — where we had to go off and remediate computer systems because of some infected thumb drives — that was a rather laborious, manually intensive effort to essentially achieve a capability that we would like to have, which would be machine-to-machine. Today, you sign on, and if you’ve got the Microsoft operating system, you typically will get a notice in the machine that notifies you of the update. And whether you have a legitimate copy of that software or an illegitimate copy, Microsoft knows of every machine that has a Microsoft operating system and, as a consequence, can continue to update the software to prevent malicious code from continuing to be propagated. That ability to do machine-to-machine updates, machine-to-machine visibility of the network, and machine-to-machine control of the network, will be necessary in the future. Today, we don’t have that capability.

You’ve expressed your desire for better data quality and governance. Where are you at this point?

We’ve been very successful in this respect. We have at least got the organization put together, and it's beginning to go off and do some great things. We’ve built an organization here that is essentially off now defining and working with what we call “data stewards” that are essentially the proponents within the functional domains as well as within some of the major commands that have the responsibility for validating authoritative data sources. We have expanded the team to include Dr. Richard Wang from [the Massachusetts Institute of Technology], who is a national asset in terms of understanding data quality and in terms of making sure the data is of sufficient quality to essentially make the point that they are authoritative data sources. We also have recently acquired an individual named Dan Jensen who essentially did a lot of the same type of work for the Navy at the Fairfield Data Center in California. And we’ve brought on the data center of excellence folks from the Fort Monmouth organization run by Judy Pinsky.

They are the stewards?

They each have some individual responsibilities to work. What we’ve done is to begin to dissect their areas of responsibility. Judy Pinsky has got the lead on data services and tech support. Dan Jensen is going to be the lead on the data framework. Professor Wang has the lead on quality. We’ve got our group working on policy guidance. So we’re off on two use cases that we are building now — collecting the authoritative data sources to satisfy these use cases.

One you probably read about in the newspaper, with respect to suicides by Army members. Clearly, we had to go back in and dig through a lot of different data sources to begin to pull out the systemic issues that are resulting in what has been an increase in Army suicide over the last year or so. And clearly, we had to get into data that the Army surgeon had, some of the data that the G1 had, and some of the data that the Human Resources Command had. So we had all these different data sources, not any of which had the same data, and began to pull those piece parts of data to do some analysis to help try to figure out why we were having this issue. So that’s going on right now with the G1 and the vice chief of staff.

The second use case is what we call the executive management system, by which the G3 of the Army has tasked us to assess and help provide some authoritative data sources for measuring the readiness and operational capabilities of the Army on a daily basis. … So our organization has gone to talk to the Army Materiel Command, forces command, [Army’s Training and Doctrine Command], and a number of the different functional areas such as battle commands to begin to identify those authoritative data sources by which we do different functions within the Army. It’s a monumental effort at this point in time to try to clean up what data we use — but also more to the point, finding those single sources of truth that we need to operate and propagate as opposed to finding data that is not authoritative data but derived data or enhanced data. We want to get to the data source and make that available, accessible and discoverable by anybody who needs it.

What are your fiscal 2010 priorities?

First, we want to make sure we can fund the entire transition to the Global Network Enterprise Construct. We’re talking about funding to field the follow-on fixed regional hub nodes, and our ability to establish the area processing centers. Right now, we have a request in to the Army staff about standardizing the toolsets we use to evaluate the network across the different theater network operations security centers.

We have some other improvements from the tactical side: clearly the Warfighter Information Network-Tactical program, certainly the rifleman radio, and trying to make sure we continue to forge forward with a new tactical radio system. Those are the primary focus areas.

inside gcn

  • HPE SGI 8600

    New supercomputers headed to DOD

Reader Comments

Sat, Sep 19, 2009 SpectateSwamp Canada

Data sharing must follow great data access. Otherwise it's just data dumping. If you don't have great access to your own data. What are you sharing? Without allowing the users to have a copy of everything they share. There won't be sharing from the most important group. The staff.

Fri, Aug 21, 2009 Jim Ruth Fort Leavenworth

Data Sharing, Data Quality, and Data Governance are a long ways apart from removable media in the IT domain. Removable media allowed local file sharing for immediate consumption. Data Sharing, Quality, and Governance are processes that support the enterprise and decision makers at all levels. It seems incredible that these two disparate things are somehow being addressed in the same article as though they had a critical relationship is a bit on the edge.

Please post your comments here. Comments are moderated, so they may not appear immediately after submitting. We will not post comments that we consider abusive or off-topic.

Please type the letters/numbers you see above

More from 1105 Public Sector Media Group