Man in data center looks back at mainframe

The great evolving, dissolving government data center

The modern government data center sprang from the earliest forms of computing hardware and software, but its future is all virtual. Here’s a timeline of key events over the last 30 years.

1982
IBM introduces the model 5150 personal computer, pushing widespread deployment of desktop IT throughout government agencies.

1984
Fast, inexpensive magnetic tape cartridges replace the standard circular tape reel that had been a feature of big computer systems since the 1950s.

1986
MIPS and IBM release the first reduced instruction set (RISC) computer, showing that 80 percent of a computer operations run on 20 percent of its instructions set.

1988
IBM introduces Application System/400, a mid-range computer designed for small and intermediate organizations, that will become one of the world’s most popular departmental computers.

1991
IBM intros 3390 Direct Access Storage Device (DASD), 40 percent faster and capable of storing as much data as its predecessor, in one-third the space.

1999
VMware Workstation becomes the first product to allow users to run multiple instances of x86 operating systems on a single PC.

2002
Data centers account for 1.5 percent of total U.S. power consumption, growing 10 percent annually.

2005
The Agriculture Department recommends moving National Finance Center from New Orleans because of hurricane risk.

2006
Texas awards an $863 million, 10-year contract to IBM to build a data center in state capital of Austin. The state projects savings of $159 million over first seven years.

2007
Cisco unveils its unified computing system, fusing computing, network, storage and virtualization into a single system.

The Environmental Protection Agency estimates the energy consumption of servers and data centers had doubled in the past five years and would double again to more than 100 billion kilowatt hours, at a cost  of $7.4 billion annually. Federal servers and data centers account for 10 percent of this electricity use.

2008
The Navy consolidates 2,700 x86 servers that power the Navy and Marine Corps Intranet to reduce energy consumption, space requirements and costs. New services will host multiple VMware ESX virtual machines.

2009
Microsoft builds a data center in Northlake, Ill., providing the capacity for as many as 440,000 Windows servers on the first floor alone, 10 times the 40,000 servers conventional data centers can hold.

2010
Federal CIO Vivek Kundra directs agencies to make an inventory of their IT assets in preparation of the largest data center consolidation project in history.

“Converged infrastructure” enters the data center lexicon, referring to technologies where the complete systems enchilada – I/O, networking, storage and computing – are all configured in software.

2011
Data center research organization Uptime Institute reports that 36 percent of the large companies it surveyed expect to exhaust IT capacity within the next 18 months.

The Homeland Security Department lays out a strategy for using nine private-cloud services and three public-cloud services.

2012
San Francisco announces a plan to consolidate and upgrade its data center, with a minimum goal of virtualizing 450 servers, or 29 percent of its total infrastructure.

The Open Compute project is started at Facebook to produce most efficient server, storage and data center hardware designs for scalable computing at lowest possible cost.


 

About the Author

Paul McCloskey is editor-in-chief of GCN. Follow him on Twitter: @Paul_GCN.

Reader Comments

Please post your comments here. Comments are moderated, so they may not appear immediately after submitting. We will not post comments that we consider abusive or off-topic.

Please type the letters/numbers you see above