The great evolving, dissolving government data center
- By Paul McCloskey
- Jun 10, 2013
The modern government data center sprang from the earliest forms of computing hardware and software, but its future is all virtual. Here’s a timeline of key events over the last 30 years.
IBM introduces the model 5150 personal computer, pushing widespread deployment of desktop IT throughout government agencies.
Fast, inexpensive magnetic tape cartridges replace the standard circular tape reel that had been a feature of big computer systems since the 1950s.
MIPS and IBM release the first reduced instruction set (RISC) computer, showing that 80 percent of a computer operations run on 20 percent of its instructions set.
IBM introduces Application System/400, a mid-range computer designed for small and intermediate organizations, that will become one of the world’s most popular departmental computers.
IBM intros 3390 Direct Access Storage Device (DASD), 40 percent faster and capable of storing as much data as its predecessor, in one-third the space.
VMware Workstation becomes the first product to allow users to run multiple instances of x86 operating systems on a single PC.
Data centers account for 1.5 percent of total U.S. power consumption, growing 10 percent annually.
The Agriculture Department recommends moving National Finance Center from New Orleans because of hurricane risk.
Texas awards an $863 million, 10-year contract to IBM to build a data center in state capital of Austin. The state projects savings of $159 million over first seven years.
Cisco unveils its unified computing system, fusing computing, network, storage and virtualization into a single system.
The Environmental Protection Agency estimates the energy consumption of servers and data centers had doubled in the past five years and would double again to more than 100 billion kilowatt hours, at a cost of $7.4 billion annually. Federal servers and data centers account for 10 percent of this electricity use.
The Navy consolidates 2,700 x86 servers that power the Navy and Marine Corps Intranet to reduce energy consumption, space requirements and costs. New services will host multiple VMware ESX virtual machines.
Microsoft builds a data center in Northlake, Ill., providing the capacity for as many as 440,000 Windows servers on the first floor alone, 10 times the 40,000 servers conventional data centers can hold.
Federal CIO Vivek Kundra directs agencies to make an inventory of their IT assets in preparation of the largest data center consolidation project in history.
“Converged infrastructure” enters the data center lexicon, referring to technologies where the complete systems enchilada – I/O, networking, storage and computing – are all configured in software.
Data center research organization Uptime Institute reports that 36 percent of the large companies it surveyed expect to exhaust IT capacity within the next 18 months.
The Homeland Security Department lays out a strategy for using nine private-cloud services and three public-cloud services.
San Francisco announces a plan to consolidate and upgrade its data center, with a minimum goal of virtualizing 450 servers, or 29 percent of its total infrastructure.
The Open Compute project is started at Facebook to produce most efficient server, storage and data center hardware designs for scalable computing at lowest possible cost.
Paul McCloskey is editor-in-chief of GCN. Follow him on Twitter: @Paul_GCN.