Federal IT managers are moving toward the software-defined approach, which they say can eliminate routine tasks, spur innovation and save money, a recent survey reports.
While marking GCN's 30th year, we're taking a look at how far computing has progressed in three decades. Here's a look at the most powerful supercomputers of 1982 and 2012.
A 30-year timeline of key developments in hardware, software and virtualization.
The agency gets ready to fire up the world's third largest data center, with zettabytes of capacity, as it begins work on another new high-performance center.
The joint operation cuts off more than 1,400 botnets used in the theft of $500 million worldwide, but it also shows how nimble cyber criminals are in distributing their malware.
At a conference focusing on management of federal data centers, Schneider Electric's Bob Massie spelled out a dozen areas to target.
Agencies are seeing benefits from consolidation, though most can't quantify cost savings, according to a MeriTalk survey. And 56 percent of IT pros grade agencies' efforts at "C" or below.
Entirely virtualized, software-defined data centers will change service delivery for everyone from the citizen to the warfighter, experts say.
Facility in Richmond, Va., will speed up development and deployment of secure cloud computing services
Iron Mountain offers a secure, multitenant underground facility with portfolio services for data migration, networking, tape handling and recycling of data center assets.
Government sysadmins are capable of innovation, but it's tough when you're busy maintaining the status quo via manual processes.
Discovery/Alert 7.0 installs nodes at each data location, allowing analysts to query structured and unstructured data simultaneously.