NASCIO's new plan: CIOs take the lead

Top state IT leaders have adopted a new strategic plan aimed at putting a stronger emphasis on CIO leadership.

Mass. health authority uses ID software to prevent fraud

The Massachusetts Health Insurance Connector Authority is using LexisNexis identity management software to verify the residency information of people participating in the exchange.

41 percent of public sector can't handle data deluge, survey finds

Oracle's survey also finds a lot of C, D and F grades on how well public and private organizations handle their data.

Advanced radar would 'see' through jungle to catch drug smugglers

The project is one of many at the U.S. Army's SouthCom that combines IT expertise from multiple agencies, including foreign ones, to support forces in the field.

Big data system ready to handle fed intell, surveillance sensor data

The system integrates DataDirect Networks’ Web Object Scaler cloud storage appliance with YottaStor’s mobile computing and big data storage system called YottaDrive.

Smooth data-center move sets template for disaster recovery

Alaska’s Enterprise Technology Services team used a unified computing system to quickly move existing services to a new data center, and validated its disaster recovery approach and failover capabilities in the process.

Mixing supercomputing, social networking for a better view of Earth

The NASA Earth Exchange lets scientists build in hours Landsat-based Earth models that used to take months.

First step to the cloud – virtualization – can be a doozy

For most agencies, server virtualization is a first step toward cloud computing. If you do it right, a move to the cloud is that much easier. However, if you make a mess of it, your journey to the cloud could be tougher.

FBI mulls how to build database of tattoos

The FBI asks experts for suggestions in how to add tattoos to its databases of fingerprints, DNA, voice signatures and iris scans.

Is NSA's Accumulo open source or Google knock-off?

A bill would bar the Defense Department from using the NSA's Accumulo open source software unless DOD can show no viable commercial open source alternatives exist.

DOE wish list: Exascale computing at a price it can afford

The Department of Energy has enlisted a seven-lab consortium to develop hardware and software technologies capable of one quintillion calculations per second.

Big data just got a little smaller

A new way of using hard drives could help government agencies wrestle with the problems associated with big data.

Topic Resources

  • Moving to a Private Cloud? Infrastructure Really Matters!

    This Clipper Group paper outlines the importance of the underlying IT infrastructure for private cloud environments and lays the groundwork for clients navigating cloud, highlighting many key aspects of a private cloud infrastructure. Download this informative whitepaper to gain further insight on cloud computing.

  • Store Less, Spend Less: Managing Data According to Its Economic Value

    The term “big data” is now passing into the common vernacular. Although initially as much about the power of business analytics and data mining, the semantics have now swung to encapsulate data that is just big and of course, resource hungry, whether in terms of management, sheer storage capacity, or budgets. In that respect, the numbers are getting increasingly alarming: Estimates suggest that 90% of the world’s information was created in the last two years, and 2.5 quintillion bytes of data are created daily. Download this informative whitepaper to gain further insights into the conflicting demands of storage management.

  • Big Five in Overdrive: Are State and Local Networks Ready?

    As state and local government organizations grapple with tight budgets, demanding constituents, growing data volumes, and cross-government coordination challenges, the "Big Five of IT" – data center consolidation, mobility, security, big data, and cloud computing – promise to improve agency performance, productivity, and service. But are state and local networks ready? This research report examines potential network impact, state and local readiness, and how organizations can prepare for the onslaught.

  • Research Report: The Virtualization Playbook

    The increasing complexity of the government enterprise—and the increasing volume and complexity of the data being managed—is forcing agency IT managers to rethink how they manage the infrastructure. Virtualization is at the center of that. By virtualizing key elements of the data center, agencies gain flexibility, scalability and manageability. And those gains are magnified when agencies integrate those systems into a cohesive whole. At the same time, IT managers are looking to bolster their infrastructures with emerging storage and storage management options. The data center of the future is taking shape now.

  • Deciding on the Right Flash Storage for High Density Virtual Infrastructures

    Read this whitepaper to discover how IT planners in the Fed space are looking to flash-based storage to allow the quest for scale and density to continue.