Big data system ready to handle fed intell, surveillance sensor data

The system integrates DataDirect Networks’ Web Object Scaler cloud storage appliance with YottaStor’s mobile computing and big data storage system called YottaDrive.

Smooth data-center move sets template for disaster recovery

Alaska’s Enterprise Technology Services team used a unified computing system to quickly move existing services to a new data center, and validated its disaster recovery approach and failover capabilities in the process.

Mixing supercomputing, social networking for a better view of Earth

The NASA Earth Exchange lets scientists build in hours Landsat-based Earth models that used to take months.

First step to the cloud – virtualization – can be a doozy

For most agencies, server virtualization is a first step toward cloud computing. If you do it right, a move to the cloud is that much easier. However, if you make a mess of it, your journey to the cloud could be tougher.

FBI mulls how to build database of tattoos

The FBI asks experts for suggestions in how to add tattoos to its databases of fingerprints, DNA, voice signatures and iris scans.

Is NSA's Accumulo open source or Google knock-off?

A bill would bar the Defense Department from using the NSA's Accumulo open source software unless DOD can show no viable commercial open source alternatives exist.

DOE wish list: Exascale computing at a price it can afford

The Department of Energy has enlisted a seven-lab consortium to develop hardware and software technologies capable of one quintillion calculations per second.

Big data just got a little smaller

A new way of using hard drives could help government agencies wrestle with the problems associated with big data.

Sensors, cloud, drive geospatial agency push for data-center space

Data-storage requirements caused by the growth of remote sensors and the move to cloud forced the National Geospatial Intelligence agency to expand its data center footprint.

4 steps to the right data-sharing architecture for your agency

The IT market is muddied with confusing claims about information sharing technology, so here are steps to an architecture that should clear things up.

City planners probe downtown Philadelphia with 3D GIS

City planners turned to optical remote sensing technology for a pilot project to test a total 3D GIS solution.

Data center project could make DHS what it was meant to be

The department's multiyear consolidation lays the foundation of its future as a data-driven department.

Topic Resources

  • Everything Storage: Learn How to Store All Your Data Underneath a Single, Extremely Scalable, Software-Defined Architecture

    Join us as we discuss scalable, software-defined storage architecture and integration, along with hardware and software technologies that empower Everything Storage. You will also be introduced to the leading massive-scale, open-platform solution for object storage. Be sure to tune-in to learn and ask questions about how open-platform solutions can bring you enterprise-class reliability, performance and simplicity without vendor lock-in and the associated price tag.

  • Data Center Micro-Segmentation

    The software-defined data center is beginning to reveal some benefits beyond agility, speed, and efficiency to government agencies. One critical area is security. Read this brief to find out how VMware is making micro-segmentation operationally feasible in the data center for the first time.

  • Moving to a Private Cloud? Infrastructure Really Matters!

    This Clipper Group paper outlines the importance of the underlying IT infrastructure for private cloud environments and lays the groundwork for clients navigating cloud, highlighting many key aspects of a private cloud infrastructure. Download this informative whitepaper to gain further insight on cloud computing.

  • Store Less, Spend Less: Managing Data According to Its Economic Value

    The term “big data” is now passing into the common vernacular. Although initially as much about the power of business analytics and data mining, the semantics have now swung to encapsulate data that is just big and of course, resource hungry, whether in terms of management, sheer storage capacity, or budgets. In that respect, the numbers are getting increasingly alarming: Estimates suggest that 90% of the world’s information was created in the last two years, and 2.5 quintillion bytes of data are created daily. Download this informative whitepaper to gain further insights into the conflicting demands of storage management.

  • Big Five in Overdrive: Are State and Local Networks Ready?

    As state and local government organizations grapple with tight budgets, demanding constituents, growing data volumes, and cross-government coordination challenges, the "Big Five of IT" – data center consolidation, mobility, security, big data, and cloud computing – promise to improve agency performance, productivity, and service. But are state and local networks ready? This research report examines potential network impact, state and local readiness, and how organizations can prepare for the onslaught.