Sequoia supercomputer at Lawrence Livermore National Laboratory

New techniques behind Energy's plan for exascale computing

Scientists at the Energy Department's Los Alamos National Lab are working toward building an exascale computer that by 2020 could be powerful enough to model the human brain, cell by cell.

Expanding use of cameras and sensors creates huge stores of data

Emerging protocol can help manage the Internet of Things

MQTT, proposed as an OASIS standard, can help agencies manage all the data generated by sensors, mobile devices and other machine-to-machine networks.

Too much data as businessman sits on raft in storm at sea

Is big data big trouble for state, local governments?

A MeriTalk survey finds that many agencies lack the storage capacity, computing power and personnel to handle the data they have.

Large Hadron Collider Photo by Maximilien Brice/CERN

Internet2 backbone serving national supercomputer network

The XSEDE research network operators say bandwidth and networking tools are key to moving large files between high-performance systems.

Visualization using Excel add ins

How to use Excel for on-the-spot analytics

New add-ons for the spreadsheet program can combine Census, weather and demographic data to aid evacuations in an emergency.

Businessman points at floating money

Service helps agencies pinpoint sources of improper payments

Fiserv's payment accuracy and fraud solution helps agencies reduce the overall loss of cash associated with improper disbursement.

Genome research must move to cloud

Genome research creating data that's too big for IT

NCI is looking for ways cloud computing could help researchers tap its petabyte-scale human cancer genome database, which is "breaking the standard model" of research.

Information from paper forms entered into laptop

Two states overhaul IT to boost social services

Arkansas and South Carolina scrap older systems for cloud-based technologies and big data analytic software.

DISA plans cloud storage of exabytes of image data

DISA plans for exabytes of drone, satellite data

The agency is planning a secure cloud to hold all the imagery and data collected by the military's drones, satellites and other sources.

Big data transfer to the cloud

Service lets agencies beam big data to and from Amazon cloud

Attunity's CloudBeam is a software-as-a service platform integrated with AWS S3, providing file replication and synchronization of files as well as managed file transfer service.

Digital fingerprint with data

Police tap layman-friendly analytics to track gang activity

The Crime Analysis Unit in Alexandria, Va., adopts uReveal, which lets analysts fuse and extract knowledge from any type of data, structured or unstructured.

Chiliad Discovery Alert 7.0 screenshot

Chiliad takes virtual consolidation route to big data analytics

Discovery/Alert 7.0 installs nodes at each data location, allowing analysts to query structured and unstructured data simultaneously.

Topic Resources

  • Everything Storage: Learn How to Store All Your Data Underneath a Single, Extremely Scalable, Software-Defined Architecture

    Join us as we discuss scalable, software-defined storage architecture and integration, along with hardware and software technologies that empower Everything Storage. You will also be introduced to the leading massive-scale, open-platform solution for object storage. Be sure to tune-in to learn and ask questions about how open-platform solutions can bring you enterprise-class reliability, performance and simplicity without vendor lock-in and the associated price tag.

  • Moving to a Private Cloud? Infrastructure Really Matters!

    This Clipper Group paper outlines the importance of the underlying IT infrastructure for private cloud environments and lays the groundwork for clients navigating cloud, highlighting many key aspects of a private cloud infrastructure. Download this informative whitepaper to gain further insight on cloud computing.

  • Store Less, Spend Less: Managing Data According to Its Economic Value

    The term “big data” is now passing into the common vernacular. Although initially as much about the power of business analytics and data mining, the semantics have now swung to encapsulate data that is just big and of course, resource hungry, whether in terms of management, sheer storage capacity, or budgets. In that respect, the numbers are getting increasingly alarming: Estimates suggest that 90% of the world’s information was created in the last two years, and 2.5 quintillion bytes of data are created daily. Download this informative whitepaper to gain further insights into the conflicting demands of storage management.

  • Big Five in Overdrive: Are State and Local Networks Ready?

    As state and local government organizations grapple with tight budgets, demanding constituents, growing data volumes, and cross-government coordination challenges, the "Big Five of IT" – data center consolidation, mobility, security, big data, and cloud computing – promise to improve agency performance, productivity, and service. But are state and local networks ready? This research report examines potential network impact, state and local readiness, and how organizations can prepare for the onslaught.

  • Research Report: The Virtualization Playbook

    The increasing complexity of the government enterprise—and the increasing volume and complexity of the data being managed—is forcing agency IT managers to rethink how they manage the infrastructure. Virtualization is at the center of that. By virtualizing key elements of the data center, agencies gain flexibility, scalability and manageability. And those gains are magnified when agencies integrate those systems into a cohesive whole. At the same time, IT managers are looking to bolster their infrastructures with emerging storage and storage management options. The data center of the future is taking shape now.