big data
 
  • Maritime test bed lets Navy validate ISR systems

    Maritime test bed lets Navy validate ISR systems

    Lockheed Martin has developed a new software test platform designed to mimic naval environments at sea and ashore, which will allow the company to validate sophisticated intelligence, communications and sensor systems before they are introduced in an operational environment.

  • Benchmark compares Hadoop systems

    The Transaction Processing Performance Council's TPCx-HS benchmark was developed to provide performance, availability and energy consumption metrics of big data systems.

  • Dousing wildfires with big data

    Dousing wildfires with big data

    Researchers at the San Diego Supercomputer Center are using sensor data and satellite information for situational wildfire surveillance.

  • DARPA challenges teams to predict virus spread

    Teams will build models that predict the spread of the Chikungunya virus in the Americas; the tools could be applied to other diseases and inform responses to emergencies.

  • 2014 GCN Award Winners

    10 public sector projects win GCN Awards for IT excellence

    The 2014 GCN Award winning projects range from a system that streamlined a process for victims of physical abuse to obtain protective orders to a cost-saving mobile app by a self-taught Air Force dev team.

  • How CMS takes on and beats back Medicare fraud

    How CMS takes on and beats back Medicare fraud

    Integrated with several Medicare databases, the Fraud Prevention System uses predictive algorithms and analytics to compare billing patterns against Medicare fee-for-service claims prior to payment.

  • The National Water Center will run state-of-the-art water management models

    Tech takes on water resource challenges

    The new National Water Center, the first U.S. center for water forecast operations, research and collaboration across federal agencies, is opening for business on the campus of the University of Alabama, Tuscaloosa.

  • Data lakes may hold promise for big data analytics

    Data lakes: Don’t dive in just yet

    While some claim data lakes are essential to capitalizing on big data analytics, there is no common view about what a data lake is or how it can provide value, said Gartner researchers.

  • Wire data analytics: Toward a ‘single pane of glass’ for IT operations analytics

    Toward a 'single pane of glass' for IT operations analytics

    Wire data analytics uses packet data to monitor activity across the network stack and may be the final step in the development a single monitoring and management architecture for enterprise IT operations.

  • Using analytics to reduce child abuse risk

    Using analytics to reduce child abuse risk

    Florida's Department of Children and Families is studying the use of predictive analytics to reduce child fatalities.

Topic Resources

  • Store Less, Spend Less: Managing Data According to Its Economic Value

    The term “big data” is now passing into the common vernacular. Although initially as much about the power of business analytics and data mining, the semantics have now swung to encapsulate data that is just big and of course, resource hungry, whether in terms of management, sheer storage capacity, or budgets. In that respect, the numbers are getting increasingly alarming: Estimates suggest that 90% of the world’s information was created in the last two years, and 2.5 quintillion bytes of data are created daily. Download this informative whitepaper to gain further insights into the conflicting demands of storage management.

  • Big Five in Overdrive: Are State and Local Networks Ready?

    As state and local government organizations grapple with tight budgets, demanding constituents, growing data volumes, and cross-government coordination challenges, the "Big Five of IT" – data center consolidation, mobility, security, big data, and cloud computing – promise to improve agency performance, productivity, and service. But are state and local networks ready? This research report examines potential network impact, state and local readiness, and how organizations can prepare for the onslaught.

  • The New Rules for Storage in the Era of Big Data

    The big data era presents new opportunities and advantages for agencies that are able to exploit its capabilities. However, the velocity at which data must be processed poses major challenges as it relates to the IT infrastructure. This whitepaper examines what all-flash storage systems can do to help agencies leverage big data technologies and optimize the benefits they can attain.

  • An Executive Guide to Analytics Infrastructure: Imperatives for Advanced Analytics

    Get the most information from your data. Business has never been faster and is increasingly dependent on the timely analysis of large volumes of data for strategic and operational decisions. As executives come to rely on advanced analytics, the importance of a robust IT infrastructure cannot be overstated.Data comes from everywhere; make sure you get it in time, every time.

  • Research Report: The Virtualization Playbook

    The increasing complexity of the government enterprise—and the increasing volume and complexity of the data being managed—is forcing agency IT managers to rethink how they manage the infrastructure. Virtualization is at the center of that. By virtualizing key elements of the data center, agencies gain flexibility, scalability and manageability. And those gains are magnified when agencies integrate those systems into a cohesive whole. At the same time, IT managers are looking to bolster their infrastructures with emerging storage and storage management options. The data center of the future is taking shape now.