While 'big data' was useful at one point in encapsulating the idea of exploiting huge volumes of structured and unstructured information, many feel the term is past its sell-by date.
While geographic information systems have become well established in the federal government, the current challenge for agencies is to integrate their data with that from their counterparts in other jurisdictions.
Government agencies are making strides testing uses of big data to predict risks of disease or the path of a killer virus, but hurdles remain, including linking legacy datasets and setting up common vocabularies.
Researchers at Oak Ridge National Lab used three diverse high-performance computing architectures to analyze publicly available health-related datasets.
The Office of Federal Student Aid looks for new and unconventional data models to determine the credit worthiness of its loan applicants.
The elevation data will better enable scientists to monitor the impact of sea-level rise, conduct environmental monitoring activities and support local decision making.
The Case Advice app pulls data from a variety of back-office systems and makes it available to case workers while they’re in the field.
The economic development agency plans to expand its uses of data analytics and visualization technologies in order to convert more data into actionable information.
Novetta’s Entity Analytics software connects structured, semi-structured and unstructured data within Hadoop, enabling powerful analytic queries.
With its data-driven automation and orchestration, Swimlane manages alerts, increases situational awareness and remediates threats.
The Australian government’s guide aims to help agencies leverage their data assets by adopting big data and analytics tools.
College students compete to develop the best apps using Watson’s supercomputing powers to solve city problems.