Big data best practices from Australia

Big data best practices from down under

The Australian government’s guide aims to help agencies leverage their data assets by adopting big data and analytics tools.

CUNY students turn Watson loose on NYC challenges

CUNY students turn Watson loose on NYC challenges

College students compete to develop the best apps using Watson’s supercomputing powers to solve city problems.

Disease monitoring at Internet speed

Nowcasting: Disease monitoring at Internet speed

New research adds to growing evidence that disease monitoring techniques using social media are emerging to take the place of costlier and slower traditional tools.

Chicago builds ETL toolkit for open data

Chicago builds ETL toolkit for open data

Data officials in Chicago built an automated extract transform load (ETL) framework to more quickly and easily open city data.

Science community seeks standard data formats

Science community building standard data formats

Some scientific fields that do not have standard data formats are creating new platforms to facilitate the information sharing that will advance research.

IBM extends Smarter City Challenge

IBM extends Smarter City Challenge

Building on four years of helping cities improve services, IBM's Smarter Cities Challenge invites local governments to apply for assistance from problem-solving teams.

analytics tools can help agencies sharpen e-discovery

How analytics tools can help agencies sharpen e-discovery

With the use of advanced analytics tools, legal e-discovery workflows can often be streamlined to eliminate multiple review passes that characterize manual reviews in the discovery process.

Uber to open its data books to Boston transit planners

Uber to open its data to Boston transit planners

Uber will provide Boston with trip data from its quarterly logs to help the city improve transportation practices.

5 trends that will drive IT management in 2015

5 trends that will drive IT management in 2015

2015 is shaping up as a year when data analytics, ubiquitous video, and cyber forensics will force government IT managers to make decisions about how they deploy their resources.

Cloud ‘Commons’ would accelerate biomedical research

Cloud ‘Commons’ would accelerate biomedical research

The National Institutes of Health is supporting the launch of an electronic “Commons,” a shared cloud and high-performance computing ecosystem to support the biomedical research community.

Argonne sets new marks for high-speed data transfer

Argonne sets new marks for high-speed data transfer

Argonne National Laboratory researchers moved 65 terabytes of data between storage centers in Ottawa and New Orleans in under 100 minutes.

How will you manage big (and bigger) data in 2015?

How will you manage big (and bigger) data in 2015?

Government IT departments will increasingly turn to data warehouse augmentation tools and tactics in 2015 to address their big data management challenges.

Topic Resources

  • Mining Gold from Machine Data

    Holistic visibility is more important than ever because most IT services— including email and web services— consist of a series of connected applications and infrastructure components. Download this whitepaper to learn the best ways to enhance your IT operations management strategy and gain insightful understanding and visibility into your organization’s data center.

  • County Enhances Critical Services for Citizens

    Miami-Dade County needed to streamline database administration and development to ensure that critical government services are constantly available and aligned to the needs of local citizens. Read this whitepaper to learn how the county accelerated database management and development tasks.

  • Finding the Right Storage Balance

    New virtualization techniques and emerging flash technologies are giving data center operators more control over how they manage storage systems, speed backups and reduce costs. Download this article to learn how storage management can improve the economy of your data centers.

  • Infographic: Software-Defined Enterprise

    Moving to a fully software-defined enterprise takes time, and is typically done in steps: virtualization, cloud, software-defined storage, software-defined networking, the software-defined data center, and then full SDE. Slowly but surely, organizations are making their way. Download this infographic to learn how to overcome challenges and lower costs through SDE.

  • GameChanger: Microsegmentation

    In less than a decade, more than 87 million records with sensitive or private information have been exposed due to cyber-incidents on federal networks alone. Increasingly, organizations are beginning to realize that perimeter-based security just isn’t enough anymore. Download to learn how for many data centers, the solution is microsegmentation.