Cloud ‘Commons’ would accelerate biomedical research

Cloud ‘Commons’ would accelerate biomedical research

The National Institutes of Health is supporting the launch of an electronic “Commons,” a shared cloud and high-performance computing ecosystem to support the biomedical research community.

Argonne sets new marks for high-speed data transfer

Argonne sets new marks for high-speed data transfer

Argonne National Laboratory researchers moved 65 terabytes of data between storage centers in Ottawa and New Orleans in under 100 minutes.

How will you manage big (and bigger) data in 2015?

How will you manage big (and bigger) data in 2015?

Government IT departments will increasingly turn to data warehouse augmentation tools and tactics in 2015 to address their big data management challenges.

Howe big science is cutting big data down to size

How science is cutting big data down to size

Researchers are developing set of data management tools that could be used across the scientific community.

VA recruits Watson analytics, cloud to fight PTSD

VA recruits Watson analytics, cloud to fight PTSD

Veterans Affairs launches a pilot project to test uses of IBM Watson analytics to help doctors sift electronic medical records to support clinical decision-making in treating PTSD.

Federal health IT wish list: more mobile, telehealth, analytics, open data

Health IT wish list: more mobile, telehealth, analytics, open data

National Coordinator for Health IT's draft five-year plan outlines expanded health information sharing through mobile, sensor and analytics technologies.

Apple, IBM debut analytics apps for government, business users

Apple, IBM debut mobile analytics apps for iOS

Apple and IBM unveiled IBM MobileFirst for iOS, the first apps from a joint venture designed to offer business-specific, cloud-supported data and analytics capabilities on Apple devices.

GSA outlines leading edge tech

GSA's wish list for leading edge tech

The General Services Administration has created a list of developing technologies that could be incorporated into GSA's Alliant I and Alliant II government-wide acquisition contracts.

Smart city platform aggregates, maps open data

Smart city platform aggregates, maps open data

Plenario lets users assemble information from open data portals and analyze it via a single spatial and temporal index, making it possible to do complex analysis with one query.

Tableau puts real-time data modeling on the dashboard

Tableau puts real-time data modeling in the dashboard

Tableau’s business intelligence tools enable near-real-time “what if” data visualizations and allow users to share analytics in a collaborative environment.

NSA releases open source tool for high-volume data flows

NSA releases open source tool for high-volume data flows

The National Security Agency released an open source software product that automates data flows among multiple computer networks, even when data formats and protocols differ.

After 25 years, Census TIGER data still in demand

After 25 years, Census TIGER data still in demand

Developed for the 1990 population count, the Census Bureau’s TIGER data sets are still in demand as geospatial base data for new government and commercial mapping programs and applications.

Topic Resources

  • Mining Gold from Machine Data

    Holistic visibility is more important than ever because most IT services— including email and web services— consist of a series of connected applications and infrastructure components. Download this whitepaper to learn the best ways to enhance your IT operations management strategy and gain insightful understanding and visibility into your organization’s data center.

  • Finding the Right Storage Balance

    New virtualization techniques and emerging flash technologies are giving data center operators more control over how they manage storage systems, speed backups and reduce costs. Download this article to learn how storage management can improve the economy of your data centers.

  • GameChanger: Microsegmentation

    In less than a decade, more than 87 million records with sensitive or private information have been exposed due to cyber-incidents on federal networks alone. Increasingly, organizations are beginning to realize that perimeter-based security just isn’t enough anymore. Download to learn how for many data centers, the solution is microsegmentation.

  • Infographic: Software-Defined Enterprise

    Moving to a fully software-defined enterprise takes time, and is typically done in steps: virtualization, cloud, software-defined storage, software-defined networking, the software-defined data center, and then full SDE. Slowly but surely, organizations are making their way. Download this infographic to learn how to overcome challenges and lower costs through SDE.


More from 1105 Public Sector Media Group