FAQ: How to pick a secure cloud provider

There are several essential questions to consider in picking a cloud provider, including what level of security is needed for services your agency is moving

4 stages of the big data process

The phrase "garbage in, garbage out" is appearing with increasing frequency in discussion of big data – and with good reason.

What OS X Mountain Lion can do for the enterprise

As more agencies adopt Apple's mobile devices, the new OS' security, messaging and synch-ability could make the workplace more efficient and enjoyable.

'Private cloud' just a phase? IBM would like you to think so

The zEnterprise EC-12 adds power, capacity and other features that could appeal to government.

Survey: People fake understanding the cloud (even on first dates)

Agencies are going to the cloud and practically everybody is using it. But understanding it is another matter.

Remembering IBM's first mainframe, the 701

IBM's new mainframe is practically a new species compared with IBM's original mainframe, the 701.

The 3 secrets to unlocking big data

What is different about big data is the ability of emerging cloud environments to process unlimited amounts of data.

NARA gets OMB directive: oversee revamping of fed records management

Records management framework will be based on cloud architecture, secure storage and analytical tools, according to an Office of Management and Budget directive.

How to use the cloud as a developer sandbox

Using the Centers for Disease Control and Prevention research cloud, health IT developers can test apps before taking them live.

Is it data or deception? US-VISIT needs to know.

Of all immigrants coming into the United States, about 825,000 fingerprint records at US-VISIT have multiple names and inconsistent birth dates.

NASCIO: Big data is a big deal

State governments should be preparing now to make use of the vast ammounts of data their agencies collect, a new report says.

Flash memory arrays built for large data stores, fast transfers

The Nimbus Gemini can handle 1 petabyte of weekly data writes without a loss of performance, along with 12 gigabits/sec transfer speeds and over 1 million input/output operations/sec.

Topic Resources

  • Understanding the Impact of Ultra-Dense Hyper-Scale Servers

    For years high performance computing (HPC) was largely associated with large-scale scientific workloads and with applied technical computing. However, new workloads such as digital media, analytics and various applications have made their way onto HPC server. Read this whitepaper to learn how and why the arrival of ultra-dense, hyper-scale design is poised to make the latest shift in the HPC market.

  • Technical Computing for a New Era

    Today, high performance systems are used to solve technical computing problems across a range of sectors. This is because technical computing is more accessible than ever thanks to a wave of recent innovations. Read this whitepaper to learn about the high-value challenges that supercomputing allows agencies to address and the advanced capabilities it offers.

  • i2 Intelligence Analysis Platform

    IBM® i2® IBM® i2® Intelligence Analysis Platform is a scalable, extensible, service oriented intelligence analysis environment that is designed to provide organizations with access to intelligence when and where they need it, enabling faster, more informed decision making.

  • Consolidated Security Management for Mainframe Clouds

    IBM Security and IBM Information Management solutions for security work with the IBM System z platform to allow the mainframe to serve as an enterprise security hub, providing comprehensive, centralized security capabilities for organizations with distributed, multiplatform IT environments.

  • Integrating Security into Development, No Pain Required

    In this whitepaper, senior SANS Analyst Dave Shackleford discusses new ways developers and security teams can work together to create more efficient quality control processes to improve application functionality and security while minimizing bugs.