Pulse


Pulse

By GCN Staff


Spirit supercomputer at the Air Force Research Laboratory

SGI's ICE heats up DOD's Spirit supercomputer

SGI has completed installation of the ICE X high performance computing system that powers the Defense Department’s Spirit supercomputer, the 14th fastest supercomputer in the world, and the fastest dedicated system within DOD.

The SGI ICE X has been deployed as part of DOD's High Performance Computing Modernization Program (HPCMP), which provides compute resources for the Air Force Research Laboratory at the DOD Supercomputing Research Center.

Named after the B2 Stealth bomber, Spirit is already being used for research such as quantum mechanical simulations with computational time that, SGI says in an announcement, “scales linearly with respect to the number of atoms.”

"Spirit is significantly faster than our previously available platform for running these linear-scaling calculations, which are becoming viable for production level work," said Gary Kedziora, an HPCMP computational materials scientist. This lets scientists “model larger and more complex materials using predictive quantum mechanical methods on thousands of SGI ICE X processor cores."

The ICE X system powers Spirit with 144 T of memory and one of the largest and fastest pure compute InfiniBand clusters, SGI said. Running on the standard Red Hat Enterprise Linux operating system, Spirit is housed in 32 racks and includes 2,304 compute blades with cold-sink technology. It has 9,216 sockets in 73,728 cores that are powered by Intel Xeon E5 processors operating at 2.6 GHz.

It can achieve a peak performance of over 1.5 petaflops (quadrillion floating point operations per second).  Spirit also has 6.72 petabytes of SGI InfiniteStorage 5500 storage.

The system is already seeing a lot of use. "Our customers are flocking to the fastest system in the Department of Defense, finding that their applications are performing significantly better on the new system," stated Jeff Graham, the director of the Air Force Research Lab, who added that Spirit has boosted performance on DOD applications by more than 27 percent on average.

Posted on Jul 09, 2013 at 10:49 AM0 comments


Screenshot from VirtualUSA

Partnership boosts geospatial data sharing across jurisdictions

A partnership between The National Information Sharing Consortium and Esri could improve the sharing of geospatial information among federal, state and local government agencies during natural disasters or emergencies.

Esri’s ArcGIS Online for Organizations (AGOL) will be deployed as a GIS platform to support the Homeland Security Department’s Virtual USA Program (vUSA), which provides interactive maps that display the location and status of critical assets, including helicopter landing sites, evacuation routes, shelters, gas supplies, water lines and power grids, according to a release.

AGOL is a cloud-based mapping portal that lets emergency management personnel share maps and data with each other and the general public from any device, Web browser or desktop application. The deployment of an AGOL portal that is compliant with vUSA will significantly enhance state and local agencies’ ability to participate in the vUSA program, since AGOL has been widely adopted by these agencies, NISC officials said.

Esri officials endorsed the vUSA initiative in 2010, expressing their commitment to make certain the geospatial company’s technologies supported the goals of the program. The partnership with NISC will accelerate the establishment of shared situational awareness and information sharing capabilities, Esri officials said.

Six pilot projects involving over 35 states have been conducted since 2009 to demonstrate vUSA’s ability to support near real-time information sharing. U.S. federal, state and local governments; Canadian provincial and federal governments; non-governmental organizations; and private-sector partners have participated in vUSA since the first pilot was conducted, NISC officials said.

NISC, in collaboration with DHS’ Science and Technology Directorate’s First Responder Group and the first responder community, are using pilot programs as the basis for transitioning vUSA technologies from prototypes to platform components that will not only be interoperable with existing systems as well as those information sharing efforts already in progress, officials said.

Other GIS-based public-sector IT sharing projects are in development. The National Geospatial-Intelligence Agency and the geospatial community are creating a cloud infrastructure to demonstrate how a coalition of organizations can share geospatial information as they respond to natural disasters around the world.

NGA officials want to see how industry can deliver open, standards-based geospatial data to first responders via multiple, interoperable cloud infrastructures, Todd Myers, NGA’s lead global compute architect, told GCN during a recent interview

Posted on Jul 02, 2013 at 11:12 AM0 comments


DARPA Unattended Ground Sensor

DARPA takes a smart-phone approach to Android ground sensors

The military is taking a page from the smart-phone industry in an effort to speed up development of intelligence, surveillance and reconnaissance (ISR) ground sensors.

Researchers for the Defense Advanced Research Projects Agency’s Adaptable Sensor System, which goes by the name ADAPT, are developing a hardware and software package with a customized Android operating system for the unattended ground sensors. The sensors -- which are small, self-powered devices that sense ground activity, including acoustic, seismic, magnetic and weather events -- can communicate wirelessly with other sensors and devices, according to DOD’s Armed With Science website.

It typically takes three to eight years to develop military sensors systems using contract manufacturers, a long cycle that can mean devices are outdated by the time they’re introduced, DARPA said. By following the design processes of smart phone makers, which are always creating new or updated models, DARPA hopes to develop new devices in a year or so.

The ADAPT program focuses on  three core elements: reusable hardware, reusable software and sensor applications, DARPA said. 

 “We believe that the ADAPT building block approach — where you take the ADAPT core and easily plug it into any number of ISR sensor reference designs — will transform how the military services and the defense industry approach ISR sensor research and development,” said DARPA program manager Mark Rich. “This method has the promise of being much more cost-effective, faster to the warfighter and easier to refresh with technology upgrades.”

DARPA plans to test new sensors based on the ADAPT reference design this summer, he said.
The agency also could develop other reference designs for air and sea vehicles. In one recent test, for example, researchers replaced the control interface of a small, quad-copter UAV with the ADAPT core, which turned over flight control.

Posted on Jun 20, 2013 at 6:07 AM1 comments


Network security firewall

Gartner: Mobile, big data, advanced attacks shape threat landscape

Gartner analysts see three main trends framing the security discussion moving forward: mobile security, big data and advanced targeted attacks. The company presented its take on these high-level trends and more at its recent three-day security and risk management summit in National Harbor, MD.

  • Mobile: As focus shifts from the device to the app/data, understanding the device types and how users are using them is just as important as the user identities.
  • Big data: Delivering risk-prioritized actionable insight will require security analytics as well as changes in information security technologies, integration methods and processes.
  • Advanced targeted attacks: The latest attack strategies use custom or dynamically generated malware for the initial breach and data-gathering phase. Enterprises should employ a defense-in-depth, layered approach model.

Reporters and attendees also shared insights and factoids from the conference.

InfoSecurity magazine covered the keynote by Paul Proctor, Gartner vice president and senior analyst, who described four security scenarios that organizations will experience over the next decade:

  • Regulated risk, where a government organization leverages regulation to protect enterprises and itself.
  • Coalition rule, where barriers to entry for malicious actors are low, and government intervention is absent or ineffective.
  • The controlling parent, where the government will step in to protect the individual.
  • Neighborhood watch or anarchy, where decreasing regulation signals that government intervention will not materially impact the targeting of individuals.

Ray Wagner, managing vice president of Gartner’s secure business enablement group, spoke on trends affecting IT security managers, according to Network World.

  • The use of cloud services, especially those outside the control of the IT department, means antivirus and perimeter firewalls are increasingly ineffective.
  • All packets across the network are suspect, so monitoring should be considered a basic means to detect attacks.
  • By 2020, 75 percent of IT budgets will be set aside for rapid detection and response approaches, up from less than 10 percent in 2012.
  • Identity management and context-aware security will be key to supporting mobile devices in the enterprise.
  • Identity and access management may need to recognize social-network identities.

Steve Piper at the CyberEdge Group listed his top five takeaways from the conference in a blog post:

  • The exhibit hall was chock full of vendors touting their abilities to detect advanced threats: FireEye, Palo Alto Networks, Damballa, Sourcefire, Trend Micro, AhnLab, Blue Coat, Zscaler, Proofpoint and many more.
  • The second-biggest theme this year was around BYOD and securing mobile devices. In a recent Gartner survey on 2012-2014 security spending priorities, mobile device management came in first place.
  • The concept of big data worked its way into virtually every session that talked about security incident event management (SIEM) technology and tactics for uncovering advanced threats.
  • The industry is so hot and heavy for advanced threat protection products (and rightfully so) that it seems to have forgotten about the critical importance of good old-fashioned vulnerability management and patch management solutions.
  • Everyone — analysts, attendees and even vendors — agrees that it’s no longer a matter of “if” your network will be compromised. It's a matter of “when.”

Gartner’s Jay Heiser spoke on security myths — the misconceptions and exaggerations about threats and the technologies to combat those threats. Among those myths, reported by Security Week, are:

  • Information security budgets are 10 percent of IT spending. Recent Gartner research shows that information security spending is closer to 5 percent of the total IT budget
  • Password expiration and complexity reduces risk: Cracking is just not the major failure mode. Passwords are not cracked, they’re sniffed.

Other tweet-worthy insights from the conference included:

  • By 2019, 90 percent of organizations will have personal data on IT systems that they don't own or control. Hostreview.com.
  • Monitoring employee behavior in digital environments is on the rise, with 60 percent of corporations expected to implement formal programs for monitoring external social media for security breaches and incidents by 2015. Gartner.
  • Only 8 percent of organizations are running next-generation firewalls. And the organizations that purchased next-generation firewalls are not properly configuring them or using them to their fullest extent. CRN.

Posted on Jun 18, 2013 at 10:37 AM1 comments


Checkbook NYC financial transparency website

NYC opens the books, and the source code, on Checkbook 2.0

Last week, New York City Comptroller John C. Liu unveiled the Checkbook NYC 2.0 website and announced that the source code for the financial transparency website would be available to developers on GitHub,  which will allow other government organizations to use Checkbook to build similar sites.

Checkbook NYC illustrates how the city government spends its nearly $70 billion annual budget. Using a dashboard that combines graphs and user-friendly tables, the site displays up-to-date information about the city's revenues, expenditures, contracts, payroll and budget.  It also offers that information programmatically via APIs.

Built on the Drupal open source content management platform, Checkbook NYC's data warehouse contains more than 50 million financial transactions, according to REI Systems, which worked with the city to develop the system. The data warehouse is updated daily and is growing at a rate of approximately 2 million transactions per month. REI was selected to lead the project, the comptroller’s office said, because of its experience with government transparency websites, including USASpending.gov, Data.gov, and ITDashboard.gov.

Other partners, centralized accounting software vendors Oracle and CGI, worked to develop "adapters," or automated data feeds, between their financial management systems and Checkbook NYC. These feeds will enable other state and local governments that use Oracle and CGI solutions to easily share their financial data with the public.

Collectively, it’s estimated that Oracle, CGI and REI Systems have committed to investing more than $1 million of resources in order to make Checkbook NYC rapidly adaptable by other governments, city officials said.

Checkbook NYC is significant because it makes a vast storehouse of information available online in a timely, structured and human-readable form, according to the Sunlight Foundation. Additionally, it marks a shift to proactive civic application-sharing, Foundation officials added.

“Checkbook NYC is an outstanding example of local government adoption of the open source software model, and with this project New York City has truly stepped up and into the open IT ecosystem,” said Deborah Bryant, Open Source for America co-chair and director of the Open Source Initiative. “NYC’s highly evolved approach also increases the benefit of collaboration beyond software code – such as sharing related investments like training, knowledge base and business rules – exponentially increasing its value to the city and anyone else joining the project.”

Posted on Jun 17, 2013 at 9:39 AM0 comments