Pulse


Pulse

By GCN Staff


Virginia Tech launches network exchange in Atlanta

Virginia Tech has created a new high-performance data network exchange in Atlanta, Ga., which it says is dramatically improving access to national and international research networks. The facility, named the Mid-Atlantic Research and Education Exchange Atlanta, will also streamline access to major research centers throughout the southeastern United States.

The new facility complements another data network exchange already operated by Virginia Tech in the Washington, D.C.-metropolitan area, which was formerly known as the NatCap Aggregation Facility.

Modern science and engineering research depends on specialized high-performance networks, such as Internet2, to link distributed computing, storage, visualization, sensor and laboratory resources with research scientists and students collaborating over a global cyber-infrastructure.

Since 1998, Virginia Tech has operated statewide network and aggregation systems that link Virginia’s major research institutions to national research networks and provide a regional hub for data transfer.

In 2012, the primary network aggregation facility in the Washington, D.C.-area was rebuilt using the latest technology, raising transfer speeds to 100 gigabit/sec.

The establishment of a second data network exchange in Atlanta provides geographic diversity, backup connectivity, and direct peering connections with major research institutions such as Oak Ridge National Laboratory and the Centers for Disease Control, according to Virginia Tech.

The university partnered with the Georgia Institute of Technology to house the new facility and to establish the regional peering connections.

Both data exchange facilities are linked to Virginia Tech via extremely fast fiber optic connections established through the university’s participation in the Broadband Technology Opportunities Program, which expanded the regional fiber network.

Having two connections, one going north and one south, greatly enhances reliability and up-time, according to Virginia Tech, which will remain connected even in the face of a fiber cut on one of the paths or a problem at one of the facilities.

The university’s Network Infrastructure and Services unit designed both facilities and operates them under contract to the Mid-Atlantic Research Infrastructure Alliance. Executive Director William Dougherty says he considers the data exchange to be a critical program for the university.

“Providing the best possible access to national research networks is vital to our mission to enable the competitiveness of Virginia Tech research,” said Dougherty. “By allowing other regional universities to use these facilities, we create economies of scale and support Virginia Tech’s commitment to engagement and leadership.” Participating institutions all contribute to the cost of operating the data network exchange facilities.

Posted on Mar 25, 2014 at 10:04 AM0 comments


DARPA to mine 'big code' to improve software reliability

During the past decade information technologies have driven productivity gains that are essential to U.S. economic competitiveness, and computing systems now control a significant portion of the critical infrastructure.

As a result, tremendous public and commercial resources are devoted to ensuring that programs are correct, especially at scale. Yet, in spite of sizeable efforts by developers, software defects remain at the root of most system errors and security vulnerabilities.

To address the predicament, the Defense Advanced Research Projects Agency wants to advance the way software is built, debugged, verified, maintained and understood by combining principles of big data analytics with software analysis.

DARPA said its Mining and Understanding Software Enclaves (MUSE) program would facilitate new ways to dramatically improve software correctness and help develop radically different approaches for automatically constructing and repairing complex software, according to its announcement.

“Our goal is to apply the principles of big data analytics to identify and understand deep commonalities among the constantly evolving corpus of software drawn from the hundreds of billions of lines of open source code available today,” said Suresh Jagannathan, DARPA program manager in the announcement.

“We’re aiming to treat programs—more precisely, facts about programs—as data, discovering new relationships (enclaves) among this ‘big code’ to build better, more robust software.”

Central to MUSE’s approach is the creation of a community infrastructure that would incorporate a continuously operating specification-mining engine, the agency said. This engine would use “deep program analyses and big data analytics to create a public database containing … inferences about salient properties, behaviors and vulnerabilities of software drawn from the hundreds of billions of lines of open source code available today.”

“The collective knowledge gleaned from this effort would facilitate new mechanisms for dramatically improving software reliability and help develop radically different approaches for automatically constructing and repairing complex software,” DARPA said in describing the project.

Posted on Mar 24, 2014 at 9:20 AM0 comments


What's next for predictive analytics?

Predicative analytics, a statistical or data mining approach that determines outcomes using a series of algorithms and techniques used on both structured and unstructured data, has become a technology high in demand.  Although not a new method, its ease of use, inexpensive computing power and organizations increasing amounts of data have driven its adoption.

The technology is being used for retention analysis, fraud detection, medical diagnosis and risk assessment, to name a few. Fern Halper, director of research for advanced analytics at TDWI, highlighted four trends about predicative analytics in a blog post on TDWI website.

Techniques: Although the top three predicative analytics techniques are linear regression, decision trees and clustering, others have become more widely used. These include time series data analysis, which can be applied to weather observations, stock market prices, and machine-generated data; machine learning, which can uncover previously unknown patterns in data; and ensemble modeling, in which predictions from a group of models are used to generate more accurate results.

Open source: Open source solutions enable a wide community to engage in innovation. The R language, a free software environment for data manipulation, statistics and graphics has become one of the most popular open source solutions addressing predicative analytics in the enterprise.

In-memory analytics: In-memory analytics processes data in random-access memory rather than on disk, which lets users to analyze large data sets more effectively.

Predictive analytics in the cloud: The use of the public cloud for analytics appears to be increasing. Organizations are starting to investigate the cloud for business intelligence, and users are putting their big data in the cloud where so can process real-time data as well as running predictive models on extremely large multisource data sets.

Posted on Mar 17, 2014 at 10:41 AM2 comments


USGS to merge National Atlas and National Map programs

The National Atlas of the United States and the National Map will be combined into a single source for geospatial and cartographic information,  the U.S. Geological Survey announced.

The purpose of the merger is to streamline access to information from the USGS National Geospatial Program, which will help set priorities for its civilian mapping role and “consolidate core investments,” the USGS said.

The National Atlas provides a map-like view of the geospatial and geostatistical collected for the United States. It was designed to “enhance and extend our geographic knowledge and understanding and to foster national self-awareness,” according to its website. It will be removed from service on Sept. 30, 2014, as part of the conversion.  Some of the products and services from the National Atlas will continue to be available from The National Map, while others will not, the agency said.  A National Atlas transition page is online and will be updated with the latest news on the continuation or disposition of National Atlas products and services.

The National Map is designed to improve and deliver topographic information for the United States. It has been used for scientific analysis and emergency response, according to the USGS website.

In an effort to make the transition an easy one, the agency said it would post updates to the National Map and National Atlas websites during the conversion, including the “availability of the products and services currently delivered by nationalatlas.gov.”

“We recognize how important it is for citizens to have access to the cartographic and geographic information of our nation, said National Geospatial Program Director Mark DeMulder. “We are committed to providing that access through nationalmap.gov.”

Posted on Mar 14, 2014 at 8:56 AM0 comments


Disorder in the court? Check your beacon

Apple introduced its iBeacon at a developers conference last June, an “indoor positioning system” it described as "a new class of low-powered, low-cost transmitters that can notify nearby iOS 7 devices of their presence."

Like other beacons, Apple’s iBeacon works in the Bluetooth Low Energy frequency range to send  notifications to other devices within a network of “location-aware” beacons mounted on nearby surfaces.

While the technology is new, initial interest has focused on applications in the retail, events or energy savings, but the technology has also spawned ideas for applications in the public-sector community. In a blog on the National Center for State Courts website, court technology consultant Jim McMillan proposed several, including apps for courthouse navigation and public safety.

“It would be great to be able to provide an automatic guide to assist the public in what is called ‘wayfaring’ though a courthouse facility,” McMillan wrote. Or a courtroom beacon system could be set up to locate over-scheduled attorneys trying to keep up with the typical delays and sudden shifts in the docket.

Finally, beacons could help tighten courthouse security, wrote McMillan, “by automatically locating and calling the closest bailiffs or deputies to the courtroom if there is a problem that they could address.”

For other location-aware applications of beacon technology, “our imagination is the only limit,” he said. 

Posted on Mar 13, 2014 at 9:01 AM1 comments