As the use of IPv6 broadens, mobile users should see gains in performance, according to members of a panel at the recent Consumer Electronics Show in Las Vegas.
According to a ComputerWorld article on the discussion, panel members said that one source of improved performance would be that each IPv6-connected device — be it smart phone, phone, router, security camera or office peripheral — can communicate directly to each other over the Internet.
Service providers using IPv4, by contrast, use a process called Network Address Translation that "assigns true, unique Internet addresses to subscribers' devices only temporarily," ComputerWorld reported. The administrative overhead of sending packets back and forth to keep the connection alive slows performance and consumes power.
In addition, IPv6's massive pool of addresses allow an IP address to stay with the mobile device, eliminating the administrative traffic and dropped service when a user travels from one cell to another.
Although the Office of Management and Budget mandated that agencies enable the new protocols on public-facing services such as websites by Sept. 30, 2012, adoption has been slow.
Posted on Jan 14, 2013 at 9:39 AM0 comments
Montana is doing what other states with a modest tax base in a flat economy would do well to imitate: lean in toward regional neighbors facing similar conditions, set up arrangements to share high-cost services and use the available IT infrastructure to link resources together.
Lately the state has entered agreements with nearly a half-dozen western states, setting up a foundation for future technology sharing, Government Technology reports.
Montana’s agreements include:
- A mutual disaster-recovery agreement between Montana’s and Idaho’s revenue departments that provides for broader future technology sharing. The deal allows the states to share personnel in the event a disaster shuts down tax processing in either state. It also sows trust between the offices that helps clear the way for future "modules," Montana revenue director Dan Buck told GovTech.
- A disaster-recovery agreement with Oregon that could also expand technology sharing, according to state officials. The plan is hard-wired: it sets up a shared disaster recovery site that is linked via a 10 gigibits/sec network to data centers in each state. Additional projects might include mainframe disaster recovery and storage sharing between the states, Montana chief technology officer Stuart Fuller told GovTech.
- Agreements with labor departments in Nevada, Michigan and Arkansas to extend a shared Web hosting service to other states. The project gives each state its own customizable website interface.
Montana is also a member of the Western States Contracting Alliance, which has set up a series of contracts to Dell, Dewberry, Esri and Unisys to host GIS public cloud hosting services. But the deal is likely to lead beyond GIS management.
"Part of what we learned in the RFI process was that [cloud hosting] was bigger than GIS," Robin Trenbeath, Montana’s geographic information officer, told GCN.
Posted on Jan 10, 2013 at 9:39 AM0 comments
The Energy Department’s Pacific Northwest National Laboratory in Richland, Wash., and the University of Washington are forming the Northwest Institute for Advanced Computing to tackle big data computing and methodology.
Researchers associated with the institute will work to ensure the next generation of computers and the methods used to run them can address the upcoming big data challenges, from climate change to energy management, the lab said in its announcement.
“The expanded partnership between UW and PNNL will create tremendous new opportunities for both organizations,” Ed Lazowska, professor of computer science and engineering, said in the university's statement. “Big data is transforming the process of discovery in all fields. UW and PNNL have significant and complementary strengths.”
Working on UW's campus in Seattle, university and PNNL researchers will jointly explore advanced computer system designs, accelerate data-driven scientific discovery and improve computational modeling and simulation, focusing on areas such as computational physics, big data, cybersecurity and computing for the smart grid, according to PNNL Fellow Moe Khaleel, who directs PNNL's Computational Science and Mathematics research division.
The institute will draw on UW's expertise in computer science, engineering, applied math and natural sciences, and PNNL's expertise in designing high-performance computers and running large-scale environmental simulations.
Institute members will use existing high-performance computing resources. UW's Seattle campus houses the Hyak shared, high-performance computer cluster, and the PNNL Institutional Computing program features the 162-Teraflop Olympus supercomputer. Cloud resources also will be used extensively, the university said.
The two institutions already collaborate on the Pacific Northwest Smart Grid Demonstration Project. The two-year project is helping determine how a smarter grid can avoid congestion in the electricity transmission system and how more wind power can be used.
According to the university, initial projects for the new partnership will include algorithms and software for large graph analyses, smart grid simulation and encryption for cloud computing.
Posted on Jan 10, 2013 at 9:39 AM1 comments
Researchers at the University of Tennessee at Chattanooga are using ultra-fast, high bandwidth technology to design a computer-based disaster mitigation system aimed at helping emergency workers, local officials and the public determine how to predict and respond to disasters.
The pilot project runs computerized disaster scenarios using a model with a detailed layout of Chattanooga, including streets and buildings, to study what could happen if a hazardous material were released, for example. The goal of the project is to train emergency workers in advance of a disastrous event and deliver real-time information to workers and the public while the event actually is under way.
Chattanooga is an ideal city to test such a response system, since it has the largest community-wide gigabit-capable network in the country and an infrastructure able to maintain the communications and asset management needed for disaster mitigation.
The network can support high-performance tools and complex computational models that allow the researchers "to run accurate scenarios on the computer of how a disaster might unfold, so you can identify weaknesses," Henry McDonald, chair of excellence in computational engineering at the university, said in a release from the National Science Foundation, which is funding the project.
Additionally, the researchers currently are installing a small number of sensors that will link to the network, and be able to detect potential hazards.
Even during an emergency, "quite a bit of the infrastructure will survive," McDonald said, because the state-of-the-art fiber network has proved very resilient. "The fiber is either on top of utility poles or underground. It comes up out of the ground, up the pole and into a portal. Police cars and fire engines hook into that portal via Wi-Fi and get onto the network that way. There is a lot of communication ability we have now that we didn't have before, a lot of computing horsepower that we never had before."
NSF is funding the new disaster mitigation system research through its EArly-concept Grants for Exploratory Research (EAGER) program.
Although the pilot project for the system is set in Chattanooga, McDonald said he expects every city in the United States to benefit. "You need big computers, but you can access the cloud and access large systems over high-speed bandwidth," he said. "If it works in Chattanooga, it can work anywhere."
Posted on Jan 07, 2013 at 9:39 AM0 comments
Security researchers at Symantec have traced recent exploits of a zero-day flaw in older versions of Internet Explorer to a group it calls the Elderwood gang, whose previous attacks have included the Google Aurora attacks that have been traced to China.
An analysis of the watering hole attacks carried out against the IE flaw — for which Microsoft has issued at Fix-It workaround but not yet a patch — found similarities to previous Elderwood exploits, Symantec researchers wrote in a blog post. Among the similarities were a Flash exploit and several mentions of “HeapSpary,” which researchers said was a mistyping of Heap Spray, a common step in attacks.
Microsoft had issued an advisory Dec. 29, warning of the flaw in IE 6, 7 and 8 in certain configurations, and directing users and admins to a Fix-It for the problem. Microsoft’s next Patch Tuesday update, due Jan. 8, is not expected to include a fix for the flaw.
The vulnerability was first noticed as part of watering-hole attacks against websites of the Council on Foreign Relations’ and an energy equipment manufacturer, Capstone Turbine Corp. Microsoft said in its advisory that it was aware of only a few targeted attacks exploiting the flaw.
Symantec’s discovery of links to Elderwood raises the specter, at least, that the exploits could be part of a state-sponsored campaign. The company has tracked the group, also known as Aurora, since its attacks on Google and 33 other companies in 2009. In September 2012, Symantec issued a report saying the group had remained active, employing an unprecedented number of zero-day attacks that “indicates access to a high level of technical capability.”
Although Symantec’s report did not speculate on the origin of the attacks, Google and others have said the Aurora attacks came from within China.
Symantec’s report said the Aurora/Elderwood group was targeting defense and supply-chain contractors, human-rights groups, non-governmental organizations, IT services providers and other industries.
“Victims are attacked, not for petty crime or theft, but for the wholesale gathering of intelligence and intellectual property,” the report said. “The resources required to identify and acquire useful information—let alone analyze that information — could only be provided by a large criminal organization, attackers supported by a nation state or a nation state itself.”
In addition to its Fix-It, Microsoft has recommended high security zone settings for Internet and intranet zones, adding trusted sites to IE’s Trusted Sites zone and either disabling Active Scripting or configuring IE to prompt users before running Active Scripts.
Posted on Jan 07, 2013 at 9:39 AM1 comments