The U.S. Geological Survey has unwrapped 400 new digital topographical maps of the state of Alaska, the first of what is expected to be more than 11,000 new maps updating geospatial baselines and replacing maps that in some cases are 50 years old.
The state’s unique geophysical profile, including a challenging terrain, remote locations and a harsh climate, made mapping difficult. Prior to this effort, topographical maps for much of Alaska were about 50 years out of date and not produced to current standards, which rely largely on high-resolution digital imagery and elevation data.
The project is being spearheaded by USGS’s Alaska Mapping Initiative, a joint venture of the agency and state mapmakers to publish a complete series of digital PDF maps at a scale of 1:25,000. New satellite image layers show the latest surface views, Trans-Alaska oil pipeline data, public land survey system data and updated glacier floes.
Without the updated digital features in maps of much of the state, “essential public services have suffered,” in the areas of transportation planning, regional planning, economic development and scientific research, USGS said.
"The associated advances in human safety, navigation and natural resource management cannot be overestimated,” said Anne Castle, the Interior Department’s assistant secretary for water and science. She praised the partnership between USGS and the state for “elevating our visual record of our state to 21st century levels.”
To ensure that they meet current accuracy specifications and standards, the maps will be made using newly acquired elevation and imagery data from multiple state, federal and commercial sources.
The mapmaking will also be automated using software adapted by USGS to create approximately 11,275 digital map quadrangles, covering the entire area of the state. Ultimately, the federal and state project teams want to build a new statewide base map that would be available over the Internet, based on open standards and free of charge.
Dividends from the effort include more accurate elevation and hydrography data to help map climate change, enhanced aviation safety and streamlined disaster preparedness and response, according to USGS. "I can't think of one thing that it doesn't affect," Nick Mastrodicasa, state digital mapping project manager with the Department of Transportation, told the Alaska Daily News.
Posted on Sep 06, 2013 at 9:52 AM0 comments
Military departments are following the IT playbook of other budget-strapped government agencies with a plan to share excess capacity on their networks and IT systems, a project expected to save billions of dollars in future IT costs, the Defense Department’s press service reported.
A recent opportunity to share IT systems arose when the Army, facing force structure changes, upgraded to faster multiprotocol routers and regional network security stacks. Meanwhile, the Air Force was looking to upgrade its IT systems to meet plans for a Defense Joint Information Environment.
By piggybacking on the Army’s upgrade, the Air Force would be able to avoid about $1.2 billion in IT costs, according to press service. For its part, the Army expects to cut its IT budget by $785 million between 2015 and 2019 by consolidating hundreds of security stacks into 15 joint stacks, which the Air Force will also use.
The upgraded routers will increase the backbone bandwidth to 100 gigabytes/sec, while speeds at Army installations will hit 10 gigabytes/sec, a huge leap from typical speeds of 650 megabytes/sec at Fort Hood, Texas, for instance, according to Mike Krieger, the Army’s deputy chief information officer.
The regional security stacks are designed to improve command and control and are essential to enabling a single security architecture in the joint information environment, he said.
“More and more, we’re saying that some of the service-delivery capability can be managed at the enterprise level, greatly improving efficiency, effectiveness and security,” said Richard Breakiron, network capacity domain manager for the Army’s chief information office.
The new routers will also help the Air Force and Army converge network backbones and gain additional savings. The Air Force Space Command’s Brig. Gen. Kevin Wooton said the deal allows the Air Force to bring on unified networking capabilities such as voice over IP, “and it allows us to put much more of this capability up at the enterprise level.”
Together, Multiprotocol Label Switching routers and the regional security stack improve performance and security, said Air Force Lt. Gen. Ronnie D. Hawkins, Jr., director of the Defense Information Systems Agency, which is working with the Army on the implementation. He said the project “creates a network that is fundamentally more defensible and more efficient.”
DOD chief information officer Teri Takai called the IT sharing and modernization agreement involving the Air Force, Army and Defense Information Systems Agency “an important step forward” in the military’s “aggressive” pursuit of a joint information environment.
Posted on Sep 05, 2013 at 6:33 AM1 comments
A consortium of six colleges plans to build a cloud-based streaming video infrastructure that it will share across its campuses.
Through the New York Six MediaShare Project, consortium members will share media collections and technologies, allowing them to tap common resources and enhance services. The New York Six Liberal Arts Consortium is a cooperative venture of six upstate New York liberal arts institutions: Colgate University, Hamilton College, Hobart and William Smith Colleges, St. Lawrence University, Skidmore College and Union College.
The consortium has chosen the Ensemble Video platform to provide the core video streaming technology for its six member campuses and for the MediaShare Project. The content will be housed in a private cloud, and Wowza Media Cache servers will be deployed to the campuses, maximizing streaming efficiency.
Ensemble Video will give the schools secure access to content; rich publishing options for course lectures, campus events and guest speakers; and sharing of media collections and technologies, the company said in a release. Users will be able to embed single videos and video playlists in a variety of learning management systems, HTML pages, blogs, portals, and other content management systems.
“The Ensemble Video infrastructure is an important step for the consortium in advancing our library and IT collaborations,” said Amy Doonan Cronin, executive director of the New York Six. “Not only will it enhance our MediaShare project, it will provide a framework for a wide range of multi-campus activities in the coming years.”
“This is a growing trend we’re seeing in higher ed,” said Ensemble Video founder and CEO Andy Covell. “Institutions are coming together to share media resources and technology infrastructure. Ensemble Video’s powerful sharing features allow content to flow freely between campuses, while our flexible, decentralized administration structure lets each school or department remain autonomous.”
In July, Ensemble Video announced an agreement with the State University of New York and its 64 member campuses. The SUNY Information Technology Exchange Center (ITEC) will host and support Ensemble Video, the company said, and individual institutions will have the option of hosting their own local media cache, to further increase efficiency.
Posted on Sep 04, 2013 at 11:44 AM0 comments
It’s no secret that Windows XP’s days are numbered, but agencies that cannot or will not upgrade by Microsoft’s April 8, 2014 end-of-support deadline won’t have to work entirely without a net, though it will cost them.
Microsoft will continue to issue patches for high-level vulnerabilities through “Custom Support,” a program designed for large organizations. The service will issue patches for critical vulnerabilities and some rated as important, but not for vulnerabilities rated moderate or low, Computerworld reported. It will cost about $200 per device per year, plus extra charges for some of the important patches.
Organizations looking to continue with XP beyond the deadline can sign up for Custom Support through Microsoft’s Premier Support Services program. Alternatively, IT managers can use migration tools like Zinstall to help move their users’ programs and files off XP.
For some time Microsoft has been banging the drum about support ending for the 12-year-old XP and for Office 2003. And while many agencies, organizations and individuals have moved on to Windows 7, or even Windows 8, there are still plenty of XP users worldwide. According to data from Net Applications, as of July, XP still held 37.19 percent of the operating system market, second only to Windows 7’s 44.49. (Windows 8 was third, at 5.4 percent.) In fact, as the deadline gets closer, the rate of people giving up XP has slowed.
Some analysts have speculated that organizations may be trying to stare down Microsoft, hoping that the sheer number of XP users will force the company to extend regular support past the deadline. Gary Schare, president and CEO of Browsium, told Redmond Magazine in April that, "Right now this group has the numbers to back their position, with 600 million Windows XP systems still in use and only a 1 percent drop in the last six months after 5 percent in the prior six months."
Microsoft’s preference, of course, is that users upgrade. The company makes its case for agencies making the move on its Microsoft in Government blog, arguing that agencies will save money in the long run, increase efficiencies and improve security.
Custom Support may extend life for organizations that still have some or all of their systems running XP next April, but it doesn’t seem like a bargain. Depending on the size of an organization, $200 per device could add up to millions, and that’s only for partial support.
Agencies that haven’t completed a transition to Windows 7 or 8 — or, like a growing list of agencies, to the cloud with Office 365 or Google Apps for Government — may have to consider Custom Support as a temporary, if potentially expensive, contingency. The only other option would be to join the ranks of the stubborn 37.19 percent and do nothing and hope it works out.
Posted on Sep 03, 2013 at 12:42 PM9 comments
The U.S. Department of Energy last month opened the Energy Systems Integration Facility, a $135 million research center designed to test how power grids, data centers and other IT systems can be made more energy-efficient. In fact, the center itself, located in Golden, Colo., might be the most energy-efficient data center in the world.
One of the systems on its research plan might include the use of robots for energy management and conservation. Recently IBM and EMC developed robots designed to rove data centers and collect temperature, power usage and other data that could affect the performance of data center IT systems.
Cooling alone can account for more than 60 to 70 percent of data center power costs, according to EMC officials, liabilities that can mount up as organizations buy more capacity than needed and overcool their systems. Around 85 percent of data centers also mismanage the provisioning of infrastructure, which increases energy consumption, according to EMC officials.
The EMC Data Center Robot helps combat these problems by patrolling for temperature fluctuations, humidity and system vibrations and locating sources of cooling leaks and other vulnerabilities.
EMC’s DC Robot collects data via digital sensors and sends it through a Wi-Fi connection for processing. An algorithm converts the temperature data into a thermal map, which can be used to identify anomalous hot and cold spots in data center aisles.
Most data centers use a set of fixed sensors to manage temperatures and other energy consumption indicators, an expense that can run into the millions of dollars – “low hanging fruit” that helped justify their investment in the DC Robot, say EMC officials.
While the DC Robot was one of the first data center energy-focused robots, IBM has developed a similar model, which it offers as part of an energy management troubleshooting service.
The firm's Measurement and Management Technologies unit will use the robo-tool to create a "robotic cooling assessment," a three-dimensional temperature and humidity maps to help organizations identify energy sinks and other problem spots in their data centers. The assessment determines a data center’s baseline and high-level cooling capacity.
A third energy diagnostics tool, from Purkay Labs, is a simple portable unit that checks energy-environment data for short or long term intervals. The unit consists of an adjustable carbon fiber rod that measures the air quality at three different heights.
While not mobile and so technically not a robot, “It’s a product that we’ve developed so you can get the temp across the entire aisle,” said CEO Indra Purkayastha.
Posted on Aug 30, 2013 at 11:31 AM0 comments