Pulse


Pulse

By GCN Staff


Nogales

Cities take licensing, permit services to the cloud

Saddled with shrinking budgets, smaller cities often lack the resources to buy the leading-edge technology needed to efficiently deliver services to businesses and citizens. As a result, a growing number of municipalities are exploring ways to automate and streamline civic functions – such as asset management, land management, licensing and permitting applications – via cloud-based services.

For instance, officials in Nogales, Ariz., a city of 20,948 people near the Mexico border, are looking to revitalize the downtown community and restore historic buildings that are over 100 years old. Along with the revitalization move, city officials wanted the Public Works Department to track building permits daily, according to Hector Tapia, the assistant public works director for Nogales.

Lacking the resources to buy new IT equipment, the city turned to Accela, which offers a suite of cloud-based civic applications under the umbrella of Civic Cloud. The move eliminated the need for the city to purchase new computer hardware, software and servers, allowing the agency to purchase only the licenses necessary to improve the office workflow, Tapia said.

The city’s goal is to implement a “one-stop shop” for development plan review and the permitting process. To accomplish this, Nogales deployed Accela Automation, a Web-based application with global search and integrated mapping capabilities. Automation lets workers share information across departments and communicate with workers in the field. Additionally, the Public Works Department will deploy Accela Citizen Access, a Web interface for initiating and tracking service requests.

These cloud-based solutions will eventually be extended to Apple iOS, Android and Microsoft Windows mobile tablets and smartphones to give inspectors and field agents real-time access so they can update project files from their mobile devices.

Accela offers city governments a complete suite of business services through the Civic Cloud, company officials said in a release.  Specific civic solutions can be tailored to unique agency requirements by Accela or partner professional services staff. Agencies can also buy and quickly deploy the applications as packaged quick-start solutions, which are preconfigured to the most common agency needs.  Initial packaged solutions are available for planning and zoning, permitting and inspection. Additional packages are being developed based on agency and market requirements, Accela officials said.

Other cities are turning to cloud-based services for licensing and permit processing. Officials in Chelmsford, Mass., recently selected a new system that will let residents apply for permits online. ViewPermit will provide an integrated system – rolled out over the next few months – that will combine GIS data and licensing information in a cloud-based environment. Other neighboring communities have already deployed ViewPoint, including Fitchburg, Lexington and Peabody.

Posted on Jul 25, 2013 at 8:57 AM0 comments


Gitmachine creates FISMA compliant virtual machines

GitMachines offers IT admins a 'virtual depot' of software tools

A group of self-described “Washington techies” and “civic innovators” recently won a $500,000 grant from the Knight Foundation to help solve a perennial problem for the government IT community: how to streamline the welter of certifications and compliance regulations that slow adoption of new technologies in government to a crawl.

The group founded “GitMachines,” which it envisions as an “open-government virtual depot” of new public-sector IT tools. GitMachines aims to help IT administrators get projects off the ground faster by offering them sources of virtual machines preconfigured for government compliance settings, what it calls “IT building blocks chocked full of open-gov and open-data goodness.”

GitMachines was one of eight projects awarded grants by the foundation in a challenge to software developers to improve how citizens and government connect.

GitMachine’s founders advocate a more open-government and open-data approach to technology adoption, portraying most IT administrators as overwhelmed by small regulations that get in the way of big improvements.

“When it comes to all the techie energy of civic hackers out there, there's a big gulf between how the private sector develops IT and government develops IT,” said Greg Elin,  one of GitMachines co-founders, in accepting the grant last month. “We felt lowering the burden of IT certification and accreditation could improve how civic innovators and government IT administrators interact.”

GitMachines would also “dramatically lower the IT operational costs of open-government projects while also making them more robust on the security and compliance front to improve adoption,” GitMachines founders say.

GitMachines launched earlier this year, so it’s just getting underway with its current offerings. One of them, dubbed Minus, started at an Open Data Day event hosted in February by the World Bank and embodies the group’s philosophy. The tool is a “ready-to-run” virtual machine, targeted at researchers and government data publishers, that can be easily downloaded onto a host laptop and auto-configured with useful software and data.

Another GitMachine project called Trailhead provides step-by-step guides on securing software libraries to government specification. “Our long-term goal is to help open-data teams and projects go from zero to hero as fast as possible with a range of ready-to-run (tools),” the group said.

In describing the challenges faced by government IT administrators, the group paints a picture of government IT administrators who would like to streamline government but are lost in a regulatory thicket.

“You want to do an open-government project,” the group said in describing its approach, “but you need a server, maybe two, maybe more. You don’t do servers, not well anyway. Servers are hard. The command line is scary.”

As a consequence, “many good programmers are not up to date on how to install ever increasingly complex server software stacks.” Further, government staffers lack administrative rights on their workstations and are thus limited in their ability to experiment. Others are reluctant to adopt more open data and open solutions because of software compliance hurdles.

“Lots of great open-government software … are under-adopted because instalation is just too damn hard,” the group said.

The solution? GitMachines wants government to borrow lessons learned by Amazon, Flickr and Netflix and other large Web business operators that have “figured out how to automate the stuffing out of configuring and maintaining their back-end servers at scale.”

“We want to make it automatic to address the most common configuration and operation gotchas developers run into who do not do system configuration for a living,” said GitMachines’ founders.

To do so, the group proposes offering agencies downloadable, preconfigured virtual machines customized for open-government applications. Certification-ready VMs could be customized for the job and operate behind firewalls, bundled into existing projects or services.

Posted on Jul 25, 2013 at 1:29 PM0 comments


Upright fan in the middle of a data center

Should you warm up your data center? First, weigh the costs

Data center administrators, more than most IT managers, deal in trade-offs: they keep an eye on the cost of efficiency and weigh energy spent against peformance. One dial on the dashboard they pay close attention to is the temperature in the data center as they look to deliver performance at a reasonable cost.

According to a recent Computerworld report, the General Services Administration has recommended raising data center temperatures from 72 degrees Fahrenheit to as high as 80 degrees F. For every additional degree of temperature in a data center’s server inlet space, GSA said, it can save 4 percent to 5 percent in costs, according to the Computerworld article. 

Those numbers square with 2008 recommendations by the  American Society of Heating, Refrigerating and Air-Conditioning Engineers, which put the recommended temperature of data centers between 64.4 F degrees and 80.6 degrees F, along with a caveat that staying within that range does not ensure the data center is operating at top energy efficiency.

So, given the GSA and industry guidelines and caveats, what is the trend line on temperature ranges preferred by most data center operators?

According to a recent survey by the Uptime Institute, few data centers are being managed anywhere near the GSA limits. About half of more than 1,000 data centers from around the world are keeping the temperature in a spring-like range of 71-75 degrees F, accoring to the Computerworld report.

The survey did pick up a small surge toward hotter, low-cost environments, with 7 percent of data center operators keeping temperatures above 75 degrees, a jump from only 3 percent the previous year.  At the same time, fewer operators are maintaining data center temperatures at the lower end of the ASHRAE range: only 6 percent compared to 15 percent in 2011.

And if you do decide to turn up the data center thermostat, it pays to go slow, Computerworld reported. "In order to implement hotter (temps), you need to do it gradually, and make sure you're not causing problems in other parts of the data center," said Uptime Institute content director Matt Stansberry.

Posted on Jul 18, 2013 at 12:58 PM0 comments


Map showing location of Coresite data center in relation to federal buildings in Washington DC

Data center trades on its location for split-second access to federal data

In high-frequency securities trading, milliseconds – even meters – can mean money. Algorithms that govern the trading process can move transactions so quickly that a few seconds jump on market information can translate into a financial advantage for buyers and sellers. That’s why investment firms in New York are snapping up office space in the city’s financial district and converting it to data centers.

And according to a report by CNBC, a similar phenomenon is taking place in the nation’s capital, where market-moving economic data is released on a daily basis.

Firms that trade on government economic data are paying for server space on K Street, converting what was once an address for high-powered lobbyists into a home for high-powered analytical data centers.

CoreSite, a company that operates data centers around the country, including a data center on K Street, offers financial traders "co-located" computers right in the heart of Washington. From there, it can provide split-access access to a steady stream of economic indicators from key financial offices, including the Department of Labor’s monthly employment report, released from the agency’s Constitution Ave. headquarters; changes to the Fed funds rate, announced from Treasury building on Pennsylvania Ave; and other economic big data generated from the departments of Commerce and Justice along the National Mall, according to CNBC.

As long as three year ago, CoreSite said its Washington, D.C., data center could offer “more than a millisecond advantage over suburbs such as Ashburn, Virginia,” according to a report from Data Center Knowledge about CoreSite’s low-latency hub.

Posted on Jul 17, 2013 at 10:14 AM0 comments


West Virginia University providing free wireless access on its the Public Rapid Transit platforms whose trams carry about 15000 riders a day

West Virginia U. brings Super Wi-Fi to campus

Super Wi-Fi systems, which became possible when TV stations abandoned analog for all-digital broadcasting, haven’t exactly taken the country by storm. But some public-sector organizations are starting to take advantage of the opportunity.

In January, Wilmington and New Hanover County in North Carolina launched the first municipal Super Wi-Fi, or “white spaces,” network. And this week, West Virginia University became the first university to deploy a Super Wi-Fi network on campus, providing free wireless access on its the Public Rapid Transit platforms, whose trams carry about 15,000 riders a day.

WVU worked with the AIR.U (Advanced Internet Regions University) consortium to build the network, which uses white spaces in the radio frequency spectrum left open by the TV broadcasting shift and freed up in 2010 by the Federal Communications Commission. WVU officials called the system a test site for Super Wi-Fi that could pave the way for bringing broadband connectivity to rural areas.

Rural and other areas that lack wireless broadband access are the target for the technology, which does have a somewhat misleading name. For one thing, it’s not really Wi-Fi, since it falls outside of the specific set of interoperable IEEE 802.11 standards designated as Wi-Fi and managed by the Wi-Fi Alliance. The alliance, in fact, has publicly objected to the term Super Wi-Fi.

However, because it operates in much lower frequencies than Wi-Fi, its signals can carry much farther and can reach further into buildings, allowing it to cover a much larger area, which makes it ideal for rural settings.

That’s the focus on AIR.U, which aims to bring wireless connectivity to rural campuses.

“Colleges in rural areas will be the greatest beneficiaries of Super Wi-Fi networks because they are located in communities that often lack sufficient broadband, their needs are greater and there is typically a large number of vacant TV channels outside the biggest urban markets,” said Michael Calabrese, director of the Wireless Future Project at the New America Foundation’s Open Technology Institute.  “This combination of factors makes them ideal candidates for utilizing Super Wi-Fi spectrum to complement existing broadband capabilities.”

AIR.U is a New America initiative, whose founding partners also include Microsoft, Google, the Appalachian Regional Commission and Declaration Networks Group, an organization recently established to plan, deploy and operate Super Wi-Fi technologies.

Posted on Jul 11, 2013 at 11:46 AM0 comments