Pulse


Pulse

By GCN Staff


patient phone

With deadlines nearing, states, hospitals rush to build-out health IT networks

The Obama administration’s health reform program faces some significant high tech milestones this fall, including those designed to promote interaction between healthcare providers and patients through the use of mobile technology.

In October, states will be required to launch health insurance exchanges, online markets where the public can shop and compare health insurance plans.

This fall is also when the second stage of the administration’s “meaningful use” plan kicks in. The plan provides financial incentives to hospitals and doctors providers who can show levels of “patient engagement” via the use of health IT.

In both cases, the organizations are demonstrating progress if not perfection in setting up the foundation for a national healthcare information system.

To help promote meaningful use, for example, the Department of Health and Human Services has launched a series of 31 mobile apps that let Medicare and Medicaid patients interact with providers or access HHS-sanctioned information about their healthcare.

The apps include tools for finding the nearest federally funded health center, body-mass index calculators and information for physicians on identifying preventive services appropriate for their patients.

For its part, the Centers for Disease Control and Prevention have developed apps that offer physicians up-to-date information on influenza activity and the latest diagnostic and treatment information. A separate app lets the public track “influenza-like illness” levels across the country and where they live.

According to a report in MobiHealthNews, the technologies help support the meaningful-use requirement that half of a provider’s patients have the ability to view their health records online. Kaiser Permanente, for one, told the MobiHealthNews that in 2012 already 4 million of its members viewed their health records through an online portal, with 22 percent of the traffic coming from mobile devices.

But while mobile access to healthcare providers seems to be blossoming, most consumers may have to wait until next year before using smartphones to fully tap into new state insurance exchanges.

In the rush to meet the October deadline to get the exchanges up and running, states have not generally had time to make them fully mobile-friendly. While shoppers may be able to use their mobile phones to view the portal and read some material, some features, such as detailed plan comparisons, may not be ready for smartphones.

Washington Health Benefit Exchange CIO Curtis Kwak told Government Technology that with an Oct. 1 deadline looming, the state did not have the chance to build mobile friendliness into the first version of the exchange. Instead, it’s teed up for version 2, planned for August 2014.

Still, those using larger mobile screens may be able to navigate the site or receive text reminders of their coverage status, Government Technology reported. “You always have this problem that you want to have the same content be presentable using different presentation media,” Manu Tandon, secretariat CIO at the Massachusetts Executive Office of Health and Human Services told Government Technology. “This is no different.”

Posted on Aug 15, 2013 at 10:14 AM0 comments


API

Data.gov pilots an easier way to manage agency APIs

Data.gov is offering a new API management service intended to make it easier for agencies to release their application programming interfaces to developers.

The service, api.data.gov, is in the pilot stage — but at a point where the Data.gov team is looking for agencies to use the service and provide feedback, according to an announcement via Google Groups from Nick Muerdter of the National Renewable Energy Laboratory.

As part of the administration’s Open Government Initiative, agencies have been making data sets available on Data.gov and in many cases providing APIs, which help developers make use of the data by defining how software components should fit together. Data.gov keeps a catalog of APIs, listed by agency.

The new management service aims to help developers use APIs from across agencies boundaries, Muerdter wrote, by handling such things as API keys, usage analytics and documentation. It functions as a transparent layer atop existing APIs and, according to the api.data.gov site “helps deal with some of the repetitive parts of managing APIs.”

Agencies using the service retain full control of their APIs — and can put all of their APIs into the service or start out by trying just one — while the service will “handle the boring stuff,” according to the service’s website.

Among its features are:

API keys: Users who sign up for a key can then use that key for all participating APIs. And because they’re being managed by api.data.gov, agencies can assume that all hits are from valid users.

Analytics: The service takes care of tracking usage, offering graphs on usage trends and the ability to drill down into statistics and metrics on API performance.

Documentation: The service can host API documentation or, if an agency has its own developer portal, provide a link to it.

Rate limiting: Participants can set high or low rate limits for individual users and avoid server overloads that might be caused by users exceeding their limits.

Anyone looking to take part in the pilot can contact the team from the api.data.gov page.

Posted on Aug 02, 2013 at 7:00 AM0 comments


Nogales

Cities take licensing, permit services to the cloud

Saddled with shrinking budgets, smaller cities often lack the resources to buy the leading-edge technology needed to efficiently deliver services to businesses and citizens. As a result, a growing number of municipalities are exploring ways to automate and streamline civic functions – such as asset management, land management, licensing and permitting applications – via cloud-based services.

For instance, officials in Nogales, Ariz., a city of 20,948 people near the Mexico border, are looking to revitalize the downtown community and restore historic buildings that are over 100 years old. Along with the revitalization move, city officials wanted the Public Works Department to track building permits daily, according to Hector Tapia, the assistant public works director for Nogales.

Lacking the resources to buy new IT equipment, the city turned to Accela, which offers a suite of cloud-based civic applications under the umbrella of Civic Cloud. The move eliminated the need for the city to purchase new computer hardware, software and servers, allowing the agency to purchase only the licenses necessary to improve the office workflow, Tapia said.

The city’s goal is to implement a “one-stop shop” for development plan review and the permitting process. To accomplish this, Nogales deployed Accela Automation, a Web-based application with global search and integrated mapping capabilities. Automation lets workers share information across departments and communicate with workers in the field. Additionally, the Public Works Department will deploy Accela Citizen Access, a Web interface for initiating and tracking service requests.

These cloud-based solutions will eventually be extended to Apple iOS, Android and Microsoft Windows mobile tablets and smartphones to give inspectors and field agents real-time access so they can update project files from their mobile devices.

Accela offers city governments a complete suite of business services through the Civic Cloud, company officials said in a release.  Specific civic solutions can be tailored to unique agency requirements by Accela or partner professional services staff. Agencies can also buy and quickly deploy the applications as packaged quick-start solutions, which are preconfigured to the most common agency needs.  Initial packaged solutions are available for planning and zoning, permitting and inspection. Additional packages are being developed based on agency and market requirements, Accela officials said.

Other cities are turning to cloud-based services for licensing and permit processing. Officials in Chelmsford, Mass., recently selected a new system that will let residents apply for permits online. ViewPermit will provide an integrated system – rolled out over the next few months – that will combine GIS data and licensing information in a cloud-based environment. Other neighboring communities have already deployed ViewPoint, including Fitchburg, Lexington and Peabody.

Posted on Jul 25, 2013 at 8:57 AM0 comments


Gitmachine creates FISMA compliant virtual machines

GitMachines offers IT admins a 'virtual depot' of software tools

A group of self-described “Washington techies” and “civic innovators” recently won a $500,000 grant from the Knight Foundation to help solve a perennial problem for the government IT community: how to streamline the welter of certifications and compliance regulations that slow adoption of new technologies in government to a crawl.

The group founded “GitMachines,” which it envisions as an “open-government virtual depot” of new public-sector IT tools. GitMachines aims to help IT administrators get projects off the ground faster by offering them sources of virtual machines preconfigured for government compliance settings, what it calls “IT building blocks chocked full of open-gov and open-data goodness.”

GitMachines was one of eight projects awarded grants by the foundation in a challenge to software developers to improve how citizens and government connect.

GitMachine’s founders advocate a more open-government and open-data approach to technology adoption, portraying most IT administrators as overwhelmed by small regulations that get in the way of big improvements.

“When it comes to all the techie energy of civic hackers out there, there's a big gulf between how the private sector develops IT and government develops IT,” said Greg Elin,  one of GitMachines co-founders, in accepting the grant last month. “We felt lowering the burden of IT certification and accreditation could improve how civic innovators and government IT administrators interact.”

GitMachines would also “dramatically lower the IT operational costs of open-government projects while also making them more robust on the security and compliance front to improve adoption,” GitMachines founders say.

GitMachines launched earlier this year, so it’s just getting underway with its current offerings. One of them, dubbed Minus, started at an Open Data Day event hosted in February by the World Bank and embodies the group’s philosophy. The tool is a “ready-to-run” virtual machine, targeted at researchers and government data publishers, that can be easily downloaded onto a host laptop and auto-configured with useful software and data.

Another GitMachine project called Trailhead provides step-by-step guides on securing software libraries to government specification. “Our long-term goal is to help open-data teams and projects go from zero to hero as fast as possible with a range of ready-to-run (tools),” the group said.

In describing the challenges faced by government IT administrators, the group paints a picture of government IT administrators who would like to streamline government but are lost in a regulatory thicket.

“You want to do an open-government project,” the group said in describing its approach, “but you need a server, maybe two, maybe more. You don’t do servers, not well anyway. Servers are hard. The command line is scary.”

As a consequence, “many good programmers are not up to date on how to install ever increasingly complex server software stacks.” Further, government staffers lack administrative rights on their workstations and are thus limited in their ability to experiment. Others are reluctant to adopt more open data and open solutions because of software compliance hurdles.

“Lots of great open-government software … are under-adopted because instalation is just too damn hard,” the group said.

The solution? GitMachines wants government to borrow lessons learned by Amazon, Flickr and Netflix and other large Web business operators that have “figured out how to automate the stuffing out of configuring and maintaining their back-end servers at scale.”

“We want to make it automatic to address the most common configuration and operation gotchas developers run into who do not do system configuration for a living,” said GitMachines’ founders.

To do so, the group proposes offering agencies downloadable, preconfigured virtual machines customized for open-government applications. Certification-ready VMs could be customized for the job and operate behind firewalls, bundled into existing projects or services.

Posted on Jul 25, 2013 at 1:29 PM0 comments


Upright fan in the middle of a data center

Should you warm up your data center? First, weigh the costs

Data center administrators, more than most IT managers, deal in trade-offs: they keep an eye on the cost of efficiency and weigh energy spent against peformance. One dial on the dashboard they pay close attention to is the temperature in the data center as they look to deliver performance at a reasonable cost.

According to a recent Computerworld report, the General Services Administration has recommended raising data center temperatures from 72 degrees Fahrenheit to as high as 80 degrees F. For every additional degree of temperature in a data center’s server inlet space, GSA said, it can save 4 percent to 5 percent in costs, according to the Computerworld article. 

Those numbers square with 2008 recommendations by the  American Society of Heating, Refrigerating and Air-Conditioning Engineers, which put the recommended temperature of data centers between 64.4 F degrees and 80.6 degrees F, along with a caveat that staying within that range does not ensure the data center is operating at top energy efficiency.

So, given the GSA and industry guidelines and caveats, what is the trend line on temperature ranges preferred by most data center operators?

According to a recent survey by the Uptime Institute, few data centers are being managed anywhere near the GSA limits. About half of more than 1,000 data centers from around the world are keeping the temperature in a spring-like range of 71-75 degrees F, accoring to the Computerworld report.

The survey did pick up a small surge toward hotter, low-cost environments, with 7 percent of data center operators keeping temperatures above 75 degrees, a jump from only 3 percent the previous year.  At the same time, fewer operators are maintaining data center temperatures at the lower end of the ASHRAE range: only 6 percent compared to 15 percent in 2011.

And if you do decide to turn up the data center thermostat, it pays to go slow, Computerworld reported. "In order to implement hotter (temps), you need to do it gradually, and make sure you're not causing problems in other parts of the data center," said Uptime Institute content director Matt Stansberry.

Posted on Jul 18, 2013 at 12:58 PM0 comments