Pulse


Pulse

By GCN Staff


National Weather Service

NWS powers up faster, more detailed weather forecasts

The National Weather Service recently activated a system that quickly harnesses weather data from multiple sources, integrates the information and provides a detailed picture of the current weather.

The Multiple Radar Multiple Sensor (MRMS) system combines data streams from multiple radars, satellites, surface observations, upper air observations, lightning reports, rain gauges and numerical weather prediction models to produce a suite of decision-support products every two minutes, according to the NOAA National Severe Storms Laboratory.

Because it provides better depictions of high-impact weather events, forecasters can quickly diagnose severe weather and issue more accurate and earlier forecasts for communities and air traffic managers.

“MRMS uses a holistic approach to merging multiple data sources, allowing forecasters to better analyze data and potentially make better predictions,” said Ken Howard, a research meteorologist at NOAA’s National Severe Storms Laboratory who helped design MRMS. “It was developed in collaboration with NOAA’s National Weather Service hydrologists and forecasters who tested experimental versions and provided valuable input and feedback.”

MRMS data are also an input into the newly launched High-Resolution Rapid Refresh weather model, which lets forecasters pinpoint neighborhoods under severe weather threats and warn residents hours before a storm hits. It will also help forecasters provide more information to air traffic managers and pilots about hazards such as air turbulence and thunderstorms.

MRMS is being used to develop and test new Federal Aviation Administration NextGen products in addition to advancing techniques in quality control, icing detection, and turbulence.

NOAA researchers developed the MRMS system in cooperation with the University of Oklahoma’s Cooperative Institute for Mesoscale Meteorological Studies, and the software is available for government at no cost.

Posted on Oct 17, 2014 at 12:25 PM0 comments


PTO shopping for tech to automate acquisition systems

The Patent and Trademark Office is looking into whether off the shelf, “enterprisewide,” products are available that would help it conduct tasks related to the acquisition process. 

The products, sought by PTO’s chief financial offer, would include acquisition workload planning, distribution, transition and tracking technologies as well as tools to facilitate the evaluation of vendor proposals.

The acquisition tech would also have to be compatible with the content management system operated by the Office of the Chief Financial Officer, an Apache Cassandra database run in a Datastax Enterprise environment, according to PTO.

The PTO is interested in finding out whether prospective acquisition planning tech is able to run on VMware, and if not, what platforms it can run on. VMware provides cloud and virtualization software.

Other PTO requirements include the ability to automate data integration with existing PTO systems, including its enterprise data warehouse and Momentum, PTO’s core financial system.

Support for Microsoft, RedHat, Oracle and Apache technologies are also required, according to the RFI.

Other desirable features of the acquisition system are that it support e-signatures and single-sign on, role-based security and electronic workflow. Real-time integration with the Federal Business Opportunities service as well as automation of Federal Acquisition Regulation data extraction are also wanted, said the PTO.

Posted on Oct 15, 2014 at 7:51 AM0 comments


BYOD of choice for Congress

BYOD of choice for Congress

Apple devices have taken root on the Hill, according to a recent survey by The Hill.

Of the 102 lawmakers whose offices responded to The Hill’s questions, more than 71 percent use iPhones, 9 percent use Android phones and 28 percent carry a BlackBerry. Not surprisingly, many carry more than one device. Among those using tablets, 95 percent use iPads.

Congress is much more Apple-friendly than the nation as a whole, according to The Hill, where about  42 percent of smartphone owners have an iPhone and 52 percent have an Android.

Rep. Mike Honda (D-Calif.), who represents the Silicon Valley district that includes Apple’s headquarters, also has the full suite of an iPhone, iPad and MacBook Air — and he’s looking into picking up one of the company’s new Apple Watches, spokesman Ken Scudder said.

The lone Windows phone owner is Rep. Suzan DelBene (D-Wash.), a former Microsoft executive who now represents the district that includes the company’s Redmond, Wash., headquarters. DelBene’s staffers use Windows phones as well, her office told The Hill.

Like most Americans, popular apps for lawmakers include those that provide news, weather and traffic, although Reps. Jared Polis (D-Colo.) and Randy Hultgren (R-Ill.) told The Hill they were fans of Capitol Bells, an app developed by a former Capitol Hill staffer that decodes the Capitol’s buzzer system and lets the general public follow along.

Posted on Oct 14, 2014 at 9:32 AM2 comments


NIST funds research center on cybersecurity tools

The National Institute of Standards and Technology awarded a contract to operate a Federally Funded Research and Development Center (FFRDC) to support the work of the National Cybersecurity Center of Excellence (NCCoE).

The NCCoE was set up in partnership with the state of Maryland and Montgomery County, Md., in February 2012. The center is dedicated to helping businesses secure their data by drawing experts from government, universities and industry to help identify security solutions.

FFRDCs are public private partnerships contracted to do research for the federal government. The NIST FFRDC was awarded to the Mitre Corp., which operates six additional FFRDCs.

Secretary of Commerce Penny Pritzker said the NIST contract will enable the center to accelerate public-private collaborations by working with the first FFRCD, “focused on boosting the security of U.S. information systems.”

The center has been working in industry sectors such as health care and energy to identify common security concerns and to develop model cybersecurity examples and practice guides. It also works with small groups of vendors to develop “building blocks” addressing technical cybersecurity challenges that are common across multiple industry sectors, according to the NIST announcement.

NIST’s intention in awarding a FFRDC contract to support the NCCoE’s goals was announced last year.

Federal staff will provide overall management of the center, while MITRE will support the center’s mission through three task areas: research, development, engineering and technical support; operations management; and facilities management.

The first three task orders under the contract will allow the NCCoE to expand its efforts in developing use cases and building blocks and provide operations management and facilities planning.

Posted on Oct 02, 2014 at 12:13 PM0 comments


Not all clouds created equal

A major bottleneck in scientific discovery is now emerging because the amount of data available is outpacing local computing capacity, according to authors of new paper published on PLOSone.

And though cloud computing gives researchers a way to match capacity and power with demand, the authors wondered which cloud configuration would best met their needs.  According to the paper, Benchmarking undedicated cloud computing providers for analysis of genomic datasets, the authors benchmarked two cloud services, Amazon Web Services Elastic MapReduce (EMR) on Amazon EC2 instances and Google Compute Engine (GCE), using publicly available genomic data sets and a standard bioinformatic pipeline on a Hadoop-based platform.

They found that GCE outperformed EMR both in terms of cost and wall-clock time, though EMR was more consistent, which is an important issue in undedicated cloud computing, they wrote.

The time differences, the authors said, “could be attributed to the hardware used by the Google and Amazon for their cloud services. Amazon offers a 2.0 GHz Intel Xeon Sandy Bridge CPU, whilst Google uses a 2.6 GHz Intel Xeon Sandy Bridge CPU. This clock speed variability is considered the main contributing factor to the difference between the two undedicated platforms,” they wrote.

The authors did note that while cloud computing is an “efficient and potentially cost-effective alternative for analysis of large genomic data sets,” the initial transfer of the data into the cloud was still a challenge. One option, they suggested, would be for the data providers to directly deposit the information to a designated cloud service provider, thereby eliminating the need for the researcher to handle the data twice.

More detail about the benchmarking and results are available on PLOSone

Posted on Oct 01, 2014 at 1:28 PM1 comments