The National Institute of Standards and Technology awarded a contract to operate a Federally Funded Research and Development Center (FFRDC) to support the work of the National Cybersecurity Center of Excellence (NCCoE).
The NCCoE was set up in partnership with the state of Maryland and Montgomery County, Md., in February 2012. The center is dedicated to helping businesses secure their data by drawing experts from government, universities and industry to help identify security solutions.
FFRDCs are public private partnerships contracted to do research for the federal government. The NIST FFRDC was awarded to the Mitre Corp., which operates six additional FFRDCs.
Secretary of Commerce Penny Pritzker said the NIST contract will enable the center to accelerate public-private collaborations by working with the first FFRCD, “focused on boosting the security of U.S. information systems.”
The center has been working in industry sectors such as health care and energy to identify common security concerns and to develop model cybersecurity examples and practice guides. It also works with small groups of vendors to develop “building blocks” addressing technical cybersecurity challenges that are common across multiple industry sectors, according to the NIST announcement.
NIST’s intention in awarding a FFRDC contract to support the NCCoE’s goals was announced last year.
Federal staff will provide overall management of the center, while MITRE will support the center’s mission through three task areas: research, development, engineering and technical support; operations management; and facilities management.
The first three task orders under the contract will allow the NCCoE to expand its efforts in developing use cases and building blocks and provide operations management and facilities planning.
Posted on Oct 02, 2014 at 12:13 PM0 comments
A major bottleneck in scientific discovery is now emerging because the amount of data available is outpacing local computing capacity, according to authors of new paper published on PLOSone.
And though cloud computing gives researchers a way to match capacity and power with demand, the authors wondered which cloud configuration would best met their needs. According to the paper, Benchmarking undedicated cloud computing providers for analysis of genomic datasets, the authors benchmarked two cloud services, Amazon Web Services Elastic MapReduce (EMR) on Amazon EC2 instances and Google Compute Engine (GCE), using publicly available genomic data sets and a standard bioinformatic pipeline on a Hadoop-based platform.
They found that GCE outperformed EMR both in terms of cost and wall-clock time, though EMR was more consistent, which is an important issue in undedicated cloud computing, they wrote.
The time differences, the authors said, “could be attributed to the hardware used by the Google and Amazon for their cloud services. Amazon offers a 2.0 GHz Intel Xeon Sandy Bridge CPU, whilst Google uses a 2.6 GHz Intel Xeon Sandy Bridge CPU. This clock speed variability is considered the main contributing factor to the difference between the two undedicated platforms,” they wrote.
The authors did note that while cloud computing is an “efficient and potentially cost-effective alternative for analysis of large genomic data sets,” the initial transfer of the data into the cloud was still a challenge. One option, they suggested, would be for the data providers to directly deposit the information to a designated cloud service provider, thereby eliminating the need for the researcher to handle the data twice.
More detail about the benchmarking and results are available on PLOSone.
Posted on Oct 01, 2014 at 1:28 PM1 comments
The Chesapeake Crescent Initiative (CCI), a public-private partnership formed to support civic technology enhancement, has organized a SWAT team of sorts to help cities strengthen their technology foundations, harden their resiliency and optimize programs and services.
The Safe + Smart Cities coalition, made up of experts from the tech, higher education and financial communities, picked the city of Newark, Del., as its first pilot. A second city from the mid-Atlantic region will be announced this fall to receive pro bono recommendations from the team.
In an announcement, the coalition said its recommendations aim to provide the cities “pragmatic and feasible options to achieve “safe and smart” objectives.
“The strategies developed as a result of this effort will allow us to maximize our limited resources in a way that best serves the citizens of Delaware,” said Delaware Gov. Jack Markell, who added the project could help “enhance our resiliency so we mitigate the damage of disaster situations before they happen.”
The Newark pilot will open with a workshop on the city’s technological maturity and vulnerabilities, the status of its infrastructure and what tools might be deployed to meet a “safe and smart” profile, according to the announcement.
That will produce a Safe + Smart City “blueprint,” or big picture report integrating “hard and soft infrastructure functions,” including buildings, public safety and communication networks.
Herb Miller, co-founder and vice chair of CCI, said questions about how to cope with natural disasters has recently become a priority for cities, together with how to use technology to improve civic “livability” and connections to constituents.
“But these approaches are often pursued through separate channels with different stakeholders, even though they have many core commonalities,” he said.
The coalition hopes to use the lessons from its work with Newark to “as a reference model for many other municipalities and the nation as a whole,” said Stephanie Carnes, CCI’s managing director.
CCI has lined up a sizeable list of participants for the project, including Cisco, Schneider Electric, AtHoc, Verint Systems and Priority 5, which will examine the “technological maturity” of each pilot city.
Virginia Tech and the universities of Maryland and Delaware will also lend their expertise in the areas of resilience and risk mitigation, according to the announcement.
Posted on Sep 30, 2014 at 10:26 AM0 comments
Originally developed in 2012 to help protect endangered right whales on the East Coast, the iOs-based Whale Alert app has been updated to provide mariners in the Pacific with the most current information available about whale movements and conservation initiatives.
The free app uses GPS, Automatic Identification System, wireless Internet and nautical charts from the National Oceanic and Atmospheric Administration to provide mariners with a single source of information about whale locations and conservation measures in their immediate vicinity.
New features include information about California Marine Protected Areas, PORTS (Physical Oceanographic Real-Time System) tide and weather data and the ability for the public to report whale sightings to databases that NOAA and whale biologists use to map whale habitats and migration patterns.
“Whales are important both ecologically and economically, but they continue to face a variety of threats including ship strikes,” said Michael Carver, deputy superintendent of Cordell Bank National Marine Sanctuary. “Whale Alert allows citizens to provide data scientists can use to inform management and better protect whale populations.”
Whale Alert has been developed by a collaboration of government agencies, academic institutions, non-profit conservation groups and private sector industries, led by NOAA’s Office of National Marine Sanctuaries.
Whale Alert data collected by citizens and scientists are currently available online at the Whale Alert - West Coast website. “More is usually better when it comes to data,” said Jaime Jahncke, Point Blue Conservation Science lead on the project. “Whale Alert allows us to crowdsource data collection, so that as scientists we have more information available to help protect whales from ships.”
Whale Alert can be downloaded free of charge from Apple’s App Store. More information on Whale Alert and the groups responsible for its development can be found at www.whalealert.org
Posted on Sep 30, 2014 at 11:33 AM1 comments
The Centers for Disease Control and Prevention is looking for an all-in-one data tracking and analytics tool to help it meet the rising level of health care trend and emergency data it follows, according to a request for information it recently filed.
In a long wish-list of features, CDC said it was interested in ideas for a single platform that can “integrate, analyze, visualize and report on key surveillance, epidemiologic, laboratory, environmental and other types and sources of data during emergency or routine investigations in an efficient – and timely manner.”
The agency requires a high level of integration because in its role as the nation’s disease tracker, it supports “disease surveillance and epidemiologic investigation activities, laboratory testing, scenario modeling, intelligence gathering, environmental investigation and medical countermeasures deployment,” according to the RFI.
While the Centers can meet those requirements, it faces a number of challenges in doing so, including “many process-driven and technical challenges in [its] capacity to collect, integrate and analyze numerous data types and sources.”
Data integration and unification is a major selling point for acquiring new technology, according to the CDC request. The RFI points out that often lab testing for evidence of pathogens is often performed in multiple labs, multiple CIO offices and “throughout laboratory response network.” Additionally, lab results are “contained within and reported through a variety of IT systems. On top of that, “epidemiology related systems … have evolved independently of each other.” The list of dis-integration is a long one.
The envisioned platform would let CDC “standardize a core set of data elements across multiple surveillance programs and event responses to capture data in a consistent manner, as well as integrate new data types and unstructured data,” said CDC.
The platform would give CDC’s external partners near real-time access to event data through a secure interface and “enable infrequent and new users as well as experienced users to successfully operate the system with limited training.”
Posted on Sep 29, 2014 at 12:59 PM0 comments