The National Institutes of Health wants a better way to find and cite biomedical research data as well as associated publications grants, so it is asking the community for ideas.
In a request for information, the National Human Genome Research Institute (NHGRI) said it is considering the development of a biomedical data catalog, similar to one NIH’s PubMed does for scientific publications.
NIH envisions the data catalog, different from a data repository, “would help make data in such repositories more easily findable and citable in a consistent manner. In addition to supplying core, minimal metadata to ensure a valid data reference, it is envisioned that a Data Catalog would include links out to the location of the data, to the NIH Reporter record of the grant that supported the research, to relevant publications within PubMed or journals, and possibly to associated software or algorithms,” NIH said.
The RFI falls under NIH’s Big Data to Knowledge (BD2K) Initiative, “which aims to facilitate broad use of biomedical big data, develop and disseminate analysis methods and software, enhance training for disciplines relevant for large-scale data analysis and establish centers of excellence for biomedical big data,” according to the BD2K website. Responses must be submitted via email to firstname.lastname@example.org by June 25.
The effort is one of several NIH is making toward managing large stores of information. The agency, for instance, also is soliciting best practices on how to overcome the challenge of managing data generated by genome sequencing and the use of large-scale imaging technologies, which are "breaking the standard model by which researchers manage and analyze data," George Komatsoulis, chief information officer of the National Cancer Institute, part of NIH, wrote in an April blog post. NCI is asking its grantees for input on a set of pilot projects to test the feasibility of setting up a "cancer knowledge cloud" that would equip researchers with the computational tools they need to meet the big data demands of big science.
Posted on Jun 14, 2013 at 9:39 AM0 comments
The General Services Administration is looking for ideas and comments for identification management in 2014 and beyond. In an attempt to assess the validity and viability of its requirements, GSA’s request for information seeks technologies that can offer greater operational functionality, more efficiency and expanded customer support in areas including system architecture, security, billing, reporting and service-level agreements.
GSA currently delivers nationwide end-to-end Personal Identity Verification (PIV) services to approximately 95 federal departments, agencies, boards and commissions, which support 850,000 identity accounts. GSA’s Managed Services Office provides identity management services such as enrollment, PIV Card activation and backend systems infrastructure and integration for any agency that wishes to save money by leveraging GSA’s larger capital investment.
Specifically, GSA’s RFI asked companies to submit responses that address high-level requirements, including:
- Mobile/portable solutions for all issuance/post-issuance capability.
- Collection of multi-modal biometric types (iris, facial, fingerprints, etc.).
- Secure cloud services and virtualization solutions, etc.
- Transitioning existing hardware.
- Temporary credentials for forgotten, damaged or stolen cards.
At this time no solicitation exists, so companies are not to submit proposals.
Posted on Jun 14, 2013 at 9:39 AM0 comments
The Homeland Security Department is looking to improve geospatial imagery and analysis to support emergency management and security planning for special events.
DHS’ Geospatial Management Office has selected BAE Systems to provide geospatial imagery for real-time intelligence as part of DHS’ Remote Sensing Services to Support Incident Management and Homeland Security contract, according to a company release. BAE Systems is one of four prime contractors in the $50 million, five-year IDIQ contract.
BAE Systems’ intelligence experts will use geospatial data and airborne imagery to produce high-resolution maps that reflect current environmental conditions, BAE officials said. The data will be used to produce real-time intelligence products to support a variety of DHS missions, including emergency management of natural and man-made disasters, and possibly security planning for special events. The geospatial intelligence products might also be used to assist public safety and law enforcement with tactical planning and incident response.
As part of the contract, BAE Systems’ supported the Federal Emergency Management Agency in its response to the tornadoes in Oklahoma. BAE provided high-resolution, color imagery along the entire path of destruction -- information that is critical to the recovery and cleanup efforts, a BAE official said.
BAE Systems works with geospatial firms, universities and government agencies on geospatial intelligence solutions as part of its Geospatial Operation for a Secure Homeland – Awareness, Workflow, Knowledge (GOSHAWK) program. GOSHAWK develops hybrid teams of data providers, systems integrators and IT professionals to rapidly transform geospatial data into actionable intelligence, company officials said.
In December 2012, the National Geospatial-Intelligence Agency awarded BAE Systems a multi-year 60 million dollar contract to provide Activity-Based Intelligence systems, tools and support.
BAE Systems’ ABI solution uses advanced software analysis tools and commercial, off-the-shelf computing infrastructures to automate the ingestion, storage and processing of large volumes of intelligence data across multiple sources. ABI helps intelligence analysts better identify adversarial activity patterns and gives them a greater understanding of the relationships between individuals, their activities and their transactions, company officials said.
Posted on Jun 12, 2013 at 9:39 AM0 comments
At the Computex show in Taipei last week, Intel showed a prototype version of a Thunderbolt key drive that boasts bi-directional transfer speeds of 10 gigabits/sec, making it the “world’s fastest thumb drive,” according to an article from the IDG News Service. The device connects directly to a Thunderbolt port and supports both data and video transmission through a single connection, making it attractive to agencies working with video or large data sets.
The prototype Thunderbolt thumb drive — whose speed is twice as fast as current USB 3.0 transfers — does not require cables, as do other Thunderbolt connections, and uses SanDisk SSD for storage, according to IDG News. Oren Huber, a Thunderbolt engineer, told IDG the prototype is a reference design and said there has been some interest in building products based on the design.
The Thunderbolt connects to high-performance peripherals such as graphics adapters, video and audio editors and storage devices, and it combines bi-directional data and video I/O performance. Apple added Thunderbolt connectors to Macs in 2011, and some PCs and peripherals have them. It allows users to transfer an entire HD movie in less than 30 seconds, making it ideal for fast synchronization of content between devices. For high performance back-up and restore, Thunderbolt can transfer of 1T of data in less than 5 minutes, the company said.
Initially available for Apple devices, there are now more than 80 Thunderbolt-enabled peripheral devices, according to Intel, covering everything from storage drives, expansion docks, displays and a variety of media capture and creation hardware. More than 220 companies worldwide are developing Thunderbolt-enabled products, the company added.
In a related development, Intel announced Thunderbolt 2, which enables 4K video file transfer and display simultaneously by combining the two previously independent 10 gigabits/sec channels into one 20 gigabits/sec bi-directional channel that supports data and/or display. Besides the benefits to those working with massive amounts of data such as video, it will allow terabytes of data to be backed up in a question of minutes, rather than hours. Thunderbolt 2 is currently slated to begin production before the end of this year.
Posted on Jun 11, 2013 at 9:39 AM0 comments
Amazon.com has made no secret of its interest in pursuing government customers for Amazon Web Services, its cloud services arm, and lately it has been adding services and capacity to help meet those aims.
Amazon recently said that more than 300 government agencies, looking to become “more innovative, agile, and cost-efficient” had already become AWS customers.
The company hoped to add to those numbers in May when a cloud services framework it developed with the Health and Human Services Department was approved by the Federal Risk and Authorization Management Program office, giving an impetus for more agencies to adopt the AWS’s HHS cloud framework.
In March, AWS scored a 10-year, $600 million deal with the CIA for cloud computing services, further raising the credibility of the company – and the reputation of cloud technology – among agencies requiring the highest cloud security possible. Amazon was on a roll, but it soon hit speed bump. On June 6, the Government Accountability Office sustained a protest by IBM against the CIA contact, putting the project back on square one. But despite the ups and downs, there are indications Amazon is steadily laying the groundwork for a growing government presence.
Last September, the U.K. research firm Netcraft reported that AWS had become the largest hosting company in the world. In the last eight months, the number of its Web-facing servers had grown by a third to 158,000, the research firm said.
AWS’s business has also been growing. In the first quarter of 2013, AWS and other non-retail business accounted for 5 percent of Amazon’s revenue, up from 3.2 percent from 2011, Netcraft reported.
And Amazon has been increasing the number of services it provides: in 2012, 159 new services and features were released, Netcraft said.
The research firm also noted that Northern Virginia, the geographic nexus of the federal government and its IT services providers, is one of the largest markets for Amazon’s Elastic Compute Cloud (EC2) service, which offers on-demand virtual computer instances by the hour.
Together with Northern Ireland, the two regions account for three-quarters of all EC2 usage measured by Netcraft.
On yet another front, in 2011, Amazon launched GovCloud, cloud services aimed for more sensitive applications that might require additional security and compliance with U.S. regulations.
Netcraft said that as of May 2013, it found that only 27 Web-facing servers were associated with GovCloud.
Some of those computers power the National Institutes of Health’s Global Rare Disease Patient Registry and Data Depository, Netcraft said, as well as GovDashboard, a software-as-a-service offering for setting up data dashboards.
Posted on Jun 10, 2013 at 9:39 AM0 comments