The General Services Administration is looking for ideas and comments for identification management in 2014 and beyond. In an attempt to assess the validity and viability of its requirements, GSA’s request for information seeks technologies that can offer greater operational functionality, more efficiency and expanded customer support in areas including system architecture, security, billing, reporting and service-level agreements.
GSA currently delivers nationwide end-to-end Personal Identity Verification (PIV) services to approximately 95 federal departments, agencies, boards and commissions, which support 850,000 identity accounts. GSA’s Managed Services Office provides identity management services such as enrollment, PIV Card activation and backend systems infrastructure and integration for any agency that wishes to save money by leveraging GSA’s larger capital investment.
Specifically, GSA’s RFI asked companies to submit responses that address high-level requirements, including:
- Mobile/portable solutions for all issuance/post-issuance capability.
- Collection of multi-modal biometric types (iris, facial, fingerprints, etc.).
- Secure cloud services and virtualization solutions, etc.
- Transitioning existing hardware.
- Temporary credentials for forgotten, damaged or stolen cards.
At this time no solicitation exists, so companies are not to submit proposals.
Posted on Jun 14, 2013 at 9:39 AM0 comments
The Homeland Security Department is looking to improve geospatial imagery and analysis to support emergency management and security planning for special events.
DHS’ Geospatial Management Office has selected BAE Systems to provide geospatial imagery for real-time intelligence as part of DHS’ Remote Sensing Services to Support Incident Management and Homeland Security contract, according to a company release. BAE Systems is one of four prime contractors in the $50 million, five-year IDIQ contract.
BAE Systems’ intelligence experts will use geospatial data and airborne imagery to produce high-resolution maps that reflect current environmental conditions, BAE officials said. The data will be used to produce real-time intelligence products to support a variety of DHS missions, including emergency management of natural and man-made disasters, and possibly security planning for special events. The geospatial intelligence products might also be used to assist public safety and law enforcement with tactical planning and incident response.
As part of the contract, BAE Systems’ supported the Federal Emergency Management Agency in its response to the tornadoes in Oklahoma. BAE provided high-resolution, color imagery along the entire path of destruction -- information that is critical to the recovery and cleanup efforts, a BAE official said.
BAE Systems works with geospatial firms, universities and government agencies on geospatial intelligence solutions as part of its Geospatial Operation for a Secure Homeland – Awareness, Workflow, Knowledge (GOSHAWK) program. GOSHAWK develops hybrid teams of data providers, systems integrators and IT professionals to rapidly transform geospatial data into actionable intelligence, company officials said.
In December 2012, the National Geospatial-Intelligence Agency awarded BAE Systems a multi-year 60 million dollar contract to provide Activity-Based Intelligence systems, tools and support.
BAE Systems’ ABI solution uses advanced software analysis tools and commercial, off-the-shelf computing infrastructures to automate the ingestion, storage and processing of large volumes of intelligence data across multiple sources. ABI helps intelligence analysts better identify adversarial activity patterns and gives them a greater understanding of the relationships between individuals, their activities and their transactions, company officials said.
Posted on Jun 12, 2013 at 9:39 AM0 comments
At the Computex show in Taipei last week, Intel showed a prototype version of a Thunderbolt key drive that boasts bi-directional transfer speeds of 10 gigabits/sec, making it the “world’s fastest thumb drive,” according to an article from the IDG News Service. The device connects directly to a Thunderbolt port and supports both data and video transmission through a single connection, making it attractive to agencies working with video or large data sets.
The prototype Thunderbolt thumb drive — whose speed is twice as fast as current USB 3.0 transfers — does not require cables, as do other Thunderbolt connections, and uses SanDisk SSD for storage, according to IDG News. Oren Huber, a Thunderbolt engineer, told IDG the prototype is a reference design and said there has been some interest in building products based on the design.
The Thunderbolt connects to high-performance peripherals such as graphics adapters, video and audio editors and storage devices, and it combines bi-directional data and video I/O performance. Apple added Thunderbolt connectors to Macs in 2011, and some PCs and peripherals have them. It allows users to transfer an entire HD movie in less than 30 seconds, making it ideal for fast synchronization of content between devices. For high performance back-up and restore, Thunderbolt can transfer of 1T of data in less than 5 minutes, the company said.
Initially available for Apple devices, there are now more than 80 Thunderbolt-enabled peripheral devices, according to Intel, covering everything from storage drives, expansion docks, displays and a variety of media capture and creation hardware. More than 220 companies worldwide are developing Thunderbolt-enabled products, the company added.
In a related development, Intel announced Thunderbolt 2, which enables 4K video file transfer and display simultaneously by combining the two previously independent 10 gigabits/sec channels into one 20 gigabits/sec bi-directional channel that supports data and/or display. Besides the benefits to those working with massive amounts of data such as video, it will allow terabytes of data to be backed up in a question of minutes, rather than hours. Thunderbolt 2 is currently slated to begin production before the end of this year.
Posted on Jun 11, 2013 at 9:39 AM0 comments
Amazon.com has made no secret of its interest in pursuing government customers for Amazon Web Services, its cloud services arm, and lately it has been adding services and capacity to help meet those aims.
Amazon recently said that more than 300 government agencies, looking to become “more innovative, agile, and cost-efficient” had already become AWS customers.
The company hoped to add to those numbers in May when a cloud services framework it developed with the Health and Human Services Department was approved by the Federal Risk and Authorization Management Program office, giving an impetus for more agencies to adopt the AWS’s HHS cloud framework.
In March, AWS scored a 10-year, $600 million deal with the CIA for cloud computing services, further raising the credibility of the company – and the reputation of cloud technology – among agencies requiring the highest cloud security possible. Amazon was on a roll, but it soon hit speed bump. On June 6, the Government Accountability Office sustained a protest by IBM against the CIA contact, putting the project back on square one. But despite the ups and downs, there are indications Amazon is steadily laying the groundwork for a growing government presence.
Last September, the U.K. research firm Netcraft reported that AWS had become the largest hosting company in the world. In the last eight months, the number of its Web-facing servers had grown by a third to 158,000, the research firm said.
AWS’s business has also been growing. In the first quarter of 2013, AWS and other non-retail business accounted for 5 percent of Amazon’s revenue, up from 3.2 percent from 2011, Netcraft reported.
And Amazon has been increasing the number of services it provides: in 2012, 159 new services and features were released, Netcraft said.
The research firm also noted that Northern Virginia, the geographic nexus of the federal government and its IT services providers, is one of the largest markets for Amazon’s Elastic Compute Cloud (EC2) service, which offers on-demand virtual computer instances by the hour.
Together with Northern Ireland, the two regions account for three-quarters of all EC2 usage measured by Netcraft.
On yet another front, in 2011, Amazon launched GovCloud, cloud services aimed for more sensitive applications that might require additional security and compliance with U.S. regulations.
Netcraft said that as of May 2013, it found that only 27 Web-facing servers were associated with GovCloud.
Some of those computers power the National Institutes of Health’s Global Rare Disease Patient Registry and Data Depository, Netcraft said, as well as GovDashboard, a software-as-a-service offering for setting up data dashboards.
Posted on Jun 10, 2013 at 9:39 AM0 comments
States are making progress connecting their own health exchanges with federal agencies, according to a recent report from the Government Accountability Office. But among the remaining challenges is that a federal data services hub to be used for exchanging information isn’t finished yet.
States are charged with fully activating their choice of health exchange by Oct. 1, 2013, the date set for the launch by the Affordable Care Act. GAO looked at seven states whose exchanges are in various stages of development but which are expected to be completed by the deadline.
According to GAO’s report, all seven states surveyed are developing IT infrastructure that includes upgrading or replacing their outdated Medicaid and Children's Health Insurance Program eligibility and enrollment systems. In addition, six of the states are building not only the Web interface for consumers to navigate health options but also the infrastructure needed to integrate the exchanges with federal systems to determine applicant eligibility.
The state systems are expected to tap into a federal data services hub provided by the Health and Human Services Department that will serve as a single source of the federal data that determines eligibility. State systems will transmit requests for data through the hub to multiple federal agencies, including the Homeland Security Department, IRS, Social Security Administration and Veterans Heath Administration, to name a few. The hub then returns the data in near real-time back to the states, where it can be used to verify the applicants’ eligibility.
GAO acknowledged the complexity of the task: "With the amount of data that states must share with HHS in order to verify eligibility, developing streamlined eligibility and enrollment systems is a vast undertaking requiring states to develop sophisticated IT systems."
Unfortunately for states, the federal data services hub is also under development (as are the rules governing its use), so the challenge for states is to build in enough flexibility so that the systems can communicate when they are finished.
Additionally, because tax and income information will be transmitted, “there is a laundry list of privacy and security standards that must be met,” according to Dylan Scott’s blog post on Governing. Further, Scott said, “the IRS is accustomed to receiving and then processing this kind of information over long periods of time, up to a month, while the exchange is supposed to provide verification in almost real-time. Nobody is sure if and how the hub will be equipped to handle that workload.”
Officials in six states surveyed told GAO that even though they did not have complete information on the requirements of the federal data services hub, they still needed to begin working on their end of the IT infrastructure. For its part, the Centers for Medicare & Medicaid Services has released guidance to the states on how to access or verify data through the federal data services hub through such sources as webinars, conferences and other forums. Still, most state officials told GAO they were concerned that the lack of specific IT-related federal guidance could lead to changes late in the development process.
Stan Czerwinski, who headed the group that authored the GAO report, told Governing that states will likely be working on the plugging into the federal exchange up to and following the deadline. “Until they're able to do this testing to make sure that all these points connect, it's still unknown,” Czerwinski says. “I think they agree that it's the biggest challenge area and will need adjusting from day one.”
Posted on Jun 06, 2013 at 9:39 AM1 comments