Managers of the International Space Station (ISS) are developing what amounts to an express delivery service to rapidly ferry small packages of critical research back to Earth in order to maintain scientific momentum on the projects.
The Center for the Advancement of Science in Space (CASIS), which manages the ISS research lab, has picked Houston-based Intuitive Machines to work with NASA to design a terrestrial return vehicle (TRV), the capsule that will deliver the scientific perishables safely back to Earth.
Today, retrieval and return of experiment results from the ISS are conducted only a couple of times per year and require a long planning process. In contrast, Intuitive Machines said the TRV will enable frequent and same-day delivery of samples from ISS to the researcher's laboratory.
“The timely delivery of critical or perishable samples is essential in enabling new and exciting research aboard the ISS National Laboratory,” the partners said in announcing the service. The first flight of the TRV from the ISS is planned for 2016.
Intuitive Machines will open the TRV service to scientific, academic, commercial as well as government researchers.
"The International Space Station, with its unique microgravity laboratories and crew, enables research over a wide range of disciplines from physics through biology,” said David Wolf, a research scientist and former astronaut.
“This small payload return capability will provide controlled conditions and flexible choices for timely sample analysis,” Wolf added. “The scientific team will be able to much more efficiently adjust experimental parameters in response to results, exploit unique results, and correct problems encountered.
Intuitive Machines will be responsible for the design and certification of the return vehicle, as well as managing the payload return services for its customers. CASIS will manage the integration of the service onto a commercial launch vehicle to access the ISS, as well as flight operations services.
The TRV contains subsystems for protecting the payload during the return trip and delivering it accurately to a landing location such as a dry lakebed, where it can be easily retrieved. Once recovered, the payload would be removed from the TRV and delivered to the customer.
The vehicle is equipped with propulsion and flight control systems, which perform maneuvers for the entry and descent through the Earth’s atmosphere.
Posted on Oct 22, 2014 at 10:42 AM0 comments
Samsung Electronics announced its products have been placed on the Commercial Solutions for Classified (CSfC) Program Component List, making them the first consumer devices validated to handle classified information, the company said in its announcement.
The National Security Agency’s CSfC process allows commercial products to be used in layered solutions to protect classified information, giving agencies the ability to securely communicate based on commercial standards.
The approved products include the Galaxy S4, Galaxy S5, Galaxy Note 3, Galaxy Note 4, Galaxy Note 10.1 (2014 Edition), Galaxy Note Edge, Galaxy Alpha, Galaxy Tab S 8.4, Galaxy Tab S 10.5 and the Galaxy IPSEC Virtual Private Network (VPN). All devices and capabilities incorporate security features powered by Samsung Knox.
Samsung Knox is an enterprise security platform that enhances the security of data and applications on Android-based devices. It secures the boot process, provisioning, application execution, data storage and transmission, while retaining compatibility with the Android functionality and ecosystem.
Earlier this year Samsung mobile devices were officially included on the Defense Information Systems Agency’s approved product list. The CSfC list for high security solutions supplements the DISA listing, enabling agencies and contractors to design solutions meeting the full range of government security objectives. Samsung is the only manufacturer with mobile devices on both lists, the company said.
Posted on Oct 21, 2014 at 10:08 AM0 comments
The National Geospatial-Intelligence Agency recently released open-source gamification software to GitHub, the collaborative software development environment.
The gamification-server software tracks gamification elements (badges, points, tags) for work pages or apps and provides a framework for providing awards/points to users or teams. It can run either standalone or integrated with other web-based applications.
"Government game development efforts are exponentially on the rise today,” said NGA director Robert Cardillo in the agency’s announcement. “The current generation of professionals is discovering the collaborative learning power of using games in standard business practices, and the newer generation is already familiar with how these new technologies are powerful learning tools.”
Hawaii, in fact, recently incorporated gaming principles and technologies into the state’s website. As a result, overall adoption of online services is up as much as 20 percent.
NGA’s gamification software also provides a customizable web interface for displaying badges and a configurable rules engine that translates actions performed by users into awards, said Ray Bauer, an NGA information technology innovation lead.
“The use of badging and awards recognizes what achievements matter most based on agency priorities, and rewards the user in the context of their work,” said Bauer.
Implemented as a django python web service and associated web application, the gamification-server provides a customizable web interface for displaying badges as well as a configurable rules engine to translate actions performed by users into awards, according to the GitHub posting. User awards can be exported into an Open Badges Backpack, allowing users to present expertise gained within other social frameworks or applications.
The software is designed so that other sites can send in "signals" that are parsed through a rules engine and generate points and badges. Also, other sites and apps can pull in JSON to list badges that a user has.
NGA launched its GitHub account in April 2014 and has released eight open source software packages on the platform, including:
- GeoQ allows teams to collect geographic structured observations across a large area, but manage the work in smaller geographic regions.
- RFI generator helps first responders and analysts at headquarters work with Requests for Information within a geospatial context.
- GeoWave provides geospatial and temporal indexing on top of Accumulo.
Posted on Oct 20, 2014 at 12:57 PM0 comments
The National Weather Service recently activated a system that quickly harnesses weather data from multiple sources, integrates the information and provides a detailed picture of the current weather.
The Multiple Radar Multiple Sensor (MRMS) system combines data streams from multiple radars, satellites, surface observations, upper air observations, lightning reports, rain gauges and numerical weather prediction models to produce a suite of decision-support products every two minutes, according to the NOAA National Severe Storms Laboratory.
Because it provides better depictions of high-impact weather events, forecasters can quickly diagnose severe weather and issue more accurate and earlier forecasts for communities and air traffic managers.
“MRMS uses a holistic approach to merging multiple data sources, allowing forecasters to better analyze data and potentially make better predictions,” said Ken Howard, a research meteorologist at NOAA’s National Severe Storms Laboratory who helped design MRMS. “It was developed in collaboration with NOAA’s National Weather Service hydrologists and forecasters who tested experimental versions and provided valuable input and feedback.”
MRMS data are also an input into the newly launched High-Resolution Rapid Refresh weather model, which lets forecasters pinpoint neighborhoods under severe weather threats and warn residents hours before a storm hits. It will also help forecasters provide more information to air traffic managers and pilots about hazards such as air turbulence and thunderstorms.
MRMS is being used to develop and test new Federal Aviation Administration NextGen products in addition to advancing techniques in quality control, icing detection, and turbulence.
NOAA researchers developed the MRMS system in cooperation with the University of Oklahoma’s Cooperative Institute for Mesoscale Meteorological Studies, and the software is available for government at no cost.
Posted on Oct 17, 2014 at 12:25 PM0 comments
The Patent and Trademark Office is looking into whether off the shelf, “enterprisewide,” products are available that would help it conduct tasks related to the acquisition process.
The products, sought by PTO’s chief financial offer, would include acquisition workload planning, distribution, transition and tracking technologies as well as tools to facilitate the evaluation of vendor proposals.
The acquisition tech would also have to be compatible with the content management system operated by the Office of the Chief Financial Officer, an Apache Cassandra database run in a Datastax Enterprise environment, according to PTO.
The PTO is interested in finding out whether prospective acquisition planning tech is able to run on VMware, and if not, what platforms it can run on. VMware provides cloud and virtualization software.
Other PTO requirements include the ability to automate data integration with existing PTO systems, including its enterprise data warehouse and Momentum, PTO’s core financial system.
Support for Microsoft, RedHat, Oracle and Apache technologies are also required, according to the RFI.
Other desirable features of the acquisition system are that it support e-signatures and single-sign on, role-based security and electronic workflow. Real-time integration with the Federal Business Opportunities service as well as automation of Federal Acquisition Regulation data extraction are also wanted, said the PTO.
Posted on Oct 15, 2014 at 7:51 AM0 comments