The military is taking a page from the smart-phone industry in an effort to speed up development of intelligence, surveillance and reconnaissance (ISR) ground sensors.
Researchers for the Defense Advanced Research Projects Agency’s Adaptable Sensor System, which goes by the name ADAPT, are developing a hardware and software package with a customized Android operating system for the unattended ground sensors. The sensors -- which are small, self-powered devices that sense ground activity, including acoustic, seismic, magnetic and weather events -- can communicate wirelessly with other sensors and devices, according to DOD’s Armed With Science website.
It typically takes three to eight years to develop military sensors systems using contract manufacturers, a long cycle that can mean devices are outdated by the time they’re introduced, DARPA said. By following the design processes of smart phone makers, which are always creating new or updated models, DARPA hopes to develop new devices in a year or so.
The ADAPT program focuses on three core elements: reusable hardware, reusable software and sensor applications, DARPA said.
“We believe that the ADAPT building block approach — where you take the ADAPT core and easily plug it into any number of ISR sensor reference designs — will transform how the military services and the defense industry approach ISR sensor research and development,” said DARPA program manager Mark Rich. “This method has the promise of being much more cost-effective, faster to the warfighter and easier to refresh with technology upgrades.”
DARPA plans to test new sensors based on the ADAPT reference design this summer, he said.
The agency also could develop other reference designs for air and sea vehicles. In one recent test, for example, researchers replaced the control interface of a small, quad-copter UAV with the ADAPT core, which turned over flight control.
Posted on Jun 20, 2013 at 6:07 AM1 comments
Gartner analysts see three main trends framing the security discussion moving forward: mobile security, big data and advanced targeted attacks. The company presented its take on these high-level trends and more at its recent three-day security and risk management summit in National Harbor, MD.
- Mobile: As focus shifts from the device to the app/data, understanding the device types and how users are using them is just as important as the user identities.
- Big data: Delivering risk-prioritized actionable insight will require security analytics as well as changes in information security technologies, integration methods and processes.
- Advanced targeted attacks: The latest attack strategies use custom or dynamically generated malware for the initial breach and data-gathering phase. Enterprises should employ a defense-in-depth, layered approach model.
Reporters and attendees also shared insights and factoids from the conference.
InfoSecurity magazine covered the keynote by Paul Proctor, Gartner vice president and senior analyst, who described four security scenarios that organizations will experience over the next decade:
- Regulated risk, where a government organization leverages regulation to protect enterprises and itself.
- Coalition rule, where barriers to entry for malicious actors are low, and government intervention is absent or ineffective.
- The controlling parent, where the government will step in to protect the individual.
- Neighborhood watch or anarchy, where decreasing regulation signals that government intervention will not materially impact the targeting of individuals.
Ray Wagner, managing vice president of Gartner’s secure business enablement group, spoke on trends affecting IT security managers, according to Network World.
- The use of cloud services, especially those outside the control of the IT department, means antivirus and perimeter firewalls are increasingly ineffective.
- All packets across the network are suspect, so monitoring should be considered a basic means to detect attacks.
- By 2020, 75 percent of IT budgets will be set aside for rapid detection and response approaches, up from less than 10 percent in 2012.
- Identity management and context-aware security will be key to supporting mobile devices in the enterprise.
- Identity and access management may need to recognize social-network identities.
Steve Piper at the CyberEdge Group listed his top five takeaways from the conference in a blog post:
- The exhibit hall was chock full of vendors touting their abilities to detect advanced threats: FireEye, Palo Alto Networks, Damballa, Sourcefire, Trend Micro, AhnLab, Blue Coat, Zscaler, Proofpoint and many more.
- The second-biggest theme this year was around BYOD and securing mobile devices. In a recent Gartner survey on 2012-2014 security spending priorities, mobile device management came in first place.
- The concept of big data worked its way into virtually every session that talked about security incident event management (SIEM) technology and tactics for uncovering advanced threats.
- The industry is so hot and heavy for advanced threat protection products (and rightfully so) that it seems to have forgotten about the critical importance of good old-fashioned vulnerability management and patch management solutions.
- Everyone — analysts, attendees and even vendors — agrees that it’s no longer a matter of “if” your network will be compromised. It's a matter of “when.”
Gartner’s Jay Heiser spoke on security myths — the misconceptions and exaggerations about threats and the technologies to combat those threats. Among those myths, reported by Security Week, are:
- Information security budgets are 10 percent of IT spending. Recent Gartner research shows that information security spending is closer to 5 percent of the total IT budget
- Password expiration and complexity reduces risk: Cracking is just not the major failure mode. Passwords are not cracked, they’re sniffed.
Other tweet-worthy insights from the conference included:
- By 2019, 90 percent of organizations will have personal data on IT systems that they don't own or control. Hostreview.com.
- Monitoring employee behavior in digital environments is on the rise, with 60 percent of corporations expected to implement formal programs for monitoring external social media for security breaches and incidents by 2015. Gartner.
- Only 8 percent of organizations are running next-generation firewalls. And the organizations that purchased next-generation firewalls are not properly configuring them or using them to their fullest extent. CRN.
Posted on Jun 18, 2013 at 10:37 AM1 comments
Last week, New York City Comptroller John C. Liu unveiled the Checkbook NYC 2.0 website and announced that the source code for the financial transparency website would be available to developers on GitHub, which will allow other government organizations to use Checkbook to build similar sites.
Checkbook NYC illustrates how the city government spends its nearly $70 billion annual budget. Using a dashboard that combines graphs and user-friendly tables, the site displays up-to-date information about the city's revenues, expenditures, contracts, payroll and budget. It also offers that information programmatically via APIs.
Built on the Drupal open source content management platform, Checkbook NYC's data warehouse contains more than 50 million financial transactions, according to REI Systems, which worked with the city to develop the system. The data warehouse is updated daily and is growing at a rate of approximately 2 million transactions per month. REI was selected to lead the project, the comptroller’s office said, because of its experience with government transparency websites, including USASpending.gov, Data.gov, and ITDashboard.gov.
Other partners, centralized accounting software vendors Oracle and CGI, worked to develop "adapters," or automated data feeds, between their financial management systems and Checkbook NYC. These feeds will enable other state and local governments that use Oracle and CGI solutions to easily share their financial data with the public.
Collectively, it’s estimated that Oracle, CGI and REI Systems have committed to investing more than $1 million of resources in order to make Checkbook NYC rapidly adaptable by other governments, city officials said.
Checkbook NYC is significant because it makes a vast storehouse of information available online in a timely, structured and human-readable form, according to the Sunlight Foundation. Additionally, it marks a shift to proactive civic application-sharing, Foundation officials added.
“Checkbook NYC is an outstanding example of local government adoption of the open source software model, and with this project New York City has truly stepped up and into the open IT ecosystem,” said Deborah Bryant, Open Source for America co-chair and director of the Open Source Initiative. “NYC’s highly evolved approach also increases the benefit of collaboration beyond software code – such as sharing related investments like training, knowledge base and business rules – exponentially increasing its value to the city and anyone else joining the project.”
Posted on Jun 17, 2013 at 9:39 AM0 comments
The National Institutes of Health wants a better way to find and cite biomedical research data as well as associated publications grants, so it is asking the community for ideas.
In a request for information, the National Human Genome Research Institute (NHGRI) said it is considering the development of a biomedical data catalog, similar to one NIH’s PubMed does for scientific publications.
NIH envisions the data catalog, different from a data repository, “would help make data in such repositories more easily findable and citable in a consistent manner. In addition to supplying core, minimal metadata to ensure a valid data reference, it is envisioned that a Data Catalog would include links out to the location of the data, to the NIH Reporter record of the grant that supported the research, to relevant publications within PubMed or journals, and possibly to associated software or algorithms,” NIH said.
The RFI falls under NIH’s Big Data to Knowledge (BD2K) Initiative, “which aims to facilitate broad use of biomedical big data, develop and disseminate analysis methods and software, enhance training for disciplines relevant for large-scale data analysis and establish centers of excellence for biomedical big data,” according to the BD2K website. Responses must be submitted via email to email@example.com by June 25.
The effort is one of several NIH is making toward managing large stores of information. The agency, for instance, also is soliciting best practices on how to overcome the challenge of managing data generated by genome sequencing and the use of large-scale imaging technologies, which are "breaking the standard model by which researchers manage and analyze data," George Komatsoulis, chief information officer of the National Cancer Institute, part of NIH, wrote in an April blog post. NCI is asking its grantees for input on a set of pilot projects to test the feasibility of setting up a "cancer knowledge cloud" that would equip researchers with the computational tools they need to meet the big data demands of big science.
Posted on Jun 14, 2013 at 9:39 AM0 comments
The General Services Administration is looking for ideas and comments for identification management in 2014 and beyond. In an attempt to assess the validity and viability of its requirements, GSA’s request for information seeks technologies that can offer greater operational functionality, more efficiency and expanded customer support in areas including system architecture, security, billing, reporting and service-level agreements.
GSA currently delivers nationwide end-to-end Personal Identity Verification (PIV) services to approximately 95 federal departments, agencies, boards and commissions, which support 850,000 identity accounts. GSA’s Managed Services Office provides identity management services such as enrollment, PIV Card activation and backend systems infrastructure and integration for any agency that wishes to save money by leveraging GSA’s larger capital investment.
Specifically, GSA’s RFI asked companies to submit responses that address high-level requirements, including:
- Mobile/portable solutions for all issuance/post-issuance capability.
- Collection of multi-modal biometric types (iris, facial, fingerprints, etc.).
- Secure cloud services and virtualization solutions, etc.
- Transitioning existing hardware.
- Temporary credentials for forgotten, damaged or stolen cards.
At this time no solicitation exists, so companies are not to submit proposals.
Posted on Jun 14, 2013 at 9:39 AM0 comments