Ever since President Thomas Jefferson in 1804 sent Lewis and Clark to find and map the most direct and practicable water route across the continent to the Pacific Ocean, Americans have been curious about the land they live in.
But it wasn’t until 1879 that the U.S. Geological Survey was established to make a geologic map of the United States. Then in 1992 Congress enacted the Geologic Mapping Act, which required the USGS and the state geological surveys to build a “national archive” of standardized geoscience map information called the National Geologic Map Database. The website debuted in 1996.
Change occurs more rapidly in the 21st century, however. USGS and the Association of American State Geologists, AASG, have just launched a redesigned website that is significantly more advanced in terms of the technology and information now available to the public.
The new system improves the integration of publication citations, stratigraphic nomenclature, downloadable content, and unpublished source information, greatly improving public access to this archive, according to the USGS website.
One significant feature of new site is “MapView” – a new interface that seamlessly portrays the nation’s geologic maps published by USGS, the state geological surveys, and many others. These maps, now available through the National Geologic Map Database, can be viewed in detail and downloaded from the various publishers.
According to USGS, this is just the first stage in a complete redesign of the database. Other aspects of the site will be upgraded in the months ahead.
Posted on Dec 07, 2012 at 1:14 PM4 comments
With Americans bracing for tax hikes in January if Congress and the White House can’t resolve the "fiscal cliff" budget dilemma, taxpayers’ personal data may also take a hit.
Personal information could be at risk from the IRS’ IT modernization efforts, warns the Treasury Department’s Inspector General for Tax Administration (TIGTA) in a newly released report published Dec. 4. The IG office labels the modernization program "a major risk" to the data and cites two key systems, Modernized e-File and the Customer Account Data Engine 2, or CADE 2.
The report said IRS "has made progress to improve information security and personnel safety; however, it needs to continue to place emphasis on information and physical security programs in order to ensure that policies, procedures and practices adequately address security control weaknesses."
Among the weaknesses cited were system access controls, configuration management, audit trails, physical security, remediation of security weaknesses, and oversight and coordination on security related issues.
A summary of the IG report on the TIGTA website notes that the IRS has developed and implemented significant systems since last year’s assessment, including Release 7.0 of the Modernized e-File system in January 2012 and the daily processing and database implementation projects of CADE 2.
The CADE 2 project was in the testing phase when the IG report was written in September and was expected "to be placed into production in late 2012." CADE 2 will store all individual taxpayer account data and provide that information to "select downstream IRS systems on a daily basis."
The report says IRS data integrity testing hasn't provided sufficient assurance that CADE 2 data is consistently accurate and complete. It calls for stronger traceability controls on a database meant to become the authoritative repository of taxpayer information.
"Until the IRS addresses security weaknesses, it will continue to put the confidentiality, integrity and availability of financial and taxpayer information and employee safety at risk," the report said.
The audit was initiated as part of the TIGTA Fiscal Year 2012 Annual Audit Plan and addresses the major management challenge of modernization. TIGTA is required by the IRS Restructuring and Reform Act of 1998 to annually perform an evaluation of the adequacy and security of IRS technology.
Posted on Dec 06, 2012 at 11:46 AM0 comments
The National Nuclear Security Administration has launched a project management mobile app for its Global Threat Reduction Initiative that runs on both the Apple iOS and Google Android platforms, according to the agency's blog.
The app hooks mobile users into G2, GTRI’s project management system, to help GTRI project managers in their mission to secure nuclear and radiological materials around the world. The app allows mobile users to quickly filter and analyze all the real-time information about locations and coordinate that with schedules and infrastructure.
Since NNSA established GTRI in 2004 to consolidate efforts to prevent the acquisition of nuclear and radiological materials for use in weapons of mass destruction and for other acts of terrorism, the agency's workload has grown. It developed G2 to help NNSA project managers filter and analyze large amounts of real-time, geospatial data and integrate that data with scope, schedule, cost and infrastructure information for the entire portfolio of GTRI projects, according to the agency. With G2, NNSA said it was able to do more work and manage greater resources without having to hire additional staff.
The new app makes the G2 system available to mobile users, allowing the GTRI team to manage projects wherever they are in the world from their smartphones or tablets.
This is the first time that the GTRI has released an app for more than one operating system, which indicates the administration’s commitment to multiplatform mobile device management.
The NNSA has previously experimented with networking strategies that ended up being used departmentwide.
Posted on Dec 05, 2012 at 2:11 PM0 comments
Fugitive tech pioneer John McAfee, who has eluded authorities in Belize for several weeks, had his location exposed Dec. 4 by a basic smart-phone feature — the inclusion of GPS location data in images taken with a phone, or messages sent from a phone.
McAfee, who has blogged, tweeted and sent out podcasts while on the lam, met with several journalists from Vice magazine, which posted a picture of them together on its site, under the headline, “We are with John McAfee right now, suckers,” the Washington Post reported.
Whoever took the photo hadn’t turned off the phone’s geotagging feature, however, and a hacker who goes by the handle Simple Nomad extracted the coordinates and tweeted the location, which turned out to be a villa south of the Belize border in Guatemala, the Post reported. (UPDATE: McAfee has been arrested in Guatemala for entering the country illegally.)
Security experts have tried to raise awareness for years about the potential risks from geotagging features that smart-phone users might not know about. In 2010, researchers warned that smart phones carried by U.S. troops could reveal location information that could endanger lives or missions. Other security experts have warned about potential privacy risks when they post their photos online.
In April, an alleged member of Anonymous, wanted for posting the home addresses of police officers online, was captured after pictures he posted taunting investigators revealed his location.
Inexpensive software is available for extracting location information from posted pictures taken by any device with a Global Positioning System receiver. But users also can avoid the problem by disabling the geotagging features, which is what McAfee may have thought his interviewers had done.
McAfee, who founded the IT security company that still bears his name, has been wanted for questioning since Nov. 11, when a neighbor was killed by gunshot near McAfee’s home. After his location was revealed, McAfee eventually admitted he was in Guatemala and would be contacting a lawyer, according to the Post.
Posted on Dec 05, 2012 at 7:35 AM1 comments
Amid growing concerns about malware threats in the IT supply chain, the Defense Advanced Research Projects Agency is looking for ways to test commercial products on a large scale to make sure they’re “clean.”
DARPA has launched the Vetting Commodity IT Software and Firmware (VET) program to find methods of ensuring that the commercial IT products the Defense Department buys, ranging from smart phones to routers, are free of backdoors, malicious code and other potential threats.
Supply-chain security has come to the fore recently, with a congressional intelligence panel warning that the United States “should view with suspicion” the growth of Chinese telecommunications companies in the U.S. market. A recent report by the Georgia Tech Information Security Center and Georgia Tech Research Institute identified supply chain threats as a serious, and hard to detect, threat.
Back doors, spyware and other malicious code could theoretically be designed into products or added by a manufacturer, vendor or integrator.
DARPA’s VET program wants to test products before they’re installed, which would seem to be a pretty big job.
“DOD relies on millions of devices to bring network access and functionality to its users,” Tim Fraser, DARPA program manager, said in a statement. “Rigorously vetting software and firmware in each and every one of them is beyond our present capabilities, and the perception that this problem is simply unapproachable is widespread. The most significant output of the VET program will be a set of techniques, tools and demonstrations that will forever change this perception.”
With VET, DARPA wants to develop a three-step process:
- Defining malice: Given a sample device, how can DOD analysts produce a prioritized checklist of software and firmware components to examine and list broad classes of hidden malicious functionality to rule out?
- Confirming the absence of malice: How can analysts demonstrate the absence of those broad classes of hidden malicious functionality?
- Examining equipment at scale: How can the procedure scale to non-specialist technicians who must vet every individual new device used by DOD prior to deployment?
DARPA will host a proposer’s day Dec. 12 in Arlington, Va., to brief interested participants in the program.
Posted on Dec 04, 2012 at 1:34 PM2 comments