Massive datasets created by new cloud platforms will put a heavy burden on government IT managers. Here’s what you can do about it.
Modernizing old applications could free a lot of money that is sunk into operations and maintenance, write two executives at Unisys Federal Systems.
NASA has launched a contest for clever programmers to come up with new tools to help mine the extensive planetary information collected in the agency's databases.
The government has been slow to take up semantic technology, but a relatively obscure interagency XML project could provide a much-needed boost in fields such as law enforcement and health care.
The Health and Human Services Department has been pushing out its data in recent months and the results have sparked innovation and helped consumers, Chief Technology Officer Todd Park said at a new media conference.
Senators warn about reverting to the pre-9/11 days of hoarding information, and say instead that agencies need to balance security concerns with needs.
The new NIST guidance for managing information security risk is called the capstone of the agency's work on FISMA implementation.
Mitre Corp. has launched of an identity recognition competition open to techies, amateurs and professionals, individuals, and teams alike who think they have an idea to improve identity recognition technologies.
The government needs to get smart about handling the coming flood of real-time, sensor-based data, writes consultant Marc Demarest.
Guest columnist Frank A. McDonough offers three examples of electronic archives that outshine NARA's troubled effort to preserve government records.
The public seems amazed by the type and amount of sensitive information that is available to people who should not have access to it, but security professionals are not, writes security consultant Shon Harris.
A State Department program lacked a feature that might have alerted officials to the unauthorized download of diplomatic cables.