The Internet of Things (IoT) is coming, and there’s no doubting its potential. Government IT managers don’t care that your fridge can tell your smartphone what you need to buy next, but they do appreciate that advances in connectivity and data collection will enable major improvements to services that government provides citizens.
Those improvements will come from linking the embedded computing systems that drive much of the country’s infrastructure and that outnumber the more familiar servers, PCs and laptops many times over. With the IoT, systems will become even more numerous and capable, and that’s one of the key factors in the growth of Smart Cities. But it poses a massive security problem.
Market researcher International Data Corp. sees strong growth for the IoT in a number of areas over the next few years, including government. It projects a 7.2 percent compound annual growth rate in environmental monitoring and detection through 2018, for example, and 6.3 percent CAGR for public infrastructure assets management.
Other large growth areas are public safety, emergency response and public transit.
“For IT, typical drivers for this growth are cost and time savings,” said Scott Tiazkun, senior research analyst for IDC’s Global Technology and Industry Research organization. “There’s the convenience factor in having all of these sensors in many places that automatically send data back versus having to send a person out to do a reading, which also decreases the chance for errors.”
Typically, however, these kinds of embedded systems have been built with cost and performance in mind and not security. Now that they are also becoming more interconnected, that vulnerability has become increasingly attractive to attackers looking for protected information or who want to disrupt public services.
The Department of Homeland Security says many of the public infrastructure sites that have recently been successfully attacked were insufficiently protected, and at times administrators weren’t even aware they needed to be secured.
Some parts of the government are keenly aware of potential security problems. Embedded computer systems play a part in just about every area of military technology, for example, and the Defense Advanced Research Projects Agency started its High Assurance Cyber Military Systems program in 2012 specifically to create technology for embedded systems “that are functionally correct and satisfy appropriate safety and security properties.”
Fortunately, it seems the security industry has begun to take notice of the needs of the IoT, though it’s debatable how far traditional IT security systems and techniques can be made to work for embedded systems. But tools specifically aimed at this market are being developed and some are already out.
Computer scientists at the University of California, San Diego, have developed a tool that allows hardware designers and system builders to test for security as they build their devices, for example. It tracks a system’s security-specific properties and makes sure they stay secure. It also detects problems in non-critical subsystems that can affect other, more critical ones.
On the software side, Real-Time Innovations has introduced what it claims is the first secure messaging software for critical industrial systems. Its machine-to-machine communication doesn’t need the centralized brokers or system administrators required by traditional IT security, which ensures the low communication latencies needed by such systems.
These tools, and others like them, will be needed. Embedded system security is still an unknown territory for many government organizations. As the IoT becomes a reality, that could put a lot of public systems and infrastructure at risk.
Posted by Brian Robinson on Jun 20, 2014 at 10:57 AM2 comments
The public has one more chance to weigh in on the selection of a Secure Hash Algorithm that will become the new standard for federal digital signatures and other hashing functions.
A hash algorithm is a cryptographic tool that can create a digest – a unique string of bits of a specific length – specific to a digital document. In an environment when most documents are created and used digitally, hashing is an essential tool for verifying the authenticity of documents.
Because the digest is unique and cryptographically tied to the message, it can be used to verify that the contents of a digital document have not been altered. If any changes are made in the document, the digests produced by the hash algorithm before and after will not match. The algorithms also can be used to create digital signatures.
The Keccak algorithm (pronounced “catch-ack”) was selected as the winner of a five-year public competition for a new hashing standard in 2012 by the National Institute of Standards and Technology. It will put a new cryptographic arrow in the federal quiver, supplementing the unexpectedly long-lived SHA-2 family of algorithms.
But before becoming enshrined as SHA-3 in the Federal Information Processing Standards (FIPS), there will be a final round of public comment on Keccak. Because the standard algorithm will be freely available to all users – government and private sector alike – NIST wants to make sure, among other things, that no patents will be infringed in the use of the algorithm.
NIST has announced a final three-month period for public comment on the proposed standard.
The development of SHA-3 was a response to advances over the last decade in the cryptanalysis, or breaking, of hash algorithms. New attacks introduced serious concerns about the security of the SHA-1 algorithm standard, and by 2007 cracks also had begun to appear in the algorithms that collectively make up the SHA-2 standard. So NIST began a competition to find a new, stronger algorithm.
SHA-1 has been retired, but the weaknesses in SHA-2 were not as serious as originally feared, and SHA-2 remains a viable cryptographic tool. Nevertheless, NIST continued with the competition in the expectation of identifying a new algorithm that would be not only more secure, but more efficient.
NIST received 64 entries and after two preliminary rounds, five finalists were selected in December 2010. After 18 months of review, Keccak was selected as the winning algorithm in October, 2012.
There were no published attacks that “in any real sense,” threated the practical security of any of the finalists, NIST wrote in its announcement, and all finalists had acceptable margins of security. But Keccak is a little stronger and a little faster than SHA-2 and it has the largest margin of security among the finalists. Its simplicity and flexibility means it should be able to run efficiently on a wide variety of platforms.
Also, SHA-3 will not replace SHA-2, but will become a standard for hashing alongside it the foreseeable future.
The Draft FIPS 202 specifies six functions based on Keccak. Four are fixed-length cryptographic hash functions and two are closely related "extendable-output" functions (XOFs). The four fixed-length hash functions provide alternatives to the SHA-2 family. The XOFs can be used in a variety of applications, including generating and verifying digital signatures, key derivation functions and random bit generation.
NIST is proposing the creation of FIPS 202, specifying SHA-3 as a hashing standard, and changes to the existing FIPS 180-4, which contains the SHA-2 specifications, to also allow use of SHA-3. Comments should be sent by Aug. 26 to SHA3comments@nist.gov with “Comment on Draft FIPS 202” or “Comment on draft revision to the Applicability Clause of FIPS 180” in the subject lines.
Posted by William Jackson on Jun 13, 2014 at 6:58 AM0 comments
Debates over the state of antivirus technology and tools have resurfaced yet again after the executive in charge of Symantec’s information security business was quoted in the Wall Street Journal a month ago as saying antivirus is dead.
Now, that should be a big deal, since Symantec has made its reputation and fortune off the back of the antivirus business, and it still makes up some 40 percent of its revenue. According to Symantec’s Brian Dye, the company no longer thinks of antivirus as any kind of money maker. Antivirus catches less than half of the cyber attacks that now occur, he said.
However, this is only the latest in a series of announced deaths of the venerable technology, which has for so long been a keystone of enterprise security. In 2012, the Flame malware was discovered to have infected systems around the world and to have been resident on those systems for up to two years without having been detected by antivirus software. It was seen as a huge failure for antivirus, and the potential death knell for the technology.
None of this is news to most security professionals, who have been preaching the vulnerability of “traditional” security for some time and the need for layered, in-depth defense. Symantec now certainly believes that, since it has a new philosophy (and new products and solutions to sell) which emphasizes this approach.
But, is antivirus now really useless? That would be bad news for many government organizations, which still rely to a great extent on legacy systems such as antivirus for the core of their security. Lastline Labs, which looks at these kinds of issues, is one outfit that isn’t ready to toll the bell for antivirus yet, though it does say it’s staggering badly.
The main problem, it believes, is that antivirus takes too long to catch up with malware. From tests run for over a year, from May 2013 to May 2014, it found that, on any given day, at least half of the AV scanners it tested failed to detect new malware. Even after two months, a third of the scanners were still not detecting it.
Eventually, AV scanners do start to catch up. Two weeks was the common lag time. But, even after a year, according to Lastline, there were malware samples that still evaded 10 percent of the scanners tested.
Source: Lastline. Click chart for larger view.
As the graph shows, there’s a major problem with the 1 percent of malware that consistently evades capture by antivirus systems. That likely represents advanced malware that more sophisticated criminals use to persistently target and infiltrate organizations, Lastline said. Unfortunately, unlike more opportunistic cyber events, attacks that use such malware are the ones that usually cause the most serious security breaches.
Traditional antivirus is not dead, Lastline believes, but it does need to be complemented with other approaches, such as those based on dynamic analysis of samples and network anomaly detection. The National Security Telecommunications Advisory Committee came to a similar conclusion in a report to the president last year, and it’s the basis of many of the next generation of security tools that are now being unveiled.
Meanwhile, until budget-constrained agencies can catch up with this flow, many will have to persist with the AV systems they already have while being aware of their limitations.
Which brings up another point.
In February of this year, a Senate report on the federal government’s cybersecurity track record found that agencies that had recently suffered major breaches had consistently failed to patch security software, including antivirus, with some as many as two years behind on their updates.
Even the admittedly limited effectiveness of traditional antivirus systems won’t survive that.
Posted by Brian Robinson on Jun 06, 2014 at 9:00 AM1 comments
The influx of consumer IT into the workplace — often unmanaged and unseen by administrators — is speeding up, and it isn’t just the fault of irresponsible employees.
“People need to get their work done, and they’ll do anything to get it done,” said Oscar Fuster, director of federal sales at Acronis, a data protection company. When tools that can help them appear in the marketplace, and in their own homes, they chafe when administrators do not let them use them. The result often is an unmanaged shadow infrastructure of products and services such as mobile devices and cloud-based file sharing that might be helpful for the worker but effectively bypasses the enterprise’s secure perimeter.
It is not all the fault of the administrators. They have policy, regulation and legislation to comply with. But if someone doesn’t do something quickly, agencies will soon find that their sensitive data is outside of their control.
What is needed is a more agile approach to acquiring and managing technology that doesn’t leave the government two years behind the consumer curve in acquiring tools. Departments must be willing to decentralize authority so that agencies can adapt quickly to their technology needs, and more freely interpret legislative mandates.
“It’s easier said than done,” Fuster said. But most IT legislation is technology neutral, and policies can be fashioned to accommodate new technology more quickly than is happening now, he says. “The second you fall behind, people will start cutting corners.”
Shadow IT is not a new problem. In the early days of the home PC, workers could use removable hard drives to work at home, and floppy disks could move files easily from one office to another. The difference was that 40 years ago it took more tech savvy and a little more investment to get outside the perimeter. When the world went wireless 15 or so years ago, there was an exponential jump in the ability to think and work outside the box.
Things have shifted again with handheld mobile devices and nearly ubiquitous network access. Consumer cloud services can put an entire suite of productivity tools in your hand, but it also takes data outside the administrator’s control.
The solution is two-fold. Because the enterprise itself is becoming more fluid, more attention is needed to the security of the data itself. Encryption and controls to monitor its movement, coupled with more well-defined access control, can help protect data and see who is using it and where. This addresses not just the shadow IT challenge, but the insider threat and the growing use of stealthy exploits that can sit quietly in the system and slowly export data.
At the same time, be open to accommodating workers so that they are less tempted to work around you. One powerful tool is the ability to manage mobile devices within your legacy infrastructure. Windows Phone has a small percentage of the mobile market, but the latest Windows 8.1 update allows administrators to use a common set of management tools from the server through the desktop to the handheld device. Even if your workers prefer an Android or iPhone, this can be a good compromise to making your workplace more flexible.
Posted by William Jackson on May 30, 2014 at 8:03 AM0 comments