5 critical capabilities for 2019
- By Robert Johnston
- Jan 16, 2019
We can add NASA to the list of recent federal cyber breach victims. The space agency disclosed in late December that hackers found their way into its servers in October 2018. While NASA is still investigating the extent of the breach, the agency knows the hackers accessed personal data of both former and current employees. Unfortunately, other agencies will surely find themselves in NASA’s shoes in 2019. Here’s why:
Cyber criminals know government IT pros have limited budgets that create resource challenges when it comes to securing a daunting array of technologies and data flows. This makes agencies at all levels of government target-rich environments for hackers. So, what’s the answer? How can government IT leaders take control of their data and reduce their vulnerability to bad actors in 2019?
The solution is straightforward, but multilayered. Government agency CIOs and CTOs need a hub-and-spoke system to collect and index data from all their IT touchpoints. These include network traffic, web servers, VPNs, firewalls, applications, hypervisors, GPS systems and pre-existing structured databases. For optimal cyber protection, all those data feeds should be run through an artificial-intelligence-authored security information and event management (SIEM) system equipped with machine-learning-powered analytics to identify anomalous and malicious patterns.
The hub-and-spoke approach should enable four critical capabilities: log/device management, analytics, account/system context and visualization of user privileges across an entire network. Here’s a walk-through of the capabilities and why they matter.
1. Log/device management: This piece should include unlimited and automated coverage of logs, devices and systems as well as integrated compliance management. It should also provide real-time event log management, Windows and Linux server management, cloud and on-premise ingest, secure and encrypted log management and log data normalization.
2. Analytics: Data today is too voluminous for human analysis, so using AI and machine learning to analyze large amounts of data makes the most sense. Agencies should look for a single platform that provides automated threat intelligence, real-time intrusion detection alerts, 24/7 network vulnerability assessment, and user and device context.
3. Account/system context: Speed is essential, so agencies should look for a system that provides one-click, automated risk reporting for auditors and decision-makers that takes minutes rather than days.
4. Visualized permissions: Because cybersecurity conditions and requirements quickly change, agencies need the ability to visualize privileged users and groups in real-time across the network in order to understand who can touch an agency’s data.
5. Long-term viability: Will an agency's technology still be viable in one, two or five years? It's an important question, but one that is often mistakenly answered with a yes. The era of on-premise architectures is over because they are flawed by design. Tied to the constraints of initial deployment, these systems are allergic to architecture migration, software redesign, advancements in analytic capabilities and new database implementation. In the cloud, however, organizations can develop a symbiotic relationship between the service they use and new cutting-edge technologies. With today's cybersecurity threats, agencies need to be bigger, faster and stronger than the adversary, and the cloud gives them the opportunity to deploy the best solutions available.
The hub-and-spoke approach gives government agencies a fighting chance to keep data out of hackers’ hands. What used to be nice to have is now essential. There’s just too much at stake.
Robert Johnston is the co-founder and CEO at Adlumin Inc.