Building tools to secure computer processors
- By Patrick Marshall
- Feb 14, 2018
The recent news of major vulnerabilities in computer processors -- dubbed Meltdown and Spectre -- that allow hackers to directly access data regardless of operating system protections was no surprise to researchers at Galois, a computer science research company based in Portland, Ore.
Computer processor designers simply haven’t paid attention to security, said Joe Kiniry, principal scientist at Galois. “They still basically trust that the user of the system is going to respect what was said in a developer’s guide for the CPU,” he said.
Kiniry is leading a team that has just been awarded a $4.5 million contract from the Defense Advanced Research Projects Agency to develop tools and methodologies for designing secure CPUs.
The project -- Balancing Evaluation of System Security Properties with Industrial Needs (BESSPIN) -- will develop security metrics and a framework for making decisions about the tradeoffs in securing CPUs.
Even before the metrics and tools are developed, Kiniry said focusing on the need for security in hardware is in itself a critical step. Currently, companies engaged in hardware development “rarely would involve actual cybersecurity professionals in research and development,” he said
The first step in BESSPIN, he added, is to develop metrics. “If you’re going to make decisions about the design of your system, you need quantifiable means by which to do so,” Kiniry said. “The problem is there isn’t any good security metrics in the world today.”
Building a product that is going to use cryptography, for example, requires a choice about what algorithm to use and what key size. “That boils down to a metric,” Kiniry said. “If you pick a big fat key --— RSA 4096, for example -- that's going to have a huge implication on the size of your circuit and the cost it takes to fabricate and the speed at which it can compute.”
Similarly, modern processors have extra bits reserved for checking memory and controlling permissions to access specific memory pages. “Deciding to add those extra bits was all about adding a layer of security to the chip,” Kiniry said. “But adding those bits had implications on performance and power.”
“The [Defense Department] wants those notions in a unified framework so that designers of new generations hardware can make informed decisions about the trade-offs between modifications to an architecture for security purposes and what implications it has on power, performance, area, cost,” he added.
In addition to metrics and a security decision-making methodology, the project will also develop a suite of tools tailored for engineers to encourage better security in the design process.
“As a hardware engineer you think in terms of hardware design, hardware architecture, hardware description languages,” Kiniry said. “You're not a cybersecurity person, and you probably don't know much about software, so the tooling we build for this program must respect that perspective. It has to be usable by, comprehensible to, fit in the workflow of, a typical hardware engineer.”
The tools will distill the security implications of designs. “If the chip isn't secure it'll highlight why it's not, provide recommendations for how to fix it and give counterexamples like actual proofs-of-concept flaws in the feedback,” Kiniry said. “And if it is secure, it will provide assurance that is formal.”
Given that DARPA is funding the project, is there an expectation that the federal government might require chipmakers to meet specified security standards? “I can't speak for the government,” Kiniry said. “But people in my line of work have made similar recommendations to the government for decades.”
Patrick Marshall is a freelance technology writer for GCN.