Vinton Cerf | Internet forecast calls for clouds
Cloud computing's development mirrors the early days of the Internet
It’s been a little more than three years since Vinton Cerf joined Google, and he still looks upon his title as Google’s chief Internet evangelist with a certain amusement. His title notwithstanding, Cerf continues to play an active role in helping to advance the Internet. With Robert Kahn, he co-designed TCP, which was instrumental in enabling the transfer of files and e-mail messages, 35 years ago.
GCN Editor-in-Chief Wyatt Kash caught up with Cerf recently at Google’s new offices in Reston, Va. An excerpt from this interview appeared in the IT Agenda 2009 report in GCN's Jan. 12 issue.
GCN: What do you see as some of the chief challenges facing government IT officials in the year ahead?
Vinton Cerf: First, we need to evaluate our current position within government to get some idea of the scope of what is [currently] in use -- a soup-to-nuts assessment of where are we in government in terms of networks and technology.
Second, there’s no question that our dependence on computer communications in the government and outside dictate that we do a better job of security. So in the course of evaluating what we have, we also have to ask what to do about improving the security of the system.
A third thing, which is only beginning to become clear: This interest in cloud computing is appropriate. But the question is, what happens if there’s more than one cloud? And how do clouds interact with each other?
In a very funny way, I feel like this is déjà vu all over again. I’m thinking about all of the questions that arise when I’m using the facilities of two clouds and I want to initiate a computation in one cloud, have another cloud supply the data, or vice versa, or share data back and forth. If there are access controls associated with information within this cloud, how do I reinstantiate those access controls if the data moves to a new cloud? What’s my vocabulary? In fact, some clouds have no idea if there’s any other cloud in existence. They don’t know how to even refer to another cloud.
This is actually a reprise of a lot of the questions that came up in [the development of the] Internet. The Net didn’t know there was any other Net in the universe — it didn’t have any way of saying, “Send this to somebody who’s not on my network.” When you said, “Send this”…the question is: “What’s this?” and “How should I send it to the other guy?” and “What should it look like when it arrives?” and “How should it be interpreted when it arrives?”
It’s kind of the same thing all over again at a rather higher conceptual level. So intercloud stuff is going to be the next decade’s really interesting communications and networking challenge.
The next thing I’m concerned about is the government’s need to use computing to become more efficient internally. What I’d like to see is sufficient standardization so that the only reason that you can’t exchange information among government systems is because you set a policy not to do that. If the policy changes, I don’t want the technology to be the obstacle. I want the technology to be the facilitator.
GCN: Do you think the Federal Desktop Core Configuration program is a step in the right direction?
Cerf: I think so, although we have to be careful about standardizing configuration [if it means that] we will only use this particular software from this particular vendor. I don’t buy that.
I’m a big fan of open standards, and I believe if we pick standard interfaces, then we should permit multiple implementations of things, as long as they meet those standards. That’s how [the] Internet works. Recently you have lots and lots of people able to build software and hardware and plug it into the Internet as they all have agreed that they would follow these particular standards, and that means that we should expect things to interface successfully.
So I’m looking more for open government standards than I am for standardizing a particular piece of software. The current chief technology officer of the District of Columbia, [Vivek Kundra], has demonstrated that open source…and cloud computing are a pretty efficient way of spending government dollars. His success ought to be factored into thinking in the larger government framework.
GCN: What are you focusing on these days with the Internet?
Cerf: My belief right now is that we have a [highly] globalized economy. We are sensing the effects of that globalization — everything is connected to everything, so we need to reinforce our ability to use information technology to interact with the rest of the world.
Small example: When we carry out electronic commerce and I conclude a contract with someone and maybe even digitally sign it, there’s a question about how a digital signature is interpreted in the two jurisdictions we might be in.
If you’re in Europe and I’m in the United States, it’s not clear to me whether the jurisdiction in Europe sees the same significance to a digital signature that we do in the United States, or vice versa.
What happens if, after concluding this contract electronically, one or the other party fails to meet [its] obligations in the contract, and you seek recourse, and someone says, “Well, it wasn’t a valid contract because the digital signature doesn’t count.” That wouldn’t be a good outcome. So we have to look for probably multilateral agreements around the world about what digital signatures mean.
To generalize that, it also means you have common agreements about what is abuse on the Internet and how can we combat it. What do we consider [to be] abuse that is actionable? What should we do about privacy? It would be helpful if we had a uniform sense of what privacy is supposed to mean in this online environment, so we could all implement the same thing.
So, I see a whole lot of opportunities for multilateral interaction — for multi-stakeholder discussions. And I use that word deliberately, because multilateral usually means multiple governments. Multi-stakeholder refers to the private sector, civil society, governments and the technical community. It is what you see in ICANN — the Internet Corporation for Assigned Names and Numbers — as a multi-stakeholder structure, which pioneered this notion of multiple stakeholders having common and equal say about policy. I’m convinced that U.S. leadership will only be manifest if it is carried out in a multi-stakeholder fashion.
GCN: What was your take on the vulnerabilities in the Domain Name System that surfaced last year, and what needs to be done going forward?
Cerf: Yes, this was a fundamental design mistake in the DNS implementation, and it got widely publicized. I know we scrambled at Google to make corrections and changes to our software to make sure we weren’t vulnerable to that particular attack. There could be other problems like that arising.
I’ve seen bugs that worked for 20 years and suddenly emerge in a certain set of conditions, or somebody discovered how to exploit them, which only reinforces my belief that strong authentication of information that you rely on heavily is really important.
In the case of the Domain Name System, [DNS Security Extensions] is the best step we can take in the near term to make that system more resilient and less vulnerable to various forms of spoofing or phishing.
Some of the top-level domains are already being digitally signed. Sweden, for example, Puerto Rico, Bulgaria and Brazil are all signing their top-level domain zone files. There’s ongoing discussion about getting the root-zone file digitally signed. There are several different alternative ways, and the [National Telecommunications and Information Agency] put out a request for comments that closed on Nov. 24 about what general public opinion [is]. So that’s an important step.