The Net Plumber

The Net Plumber

Comprehensive database will help efforts to fix billions of broken links

BY WILLIAM JACKSON | GCN STAFF

The Web has 19 degrees of separation, according to a study by IBM Corp., Compaq Computer Corp. and AltaVista Co.

That means it is theoretically possible to reach any Web page from any other page in no more than 19 jumps, thanks to embedded hyperlinks'about 100 billion of them, according to Franck Jeannin, chief executive officer of LinkGuard Ltd. of London. He said the count 'is doubling every 12 to 18 months.'

The problem is, up to 10 percent of the embedded links are broken at any given time. A recent survey by Jupiter Media Metrix of New York of 28 federal Web sites and 53 state and local sites found that 84 percent had broken links [GCN, Oct. 23, 2000, Page 1].

Links break when uniform resource locators change, and the average life span of a URL is short: just 44 days. 'Every six weeks, on average, every link gets broken,' Jeannin said.


LinkGuard CEO Franck Jeannin is seeking dot-gov customers' help in mapping the Web.
Jeannin founded LinkGuard to fix all those broken links on the Web, and the service is popular with federal customers, he said.

'We have an abnormal number of dot-govs in our customer base,' he said, 'more than most other domains.'

Outbound and inbound

By March, Jeannin expects to complete the first step in an ambitious project to map every link on the Web. It will take the form of a 40T distributed database, and it will let LinkGuard go beyond fixing outbound links to repair inbound links, too.

The database, called LinkMap, will reside on PowerVault 650F and 630F Fibre Channel storage devices from Dell Computer Corp., accessed through Dell PowerEdge 6450 enterprise servers.
Dell recently contracted to supply at least 1,000 terabytes'a petabyte'of storage for the Navy-Marine Corps Intranet [GCN, Dec. 11, 2000, Page 17].

'It's not unusually large,' Bruce Kornfeld, Dell's director of storage product marketing, said of the LinkMap database. 'What makes this interesting is that they've done the math and they need 40T now,' instead of scaling up from a smaller amount. In a year, the map could require an additional 40T, he said.

Putting together a static index of Web links is not too tough, Jeannin said, and could be done in 10 days. The tricky part will be keeping LinkMap updated in almost real time so that it can fix broken links on client Web sites automatically.

But the real value of LinkMap lies in gathering saleable information.

Who's linking where

From a comprehensive map of the Web, a site manager could tell 'not only who is linking to you, but who is linking to the sites that are linking to you, and who is linking to your competitors,' Jeannin said.

He is betting such information will be worth a lot of money. Not everyone is enthusiastic about its availability in one place, however.

'It raises some concerns,' said Wayne Madsen, senior fellow at the Electronic Privacy Information Center in Washington. He said he sees the database as a 'form of open-source intelligence' for government agencies and law enforcement.

'The government has a terrible history of creating lists of 'bad people',' Madsen said.
He did not question Jeannin's right to gather and sell the information, but he said the potential for abuse highlights the need for privacy safeguards and some regulatory authority over online information.

Regulatory jurisdiction would be a problem, because LinkGuard's headquarters are in London. The database server-storage arrays will be split among San Jose, Calif., Vienna, Va., and London sites.

LinkGuard now offers a free, limited service, at www.linkguard.com, through which customers can check the status of links on their sites and fix broken ones. The company's Classic Pro service makes corrections on an unlimited number of pages for a penny per page, with a $50-per-year minimum.

When LinkMap is up and running, customers can install the company's software on their servers to automatically check and repair not only outbound links on their own sites, but also inbound links from other sites.

Here to inform

The software will automatically inform LinkMap when a URL changes so the database can be updated in near-real time.

LinkGuard now uses agents to crawl the Web and collect link data to populate LinkMap. A PowerVault 650F storage system can have up to 10 36G Fibre Channel drives internally and can be configured to communicate with a single server or with multiple servers in a storage area network. Each storage unit supports up to 11 630F subsystems to expand storage capacity to more than 4T.

Jeannin said he initially would deploy 40 servers with storage units, each holding about 1T of data. LinkGuard is developing the database software to access the data.

'What's complex is the way we synchronize the servers,' Jeannin said.

Constant updates

Automated agents crawling the entire Web can gather link data in about 10 days, Jeannin said. But 10 days is not fast enough to make the data valuable. The key to making LinkMap work will be getting enough customers to install its software on their servers so that the database will be constantly refreshed.

'It is absolutely vital to us to have collaboration with webmasters,' Jeannin said. 'I believe critical mass would be webmasters representing 50 percent of the Web.' With that many signed on, the database would have enough real-time information to make it a valuable resource, Jeannin said.

Jeannin said he hopes the automatic fixing of broken links will be enticement enough for webmasters to subscribe to LinkMap, giving his company valuable information to sell.

inside gcn

  • digital model of city (Shutterstock.com)

    Why you need a digital twin

Reader Comments

Please post your comments here. Comments are moderated, so they may not appear immediately after submitting. We will not post comments that we consider abusive or off-topic.

Please type the letters/numbers you see above

More from 1105 Public Sector Media Group