Google 2.0 embraces Semantic Web
- By Joab Jackson
- May 18, 2007
The recently-announced revisions to the Google search service may require agency Webmasters to do more preparation to get their agency's content properly indexed by the site. Those agencies that undertake this effort, however, should enjoy significantly increased exposure as a result, noted Steve Arnold, search consultant.
"It's rule-changing," Arnold said of Google's new approach to search. "This is will be a different game going forward with how information will be presented and what IT people will be expected to do."
Earlier this week, Google announced that it was revamping its Internet search engine
. Google promised that when people search on its site, they should start getting a wider range of results, including more links to videos, images, news, maps and books.
Among the changes, according to a report issued by the equity research firm Bear, Stearns & Co., is additional capability in semantic reasoning about the material the site indexes. The company has applied for a number of patents around a technology it calls the Programmable Search Engine
, which will look for metadata posted on Web sites that defines the material on those sites.
According to the report, PSE will allow Webmasters to program an Internet search engine to categorize site content in very specific ways. This will allow Google and other search engines to identify the content as belonging to specific domains of expertise as well as to identify complex relationships with other sources of information. PSE is expected to be in use by the end of the year.
Google is, in effect, attempting to change the pecking order of the Web, from having search engines scanning the entire sites themselves to asking Webmasters "to tell us what you got," Arnold said.
Arnold said that preparing the PSE metadata will require some work, however. The work should be easy to those agencies that have already participated in the Google SiteMaps
initiative, in which agency Webmasters provided a list of links to database queries so that they can be indexed by the search engine.
Those unfamiliar with the conventions of sitemaps will find the task more difficult, Arnold said.
Overall, the job of preparing metadata for PSE will be large enough that agencies would need to devote additional funding to get the job done, Arnold predicted. The Bear, Stearns report notes that "For large Internet sites ... we think their management will need to assess the implications of Google's PSE on their business. ... If Google's dominance grows, for Webmasters, we believe they will have little choice but to play ball with Google."
Agencies are increasingly finding that more and more of their virtual visitors are arriving by way of a search engine
, rather than typing into the browser the address of the agency's Web site itself. So preparing the site for search engines is becoming increasingly important. "Visibility is important," Arnold said. "Smart Web sites will get the clicks."Although Google garners the lion's share of Internet searches, industry observers predict that other search services offered by Yahoo!, Microsoft and others will use the PSE format for their own services as well.Arnold, who is head of the ArnoldIT
consulting firm, is completing a book on Google, "Google 2.0: The Subtle Predator." He provided some of the information to the Bear, Stearns report.
Joab Jackson is the senior technology editor for Government Computer News.