Bundling Web pages could make future surfing much faster

What actually happens, of course, is far more complex. I didn't go into detail earlier
because there's little more that users or even webmasters can do to speed things up.
Loading speed depends on how the Hypertext Markup Language pages work their way through
the layers of Internet protocols.


It isn't the amount of information on the Web page that matters so much as how
efficiently it moves across the Net.


Some users may have either internal modems that lack indicator lights or browsers
without software indicators to show when information is sent or received. If everyone had
these ways to watch what's happening, they might stop looking for faster modems and make
what they have run more efficiently.


The truth is that unless you're transferring a single large file, most of the time your
modem is practically idle, anyway.


Web pages aren't static images. They contain hot buttons and links to other data. HTML
treats each part as a separate file, so a page with 10 segments can't be displayed
completely until 10 different network connections have been made, one for each segment,
each perhaps moving over a completely different Net path.


The Netscape Navigator browser lets you watch this happen at the bottom of the screen.
A dozen or more small files may transfer before anything appears on the page. That as much
as anything is what's causing all the traffic on the Internet.


Browsers may attempt to download several Web objects simultaneously, but that's only a
patch and not a real fix because many links are still being created. The load on the Net
is unaffected.


There's a ray of hope on the horizon: a new HTML version that will bundle an entire Web
page into a single data exchange. It will create persistent links rather than close them
and make new ones each time an object is transmitted.


The volume of data won't decrease, but at least the Net won't have to cope with making
and closing dozens of separate links per page. You can view information about the proposed
changes on the World Wide Web Consortium's site at http://www.3w.org
and the Internet Engineering Task Force's site at http://www.ietf.org.


This information doesn't really help you today, but it may keep you saner to know that
groups are attacking the Web's slowness and have solid ideas for fixing the leaky pipes.


One obvious solution would be to cache the Web pages you visit more than once. If the
cache software were smart enough, it could even detect alterations to the page and
download a new version only when necessary.


What if the cache software could also look at a page you are viewing and begin
downloading pages to which it links, even before you clicked on them?


Now if only the cache software could display real genius and store pages based not on
the sequence in which you access them but on how often you go back. That would really be
useful.


There are several programs that already do most of these things. For example, the $40
Got It package is downloadable from Go Ahead Software Inc. of Redmond, Wash., at http://www.goahead.com.


Yes, the predictor feature that downloads pages in anticipation of your requesting them
would increase Net load, but that should be offset by the elimination of lots of downloads
of frequently accessed pages.


As the standards groups are working to improve HTML, two leading players are making the
situation worse with different, competing versions of dynamic HTML. Dynamic in this case
means greater interaction between viewer and Web page.


Microsoft Corp.'s Dynamic HTML would let developers build Web pages that could be
user-customized with, say, personal color schemes. Netscape Communications Corp.'s dynamic
HTML-note the lowercase d-would do approximately the same thing but in a different way.


My, my, we all need something to complain about, and it might as well be conflicting
standards.


Server push delivers information automatically from the PointCast Network and other
publishers as opposed to seeking information with a browser. It flourished mostly because
it's so easy to bundle advertising with the push content.


There's nothing wrong with push, except that you could fill up an entire hard disk just
with the different providers' setups. We all love innovation, but this is one area where I
hope the small fry go away and leave, say, three push providers. That should be enough for
diversity without overloading us.


I really like push technology. When it works, it can be a great time saver. I'm willing
to fight the bugs in a couple of push techniques, but not 20 or more of them.


John McCormick, a free-lance writer and computer consultant, has been working with
computers since the early 1960s.


inside gcn

  • machine learning

    Mitigating the risks of military AI

Reader Comments

Please post your comments here. Comments are moderated, so they may not appear immediately after submitting. We will not post comments that we consider abusive or off-topic.

Please type the letters/numbers you see above