[Date Prev][Date Next][Thread Prev][Thread Next][Author Index][Date Index][Thread Index]

Re: Upgrading the news URL



Andrew Pam wrote:
 
> I would prefer to use the extension proposed in my Internet-Draft
> "Fine-Grained Transclusion in the Hypertext Markup Language"
> <http://xanadu.com.au/xanadu/draft-pam-html-fine-trans-00.txt>
> because the principle applies in exactly the same way to all 
> documents regardless of the transmission protocol (e.g. HTTP or NNTP).

Very impressive!  So when's this going to be an RFC?  :)

> I had a similar idea quite some time ago (a few years, I believe) but the
> major difficulty is that a Xanadu system is designed to retain all documents
> indefinitely, and this is currently infeasible with Usenet because of the
> sheer volume of articles.  Of course, compressing quotes from other messages
> into transclusions would assist enormously but the problem of the enormous
> quantity of spam (both horizontal and vertical), regularly repeated postings
> and often irrelevant or improperly posted binaries seems to me to make it
> very difficult for anyone to afford the quantity of online storage required
> for such an undertaking.

I've been thinking about that a great deal since my original post; and
I've had some thoughts that you might be interested in.

It would require, as we both realize, tremendous resources to keep a
permanent archive of everything that passes through Usenet or WWW, and
as the Internet continues to grow, this problem will only get worse.
Worst, most of the data that would be archived would be so useless
(read: unprofitable) such that nobody would want to archive it.

So, if no one could profit from maintaining a universal permanent
archive, how could it be accomplished?

Here's my idea: rather than trying to archive everything forever in one
centralized (or even decentralized) system, people all over the Internet
could become "library servers" by archiving only the articles which
interest them, and then serving them to the world.  This is definitely
within the realms of the feasible, as articles would link to other
artcles with globally unique message-ids.  And considering the number
of people already on the Internet and their variety of interests, these
personal archives could become the ultimate repository of all knowlege
in no time at all.

It would work as a Broadcast Automated Request.  Basically, a user
interested in a given article could request it over Usenet in a
specifically designed newsgroup ...
---
Newsgroups: comp.hypertext.requests
From: dfab@xxxxxxxxxxx
Subject: Request: 5ferud$241vg@xxxxxxxxxxxxxxxxxxxxx
---
... and the server, which may house only one or two articles or may be
an archive of thousands, would monitor these subject headers to check
for requests for an article on file.  If the server program owned this
article, it could initiate a connection with the requester to transmit
the article.

A library like this could maintain far more than any managed library
ever could.  The magic of this system is that if ANYBODY wanted to
archive a given file, if just one person thinks an article is important
or interesting, it would be available for anyone, anytime, anywhere.

Best of all, a BAR system needn't require that everyone install new
server software, unlike HyperWave.  As we speak, an IETF working group
is considering extensions to the default NNTP server, and how NNTP
server extensions could be implemented automatically.  So rather than
trying to get people to use new software, we could just implement an
extension to the already existing network.
<http://www.ietf.org/html.charters/nntpext-charter.html>

Thoughts?

     -Give me ambiguity or give me something else-
                          dAN