Usenet History: Implementation and User Experience

-Usenet  
  • webmaster
  • 30 May 2021

Before you begin to understand the implementation of Usenet, you must know about two critical things which contributed. First, the University of North Carolina Computer Science department had a Unix machine equipped with a slow, small disk, a slow-performing CPU, and most importantly system with a low RAM. This machine is supposed to be slower than most time-sharing machines for 1979. Duke CS had a relatively faster computer, the 11/70. Since it was the first implementation, UNC’s offering of 11/45 had to be used. During 1979, there was no Internet, and departments were not connected to the ARPANET, so logging in remotely was not an option. Using a dial-up network, billed per minute, costed daytime charges. A speed of 9600 bps could be topped if connected via the Gandalf port selector.

The second important thing to bear in mind is that the first implementation would involve few experiments. The first-ever public announcement of Usenet read out the problems faced during the implementation. Many amateurs worked together on this plan, but it was time to get started. Once Usenet was made available, a committee could be formed and later that committee could use the net to begin analyzing what the problems were. A network protocol had not been designed before. A few experiments had to be carried out to get things right. Do keep in mind that Tom Truscott and Jim Ellis had programming experience and were experienced system admins. Tom had communications software experience and had been programming kernel-level software for about 14 years.

Implementation of Usenet

The strategy implemented for developing the Usenet was rapid prototyping. The first version of Netnews software was implemented as a Bourne Shell Script. The script had features such as cross-posting and multiple newsgroups and was 150 lines long.

But why use a shell script for programming? The simple reason is: Compiling a program took a very long time. The more the waiting time to compile made Tom start something new. Most of the code made use of string-handling concepts and C programming was not good for string-handling. You could have written a string library, but it was time-consuming as the compilation speeds were low. With shell script, you could try out new things and develop the code incrementally. The shell script is slow to use in production and didn’t run that quickly when executed. It was not much of a hindrance as it was not a production program. It was a prototype intended to be used to create a file format. Once everything was in place, it was re-written in C programming language.

Implementation details

Regrettably, the script version and the C version of the implementation are not available today. However, Tom remembered a few of the implementation details. The subscribed newsgroups of a user were being saved in an environmental variable set in the .profile file. There were commands to retrieve all the articles the particular user had read previously. The retrieval was possible through $HOME/ .netnews to mark with the current time. On successful exit, only the last read time was getting saved. The script was not written to have the capabilities to read out of an order, to skip articles to read them later or to even stop reading midway in an article. The limitation was due to an assumption error—only a couple of articles would be read per day. The incoming traffic today is 60 tebibytes per day. The prediction was off by many orders of magnitude.

Other implementations

Another implementation was not to display cross-posted articles more than once. The cross-posted articles appeared as a single file linked from multiple directories. This technique was not only helpful to find duplicates but also saved adequate disk space. At that time, disk space was quite expensive.

Few other points worth knowing: Redundancy of a global coordinator as each line had an article ID with the site name, period, and a sequence number. The filenames were the articles’ IDs that had a character limitation. A database may have helped to store the files. But a single database of all news would require a locking mechanism which was hard to achieve on a 7th Edition Unix. Pipes had to be created before the processes, so the file system had to be relied upon. The UI resembled the 7th Edition mail command—it was simple and worked seamlessly for exchanging low-volume mails.

Leave a comment

Your email address will not be published. Required fields are marked *