Reading news on a UNIX box connected by a slow connection can be painful.
A 14.4kbps modem can take several minutes to transfer a Usenet active newsgroup file from an NNTP server. Every time you start common UNIX newsreaders (trn, strn, and tin), you suck this file from your provider's news server. This translates into a congested network connection, a long wait before you can read news, and frustrated users.
Caching some part of the news feed can help this problem.
The fine people at Suburbia.net have released a program called nntpcache that implements a caching news server that pulls down only the articles requested. It not only caches the active files and newsgroup files, but also the articles people actually read.
While this product is still in beta, I have seen a dramatic performance improvement in reading news locally. The leap has been enough to make me switch from reading all my news on a SparcStation 20 connected by Ethernet to the news server to reading my news on an itty bitty 486-33 in my basement.
Once upon a time, I used a program that came with the strn news reader to grab the "active" and "newsgroups" files from the server periodically. I then hacked the strn newsreader to support these cached local files. I consider this solution obsolete now, though.
My initial implementation of this used strn-0.9.2 as a base. I made changes to the strn code such that there is an option in the configuration process that enables a local active file.
The code is available here for historical purposes:
To support this, you need to run a cron job that sucks the news active file down from your provider's news server every day, or as often as you want news to be updated. Details on how to do this are included with the distribution.
Note that new articles may not be visible until you get the active file from your server.