logo Sign In

Post #89845

Author
JB522
Parent topic
Usenet tutorial?
Link to post in topic
https://originaltrilogy.com/post/id/89845/action/topic#89845
Date created
25-Jan-2005, 10:54 PM
Originally posted by: eDroj
Go with Newshosting or Giganews..... the ones that the ISP's provide are practically bantha poodoo.......
And Forté Agent NON free version works splendid with binarys, at least it does for me.


The big problem with Free Agent is that, last I checked, it hasn't been updated to work with par files, whereas I believe Agent (the pay version) has. What this basically means is that Free Agent doesn't recognize the multiple posts that make up the par as being part of the same file, and doesn't automatically decode them. I was trying to get my first OOT disc using Free Agent (Dr. G's ANH, before I discovered this website), and it was a pain to manually save the messages (which Free Agent thinks are just text messages) to disk and then decrypt them.

Especially if you're going to shell out the money for Giganews or another high-availability news feed, I recommend taking a look at NewsLeecher. It's a multi-threaded reader, meaning that it will open multiple connections to a news server. Most ISPs have their rates capped, but allow some number of multiple connections. If your ISP or news provider has multiple servers, you can set it up to do multiple connections to them too. I didn't buy NewsLeecher, but did use it during the free trial. With three servers and four connections to each, I had 12x the bandwidth of Free Agent. And, it seamlessly integrated the multiple parts of the par and decoded them too.

With no offense meant to you guys, it cracks me up that guys who are so smart in image processing, BitTorrent, etc. and don't know Usenet. For me, Usenet WAS the internet in the beginning- I started regularly reading news in 1989, back when the groups really were for news and commentary, not spam, porn, and file sharing. I haven't read actively in many years since web-based news and boards such as this suit me better, but I remember spending hours reading rec.arts.sf-lovers (when the Usenet community was small enough that SW, Trek, and everything else could share a group), then rec.arts.sf.starwars, and later a bunch of splinter r.a.s.starwars groups.

And somebody mentioned Grabit. I think I tried it before NewsLeecher, but there was some reason I didn't like it. I think maybe it saw each micro-part of a file as a separate message. (Basically, most of the posters in alt.binaries.starwars will archive a DVD into a rar archive and then add par information (which has redundant information that helps reconstruct the files if a few isolated pieces get lost). They split the rar into manageable chunks and post each chunk as a message, with about 100000 lines IIRC, and there will be 100-300 or so of these for a DVD. But the news servers actually break these up into smaller pieces to exchange them, and there will maybe be 500 smaller pieces to each big piece, which I'm calling micro-parts. Free Agent and most other readers I've used will report the big chunks as a message, and then will tell you if not all of the micro-parts are available, but I'm thinking that Grabit saw each of the micros as an individual message, meaning that it will show you tens of thousands of messages instead of hundreds. But I may be wrong on this last part- it might not have been Grabit, so it might be worth a try or I might not have had it set up right.

Sorry for the rambling at the last, but even if I'm wrong about Grabit I figure it might help someone understand a little more about how Usenet works. I think it's pretty amazing that a system designed for exchanging 100-line text messages works at all for multi-gigabyte files.