Post by Bill (Adopt)So where is all this excessive download coming from.
The night before last's was alt.binaries.dvd so presumably one of our
customers was downloading entire DVDs. I don't have a problem with this,
but there are *much* more effective ways of downloading DVDs than via
newsgroups - such as BitTorrent and LimeWire etc.
Post by Bill (Adopt)In fact quite light ..2100+ for seven days would be high use, but not
'over-use'...
It's not article number but actual data size, so 2000 1K messages would be
less of a problem than 2 or 3 100MB messages.
Post by Bill (Adopt)I thought you had determined this - and were looking at a particular
source...
It's difficult to determine a particular source due to it being like
looking for a needle in a haystack, and I can't really spend all month
going through logs line by line, and there doesn't really seem to be any
decent log analyser software available. It would be easier if you could get
something like Webalizer for newgroup logs. I would write some, but again
it's getting the time! :-(
I do have the IP address of whoever was attempting to download DVDs, so
I've asked BT to trace this at the moment. I only keep a record of those
people who have static IP addresses.
Post by Bill (Adopt)Have you managed to identify the 'culprit' yet? After all, somebody
must be accessing the download. Is it not identifiable? I doubt it's
If it's one of our customers (and the DVD downloading one is), then yes I
can identify them (if BT get back to me of course), but there are other
ways the newsgroup can be abused, and the sci.crypt method is one of them.
Someone, somewhere in the world (Russia we think) is spamming sci.crypt
with approx 20,000 random messages a night. Some of these messages are each
10K long - so you can see it's a *huge* amount of data. This was *one*
newsgroup I identified, mainly because I happened to subscribe to
sci.crypt. However, I only subscribe to around 15 newsgroups, out of the
full 110,000 newsgroups - so for all I know someone could be doing the same
to any number of other newsgroups that I don't know about.
Our own newsserver is much more efficient than the old Argonet one, so the
way our news server works is that it caches news articles for those
newsgroups that people request. This means that in theory, if two or more
of our customers subscribe to the same newsgroup, the server will cache the
articles and not have to refetch them from the upstream news server until
some new articles arrive. This saves bandwidth enormously, which is why (if
you look at the graphs on the newsgroup support page at
http://www.orpheusinternet.co.uk/support/news.html ) you'll see a maximum
of 10 connections. This is not 10 customers, but 10 connections from our
own news server.
Post by Bill (Adopt)Are they? ..but too many IPs still do run them as part their service...
Yes. Pretty much all large ISPs have now dropped newsgroups - such as BT,
Tiscali, Orange, Talk Talk, Virgin, etc etc. Only some of the smaller ones
which peer from Pipex (like Argonet used to) still offer them. Pipex in
turn get their feed from Giganews.
I've cut out the middle man (Pipex) so I buy my feed direct from Giganews -
but it's not cheap. Although, spread over our customers, it works out as
excellent value for money for our customers, as it probably costs people
less than 10p a month as part of their subscription.
Post by Bill (Adopt)Is is it not possible to put a block on the high use groups.
No. You can't block newsgroups. You can only supply a list of newsgroups
you wish to carry. This is why it's harder to remove individual newsgroups.
I can only make the list longer so instead of saying we with to carry alt.*
I have to individually specify alt.coffee for example.
Post by Bill (Adopt)alt.film-festivals
alt.film-festivals.sundance together about 10 a month including
However, those could be configured in a single rule by allowing
alt.film-festivals.* but as you can see, if there are lots of individual
newsgroups, the list can start getting rather difficult to manage.
Post by Bill (Adopt)sci.space.science about 10 a year
Again, if there are several space related newsgroups I could probably
define a catchall such as sci.space.* but of course, if there was a
sub-group that I required blocking (as an example, not in this particular
case), then I'd have to go back to listing individual groups.
It would be really nice if you could just specify which newsgroups to block
- but you can't.
Post by Bill (Adopt)rec.aviation.misc about 10 only every ten years,
rec.* is available anyway. It's only alt.* and sci.* which have been
temporarily removed.
Post by Bill (Adopt)So ..the question is do you believe yourself capable of providing
the Usenet service - or is it now costing you just too much..?
..and, if so, is there another model that may be used -
Actually, I don't need a different model. The model we have generally works
very well. Until this month, our maximum ever (in the two years we've had a
news feed) usage was no more than 40%, usually averaging 25% a month.
It was only in the past couple of weeks usage shot up to over 100%.
The reason there's a problem at the moment is possibly down to me - because
we've never been over our quota before, I was looking to extend our quota
temporarily, paying a single penalty for one month excessive use. So, I
extended our quota to 200%.
However, that was when I found that in one night we went up to 150% quota,
so my additional cost would be used up in 2 or 3 days instead of the rest
of the month, so I took the decision to revert my change and change the
quota back to 100% to prevent any more usage. This is when we ended up with
two weeks with no news until the end of the month (actually the 15th, as
it's billing month, not calendar month).
When the 15th came round (Sunday), I noticed that instead of being billed
for the amount we were over (actually a relief), our quota was back to the
usual (100%) but the excessive bandwidth had been 'carried forward', so
instead of starting the month on 0%, we started this month on around 48%
usage, which then shot up again.
Now I know how Giganews' bandwidth allocation works, I won't attempt to
increase our bandwidth again, which means it will simply stop at 100% and
we may lose the newsfeed again until the 15th of next month. However, that
will ensure that we start next month with a bandwidth usage of 0%, which
means that everything should be back to normal again.
I've now got logging in place - so I can detect if people are downloading
vast amounts of data (such as alt.binaries.dvd) as this not only affects
the bandwidth of our Giganews feed, but is also affecting the bandwidth our
servers use in the London data centre - as our server has to download the
same amount of data from Giganews in the first place.
We're currenly (as of about 10 minutes ago) running at 96.02%, which is
within a cats whisker of being blocked again, but if I can block a few high
traffic newsgroups to make the remaining 4% last as long as possible, then
we may get closer to the 15th of next month without losing news again.
Anyway, hopefully this gives a more detailed breakdown of the problem.
Paul
--
Usenet replies: To contact me, visit www.vigay.com/feedback/
Life, the Universe, RISC OS Help and Everything - www.vigay.com/
Share and discuss ideas or chat about the above - http://forum.vigay.com/
Quality Internet, Domain Registration & Hosting - www.orpheusinternet.co.uk/