Denial Of Service - reprise

From lsi <lsi@lsi.clara.net>
Date Sat, 6 Jan 2001 05:18:40 -0000


[: hacktivism :]

OK so you're under seige and have been for months.  Those nice
folk with the scripts have spiked your dataset and slowed your
server.  But you're still online - and so are they.

What's next?

The most effective deterrent yet discovered is the statistic.  The
robots attacking my site go for certain files.  If I make a list of the
most frequently accessed pages, and then post that back to the
website, anyone who attempts a DOS contributes to this set of
statistics.  That is, the files they go for the most will be the ones at
the top of the list.  If I then hyperlink this list, that makes it
extremely simple for my legitimate visitors to access those pages
the DOSers want to take down.

The more they try, the higher and more significant the numbers
become.

I have found this strategy effective in reducing the incidence of
attack.

Since my attention is now focused on these files which have been
picked out for me, I took the opportunity to redevelop and expand
upon these themes (to the tune of 7Mb of TIFFs, in this case).

I also mirrored my site, and implemented a firewall, an intrusion
detection system and a second virusscanner on my home PC.

So now, the direct result of all this DOSing is TWO sparkling
servers with more content than ever - and a new and very seductive
way of finding the files under attack.

The point of this email is to alert fellow hacktivists as to the likely
outcome of their DOS attempts.  These attempts serve to highlight
an attacker's SORE POINT.  The attempts can be TURNED ON
THE ATTACKER with the aid of traffic analysers.  And long-term
the victim will likely Dig In, at a minimum.

To maximise your chances of working toward your cause, I
suggest you avoid feeding the objects of your discontent in this
way.

Adversity encourages innovation... this effect is also noted in [all
other living systems] such as bacteria, who learn resistance to
antibiotics after a while.

 ***

There is another type of robot hanging around my site.  These
robots check for certain files at a certain interval.  Sometimes they
just see whether it's still there - sometimes they download some or
all of the file, presumably to see whether it's the same.  These
robots always have similar IP addresses (in contrast to DDOS
above) which usually resolve to one of the "internet intelligence
agencies" we have seen mention of recently - such as www.digital-
integrity.com ....

These robots also do a neat job of highlighting the most interesting
files.  Likewise the traffic analyser generates a hyperlinked most-
frequently-monitored list, which my visitors are using also.

Someone mentioned getting used to the surveillance - I though I'd
chime in and say that I am monitoring them back, and using the
stats I get to enhance my site.  Every one of the stats pages is
legit 'spider fodder'.

Here is a list of signatures used by monitoring agencies.  There are
more no doubt - I am either yet to isolate them in my logs, or they
are yet to visit me.  These sigs come up in the useragent field of
the webserver logfile.

The Informant
diibot
surfwalker
linkwalker
N2H2
netmechanic
scorebot/
linklint
leia
contype

I have researched each useragent I have classified, I know
netmechanic is a benign service - but it can be and IS used for
monitoring.

I expect some sort of pattern analysis will identify robots who
refuse to identify themselves.  I have another page, 'most frequently
accessed pages by mystery useragents' .. you get the idea.  If I
get the pattern analysis going I'll have a page "most frequently
accessed pages by useragents attempting to hide themselves".

Stuart

PS: I just read that in USA "any citizen business" can buy a
background check on you for around $5 from your local police
station.  I wonder if any citizen can get a background check on any
business as easily?

PPS: if anyone has a webserver logfile they want munched, send it
to me. :)

PPPS: for long-term monitoring avoidance, check out stealth
scripts.  These sweeties serve a different page to a different visitor,
depending on their IP address, useragent field, etc.  So you can
simply tell it to serve sugar/salt/pickles to whoever you want.
Needed: distributed protocol to exchange database of target IP
addresses for such filtered content.  So a network of sites can
share a single database and consistently filter the content served
to addresses in the database.  Stealth scripts are reportedly used
for search engine positioning.  The mere existence of stealth
scripts means that monitoring companies are selling a bunch of
bullshit - they can't be sure the content they are checking even
exists.  Trick is to name a script with an HTML extension, so the
spider swallows it.  Then verify address in realtime and send
predefined page if address in hitlist, else send usual content.
</script>

PPPPS; teh quick fix for the current crop of monitors seems to be
to make a new page with the same/updated content and a different
filename and change your links to point at the new page.  Leave
the old page online - the robot has no idea you're not pointing to
that page anymore, in fact its the only thing on the entire internet
even accessing it and nobody has a brain as small as it - it has no
idea at all.  It needs a human to notice/reprogram it.  Add case-
sensitive filenames, unicode etc and you're in Paradise.htm .. or
was that %2E%2E/paradise.html ... humans aren't so hot at
noticing unexpected changes either.  But even change the
timestamp on that monitored file, and you can bet Monday AM,
someone will be clearing an alert from their intray.


------------------------------
. ^               Stuart Udall
.~X\     stuart@cyberdelix.net
.~ \    http://cyberdelix.net/

..revolution through evolution


[: hacktivism :]
[: for unsubscribe instructions or list info consult the list FAQ :]
[: http://hacktivism.tao.ca/ :]