Risks Digest 20.86

From risks@csl.sri.com
Date Thu, 30 Mar 2000 16:45:24 -0800 (PST)


[: hacktivism :]

RISKS-LIST: Risks-Forum Digest  Thursday 30 March 2000  Volume 20 : Issue 86

   FORUM ON RISKS TO THE PUBLIC IN COMPUTERS AND RELATED SYSTEMS (comp.risks)
   ACM Committee on Computers and Public Policy, Peter G. Neumann, moderator

***** See last item for further information, disclaimers, caveats, etc. *****
This issue is archived at <URL:http://catless.ncl.ac.uk/Risks/20.86.html>
and by anonymous ftp at ftp.sri.com, cd risks .

  Contents:
More NASA woes in stress testing (PGN)
Re: Faster, cheaper *not* better (PGN)
More details on the Sea Launch failure (Steven Huang)
Stephen King eBook cracked (Re: Pack, RISKS-20.85)
California privacy legislation (Dan Gillmor)
Criminal records in North Carolina (Joe Thompson)
Judge issues injunction in software reverse-engineering case (NewsScan)
Re: Hackers sued by software-filtering company (PGN, Ross Oliver)
German ministry of family et al. and links to porn (Klaus Brunnstein)
Privacy problems with HTTP cache-control (Martin Pool)
Re: Northwest grounded for 3.5 hours after cable cut (Henry Spencer,
    Bob Dubery)
Northwest Air fallout: MN backhoe affects FL hotel bookings! (William Smith)
Re: MIT grade spreadsheet problem (Allan Duncan, Tony Lima, John Pearson)
Abridged info on RISKS (comp.risks)

----------------------------------------------------------------------

Date: Thu, 30 Mar 2000 12:12:19 PST
From: "Peter G. Neumann" <neumann@csl.sri.com>
Subject: More NASA woes in stress testing

NASA subjected its $75M 850-pound High Energy Solar Spectroscopic Imager
spacecraft to preflight stress testing, and inadvertently managed to shake
it for about 200 milliseconds at 20 (instead of 2) times the force of
gravity.  Computationally it looks like a small off-by-one error, except
that it was one order of magnitude.  The HESSI was seriously damaged, with
two of its four solar panels cracked.  However, it may be salvageable,
having continued to function through the testing!  [PGN-ed from various
sources.  It may need HESSIan solders to fix it?]

http://www.abcnews.go.com/sections/science/DailyNews/hessi000323.html
http://www.nytimes.com/library/national/science/032400sci-nasa-satellite.html
  
  [Note: the 2000-pound PC noted in RISKS-20.85 was pounds Sterling.  The
  850 pounds above is of course weight.  English is wonderfully ambiguous.
  Sorry for the confusion.  PGN]

------------------------------

Date: Thu, 30 Mar 2000 12:19:52 PST
From: "Peter G. Neumann" <neumann@csl.sri.com>
Subject: Re: Faster, cheaper *not* better

One of the problems associated with the hard landing of the Mars Lander is
now believed to have been a software flaw: when the landing gear deployed,
the software erroneously concluded landing had been achieved and ordered the
engines to be shut down prematurely.  [See media reports on 29 Mar 2000.]  It
is once again clear that faster and cheaper are typically not better.
Lowest-common-denominator systems are sure-fire candidates for subsequent
appearances in RISKS.

Doneel Edelson noted an AP item in *USA Today* on 22 Mar 2000 that discusses
some of NASA's problems -- cutting employees too deeply (from 25,000 to
18,500 over 7 years) and losing veteran engineers, failures in
communications between technical people and managers, etc.  The article also
notes that NASA administrator Daniel Goldin contradicted reports that NASA
knew of a rocket-engine flaw that resulted in a mission loss.

------------------------------

Date: Thu, 30 Mar 2000 16:50:30 -0500
From: Steven Huang <sthuang@hns.com>
Subject: More details on the Sea Launch failure (RISKS-20.84)

According to a report filed by the Associated Press
(http://www.cnn.com/2000/TECH/space/03/30/sealaunchfailure.ap/index.html),
the investigation is identifying a configuration error, causing "a valve to
remain open in the second stage pneumatic system", which is "involved in the
operation and steering of the engine, and the loss of pressure would have
reduced the performance so much that an onboard automatic flight termination
system would have been triggered."  The error is blamed on a ground-based
system.

Steven Huang, MobileSat, Hughes Network Systems, 11717 Exploration Lane 
Germantown, MD 20876   (240) 453-2357

------------------------------

Date: Mon, 27 Mar 2000 16:56:33 PST
From: "Peter G. Neumann" <neumann@csl.sri.com>
Subject: Stephen King eBook cracked (Re: Pack, RISKS-20.85)

Pirated PDF versions of Stephen King's "Riding the Bullet" have been
circulating on the Internet since 17 Mar 2000.  While many ISPs have forced
members to remove the decrypted files, they are still available from a Swiss
site, providing stark evidence of security weaknesses in PC-based eBook
distribution systems.  The episode has irked the companies developing such
systems, who complain that export restrictions have kept them from using
more powerful encryption techniques.  [Source: "Cracking the Bullet: Hackers
Decrypt PDF Version of Stephen King eBook, by Glenn Sanders and Wade Roush,
23 Mar 2000, full text at http://www.ebooknet.com/story.jsp?id=1671]

  [But as RISKS readers know, strong crypto by itself is not enough.]

------------------------------

Date: Tue, 28 Mar 2000 07:14:16 -0800
From: "Gillmor, Dan" <DGillmor@sjmercury.com>
Subject: California privacy legislation

  [We have been warning about identity theft for many years.
  It is now becoming a criminal art form.  The following item is
  from the IP list of David Farber <farber@cis.upenn.edu>.  PGN]

http://www.mercurycenter.com/svtech/columns/gillmor/docs/dg032800.htm

NOT too long ago, someone I know well suffered that most modern of crimes,
identity theft. A crook got hold of useful information -- including her
Social Security number -- and used it to create a fraudulent identity.

The victim discovered the fraud when bills started coming in for things she
hadn't bought. Then ``I got letters from lawyers saying they were suing me
because I hadn't paid,'' she says. The onus was on her to make things right
with credit bureaus, financial institutions and the like -- and the
paperwork was massive.

This kind of outrage is all too common. American businesses are all too
casual with our Social Security numbers and other information. Greasing the
wheels of commerce has been far, far more important than protecting people's
privacy. Law enforcement, meanwhile, believes it has better things to do
than investigate, much less prosecute, such crimes.

But you can almost feel privacy gaining strength as a public issue. The
Internet Age has opened people's eyes, because people are beginning to see
the consequences when all kinds of data ends up in databases that are open
to anyone with sufficient cash.

Not many legislators -- federal, state or local -- have grasped the growing
public angst until recently. One of several in the California Legislature
who understood the issue early is state Sen. Debra Bowen, D-Redondo Beach,
who has introduced several bills that would go a long way toward protecting
you and me from predatory data practices.

------------------------------

Date: Sun, 26 Mar 2000 20:18:38 -0500 (EST)
From: kensey@crowley.orion-com.com (Joe Thompson)
Subject: Criminal records in North Carolina

Some time ago I sent in an item (RISKS-20.17) about the new sex offender
database in Virginia and how quickly errors were revealed.  On a trip this
weekend to the North Carolina Renaissance Faire, I was watching TV the
evening I checked in and saw ads for "123nc.com".  Apparently North Carolina
has gone Virginia one better -- the Website allows visitors to search *all*
criminal records in the state of North Carolina.  The risks are the same
kind but much magnified.

It's also worth noting that this site does *not* appear to be a state
government operation. -- Joe

Joe Thompson | http://www.orion-com.com/~kensey/  spam+@orion-com.com

  [Yes, Virginia, there is a sanity clause (bad pun prompted by the famous 
  letter in the *Herald Tribune* many years ago).  Virginia was also the
  first state to pass UCITA, the Uniform Computer Information Transactions 
  Act, a horrendously bad piece of legislation.  Incidentally, Maryland has
  just jumped on what we hope will not become a bandwagon.  PGN]

------------------------------

Date: Mon, 20 Mar 2000 08:25:03 -0700
From: "NewsScan" <newsscan@newsscan.com>
Subject: Judge issues injunction in software reverse-engineering case

A federal judge in Boston has ordered a halt to distribution of the "cphack"
software created by two computer hackers by reverse-engineering the
commercially distributed "Cyber Patrol" program that allows parents to
shield their children from pornography on the Internet.  The judge's order
also applies to any mirror Websites where the program has been made
available.  Peter Junger, a law professor and free speech advocate, calls
the ruling "a rather horrifying challenge to people's right to write
software" and to figure out how it works by taking it apart and examining
it.  [*USA Today* 17 Mar 2000; NewsScan Daily, 20 Mar 2000]
http://www.usatoday.com/life/cyber/tech/cth570.htm

  [Reverse engineering would be effectively outlawed wherever UCITA (noted
  above) passes.  PGN]

------------------------------

Date: Thu, 30 Mar 2000 12:29:33 PST
From: "Peter G. Neumann" <neumann@csl.sri.com>
Subject: Re: Hackers sued by software-filtering company (RISKS-20.84-85)

However, cphack had been *copyleft* under the Free Software Foundation's GPL
General Public License (http://www.gnu.org), which among other things makes
redistribution unrestricted.

For background, see Judge Harrington's order:
  http://www.politechbot.com/cyberpatrol/final-injunction.html
Declan McCullagh reportage:
  http://www.wired.com/news/politics/0,1283,35244,00.html
  http://www.wired.com/news/politics/0,1283,35226,00.html
  http://www.wired.com/news/politics/0,1283,35216,00.html

------------------------------

Date: Wed, 29 Mar 2000 13:58:23 -0800
From: "Ross Oliver" <reo@iwi.com>
Subject: Re: Hackers sued by software-filtering company (RISKS-20.84-85)

In RISKS 20.85, Bear Giles writes:

>> To a critical mind, several questions scream out:
>> - why are the blacklists encrypted?  [...]
>> - how would knowing that a site is on the blacklist permit a kid
>>   to access the blocked site?

Jansson and Skala do use the term "encrypted" to describe how CyberPatrol
stores its blocklists.  However, after reading the technical details, I
think the term "compiled" is more accurate.  The file format seems to be
optimized for space and efficiency of loading and parsing, with obfuscation
as only a side effect.

As to the second question, Jansson and Skala flatly state in their essay:
"Now, let's review our goals. First, we want to break the authentication, so
let's talk about that."  They refer to the authentication for gaining
administrative access, which can then be used to bypass the filter.  And
they proceed to do so, and tell the world how.  This is what got CyberPatrol
so ticked off.  After the excrement hit the ventilating device, they draped
themselves in free speech rhetoric and claims of fair use to justify their
actions.  Had they limited their analysis to the blocklist file format,
CyberPatrol might have had much less legal and public relations leverage.

I am not attempting to defend the actions of the filter vendors.  However,
when dealing with any organization that claims the moral high ground, your
credibility can be deeply affected if your actions are perceived to be
questionable in any way.  This is a common trap for many techno-rebels,
especially youthful ones.

It is interesting to note that the major content filter vendors are
repositioning themselves toward the business market.  This shift could be
because businesses have a more money to spend than parents and libraries,
and businesses have much more latitude in the workplace to impose arbitrary
restrictions.

Ross Oliver <reo@airaffair.com>

------------------------------

Date: Wed, 29 Mar 2000 13:47:13 +0200
From: Klaus Brunnstein <brunnstein@informatik.uni-hamburg.de>
Subject: German ministry of family et al. and links to porn

German tabloids and media discuss links to porno and sex related Websites
which could be found on the homepage of the federal ministry of family,
elderly people, women and youth. After some public uproar, the Website is
closed now. Some media as well as "experts" from parties in the federal
parliament tend to assume that these including these links has originated in
the ministry itself (which would indeed be a serious case), but almost
nobody is aware about how easy it is to hack unprotected Websites (in the
absence of proper auditing, nothing is known how the links developed).

This case demonstrates several serious aspects of risks:

* despite assumptions that information flows into even remote corners of the
  "global village", German media and politicians are not aware of
  well-reported previous events where Websites of government and other
  institutions have been hacked (cases such as DoJ and Website hacking
  during Kosovo war have been reported via Internet :-)

* awareness of Internet InSecurity and demand for protective action seems to
  develop only after malevolent experience; in this sense, hacking may be
  understood as contributing to improved security, whereas the simple way to
  protect oneself from the beginning (e.g. by presenting ones Webpages by
  burning it into a CDR or protecting ones site by firewalls or "properly
  administrating Websites) is unattractively easy.

* Media and politicians approach the "Information Society" in too uncritical
  manner to observe its inherent InSecurity. In related discussions, I am
  often told (even by people with good knowledge of some Computer Science
  area :-), that Internet was founded as military technology, so it must be
  inherently secure. As contradicting facts are available easily (when
  searched for), the assumption that Internet is a heaven of Knowledge is
  hardly justified.

Regrettably, the security community contributes to misunderstanding risks by
using terms such as "weaknesses" and "exploits" for software which is
inherently insecure and unsafe: it is NOT a WEAKNESS which is exploited, but
it is the basic nature that software is INSECURE and UNSAFE - at any speed
(esp. at GigaBit instructions per second and GigaByte storage and GigaBaud
:-) Evidently, it is high time for some "Ralph Nader" to rewrite that famous
book "Unsafe at any speed" (then addressing problems in automobile
manufacturing) for the carriers of the "Information Society", especially
including The Internet.

  [More on 30 Mar 2000]

The basic assumption that a ministry responsible for protecting the youth
against illicit information (such as porn-related sites) shall guarantee, at
least to some degree, the adequacy of the content of its Websites (including
essential links against direct access to porno sites) proved to be wrong in
the reported case.  Indeed, the ministry's Webpage linked to a Website which
lead directly (among many hundreds links related to "women interests") to
Websites such as callboys.

I am glad to admit that Germany has, so far, not publicly observed any
attack on a federal government Website. So, our federal government remains
in its innocent status (concerning this aspect :-).

BUT: evidently, Webpage content quality assurance needs development, esp. on
government level. One interesting risk now moves its head: how deep shall
link levels be controlled for some sort of "coherence" with (at least: not
directly contradicting) the intentions of the owner of the original Website?
Setting aside the argument that addition of links to referenced Websites may
practically not be controlled by the administration of the linking Website,
responsibility of Website owners should *at least guarantee* that the *first
link* does not point directly to Websites which contradict the intentions
and interests of the original Website's owner. In critical cases, one may
even require that 2nd-level links must also be assured. When it is true that
every Website in The Internet may be reached with at most 7 clicks (as some
German "experts" publicly argued), then it seems impracticable to control
more than 2 link levels.

Moreover, every government Website should contain a disclaimer that 
the owner of the Website is not responsible for any link at higher levels, 
and that the owner's responsibility holds only for the day/time given as 
actual status ("last updated").

Consequently, "netiquette" must not only address responsible behaviour of
customers but also for those offering Internet information.

Klaus Brunnstein (March 30,2000)
 
------------------------------

Date: Wed, 29 Mar 2000 15:19:21 +1000
From: Martin Pool <mbp@LINUXCARE.COM>
Subject: Privacy problems with HTTP cache-control

  [Forwarded to RISKS by Lindsay Marshall, from junkbuster-users.  PGN]

executive summary

     HTTP cache-control headers such as If-Modified-Since allow servers
     to track individual users in a manner similar to cookies, but with
     less constraints. This is a problem for user privacy against which
     browsers currently provide little protection.

  problem statement

     Alice is browsing the Web; Bob runs a number of otherwise-unrelated
     Web servers. Alice makes several requests to Bob's servers over
     time. Bob would like to tie together as many as possible of the
     requests made by Alice to learn more about Alice's usage patterns
     and identity: we call this identifying the request chain. Alice
     would like to access Bob's servers but not give away this
     information.

  existing approaches

    cookies

   The standard approach for associating user requests across several
   responses is the HTTP `Cookie' state-management extension. The Cookie
   response header allows a server to ask the client to store arbitrary
   short opaque data, which should be returned for future requests of
   that server matching particular criteria. Cookies are commonly used to
   store per-user form defaults, to manage Web application sessions, and
   to associate requests between executions of the user agent.

   The user agent always has the option to just ignore the Set-Cookie
   response header, but most implementations default to obeying it to
   preserve functionality. Cookies can optionally specify an expiry time
   after which they should no longer be used, that they should persist on
   disk between client session, or that they should be passed only over 
   transmission-level-secure connections.

   The privacy implications of cookies have been [1]extensively
   discussed, and several problems have been found and recitified in the
   past. One example of privacy compromise through cookies is the use of
   cookies attached to banner images downloaded from a central banner
   server: the same cookie is used within images linked from several
   servers, and so the user can be tracked as they move around.

    other approaches

   An obvious means to associate requests is by source IP address. Over
   the short term this will generally work quite well, as a client is
   likely to use a single IP address during a browsing session. Even then
   it is complicated by proxies acting for multiple clients, network
   address translation, or multiuser machines. Over a longer term, the
   information is convolved by dynamically-assigned IPs, mobile computers
   moving between networks, dialup pools and the like. Indeed, cookies
   were proposed in large part to allow legitimate stateful applications
   to cope with the impossibility of uniquely identifying users by IP
   address.

  the meantime exploit

   The fundament of the meantime exploit is that the server wishes to
   `tag' the client with some information that will later be reported
   back, allowing the server to identify a chain. Cookies are a good
   approach to this, but their privacy implications are well known and so
   Bob requires a more surreptitious approach.

   The HTTP cache-control headers are perfect for this: the data is
   provided by the server, stored but not verified by the client, and
   then provided verbatim back to the server on the next matching
   request.

   Two headers in particular are useful: Last-Modified and ETag. Both are
   designed to help the client and server negotiate whether to use a
   cached copy or fetch the resource again.

   The general approach of meantime is that rather than using the headers
   for their intended purpose, Bob's servers will instead send down a
   unique tag for the client.

   Last-Modified is constrained to be a date, and therefore is somewhat
   inflexible. Nevertheless, the server can reasonably choose any second
   since the Unix epoch, which allows it to tag on the order of one
   billion distinct clients.

   ETag allows an arbitrary short string to be stored and passed. It is
   not so commonly implemented in user agents at the moment, and so not
   such a good choice.

   In both cases the tag will be lost if the client discards the resource
   from its cache, or if it does not request the exact same resource in
   the future, or if the request is unconditional. (For example, Netscape
   sends an unconditional response when the user presses Shift+Reload.)
   Bob has less control over this than he has with cookies, which can be
   instructed to persist for an arbitrarily long period.

   The date is sent back only for the exact same URL, including any query
   parameters. By contrast, cookies can be returned for all resources in
   a site or section of a site. This makes Bob's job a little harder.

   Bob therefore should make sure that all pages link to a small common
   resource: perhaps a one-pixel image. This image is generated by a
   script that supplies and records a unique timestamp to each client,
   and records whatever is already present.

For a demonstration, more explanation and details, please see

  http://www.linuxcare.com.au/mbp/meantime/

Martin Pool, Linuxcare, Inc. Linuxcare. 
+61 2 6262 8990  mbp@linuxcare.com, http://www.linuxcare.com/

------------------------------

Date: Sat, 25 Mar 2000 22:03:26 -0500 (EST)
From: Henry Spencer <henry@spsystems.net>
Subject: Re: Northwest grounded for 3.5 hours after cable cut (Dixon, R-20.85)

>When will people learn they need to know where their redundancy lies?
>Cables run through the same conduit are only partially redundant...

Alas, merely wanting the information is not enough.  The wire/fiber
providers often are not particularly forthcoming with this information;
worse, typically it is subject to change without notice.  They have
deeply-ingrained organizational beliefs that (a) it's nobody's business but
theirs where the wires run, and (b) a wire is a wire and which conduit it
goes through doesn't matter.  If memory serves, even customers whose
contracts explicitly called for routing diversity have been bitten by this.

  [Virginia, again.  Two adjacent fiber cables were severed in Annandale VA
  on 14 Jun 1991, taking out 80K circuits.  I believe that after the White
  Plains NY ARPAnet cable cut that cut all 7 links to New England, either AP
  or UPI (or perhaps even both) had insisted that their connections should
  be in different conduits.  The outage affected AP, UPI, and Pentagon,
  among others. (See RISKS-11.92).  PGN]

------------------------------

Date: Sat, 25 Mar 2000 07:13:51 +0200
From: "Bob Dubery" <bdubery@netcare.co.za>
Subject: Re: Northwest grounded for 3.5 hours after cable cut (Dixon, R-20.85)

A recent edition of *New Scientist* carried a short report on an
international telecomm conference.  One of the interesting points in that
report was that Singapore has an enviably low rate of telephone and data
cable outages. The reason? If a cable is cut by a building crew then the
foreman gets to spend time in jail.

That's draconian. But cables, and the information that they carry, are now
so important to businesses, commerce, and, increasingly, to public safety
and transport, that contracts should stipulate penalties to be imposed in
the case of cable breaks.

The risk of not doing so is becoming increasingly obvious.

------------------------------

Date: Tue, 28 Mar 2000 10:48:18 -0500
From: "William P. N. Smith" <wpns@compusmiths.com>
Subject: Northwest Air fallout: MN backhoe affects FL hotel bookings!

>From the *Orlando Sentinel* (orlandosentinel.com), Northwest Airlines had to
book 50 rooms at the Orlando Airport Hyatt Regency when they lost most of
their comm lines recently due to backhoe fade.  They didn't say if other
Northwest counters had to book hotels in other airports, but I can't imagine
they didn't.

Yet another example of how interconnected things are, how single points of
failure you never knew existed can cause havoc, and how we discover those
same single points of failure (the hard way).  Don't get me started on how
tightly scheduled the airlines, airports, and flight crews are, where a few
minutes of delay in one flight can ripple through the system and cause
innumerable delays for the rest of the day.

William Smith    wpns@compusmiths.com    N1JBJ@amsat.org
ComputerSmiths Consulting, Inc.    www.compusmiths.com

------------------------------

Date: Mon, 20 Mar 2000 13:05:46 +1100 (EST)
From: Allan Duncan <a.duncan@trl.telstra.com.au>
Subject: Re: MIT grade spreadsheet problem (Franklin, RISKS-20.85)

The problem of the spread-sheet sort scrambling data has been around for
a while.  MS Excel, Office 97 and before can do it, but only recently
did I catch it in the act and as a result deduce the trigger.

If you have a spread-sheet with some blank entries in the top row, as may
well happen if there is no headings row, then the columns with the blank top
elements will not be included in the sort.

There may be other requirements as well, but in the case at hand that was
the necessary condition.

------------------------------

Date: Fri, 24 Mar 2000 18:55:55 -0800
From: Tony Lima <TonyLima@ms.spacebbs.com>
Subject: Re: MIT grade spreadsheet problem (Franklin, RISKS-20.85)

"When the only tool you have is a hammer, every problem looks like a nail."
The real problem here is using spreadsheet software when record integrity is
required.  That implies use of some sort of decent database program -- which
Excel and other spreadsheets are not.

Over the 25 years I've been teaching college, I've experimented with a
variety of grading media.  On the rare occasions when I teach a class with
over 100 students, I use a standard dbf file structure (usually with some
version of Foxpro).  Otherwise, I use the only tool I've found that meets my
criteria of durability, portability and readability: a deck of 3x5 index
cards, one card per student. - Tony Lima (professor of economics, Cal. State
Hayward)

------------------------------

Date: Sat, 25 Mar 2000 16:58:09 +1030
From: John Pearson <huiac@camtech.net.au>
Subject: Re: MIT grade spreadsheet problem (Franklin, RISKS-20.85)

Am I alone in thinking that this misses the obvious point that the error
arose because the coordinator used a spreadsheet to do a database's job?
  [Evidently not.  See Tony Lima's message!  PGN]

Application vendors have spent considerable effort adding features to word
processors and spreadsheets that extend their areas of application, without
necessarily improving the usability and reliability of their product; in
this case it sounds like vi, sort and awk may have been more reliable
candidates for the job.

This specific risk is one I encountered more than once while working in the
public sector; at the time it was tolerated because of the huge disparity
between the cost and availability of spreadsheets (typically, bundled with
your word processor or the PC itself) and database software (several hundred
dollars, often with only rudimentary reporting and presentation
capabilities).

Are things still that bad, or are people really that resistant to the idea
of using the right tool for the job?

  [It's as old as the Code of Hammer-Robbie!  PGN]

John P. <huiac@camtech.net.au> <john@huiac.apana.org.au>

------------------------------

Date: 13 Dec 1999 (LAST-MODIFIED)
From: RISKS-request@csl.sri.com
Subject: Abridged info on RISKS (comp.risks)

 The RISKS Forum is a MODERATED digest.  Its Usenet equivalent is comp.risks.
=> SUBSCRIPTIONS: PLEASE read RISKS as a newsgroup (comp.risks or equivalent) 
 if possible and convenient for you.  Alternatively, via majordomo, 
 SEND DIRECT E-MAIL REQUESTS to <risks-request@csl.sri.com> with one-line, 
   SUBSCRIBE (or UNSUBSCRIBE) [with net address if different from FROM:] or
   INFO     [for unabridged version of RISKS information]
 .MIL users should contact <risks-request@pica.army.mil> (Dennis Rears).
 .UK users should contact <Lindsay.Marshall@newcastle.ac.uk>.
=> The INFO file (submissions, default disclaimers, archive sites, 
 copyright policy, PRIVACY digests, etc.) is also obtainable from
 http://www.CSL.sri.com/risksinfo.html  ftp://www.CSL.sri.com/pub/risks.info
 The full info file will appear now and then in future issues.  *** All 
 contributors are assumed to have read the full info file for guidelines. ***
=> SUBMISSIONS: to risks@CSL.sri.com with meaningful SUBJECT: line.
=> ARCHIVES are available: ftp://ftp.sri.com/risks or
 ftp ftp.sri.com<CR>login anonymous<CR>[YourNetAddress]<CR>cd risks
   [volume-summary issues are in risks-*.00]
   [back volumes have their own subdirectories, e.g., "cd 19" for volume 19]
 or http://catless.ncl.ac.uk/Risks/VL.IS.html      [i.e., VoLume, ISsue].
 Also, new AUSTRALIAN archives at http://mirror.aarnet.edu.au/risks/ and
   http://the.wiretapped.net/security/textfiles/risks-digest/ .
 PostScript copy of PGN's comprehensive historical summary of one liners:
   illustrative.PS at ftp.sri.com/risks .

------------------------------

End of RISKS-FORUM Digest 20.86
************************


[: hacktivism :]
[: for unsubscribe instructions or list info consult the list FAQ :]
[: http://hacktivism.tao.ca/ :]