CRYPTO-GRAM, June 15, 2000
From
Bruce Schneier <schneier@counterpane.com>
Date
Thu, 15 Jun 2000 21:41:35 -0500
[: hacktivism :]
CRYPTO-GRAM
June 15, 2000
by Bruce Schneier
Founder and CTO
Counterpane Internet Security, Inc.
schneier@counterpane.com
http://www.counterpane.com
A free monthly newsletter providing summaries, analyses, insights, and
commentaries on computer security and cryptography.
Back issues are available at http://www.counterpane.com. To subscribe or
unsubscribe, see below.
Copyright (c) 2000 by Counterpane Internet Security, Inc.
** *** ***** ******* *********** *************
In this issue:
SOAP
Crypto-Gram Reprints
News
Counterpane Internet Security News
Java and Viruses
The Doghouse: Infraworks
The Data Encryption Standard (DES)
Comments from Readers
** *** ***** ******* *********** *************
SOAP
SOAP (Simple Object Access Protocol) is a proposed standard for linking
Internet applications running on different platforms, using XML
messages. SOAP is designed to connect together programs running on
different machines, without regard to what OS/CPU is on each. It's
basically remote procedure calls (RPC) implemented via HTTP with XML
content. Because no security is required in either HTTP, XML, or SOAP,
it's a pretty simple bet that different people will bungle any embedded
security in different ways, leading to different holes on different
implementations. SOAP is going to open up a whole new avenue for security
vulnerabilities.
SOAP has been developed by a bunch of companies, but it's instructive to
read Microsoft's own words on security and SOAP:
"Currently, developers struggle to make their distributed applications work
across the Internet when firewalls get in the way. Since most firewalls
block all but a few ports, such as the standard HTTP port 80, all of
today's distributed object protocols like DCOM suffer because they rely on
dynamically assigned ports for remote method invocations. If you can
persuade your system administrator to open a range of ports through the
firewall, you may be able to get around this problem as long as the ports
used by the distributed object protocol are included.
"To make matters worse, clients of your distributed application that lie
behind another corporate firewall suffer the same problems. If they don't
configure their firewall to open the same port, they won't be able to use
your application. Making clients reconfigure their firewalls to
accommodate your application is just not practical.
"Since SOAP relies on HTTP as the transport mechanism, and most firewalls
allow HTTP to pass through, you'll have no problem invoking SOAP endpoints
from either side of a firewall. Don't forget that SOAP makes it possible
for system administrators to configure firewalls to selectively block out
SOAP requests using SOAP-specific HTTP headers."
That's right. Those pesky firewalls prevent applications from sending
commands to each other, so SOAP lets vendors hide those commands as HTTP so
the firewall won't notice.
Let's continue the DCOM example. So what if DCOM runs over a firewall?
DCOM is Microsoft's main protocol for inter-application
communication. It's not just used by programs that are intended to be
servers; it's used for all sorts of desktop communication and remote
access. The result is that an average machine has dozens of programs using
DCOM. Mine shows 48, ranging from "Microsoft PowerPoint Presentation" to
"logagent" and including the catchily named
"{000C101C-0000-0000-C000-000000000046}"; you may be able to list yours by
bringing up a Command Prompt and typing "dcomcnfg".
Now, there are lots and lots of ways to secure DCOM applications, so maybe
all of those applications are happily responding only to authenticated
requests from the local machine. On the other hand, there are lots and
lots of ways to make DCOM applications insecure, so maybe one of them is
just waiting for somebody to send it an entirely unauthenticated request to
overwrite selected files on my hard disk.
Firewalls have good reasons for blocking protocols like DCOM coming from
untrusted sources. Protocols that sneak them through are not what's wanted.
Information on SOAP:
<http://soap.weblogs.com/>
Microsoft's document (which includes the quoted paragraphs):
<http://msdn.microsoft.com/library/periodic/period00/soap.htm>
** *** ***** ******* *********** *************
Crypto-Gram Reprints
Those of you who have subscribed recently might have missed these essays
from back issues.
Timing attacks, power analysis, and other "side-channel" attacks against
smart cards:
<http://www.counterpane.com/crypto-gram-9806.html#side>
The internationalization of cryptography, policy:
<http://www.counterpane.com/crypto-gram-9906.html#policy>
and products:
<http://www.counterpane.com/crypto-gram-9906.html#products>
The new breeds of viruses, worms, and other malware:
<http://www.counterpane.com/crypto-gram-9906.html#viruses>
** *** ***** ******* *********** *************
News
New kinds of network devices to attack:
<http://news.cnet.com/news/0-1003-200-1864025.html?tag=st.ne.ron.lthd.ni>
(Note the invocation of "128-bit encryption," as if that just solves
everything.)
The UK advertising authority upholds a complaint that an RSA flyer was
misleading when it implied that freeware encryption was somehow inadequate:
<http://www.asa.org.uk/adj/adj_4765.htm>
One project to reduce buffer overflow vulnerabilities:
<http://www.bell-labs.com/org/11356/libsafe.html>
Modern phone phreaking:
<http://www.wired.com/news/business/0,1367,36309,00.html>
What if smart people wrote computer viruses?:
<http://lcamtuf.na.export.pl/worm.txt>
Excellent speech by Dan Geer on risk management and security:
<http://www.stanford.edu/~hodges/doc/Geer-RiskManagement.txt>
Just because you configure Microsoft Windows 2000 to use triple-DES with
IPsec, doesn't mean you get it:
<http://www.wired.com/news/technology/0,1282,36336,00.html>
In some circumstances the software only uses single DES. And to make
matters worse, it never bothers alerting the user.
The Canadian government intended to put all of its data on citizens in one
big database, including tax return information (which, by law, Revenue
Canada is forbidden to disclose to other arms of government).
<http://www.wired.com/news/politics/0,1283,36435,00.html>
<http://www.canoe.ca/CNEWSArchiveMay00/candigest_may16.html>
<http://www.canoe.ca/CNEWSArchiveMay00/candigest_may17.html>
But the plan was scrapped.
<http://www.hrdc-drhc.gc.ca/common/news/dept/00-39.shtml>
<http://www.wired.com/news/politics/0,1283,36649,00.html>
More on the dangers of relying on PKI:
<http://www.DGA.co.uk/customer/publicdo.nsf/public/WP-HERESY>
Microsoft's Office Assistant -- that annoying paper-clip helper in Office
-- has some nasty security vulnerabilities in Office 2000. It seems that
an attacker can write scripts for the assistant that can do all sorts of
damaging things, and that these scripts can run automatically when the user
clicks on a Web page or opens an HTML-enabled e-mail. This is an amazing
breach of security for Microsoft. Because they chose to mark all Office
Assistant scripts as "safe," these scripts can do anything they want. This
is exactly the sort of vulnerability that virus writers exploit. This
isn't a programming error; this is a deliberate design decision made at a
very high level. Even more evidence that Microsoft doesn't take security
seriously.
<http://www.zdnet.com/zdnn/stories/news/0,4586,2570727,00.html>
A Microsoft bulletin that really downplays the importance:
<http://www.microsoft.com/technet/security/bulletin/ms00-034.asp>
A patch from Microsoft:
<http://officeupdate.microsoft.com/2000/downloadDetails/Uactlsec.htm>
Someone actually patented using a tattooed bar code to verify a person's
identity. Isn't Revelation 13:16-18 prior art?
<http://patents.uspto.gov/cgi-bin/ifetch4?ENG+PATBIB-ALL+0+946309+0+7+25907+
OF+1+1+1+PN%2f5%2c878%2c155>
Ph.D. dissertation on the incidence of serious Internet attacks. According
to this research, it happens less often than people claim:
<http://www.cert.org/research/JHThesis/Start.html>
Tired of all of these Microsoft Visual Basic viruses? Turn off the
Scripting Host entirely:
<http://www.fsecure.com/virus-info/u-vbs/>
Real Networks demonstrates that they just don't learn:
<http://www.vortex.com/privacy/priv.09.15>
Smart electronics that knows its location. Want to bet that no one has
thought about the security ramifications of this technology?
<http://www.telegraph.co.uk/et?ac=000111464113065&pg=/et/00/5/7/ntac07.html>
Not that I mind the U.S. government studying privacy, but in this case an
18-month study means 18 months of no action:
<http://www.cnn.com/2000/TECH/computing/05/22/new.privacy.study.idg/index.html>
According to a news report, the EU lifted all restrictions on encryption
restrictions, despite the protests of the United States.
<http://www.heise.de/tp/english/inhalt/te/8179/1.html>
The truth wasn't nearly as good. At a meeting on 22 May, the European
Ministers of Foreign Affairs withdrew the proposal from their agenda at the
last minute. No reason for this change of heart by European ministers was
given. Officials from both France and the UK expressed reservations about
the measure, and officials have confirmed that the U.S. pressured the
EU to block the decision. As far as I know, no official statement has been
made by anyone.
<http://www.wired.com/news/politics/0,1283,36623,00.html>
Good intro article on crypto from IEEE Spectrum:
<http://www.spectrum.ieee.org/pubs/spectrum/0400/enc.html>
XML and how to secure it:
<http://www.zdnet.co.uk/news/2000/20/ns-15500.html>
The FTC finally gives up on Internet privacy self-regulation:
<http://www.zdnet.com/zdnn/stories/news/0,4586,2574082,00.html>
The FTC report:
<http://www.ftc.gov/os/2000/05/index.htm#22>
Social engineering in the real world: using fake IDs to penetrate
government buildings.
<http://www.cnn.com/2000/US/05/25/security.breaches.01/index.html>
The best line is: "'I think any time you expose vulnerabilities it's a good
thing,' said Attorney General Janet Reno...." This, of course, means that
she is in favor of full disclosure of network vulnerabilities.
Keystroke Surveillance Tool: A small piece of hardware can covertly record
and store half a million keystrokes of information. The device can be
hidden in a keyboard or a PS/2 plug, and requires no software to install.
<http://www.zdnet.co.uk/news/2000/12/ns-14347.html>
The myth of open source security: An essay by the author of the open
source Mailman program explains why open source is not as secure as you
might think -- using security holes in his own code as an example.
<http://developer.earthweb.com/journal/techfocus/052600_security.html>
And Slashdot reactions to the essay:
<http://slashdot.org/articles/00/05/28/1838201.shtml>
The sorry state of the CIA and NSA:
<http://www.washingtonpost.com/wp-dyn/articles/A22957-2000May28.html>
NPR ran a story on the shortwave numbers stations. Schneier was
interviewed for the story.
<http://www.npr.org/programs/lnfsound/stories/000526.stories.html>
EFF's testimony on the Digital Millennium Copyright Act. Really good reading.
<http://cryptome.org/dmca-eff.htm>
E-mail virus stolen from a researcher's computer:
<http://www.herald.co.nz/storydisplay.cfm?storyID=138622&thesection=technolo
gy&thesubsection=general>
Pennsylvania makes it crime to spread a computer virus. I don't know how
this affects Outlook viruses that spread on their own.
<http://www.cnn.com/2000/TECH/computing/06/01/virus.crack.down.idg/index.html>
SANS releases the top ten critical Internet security threats.
<http://www.sans.org/topten.htm>
Evidence from the CD Universe credit-card theft was tainted, so they can't
prosecute:
<http://www.msnbc.com/news/417406.asp?cp1=1>
** *** ***** ******* *********** *************
Counterpane Internet Security News
Counterpane's Managed Security Monitoring service has been running smoothly
for several months now. We're monitoring customer networks across the
U.S., and are starting to look at expanding to Europe and Asia. If you're
interested in learning how we can monitor your network, contact us at (888)
710-8175 or sales@sj.counterpane.com.
Interview with Bruce Schneier in Information Security Magazine:
<http://www.infosecuritymag.com/jun2000/junqa.htm>
Fast Company magazine profiled Counterpane:
<http://www.fastcompany.com/online/35/ifaqs.html>
Giga Research issues an opinion of Counterpane's monitoring service:
<http://www.counterpane.com/giga.pdf>
Computer Security Incident Handling Conference (First), June 26-30,
Chicago: Bruce Schneier will be speaking on June 27 at 2:00 PM and giving
the keynote address on June 28 at 9:15 AM.
<http://www.first.org/conference/2000/>
PC Week/DCI Security Summit, June 27-29, Boston: Bruce Schneier is
co-chair, and will deliver the keynote address at 9:00 AM on June 29.
<http://www.dci.com/brochure/secbos/>
Black Hat Briefings, July 26-27, Las Vegas: Bruce Schneier will be speaking
in the morning of July 26th.
<http://www.blackhat.com>
The SecurityFocus Web site has an audio interview with Bruce Schneier:
<http://www.securityfocus.com/templates/media.html?id=25>
** *** ***** ******* *********** *************
Java and Viruses
At the JavaONE conference earlier this month, Scott McNealy made an
erroneous comment about Java. Here's how CNet reported it:
>In his [JavaONE] keynote, [...] McNealy said that
>Java is immune to viruses such as Melissa and "I
>Love You" that have spread through Microsoft Outlook,
>a statement that security experts generally back up.
My guess is that security experts don't back that statement up, because
it's wrong. McNealy is confusing an application (Microsoft Outlook) with a
programming language (Java). And he is confusing design flaws with
implementation flaws.
The main problems with the security in Outlook is that it:
a) defaults to effectively "no security" in many cases.
b) hides essential risk-assessment information by default (e.g., file
extensions).
c) is easily told to do things (scripted) by other programs.
All of these are design flaws. That is, Microsoft Outlook was designed by
Microsoft to have these problems, They are not implementation flaws:
errors made by the programmers during development. The fundamental problem
is that the security implications of the chosen design and feature-set were
never examined in any meaningful way.
None of the above design flaws is a preventable or secured contingency
covered by Java's "sandbox" model, nor its security policy mechanism, nor
any other aspect of Java's security model. In fact, I could easily write
an e-mail application in Java that had design flaws identical to Outlook
Express. Nothing in Java nor in its security model can prevent or even
hinder such a design.
What McNealy probably means to distinguish is the permissive nature of
Microsoft Outlook when it comes to executing attachments, and the more
secure nature of the Java sandbox. Java contains specific provisions to
deal with potentially dangerous applets, and tries to execute them in a
protected mode that severely limits the effects they can have on other
applications and the host computer. The Java design team has spent a lot
of time worrying about malicious executables and how they can be prevented
from running amok.
The CNet story:
<http://news.cnet.com/news/0-1003-200-2026364.html>
** *** ***** ******* *********** *************
The Doghouse: Infraworks
Yet another company is claiming to have "revolutionary, patented new
technology" to control the use of data on other people's computers.
I quote from their press release: "It means that you can send digital
files to anyone without fear of unauthorized redistribution. For example,
you could attach a Word or Excel document in an e-mail to anyone anywhere,
and prohibit them from forwarding it, printing it, copying it or cutting
and pasting any part of it. You set the permissions, and the data is
deleted when the permissions are exhausted. You are in total control of
your digital property."
"If hacking is attempted, the file self-destructs. Just like Mission
Impossible. No other technology can do this."
Someone remind these nice people that Mission Impossible is fiction.
<http://www.infraworks.com/>
** *** ***** ******* *********** *************
The Data Encryption Standard (DES)
The Data Encryption Standard (DES) has been the most popular encryption
algorithm of the past twenty-five years. Originally developed at IBM
Corporation, it was chosen by the National Bureau of Standards (NBS) as the
government-standard encryption algorithm in 1976. Since then, it has
become a domestic and international encryption standard, and has been used
in thousands of applications. Concerns about its short key length have
dogged the algorithm since the beginning, and in 1998 a brute-force machine
capable of breaking DES was built. Today, modifications to DES, such as
triple-DES, ensure that it will remain secure for the foreseeable future.
In 1972, the NBS (since renamed the National Institute of Standards and
Technology, or NIST) initiated a program to protect computer and
communications data. As part of that program, they wanted to standardize
on a single encryption algorithm. After two public requests for
algorithms, they received a candidate from IBM based on research being done
in its Yorktown Heights and Kensington Laboratories. Among the people
working on this candidate were Roy Adler, Don Coppersmith, Horst Feistel,
Edna Grossman, Alan Konheim, Carl Meyer, Bill Notz, Lynn Smith, Walt
Tuchman, and Bryant Tuckerman.
The algorithm, although complicated, was straightforward. It used only
simple logical operations on small groups of bits and could be implemented
fairly efficiently in the mid-1970s hardware of the time. DES is not very
efficient in software, especially the 32-bit architectures that are common
today.
Its overall structure was something called a Feistel network, also used in
another IBM design called Lucifer. DES is a block cipher, meaning that it
encrypts and decrypts data in blocks: 64-bit blocks. DES is an iterated
cipher, meaning that it contains 16 iterations (called rounds) of a simpler
cipher. The algorithm's primary strength came from something called an
S-box, a non-linear table-lookup operation by which groups of six bits
would be replaced by groups of four bits. These table lookups were
expressed as strings of constants.
NBS lacked the ability to evaluate the algorithm, so they turned to the
National Security Agency (NSA) for help. The NSA did two things: they
changed the constants in the S-boxes, and they reduced the key size from
its original 128 bits to 56 bits.
The revised algorithm, called DES, was published by NBS in March
1975. There was considerable public outcry, both regarding the "invisible
hand" of the NSA -- the changes they made were not made public, and no
rationale was given for the S-box constants -- and the short key
length. Originally the key length was supposed to be reduced to 64 bits,
but when the standard was published, it turned out 8 of those bits were
"parity bits" used to confirm the integrity of the other 56 bits, and not
part of the key at all.
Despite criticism, the DES was adopted as a Federal Information Processing
Standard in November 1976. It was the first time an NSA-evaluated
encryption algorithm was ever made public, and was one of the two most
important developments that spurred the development of public cryptography
research (the other was the invention of public-key cryptography).
After becoming a U.S. government standard, DES was adopted by other
standards bodies worldwide, including ANSI and ISO. It became the standard
encryption algorithm in the banking industry, and was used in many
different applications around the world. The terms of the standard
stipulated that it would be reviewed and recertified every five years. NBS
recertified DES for the first time in 1987. NIST (NBS after the name
change) recertified DES in 1993. In 1997 they initiated a program to
replace DES: the Advanced Encryption Standard.
The NSA's involvement in the S-box values became clear in the early
1990s. Two Israeli cryptographers, Eli Biham and Adi Shamir, invented a
powerful cryptanalytic attack called "differential cryptanalysis," and
showed that the DES S-boxes were optimized to resist this heretofore
unknown attack. It later became public that the IBM team had developed
this attack themselves while creating Lucifer and DES, and that the NSA
classified their research.
In the late 1990s, it became widely believed that the NSA was able to break
DES by trying every possible key, something called "brute force"
cryptanalysis. This ability was graphically demonstrated by the Electronic
Frontier Foundation in July 1998, when John Gilmore built a machine for
$250,000 that could brute-force a DES key in a few days.
Years before this, more secure applications had already converted to an
encryption algorithm called triple-DES (also referred to as
3DES). Triple-DES is the repeated application of three DES encryptions,
using two or three different keys. This algorithm leverages all the
security of DES while effectively lengthening the key, and is in wide use
today to protect all kinds of personal, business, and financial secrets.
DES is the most important algorithm ever made. Because it had an NSA
pedigree, it was widely believed to be secure. It is also the most studied
encryption algorithm ever invented, and many cryptographers "went to
school" on DES. Almost all of the newer encryption algorithms in use today
can trace their roots back to DES, and papers analyzing different aspects
of DES are still being published today.
** *** ***** ******* *********** *************
Comments from Readers
From: "J. Christopher Williams" <jcw@datarave.com>
Subject: Phil Agre on the Outlook Worms
Phil Agre wrote:
"I received about 60 copies of the latest Microsoft e-mail virus and its
variants. How many did you get? Fortunately I manage my e-mail with
Berkeley mailx and Emacs keyboard macros, so I wasn't at risk. But if
we're talking about billions of dollars in damage, which equates roughly to
millions of lost work days, then I think that we and Microsoft need to have
a little talk."
While many may find the author's above comments relevant to the issue at
hand, branding products as a method of securing their computer strikes me
as one-upmanship. The fact is there are several ways to get around such
Virii & Worms, some of which do not require additional
products. Admittedly it would be nice if these features were enabled by
default. The one item that could be used to prevent viral infection not
mentioned by the above author is the human brain. Odd, considering it is
the weapon of choice.
"... This is by-design behavior, not a security vulnerability.
"More odd language. It's like saying, 'This is a rock, not something that
can fall to the ground'. It's confusing to even think about it. ..."
The author may be a security or computer expert, but his grasp of basic
grammar is less than firm. The more accurate grammar conversion would be
"This is an object thrown to the ground, not a free-falling object." The
statement in and of itself merely assists in defining what can be called
"thrown" and what can be called "free-falling" object. It is not
misleading unless the reader believes in the kind of misinformation
campaign the author suggests. To me the author is seemingly engaging in
the same "blame shifting tactic" that he accuses Microsoft of.
"...This particular blame-shifting tactic is particularly disingenuous
given that the virus spread rapidly through Microsoft itself, to the point
that the company had to block all incoming e-mail (Wall Street Journal
5/5/00). ..."
My company had fewer than 20 employees infected out of 1,600+ nation-wide
employees, yet we closed down our Exchange Servers as well. For
statistical comparison, every employee at the office where I work is
provided a computer, without exception. We number 250+ employees with
significant growth experienced during the last 90 days and a forecasted
doubling of employee base over the next 90. Other offices tend to be
similarly equipped, although their growth varies. The decision to do so
was based more upon the potential to infect non-company computers (i.e.,
repercussions of infecting a customer and losing their good will) and the
risk of receiving additional infectious material during the ILOVEYOU window
of opportunity.
"Microsoft shouldn't be broken up. It should be shut down."
I agree with the balance of the authors letter message, if not its
presentation. With the exception of the above statement. The statement is
very anti-individual (or in this case entity) which equates to prejudiced.
I think the author's message can be summed up in "Microsoft should protect
a customer from his/her-self whether the customer understands that
protection or not."
I personally believe in security. I believe in encryption keys and
signatures. I believe in both the pure research & practical application of
encryption theory. I also believe that it is not "Big Brother's" place to
decide what kind or how much encryption I use. I'd much rather Microsoft
provide a stable operating system with features into which encryption can
be incorporated. I'd much rather people like Counterpane provide
encryption alternatives & research to allow me to choose the best solution.
Either way, I don't believe that a "perfect" or "flawless" product
exists. As Counterpane has said repeatedly: Security is not a
destination, it is a journey. Microsoft cannot plug every flaw or security
hole that exists in their operating system or their applications; what they
can do is be responsive to revealed security flaws. That is the standard
we should hold them to.
To sum up my message: I believe software development companies should be
held accountable for their negligence. A feature demonstrably flawed yet
still implemented or not patched yet allowed to continue by the development
company should allow suit for damages by the damaged party. Phil's obvious
(to me) anti-Microsoft bent leads me to question his objectivity and
therefore the purpose to which he wrote his article.
From: Bob Smart <Bob.Smart@cmis.CSIRO.AU>
Subject: ILOVEYOU worm
I believe the message of the ILOVEYOU event has not been well
understood. John Carder gets it right in the specific case in a response
to James Gleick's SLATE article: "Windows Scripting Host gives a Visual
Basic script the authority of the user executing the script, not the
authority of the author of the script."
The problem of securely executing content that arrives across the network
is not restricted to Windows. Mobile code is a fact of life, whether it be
software downloads or Java applets. The Unix community have to pay
attention to this issue because you can be sure Microsoft now will.
When our receptionist sent around a Christmas greeting executable which a
friend had sent to her, I naturally didn't run it. However if software is
properly designed then it should be possible for people to send such things
to their friends. It should be possible to run them with no danger of
harm: just like a Java applet in a secure JVM.
From: phred@teleport.com
Subject: RTM vs. the "Love Bug"
Microsoft claims in its response to Jim Gleick's essay at Slate that the
"Love Bug" situation is somehow akin to the RTM worm.
Errant nonsense.
It's quite simple: RTM devised a stack-smasher and dictionary password
guesser with a clumsy forwarding mechanism and a rickety little
back-propagation communications channel. No user intervention was necessary.
The RTM worm was based on program bugs. The "Love Bug" was based on
Microsoft program *features*. It makes all the difference.
From: Jeff <jeff@antistatic.com>
Subject: Social Engineering and the ILOVEYOU worm
The ILOVEYOU worm was social engineering at a kindergarten
level. Microsoft was lucky that it got handed a "slow pitch". It could
have been MUCH worse.
If I were the malicious coder I would have added a critical step. After
getting the address book entry, the script should have checked for the last
message sent from that user and used THAT subject line.
1) Alice e-mails Bob with subject line "Picnic on Saturday?"
2) Carl e-mails Bob with the virus.
3) Virus sends copy of itself from Bob to Alice with subject line "Re:
Picnic on Saturday?"
The damage would have been much worse, and have been harder to filter at
the server level. Add in a level of code morphing, and this could have
completely shut down whole e-mail systems.
From: Peter Houppermans <Peter.Houppermans@pa-consulting.com>
Subject: Buffer Overflows
I'm not sure it's just a lack of interest in good quality, it strikes me
that some software houses have decided to ACCEPT that 'kind of OK' is good
enough for sale. The article below makes some interesting observations
that I can actually agree with, and might point at a slightly deeper cause:
good design fundamentals. Buffer overflows strike me as the result of not
properly managed development processes -- and some would argue the
consequence of using C ('providing enough rope to hang yourself' or, in my
opinion, with flexibility and power comes responsibility).
<http://www2.linuxjournal.com/articles/currents/019.html>
On a more amusing note, MS has found itself exposed to the law, as in
France (and, as far as I can tell, in Europe as a whole) the exclusion of
warranty is not legal -- i.e., null and void. So, some apparently
enterprising users are taking MS to court of the wonderful virus whose name
we can't mention because string filters will bounce the mail ;-), claiming
negligence as they should have covered this exposure since the Melissa virus.
From: Joe Harrison <joe-harrison@MailAndNews.com>
Subject: Vendor liability for bad software
I am puzzled by your frequent claims that software manufacturers can
uniquely get away with releasing products that are of such poor quality
that they would be considered unsaleably defective in other contexts.
Perhaps this is the case in the USA but I would have thought that many
(most?) other countries treat software just like any other item in disputes
about fitness for purpose. Certainly here in the UK vendors have been
successfully sued and precedent is set, your stuff legally has to work
right for whatever it's supposed to do!
Have a look at:
<http://elj.warwick.ac.uk/jilt/cases/97_3stal/stalban.htm>
From: Ray Jones <rjones@pobox.com>
Subject: Trusted Clients and Computer Games
You paint a pretty dreary picture of the gaming communities. Quake has
been having serious problems of late, but Netrek dealt with trusted clients
back in 1992 (they had a rudimentary system in place before that), and
seems to have avoided serious problems since then.
The Netrek community seems to have adapted to Borgs [computer programs that
assist play] via two methods:
1- binary blessing with a nontrivial scheme
2- providing some servers that welcome borgs
The blessing scheme is RSA-based, with the client's private key obscured in
the binary. This allows the open distribution of the blessed client key
list, so anyone can run a server without having to be on some trusted list
of server gods. Having borg-friendly servers provides an approved forum
for borg authors to show off their hacking skills.
The details of the verification aren't that important. What's interesting
is the attacks that people have used to get around them. In particular, of
the two hacks I've heard of, neither relied on extracting the key from the
binary. The first used a kernel modification to allow a client to reopen a
socket after the blessed client had authenticated itself. The second hack
was on the key generation code (poor random number generation, mea culpa).
It's certainly possible to extract the key, and then embed it in a borg,
but the payoff is low. Someone is likely to notice and revoke the key, and
borg authors are already able to use their clients in arenas where they can
get some actual competition (borg vs borg).
Admittedly, Netrek has less mindshare than Quake, and is also more borgable
(being more tactical and less strategic than Netrek, IMO). The lack of
attacks might be because fewer people care. However, the combination of
the two elements above (stick, carrot) seems to have worked pretty
well. As you write elsewhere in this month's Crypto-Gram, security is
about risk reduction, not threat avoidance.
From: Ian Mason <ian@ian.co.uk>
Subject: Re: More on Microsoft Kerberos
If, as I did, you download the Microsoft Kerberos self-extracting .exe but
use WinZip to extract it you don't get to be forced to agree to their
so-called contract. That leaves me free to operate within normal copyright
laws with my copy of it. Specifically that means that I can use the
contents without any specious trade secret stuff.
It's fairly clear that Microsoft are operating in a fashion that is
anti-competitive. They claim adherence to a public Kerberos 'standard' but
then 'extend' it in an incompatible fashion. Then, with a dominant
position in the desktop marketplace, make it difficult to make a compatible
server product by restricting access to use of the details of the
extensions. Perhaps it's time for the IETF et al to get heavy with
them. Perhaps an action for 'passing-off' of the Kerberos name.
Here in the EU we have statutes that make it illegal to prevent reverse
engineering for compatibility purposes. Unfortunately we don't have a law
that prevents a dominant player with a war chest for feeding lawyers from
tying a small player up in court, but then where does?
From: Graystreak <wex@media.mit.edu>
Subject: Remotely Disabling Software
Vendors will not have to tunnel through firewalls to remotely disable
software. Today most, if not all, softwares auto-update themselves. Look
at Norton Anti-Virus' "LiveUpdate" feature for one prominent method. Other
softwares prompt you to go to a Web site and download an "update."
Even without advance planning, such auto-updates can be used to selectively
disable users. This is precisely how Napster effected the ban on the
300,000+ users identified by Metallica. If you connected to Napster with
their version 2.5 client you were told to wait while the 2.6 version
downloaded. When 2.6 was run, suddenly you were blocked, if you were on
the banned list.
Clearly this kind of event was not in Napster's original plans, yet they
were able to implement it smoothly. I expect Norton and Microsoft and
other software makers to be equally smooth about programming in
user-specific expirations.
From: Ian Mason <ian@ian.co.uk>
Subject: Cybercrime Treaty
The current wording does not prohibit research. To fall foul of the
proposed treaty one would have to have an intent to commit an offence
["designed or adapted for the *purpose* of committing any offence"]. If
one's purpose was to research vulnerabilities to prevent offenses then
one's purpose is clearly not to commit the offenses. If you don't believe
me, ask any good lawyer about the concept of 'mens rea' [literally
"criminal mind", usually read as 'criminal intent'] in the context of the
words used. It doesn't propose to create an absolute offense -- one where
only an 'actus reas', criminal action, regardless of intention is necessary
to prove guilt. It is legal for me to make, own and use a crowbar. It is
illegal for me to own one with the intent to commit burglaries with
it. The treaty addresses only the latter type of case.
Sorry, but for once, this is a case of the technologists not understanding
the lawyers instead of the other way around as is more common.
From: Johan Ovlinger <johan@ccs.neu.edu>
Subject: Lawyer Suing USWest for Insecure DSL Connection
Whilst I agree on principle about vendor liability, this is more a case of
he got what he ordered, but didn't order what he wanted. Pacific Bell
connected his computer to the Internet constantly. That's what he ordered
and that's what he got. That he failed to turn off file sharing is his
fault. If anyone is to blame, it would be Microsoft for making it so
deplorably easy to shoot yourself in the foot like this (not that Red Hat
ships with significantly saner defaults, but they at least make you enter a
root password as part of the install).
I really don't see that Pacific Bell have anything to do with his
file-sharing, unless they turned it on during installation.
I'd like to see this suit dismissed or settled out of court so as not to
set a bad precedent ('cause I hope he'll lose). Then I'd like to see some
take some of the true offenders to task for the horrible state of their
"security". Unfortunately (as I'd like to come across as not to much of an
MS basher), the current political climate makes Microsoft the only viable
target. If they get split up or significantly slapped, their EULAs will be
seen as not worth the paper they're printed on, thus opening them up to
these sorts of class action lawsuits.
Even if it would never succeed, it would be a grand sight to have a Fortune
500 class action suit against them for the 10-odd gigabucks that went lost
due to the latest Internet worm, wouldn't it?
** *** ***** ******* *********** *************
CRYPTO-GRAM is a free monthly newsletter providing summaries, analyses,
insights, and commentaries on computer security and cryptography.
To subscribe, visit http://www.counterpane.com/crypto-gram.html or send a
blank message to crypto-gram-subscribe@chaparraltree.com. To unsubscribe,
visit http://www.counterpane.com/unsubform.html. Back issues are available
on http://www.counterpane.com.
Please feel free to forward CRYPTO-GRAM to colleagues and friends who will
find it valuable. Permission is granted to reprint CRYPTO-GRAM, as long as
it is reprinted in its entirety.
CRYPTO-GRAM is written by Bruce Schneier. Schneier is founder and CTO of
Counterpane Internet Security Inc., the author of "Applied Cryptography,"
and an inventor of the Blowfish, Twofish, and Yarrow algorithms. He served
on the board of the International Association for Cryptologic Research,
EPIC, and VTW. He is a frequent writer and lecturer on computer security
and cryptography.
Counterpane Internet Security, Inc. is a venture-funded company bringing
innovative managed security solutions to the enterprise.
http://www.counterpane.com/
Copyright (c) 2000 by Counterpane Internet Security, Inc.
[: hacktivism :]
[: for unsubscribe instructions or list info consult the list FAQ :]
[: http://hacktivism.tao.ca/ :]