[svlug] A followup on UnZipping in Linux.

Mark S Bilk mark at cosmicpenguin.com
Wed Jan 23 05:15:02 PST 2002

In-Reply-To: <1011694375.1040.160.camel at theo.theotiwii.org>; 
from theotiwii at earthlink.net on Tue, Jan 22, 2002 at 02:12:54AM -0800

On Tue, Jan 22, 2002 at 02:12:54AM -0800, Ron wrote:
>I have a followup question on "unzipping" in Linux.
>I initially attempted to unzip the foo.zip file with "mc", it created
>the file structure but moved the files out of the archive with zero
>size, BOTH as a user and as ROOT (and no error or warning message).

Maybe the Swiss Army Knife has a bent spring.  If unzip
was only putting out error messages, mc should have just
reported them to you, and not created anything.

>Using the command line to unzip (as ROOT) I received a series of
>"checkdir error: cannot create / unable to process" messages - and
>failure. (I mention this only because a number of people posted me
>privately suggesting the ROOT user would have free reign at unzipping
>the file).

Well, I try to avoid running possibly buggy programs as 
root, so I'm not going to test that out.  I would have
thought it would be successful.  

But in looking around the website of Info-zip (that's the
portable version of (un)zip that's used with Linux and 
other flavors of Unix), I found in their FAQ some talk 
about archives with absolute paths.  The extraction of
archive member-files to destinations outside the 
directory that unzip is running in is referred to as 
"directory traversal", and is regarded as dangerous:


  All known versions of UnZip, including 5.42, have a 
  directory-traversal vulnerability that allows them to
  unpack files in unexpected places. Specifically, if
  an archive contains files with leading "/" characters
  (i.e., relative to the top-level/root directory)
  or with ".." components ("previous directory level"),
  UnZip will unpack the files in the indicated locations,
  possibly creating directory trees in the process--and, if
  the -o ("overwrite") option is given, quietly destroying
  existing files outside the intended directory tree. A 
  patch (slightly overkill, but apparently effective) is 
  available on the Bugtraq page that reported the problem. 
  (Thanks to Anya Berdichevskaya for the pointer.)

This is the original report they refer to:

Upon consideration, I agree with them.  There is a design
principle that says software should never inflict a 
dangerous surprise upon a reasonably careful and informed
user.  Extracting an archive should not be able to trash 
your files or silently install a trojan or virus-laden
executable, in places you're not expecting, even when you
perform the extraction as root.  

So, maybe the version of unzip that we're using has been
patched to make it refuse to perform the operation in 
question.  That's a good thing.  (Which should be 
documented in the man page and/or the help printout.)

>My question today is:
>     Who was responsible for the problem to unzip?

Assuming that unzip's refusal to extract the files is due
to a security patch having been applied to prevent it, the
answer to the question is to _reframe_ the situation:

No one is to blame.  Your Java author had not realized
(as you and I had not) the danger inherent in archives 
containing members with absolute paths.  So you can now
tell him about the above two URLs, and he can reformulate 
his zipfiles with relative paths.  Problem solved, and
we're all a little wiser.

Consider:  Is the Java tutorial only directed to system
administrators who are foolish enough to run it under root 
privileges?  In multi-user operating systems with even 
elementary security, only they are able to create a directory 
/foo under the root.  Otherwise, the present zip files with 
absolute paths will only work under insecure single-user OS's 
like Microsoft Windows.  Thus the most capable and well-
informed members of the potential audience for the tutorial 
-- users of good operating systems -- are prevented from 
accessing it.

>Quite frankly, the zip file unzips for me without a 
>problem on a Windows machine, 

Well, it shouldn't have.  That zip archive could just as 
easily have contained members that would have replaced 
components of MS-Windows and caused an unrecoverable crash.
(But with Windows you would probably have figured that was
just its usual behavior -- time for the periodic reinstall.)

>I'm not sure what to say to the author...
>     A. Learn to zip...

Learn to zip using relative paths.

>     B. Don't send zip files to Linux users,
>        they may not have memorized the manual...

Whoa, that's awfully harsh!  For one thing, you don't even 
have to look at the manual to make the current zipfiles
work.  The little help text that unzip prints out when you 
run it without parameters describes the -d option pretty 

But mainly, don't blame Linux software for enforcing good
security.  Zipfiles with absolute paths in them should not 
be sent to anybody.  

>     C. Unzip in Linux is broken...

We now see that whether we think something is broken or not 
may depend on the state of our knowledge.

>     D. Unzip in Linux is user malicious,
>        good luck if the user isn't perfect...

I think you've got it exactly backwards.  Unzip in Linux 
is user-friendly in preventing people from trashing their 
filesystem or installing a malicious program by merely 
unzipping an archive as root.

>Any advise?

In the early 1980s, I was surfing the Net (which meant 
Usenet back then) using my new dual-floppy IBM PC, a 1200 
baud modem, and a comm program written in interpreted BASIC.  
And I would watch in amazement as people like Henry Spencer 
(famous for his regexp code) wrote and published open source 
Unix software of great power and sophistication.  

The Unix community has been creating this labor of love for 
decades, and GNU/Linux is its latest flowering.  So when 
you encounter a problem, remember that some pretty smart 
and kind people have developed this software and given it
to all of us.  Remember to thank them, if only in your 
heart.  And remember that a lot of people have used these
programs for many years.  So if you think you've found a bug,
research it before you declare its existence; the Web, and 
Google, make that easy (as in this case).  Most problems 
have been spotted and fixed long ago, so apparent problems 
usually result from the user not having RTFM, which now 
includes relevant areas of the Planetary Mind (the Web).

And please don't entertain ideas that GNU/Linux software is 
malicious, bug-ridden, unstable, "not ready for the desktop", 
etc.  That's all Microsoft propaganda.  Here are some 
pointers to the evidence:


More information about the svlug mailing list