comp.os.linux.misc-backing_up_onto_mulitple_cds

This is part of The Pile, a partial archive of some open source mailing lists and newsgroups.



Newsgroups: comp.os.linux.misc
From: Dale R Worley <worley@shell01.TheWorld.com>
Subject: Backing up onto multiple CDs
Date: Sat, 8 Feb 2003 23:06:47 GMT

I have a problem which is the opposite of one that was posted a few
days ago.  I want to create a backup image and then write it to a set
of CDs, since it may be too long to fit on one CD.

(Currently, I'm considering either iso9660 or squashfs as the backup
image format, but that probably isn't important.)

The interesting question is how to write the CDs.  I could cut the
file into chunks with 'cut' and then write them individually, but that
would require quite a bit of disk space.

If cdrecord had an option to tell it where in the source file to start
reading data, it would be helpful, but it doesn't have such an option.

However, the best would be to pipe the output from mkisofs into a
series of cdrecord's.  Looking at the documentation, I *think* mkisofs
generates its output serially, it doesn't need to seek in its output
file.  And it appears that cdrecord reads its input serially.  So one
could wrap cdrecord in a script loop that would execute cdrecord,
prompt the user to change CDs, execute cdrecord again, etc.  All of
this could have a pipe from mkisofs as input and it should all Just
Work.

A few gotchas that come to mind:

- does mkisofs really generate its output serially?

- does cdrecord really read its input serially?

- does cdrecord read *exactly* the number of bytes it is going to burn
  to the CD from its input?

- does cdrecord not need to know the length of the data it is going to
  write before it starts writing?

- how large a filesystem can mkisofs write before it breaks?

Also, the related questions:

- what would it take to have a -offset option to cdrecord, so that it
  could record a segment of a file?

- is there a way of concatenating two block special devices (e.g., two
  CD drives) to create one block special device, which can then be
  mounted so the files can be seen?

===
From: Jean-David Beyer <jdbeyer@exit109.com>
Newsgroups: comp.os.linux.misc
Subject: Re: Backing up onto multiple CDs
Date: Sun, 09 Feb 2003 08:50:31 -0500

Dale R Worley wrote:
 > I have a problem which is the opposite of one that was posted a few
 > days ago.  I want to create a backup image and then write it to a set
 >  of CDs, since it may be too long to fit on one CD.

I use magnetic tapes that can hold up to 65 GBytes to avoid this problem.
 >
 > (Currently, I'm considering either iso9660 or squashfs as the backup
 > image format, but that probably isn't important.)
 >
 > The interesting question is how to write the CDs.  I could cut the
 > file into chunks with 'cut' and then write them individually, but
 > that would require quite a bit of disk space.
 >
 > If cdrecord had an option to tell it where in the source file to
 > start reading data, it would be helpful, but it doesn't have such an
 > option.
 >
 > However, the best would be to pipe the output from mkisofs into a
 > series of cdrecord's.  Looking at the documentation, I *think*
 > mkisofs generates its output serially, it doesn't need to seek in its
 > output file.  And it appears that cdrecord reads its input serially.
 > So one could wrap cdrecord in a script loop that would execute
 > cdrecord, prompt the user to change CDs, execute cdrecord again, etc.
 > All of this could have a pipe from mkisofs as input and it should all
 > Just Work.
 >
 > A few gotchas that come to mind:
 >
 > - does mkisofs really generate its output serially?

It seems to.
 >
 > - does cdrecord really read its input serially?

It seems to.

I give these answers because you can pipe the output of mkisofs into
cdrecord and it works just fine (might be a problem if machine is too
slow and cd burner does not have "burnfree" properties.
 >
 > - does cdrecord read *exactly* the number of bytes it is going to
 > burn to the CD from its input?

I do not know. It might pad the sector and write the minimum number of
sectors.
 >
 > - does cdrecord not need to know the length of the data it is going
 > to write before it starts writing?

It does not need to know the length. If you do not tell it, though, it
advises that it may not all fit on a CD.
 >
 > - how large a filesystem can mkisofs write before it breaks?

Why should it break?
 >
 > Also, the related questions:
 >
 > - what would it take to have a -offset option to cdrecord, so that it
 >  could record a segment of a file?
 >
 > - is there a way of concatenating two block special devices (e.g.,
 > two CD drives) to create one block special device, which can then be
 > mounted so the files can be seen?
 >
 > Thanks,
 >
 > Dale



-- 
   .~.  Jean-David Beyer           Registered Linux User 85642.
   /V\                             Registered Machine    73926.
  /( )\ Shrewsbury, New Jersey     http://counter.li.org
  ^^-^^ 8:45am up 2 days, 8:52, 2 users, load average: 2.23, 2.17, 2.18


From: John Thompson <john@starfleet.thompson.us>
Newsgroups: comp.os.linux.misc
Subject: Re: Backing up onto multiple CDs
Date: Sun, 09 Feb 2003 16:57:54 GMT

Dale R Worley wrote:

> I have a problem which is the opposite of one that was posted a few
> days ago.  I want to create a backup image and then write it to a set
> of CDs, since it may be too long to fit on one CD.
> 
> (Currently, I'm considering either iso9660 or squashfs as the backup
> image format, but that probably isn't important.)
> 
> The interesting question is how to write the CDs.  I could cut the
> file into chunks with 'cut' and then write them individually, but that
> would require quite a bit of disk space.
> 
> If cdrecord had an option to tell it where in the source file to start
> reading data, it would be helpful, but it doesn't have such an option.
> 
> However, the best would be to pipe the output from mkisofs into a
> series of cdrecord's.  Looking at the documentation, I *think* mkisofs
> generates its output serially, it doesn't need to seek in its output
> file.  And it appears that cdrecord reads its input serially.  So one
> could wrap cdrecord in a script loop that would execute cdrecord,
> prompt the user to change CDs, execute cdrecord again, etc.  All of
> this could have a pipe from mkisofs as input and it should all Just
> Work.
> 
> A few gotchas that come to mind:
> 
> - does mkisofs really generate its output serially?
> 
> - does cdrecord really read its input serially?
> 
> - does cdrecord read *exactly* the number of bytes it is going to burn
>   to the CD from its input?
> 
> - does cdrecord not need to know the length of the data it is going to
>   write before it starts writing?
> 
> - how large a filesystem can mkisofs write before it breaks?
> 
> Also, the related questions:
> 
> - what would it take to have a -offset option to cdrecord, so that it
>   could record a segment of a file?
> 
> - is there a way of concatenating two block special devices (e.g., two
>   CD drives) to create one block special device, which can then be
>   mounted so the files can be seen?

Wow.

Why not get a tape drive of sufficient capacity and use that for your 
backups?

===

From: zentara@highstream.net
Newsgroups: comp.os.linux.misc
Subject: Re: Backing up onto multiple CDs
Date: Sun, 09 Feb 2003 13:50:03 -0500

Dale R Worley <worley@shell01.TheWorld.com> wrote:

>I have a problem which is the opposite of one that was posted a few
>days ago.  I want to create a backup image and then write it to a set
>of CDs, since it may be too long to fit on one CD.

>- does mkisofs really generate its output serially?
>- does cdrecord really read its input serially?
>- does cdrecord read *exactly* the number of bytes it is going to burn
>  to the CD from its input?
>- does cdrecord not need to know the length of the data it is going to
>  write before it starts writing?
>- how large a filesystem can mkisofs write before it breaks?
>Also, the related questions:

If you want an easy way do do this, try "cddump".
It will output to cd's, prompt you to insert new ones.

It cheats from your specs a little, it makes 1 temp isoimage
at a time.

It's simple to run, just  " cddump / " , it does incremental backups
too.

http://www.joat.ca/software/cddump.html





From: Pat D <news**NO_SPAM**@psychogenic.com>
User-Agent: Mozilla/5.0 (X11; U; Linux i686; en-US; rv:1.3a) Gecko/20021212
X-Accept-Language: en, en-us
MIME-Version: 1.0
Newsgroups: comp.os.linux.misc
Subject: Re: Backing up onto multiple CDs
References: <ypv3cmy9xtk.fsf@shell01.TheWorld.com>
In-Reply-To: <ypv3cmy9xtk.fsf@shell01.TheWorld.com>
Content-Type: text/plain; charset=us-ascii; format=flowed
Content-Transfer-Encoding: 7bit
Lines: 20
Message-ID: <f_H1a.29461$ns3.201070@news20.bellglobal.com>
Date: Mon, 10 Feb 2003 01:51:58 -0500

Dale R Worley wrote:
> I have a problem which is the opposite of one that was posted a few
> days ago.  I want to create a backup image and then write it to a set
> of CDs, since it may be too long to fit on one CD.
> 

Have a look at the "Make CD-ROM Recovery" - http://mkcdrec.ota.be/

 From the site:

mkCDrec makes a bootable (El Torito) disaster recovery image (CDrec.iso), including backups of the 
linux system to the same CD-ROM (or CD-RW) if space permits, or to a multi-volume  CD-ROM set.

HTH

-- 
Pat Deegan,
Registered Linux User #128131
http://www.psychogenic.com/


Newsgroups: comp.os.linux.misc
Subject: Re: Backing up onto multiple CDs
References: <ypv3cmy9xtk.fsf@shell01.TheWorld.com> <slrnb4d24h.dh2.john@starfleet.thompson.us>
From: david.cook@pobox.com
Reply-To: david.cook@pobox.com
Organization: Home, Melbourne, Australia
X-Newsreader: trn 4.0-test72 (19 April 1999)
Originator: david@localhost.localdomain ((null))
NNTP-Posting-Host: ppp92.vic.padsl.internode.on.net
Message-ID: <3e55e322$1@duster.adelaide.on.net>
Date: 21 Feb 2003 18:58:18 +1050

John Thompson  <john.thompson@attglobal.net> wrote:
>Dale R Worley wrote:

>> - is there a way of concatenating two block special devices (e.g., two
>>   CD drives) to create one block special device, which can then be
>>   mounted so the files can be seen?
>
>Wow.
>
>Why not get a tape drive of sufficient capacity and use that for your 
>backups?

Because of the cost, most likely.
In my case - CD-RW was about $AUS 200, blank CDs are $AUS 1 or less.
The cheapest tape drives here seem to be various Travan IDE drives,
starting at around $AUS 350, and the tapes are over $AUS 50 each.

For use at home, for relatively infrequent backups, CDs are quite
a bit cheaper, I'd say.


(note:  Prices are a little out of date, since I haven't gone CD-RW
or tape-drive shopping recently ... )


===

From: Jean-David Beyer <jdbeyer@exit109.com>
Newsgroups: comp.os.linux.misc
Subject: Re: Backing up onto multiple CDs
Date: Fri, 21 Feb 2003 08:38:44 -0500

david.cook@pobox.com wrote:
 >John Thompson <john.thompson@attglobal.net> wrote:
 >
 >> Dale R Worley wrote:

 >>> - is there a way of concatenating two block special devices
 >>> (e.g., two CD drives) to create one block special device, which
 >>> can then be mounted so the files can be seen?
 >>
 >> Wow.
 >>
 >> Why not get a tape drive of sufficient capacity and use that for
 >> your backups?
 >
 >
 > Because of the cost, most likely. In my case - CD-RW was about $AUS
 > 200, blank CDs are $AUS 1 or less. The cheapest tape drives here seem
 > to be various Travan IDE drives, starting at around $AUS 350, and the
 > tapes are over $AUS 50 each.
 >
 > For use at home, for relatively infrequent backups, CDs are quite a
 > bit cheaper, I'd say.

No question about that.

The reason I do not care for it is that I backup _everything_ everyday
and I would have to use an awful lot of CDs to cover my two 9 GByte hard
drives on this machine and my two smaller hard drives on my other
machine. Better than floppies, I guess. ;-)

The reason I backup _everything_ is that otherwise, I would either
forget something I need, or I would have to go through all sorts of CDs
to restore a system after a crash. I have the original distribution CDs
for my system, but it has been upgraded and reconfigured so many times
that I would never get it back to where I started. This way, even if I
must replace my hard drives, all I need do is stick in two floppies and
a tape after putting in the new hard drives and I am back to where I was
that morning. (I use cron to do the backup about 1AM every morning when
I should be in bed.)

 > (note:  Prices are a little out of date, since I haven't gone CD-RW
 > or tape-drive shopping recently ... )
 >
If you are lucky, they are less now. Here in USA, you can get CR-R disks
for about US$0.25 in 100 lots without jewel cases, and CD-RW disks for
about US$0.99 in 10 lots with jewel case for 4x disks and US$1.49 for
10x disks.

I paid about US$310, IIRC, for a Plextor PlexWriter 12/10/32S external
burner on an Ultra SCSI interface a while ago; may be less (or
discontinued) by now. My VXA-1 tape drive (Ultra-2 LVD SCSI interface)
was a bit under US$800 way back when (mid 2000). But I have been in
computer biz since late 1950s and am a bit paranoid about backups from
bitter experience. Of course way back then, no need for backups because
no hard drives. The IBM 350 RAMAC (or whatever it was called) came out a
bit later. I believe the removeable disk drives were a bit later. My
first two had 40 Megabyte capacity, and that was really great. These
days, I have over 12x that in RAM alone. How things change!


===
From: "Ed Skinner" <ed@REMOVE.flat5.net>
Subject: Re: Backing up onto multiple CDs
Newsgroups: comp.os.linux.misc
Date: Fri, 21 Feb 2003 07:07:06 -0700

     Here are two answers.
     First, for a complete backup of everything, take a look at "mondo"
(do a search at freshmeat.net). It will create a bootable CDROM set that
will do an image restore. I use this when I want to create disaster
recovery set. If your disk drives are big and have a lot of data, this can
run into an amazing number of disks but, more importantly, a large number
of hours to burn everything. During that interval, if something dies
you'll have to start over creating the CDROMs. Hence, I don't do a
complete backup very often.
     On a more frequent basis, I manually create a tar of the really
important files. I then use "split" to break the tar into 650Mb chunks and
then mkisofs on each of those, followed by cdrecord to burn the iso file
systems to CDROM. Recovery is the opposite: copy the tar-fragment from
each of the CDROMs into the system, "cat" them all together and then
un-tar the result. Note that one bad CDROM will make the entire backup set
useless. I therefore do a dry-run of the recovery procedure after creating
a backup set and then put the CDROMs in a safe place.

===

From: John Thompson <john@starfleet.thompson.us>
Newsgroups: comp.os.linux.misc
Subject: Re: Backing up onto multiple CDs
Date: Fri, 21 Feb 2003 22:49:24 GMT

david.cook@pobox.com wrote:
> John Thompson  <john.thompson@attglobal.net> wrote:
>> Dale R Worley wrote:
>>
>>> - is there a way of concatenating two block special devices (e.g., two
>>>   CD drives) to create one block special device, which can then be
>>>   mounted so the files can be seen?
>>
>>Wow.
>>
>>Why not get a tape drive of sufficient capacity and use that for your 
>>backups?

> Because of the cost, most likely.
> In my case - CD-RW was about $AUS 200, blank CDs are $AUS 1 or less.
> The cheapest tape drives here seem to be various Travan IDE drives,
> starting at around $AUS 350, and the tapes are over $AUS 50 each.
> 
> For use at home, for relatively infrequent backups, CDs are quite
> a bit cheaper, I'd say.

Yeah, but if you have more than a few GBs to back up, you're going to get 
awfully sick of babysitting the process, which tends to make your backups 
even less frequent.

I used to backup my giant 20MB (that's megabyte, not gigabyte) HD onto 
floppy disks.  With compression, it took about a dozen disks.  I hated it.  
When I got a flashy new 486 with a 250MB HD, I decided there was no way I 
was going to back that up onto floppies, so I got a tape drive and haven't 
looked back.

Now I just pop in a tape, set up a cron job to run the backup overnight 
and go to bed.  When I get up next morning, the backup is done.  There's 
no pain in doing regular backups that way.

BTW, the last tape drive I bought was a SCSI unit on eBay for about US$20.  
I picked up a dozen tapes out of the "closeout" bin at the local Office 
Depot for about US$4/each.  That was several years ago.

===

Newsgroups: comp.os.linux.misc
Subject: Re: Backing up onto multiple CDs
From: * Tong * <sun_tong@users.sourceforge.net>
Date: 21 Feb 2003 21:42:43 -0500

"Ed Skinner" <ed@REMOVE.flat5.net> writes:

> un-tar the result. Note that one bad CDROM will make the entire backup set
> useless. I therefore do a dry-run of the recovery procedure after creating
> a backup set and then put the CDROMs in a safe place.

Well, that doesn't stop the wears and tears. What I planned to do is
to chop them into 7M a piece, and use par to create redundant soft
raid for each CD... Up till now, I'd been using rar to do this, and
have already been able to save my day at least once. :-)


===

From: Dale R Worley <worley@shell01.TheWorld.com>
Newsgroups: comp.os.linux.misc
Subject: Re: Backing up onto multiple CDs
Date: 23 Feb 2003 11:51:15 -0500

david.cook@pobox.com writes:

> John Thompson  <john.thompson@attglobal.net> wrote:

> >Dale R Worley wrote:

> >> - is there a way of concatenating two block special devices (e.g., two
> >>   CD drives) to create one block special device, which can then be
> >>   mounted so the files can be seen?
> >
> >Why not get a tape drive of sufficient capacity and use that for your 
> >backups?
> 
> Because of the cost, most likely.
> In my case - CD-RW was about $AUS 200, blank CDs are $AUS 1 or less.
> The cheapest tape drives here seem to be various Travan IDE drives,
> starting at around $AUS 350, and the tapes are over $AUS 50 each.

Actually, I already have a tape drive of adequate capacity, but it's
quite a bit slower than burining CDs.  And, CD-Rs are cheap, about
US$0.20 each, compared to tapes.

More importantly, though, is that I can write a file system on CDs,
and thus browse them using the ordinary Unix tools.

I've not worked out the details, but it seems that if certain tools
were available, they could be assembled into a kick-ass backup system.

===

Newsgroups: comp.os.linux.misc
Subject: Re: Backing up onto multiple CDs
From: nobody@koala.samiam.org (Sam Trenholme)
Date: Mon, 24 Feb 2003 08:29:51 -0800

>Actually, I already have a tape drive of adequate capacity, but it's
>quite a bit slower than burining CDs.  And, CD-Rs are cheap, about
>US$0.20 each, compared to tapes.
>
>More importantly, though, is that I can write a file system on CDs,
>and thus browse them using the ordinary Unix tools.

Back when I was backing up 4gig hard disks to Iomega zip drives, what I 
would do to make this easier was to split things up in to partitions
small enough to fit entirely on a zip disk.  This made it easier to back
up; unfortunatly, Linux has a 14-partition limit so the biggest
hard disk this trick will work with is a 9.8 gig hard disk.

For bigger hard disks, one idea is to have a number of 700mb partitions 
which can be backed up to CDROM, then have a huge partition which is only
backed up to tape.

===

From: William Park <opengeometry@yahoo.ca>
Newsgroups: comp.os.linux.misc
Subject: Re: Backing up onto multiple CDs
Date: 24 Feb 2003 21:08:05 GMT

Dale R Worley <worley@shell01.theworld.com> wrote:
> I have a problem which is the opposite of one that was posted a few
> days ago.  I want to create a backup image and then write it to a set
> of CDs, since it may be too long to fit on one CD.
> 
> (Currently, I'm considering either iso9660 or squashfs as the backup
> image format, but that probably isn't important.)
> 
> The interesting question is how to write the CDs.  I could cut the
> file into chunks with 'cut' and then write them individually, but that
> would require quite a bit of disk space.
> 
> If cdrecord had an option to tell it where in the source file to start
> reading data, it would be helpful, but it doesn't have such an option.

Have you really read the manpage?  How about '-multi' option.  You can
always 'split' the file as last resort.

Probably, a better option is to avoid CD, and get 5400rpm 60GB harddisk
with removable harddisk tray.


===

From: Dale R Worley <worley@shell01.TheWorld.com>
Newsgroups: comp.os.linux.misc
Subject: Re: Backing up onto multiple CDs
Date: 25 Feb 2003 10:42:44 -0500

William Park <opengeometry@yahoo.ca> writes:
> Have you really read the manpage?  How about '-multi' option.

I have, and I don't see how cdrecord -multi will help me write a
gigantic filesystem to multiple CDs.  From the man page, it seems that
-multi is for writing several chunks of data to one CD.

> You can always 'split' the file as last resort.

True, but that's precisely what I want to avoid -- realizing the
output filesystem.  I want to generate it as a stream, and write the
stream in 700MB chunks to CDs.

===

From: Dances With Crows <danceswithcrows@usa.net>
Newsgroups: comp.os.linux.misc
Subject: Re: Backing up onto multiple CDs
Date: 25 Feb 2003 15:58:30 GMT

On 25 Feb 2003 10:42:44 -0500, Dale R Worley staggered into the Black
Sun and said:
> William Park <opengeometry@yahoo.ca> writes:
>> Have you really read the manpage?  How about '-multi' option.
> I have, and I don't see how cdrecord -multi will help me write a
> gigantic filesystem to multiple CDs.  From the man page, it seems that
> -multi is for writing several chunks of data to one CD.

Yes.

>> You can always 'split' the file as last resort.
> True, but that's precisely what I want to avoid -- realizing the
> output filesystem.

?  "realizing the output filesystem"?  Er...  I think you mean, "writing
many 697M files out to disk".  "realize" has a different meaning in many
European languages than it does in American English.

> I want to generate it as a stream, and write the stream in 700MB
> chunks to CDs.

freshmeat.net , search for "CD backup".  This is not a new problem and
solutions have been figured out in various ways.


===
From: Jonathan Rawle <jr36@le.ac.uk>
Newsgroups: comp.os.linux.misc
Subject: Re: Backing up onto multiple CDs
Date: Tue, 25 Feb 2003 17:50:47 +0000

William Park wrote:

> Dale R Worley <worley@shell01.theworld.com> wrote:
>> I have a problem which is the opposite of one that was posted a few
>> days ago.  I want to create a backup image and then write it to a set
>> of CDs, since it may be too long to fit on one CD.
>> 
>> (Currently, I'm considering either iso9660 or squashfs as the backup
>> image format, but that probably isn't important.)
>> 
>> The interesting question is how to write the CDs.  I could cut the
>> file into chunks with 'cut' and then write them individually, but that
>> would require quite a bit of disk space.
>> 

I tend to use "zip" for my backups. I create a zip file of the data I want 
to back up, then use "zipsplit" to create CD-sized chunks. I believe 
zipsplit has an option that makes it pause and wait for user input between 
writing parts, so you never need to have more than one on disk at once if 
space is tight.

Zip has the added advantage that it compresses the data, so you need fewer 
CDs - it's more robust than using tar.gz as each file is compressed 
separately (so a disk error will only cause one file to be lost, not the 
whole archive). You can also encrypt your backed-up data if necessary.

Jonathan


===
From: "Ed Skinner" <ed@REMOVE.flat5.net>
Subject: Re: Backing up onto multiple CDs
Newsgroups: comp.os.linux.misc
Date: Wed, 26 Feb 2003 07:13:23 -0700

Jonathan Rawle wrote:
> I tend to use "zip" for my backups. I create a zip file of the data I want 
> to back up, then use "zipsplit" to create CD-sized chunks. I believe 
> zipsplit has an option that makes it pause and wait for user input between 
> writing parts, so you never need to have more than one on disk at once if 
> space is tight.

     The man-pages are all within "man zip".
     After five minutes of brief experimenting, I'm sold. I'm changing my
backup process to start using these instead of tar's because the split
goes so fast, and the individual splits can be unzip'd individually. No
more cat'ing together all my split-tar's into one humongous thing just to
recover one file, and no more worries about one bad CDROM spoiling the
whole backup.
     This is a better solution.
     Thanks for the tip!


===
From: "Ed Skinner" <ed@REMOVE.flat5.net>
Subject: Re: Backing up onto multiple CDs
Newsgroups: comp.os.linux.misc
Date: Thu, 27 Feb 2003 07:47:48 -0700

     Whoops, spoke too soon.
     Apparently "zip" has a 2 gigabyte archive size limit. When a set of
backup files reaches that size, "zip" dies.
     I'm back to "tar" and "split" again for incrementals, and "mondo"
(see freshmeat.net) for full images.

===
From: Jonathan Rawle <jr36@le.ac.uk>
Newsgroups: comp.os.linux.misc
Subject: Re: Backing up onto multiple CDs
Date: Thu, 27 Feb 2003 15:46:27 +0000

Ed Skinner wrote:

>      Whoops, spoke too soon.

>      Apparently "zip" has a 2 gigabyte archive size
> limit. When a set of backup files reaches that size, "zip"
> dies.

>      I'm back to "tar" and "split" again for incrementals,
> and "mondo" (see freshmeat.net) for full images.

Whoops indeed! That just goes to show that I don't have a huge amount of 
data to back up (well, I tend to burn things such as MP3s to CD without 
archiving them as they can be used directly from the disk).

I suppose I'll have to find a new method eventually. Just had a quick look, 
and DAR seems promising (http://dar.linux.free.fr/). It seems to do 
everything my zip method does. Anyone have any experience of using this?

===




the rest of The Pile (a partial mailing list archive)

doom@kzsu.stanford.edu