Tuesday, June 21, 2005

Note to self

If you install linux on a windows machine on a seperate partition and then decide to just fromat it and blow it away to have more drive space for windows - you will render windows senseless because that master boot record will be hosed.

To fix this use your windows boot CD and run the fix master boot record command from the recovery console.

If you don't have a Windows boot CD because your IT deparment controls this, download and install knoppix (www.knoppix.net) - the excellent linux live cd distrubution and run this command from a shell:
sudo install-mbr /dev/hda

Hope I don't have to remember this trick.

Listen to this article Listen to this article

Friday, June 17, 2005

Backup large amounts of data on muliple CDs

mypixbackad mypixbackad At work I am getting a computer upgrade next month and I have about 8gb of data (mostly digital images) I need to move over to my new machine. Plus I have been lazy about making a backup. So I started to wonder how to back this all up on CD. I had hoped my CD burning software would take care of it (Nero 5.5) but it doesn't seem to do this).

Here is what I came up with as a solution (Unix to the rescue or more specifically cygwin on Windows):
1. Use tar to make the whole tree of files and folders into one file:
$ tar -cvf /cygdrive/f/pix.tar My\ Documents/My\ Pictures
This creates a file called pix.tar and in my case it was over 6 gb.
2. Use bzip2 to compress the tar archive:
$ bzip2 pix.tar
This creates a file called pix.tar.bz2. I use bzip2 instead of gzip or jusr zip because I read it provides better compression and is more forgiving of being split apart and put back together. Bzip2 compressed it down to just over 4 gigs.
3. Use split to break the file into cd sized chunks. split takes a few arguments the first is the file to split, the second in this case is the number of bytes we want the files to be. Last, the prefix to the names of the files. So in this case I want 650 mb files. But how many bytes is 650 mb. I used the conversion calculator on this site: http://www.t1shopper.com/tools/calculate/. So in this case it's 681574400 bytes. The last argument is the prefix for the file names. So here is the command:
$ split pix.tar.bz2 -b 681574400 mypixback
This creates this listing of files:
mypixbackaa mypixbackab mypixbackac mypixbackad mypixbackae mypixbackaf mypixbackag
each 650mb except the last whci was 500+ mb.
4. At this point we have cd sized data chunks ready to burn.

Putting humpty dumpty together again
1. Copy all the files from the cds into one folder on a hard drive
2. Use the cat command to join the files:
$ cat mypixbacka* > pix.tar.bz2
3. use bunzip2 to uncompress the archive
$ bunzip2 pix.tar.bz2
4. Untar the tar archive:
$ tar -xvf pix.tar


Notes: To compress and split a 6gb file you need much more than 6gb of disk space available - probably twice as much.

Also using tar, bzip, cat and the like on such big files takes quite a while. Don't expect a 6gb tarbal to show up in a few seconds or even minutes

Listen to this article Listen to this article