Home
| Calendar
| Mail Lists
| List Archives
| Desktop SIG
| Hardware Hacking SIG
Wiki | Flickr | PicasaWeb | Video | Maps & Directions | Installfests | Keysignings Linux Cafe | Meeting Notes | Linux Links | Bling | About BLU |
Bosco So wrote in a message to Mike Bilow: BS> PS: I seem to recall reading somewhere that tar files do BS> not contain error correction/detection information. So if BS> the huge tar file that I made in step #1 is corrupt, and I BS> whack the original partition in step #2, then I'm in deep BS> doo-doo, right? Will gzip compression add verification BS> info? If so, can I tune the compression ratio way low so I BS> don't take any performance hits? No! Using gzip will *reduce* resistance to errors. By default, tar stores data in fixed blocks, so a single fixed block being damaged will only cause the loss of the file or files contained in that block. The usual block size is about 10 KB, although it can be made larger. All new tar implementations have error detection, but not error correction. However, if you pass tar blocks through a gzip filter, than all of the data past the point of corruption will be lost. Because of this, gzip filtering should only be used with tar where the media provides its own error control, and even then it is risky. Dealing with a bad tar block can be a mildly involved manual procedure, but tar was originally designed for magnetic tape where this sort of thing is not unusual and provisions were therefore put into tar to handle it. In fact, tar is more robust than QIC-117 or similar formats because tar stores directory information physically adjacent to the files, while QIC-117 tacks on a global directory at the end. -- Mike
BLU is a member of BostonUserGroups | |
We also thank MIT for the use of their facilities. |