Home
| Calendar
| Mail Lists
| List Archives
| Desktop SIG
| Hardware Hacking SIG
Wiki | Flickr | PicasaWeb | Video | Maps & Directions | Installfests | Keysignings Linux Cafe | Meeting Notes | Linux Links | Bling | About BLU |
| Hi BLU, | | Maybe this is a newbie error and I've been lucky not to encounter it before, | but your feedback is appreciated. | | I ran a command and unintentionally created an enormous text file that ate up | all the gigs of open space on the server: | find -print0 | xargs -0 grep -i blah > blahInstances.txt | | I'm puzzled because when I ran the output to the screen, it was only something | like 1000 lines and it quit like I expected. What's also strange to me is | that I ran the command on my local suse 9.3 box and it didn't get caught up | and generate a huge file. It ran as expected. Not sure exactly what OS | version is on the machine that had the issue. | | My main confusion is why running output to the screen would be fine, but | running it to the file would get caught in a loop or something similar. What happened was that on that one system, a file containing "blah" was found before the find found blahInstances.txt. Then when blahInstances.txt was finally found, it contained one or more lines that matched "blah", so a new line was added to blahInstances.txt, and you're now in a loop trying to find the end of a file that grows as fast as you read it, because you're copying the data to the end as you go. On the suse box, blahInstances.txt was scanned before "blah" was found anywhere, so it was empty. And when you write the data to stdout, no file is created, so the loop isn't possible. This is an inherent problem with find. It's best to make sure that the output isn't written into any directory in the search tree. If you must, try using the -prune arg to skip the directory that contains the output file. /tmp is always a good place for such files, and a good directory to prune, since you never know what sort of intermediate file may be there as you pass through. I've seen people do the same sort of thing by putting tar's output into the directory being tarred. I did it myself once. Oops. -- _, O John Chambers <:#/> <jc at trillian.mit.edu> + <jc1742 at gmail.com> /#\ in Waltham, Massachusetts, USA, Earth | |
BLU is a member of BostonUserGroups | |
We also thank MIT for the use of their facilities. |