[NTLUG:Discuss] rm doesn't recurse
David White
davidnwhite at optusnet.com.au
Fri May 7 11:42:37 CDT 2004
> You are correct in saying that this is A BAD IDEA because there is a
hard
> limit on the maximum size of a shell command line (I think it's 10k
bytes
> or so). If the combined lengths of the file names of all those zip
files
> exceeds 10k (don't forget the spaces between them)...then you'll get a
shell
> error.
On my shell, bash, I tried the following:
- create a text file, 'test' which contains 1,000,000 lines, each of 10
characters (11 including the newline) - total size of over 10 megabytes
(verified with ls -l)
- run the command,
for i in `cat test`; do echo $i >> test_out; done
The command ran smoothly. I made sure the output was as expected:
diff test test_out
and the files were identical.
The for-loop style thus doesn't have any apparent significant space
limitations on modern shells. It does have a performance penalty in that it
has to expand the command out, but unless you're dealing with alot of files,
this isn't going to matter alot - Indeed, in my 10 megabyte test, it didn't
take very long to run, and I tried running again redirecting output to
/dev/null and it ran in very little time at all, so I would imagine the cost
of the expansion is not large at all.
Admittedly the rm `find . -name "*.zip"` solution is simpler than the
for-loop solution, but I do like the for-loop solution, because it's the
most general. If I'm writing a once-only command, I can write the for loop
easily, and without thought, whether I'm getting the list of files from
find, ls, a file, or somewhere else. If you were writing a script that was
to be run often, it'd probably be good to use the find -exec method, since
the whole thing would be running inside the 'find' program, and thus should
be blazingly quick.
David
More information about the Discuss
mailing list