[NTLUG:Discuss] Finding huge files...
Kevin Brannen
kbrannen at pwhome.com
Sat May 8 13:52:17 CDT 2004
Steve Southwell wrote:
> Odd thing happened to me yesterday... One of my clients' Apache
> servers just bombed, and SSL wouldn't run, but normal http would. As
> it turned out, the ssl_engine_log had hit the 2GB limit. It took me
> awhile to find out what happened, so I wrote this little script to
> find the largest files on the filesystem. I'd be interested in
> feedback or improvements, or perhaps I'm missing some sort of built-in
> command to do this...
>
> #!/bin/sh
> # findbig.sh - Find the largest files on the filesystem
> cd /
> for line in ` du -a -b 2>/dev/null| sort -g -r |head -n 200| sed
> s/^[0-9]*s*//g$
> do
> if [ -f $line ]; then
> echo `du -h $line`
> fi
> done
I just got this emailed to me the other day from "Unix Tip of the Day"
(see http://www.ugu.com/sui/ugu/show?tip.today to check them out):
find / -xdev -size +1024 -exec ls -al {} \; | sort -r -k 5
though I find -exec inefficient and prefer xargs to do the work, so:
find / -xdev -size +1024 -print | xargs ls -al | sort -r -k 5
and in testing it out, I found files with spaces, I like the big files
at the bottom, I wanted to start at the current dir, and I wanted only
the 10 biggest, so:
find . -xdev -size +1024 -print0 | xargs -0 ls -al | sort -k 5 | tail -10
Is Unix great or what?! :-)
HTH,
Kevin
More information about the Discuss
mailing list