Anything faster than 'du' ?
Michael Hipp
Michael
Tue Jul 3 12:17:05 PDT 2007
David Bandel wrote:
> On 7/2/07, Bill Campbell <linux-sxs at celestial.com> wrote:
>> On Mon, Jul 02, 2007, Michael Hipp wrote:
>>> Is there a faster way to determine the size of a deep directory of files
>>> rather than using 'du'? It (du) is really painful if there are lots of
>>> files or the medium is slow like USB?
>> Anything that has to scan an entire directory structure, and stat()
>> all the entries to get the sizes is going to take a fair amount
>> of time. I haven't looked at the code for ``du'', but I would
>> have to guess that it's pretty lean.
>>
>
> Bill's right. This is a _very_ mature program. Most of the time it
> takes to run through the directory structures, though, has more to do
> with screen display speed than du doing its job. If you don't believe
> me, redirect output to a file (/dev/null is a good test) and watch how
> fast you get a prompt.
Well, I was just wondering if there was a tool that had some trick for
determining sizes of a deep directory rather than the "brute force"
method of du.
And the problem I'm having isn't related to the screen. Here's a typical
output:
# du -h --max-depth=1 archive
16K archive/lost+found
51G archive/church
11G archive/rhodes
154G archive/garreco
3.3G archive/dorcas
4.0K archive/michael
4.0K archive/test
65G archive/esther
35G archive/cornerstone
21G archive/distcourt
339G archive
There's probably close to a million files represented there but it
doesn't take long for the screen to scroll a dozen lines.
Thanks. Guess I'll just have to be patient. :-)
Naw.
Michael
More information about the Linux-users
mailing list