[comp.unix.xenix] Huge directories in XENIX

richard@neabbs.UUCP (RICHARD RONTELTAP) (03/21/89)

" Huge directory < /bbs/files/ibm >--call administrator "

The is the friendly message that greets me, I use du, find, or tar on
the directory mentioned above.
The directory is indeed pretty big, once there were over 2K files in it,
which means the directory file is now over 32K bytes.

No problem, I thought, just ignore it. Performance is slow on large
directories, but this is just a download directory, not accessed too
frequently.

Wrong! tar doesn't work anymore. It repetetively backs up the same files of
the 'huge' directory.

--- WHY??

Is it written somewhere: Thou shalt not have a direcory with over 2K entries?
Do tar (and others) try to read the directory in 1 go, with a buffer of
32K allocated. (That seems pretty 'naive', to put it mildly)

Since tar, du and find all give exaclty the same message, I assume the
culprit piece of code is in a standard library call. FTW(S) is a good
candidate. The manual, however, says nothing about a maximum directory
length, only subdirectory depth is mentioned.

Any advice is welcome.

Oh yeah, the software is: XENIX /386 2.2.3

Richard
(...!mcvax!neabbs!richard)

jack@turnkey.TCC.COM (Jack F. Vogel) (03/23/89)

In article <118947@neabbs.UUCP> richard@neabbs.UUCP (RICHARD RONTELTAP) writes:
>" Huge directory < /bbs/files/ibm >--call administrator "
>
>The is the friendly message that greets me, I use du, find, or tar on
>the directory mentioned above.
>The directory is indeed pretty big, once there were over 2K files in it,
>which means the directory file is now over 32K bytes.
>
.....
>Wrong! tar doesn't work anymore. It repetetively backs up the same files of
>the 'huge' directory.
>--- WHY??
 
I do not recall precisely where the directory limitation lies, I suspect you
are correct about it being in a library call, I suppose I could take the time
to look it up but don't think it would be useful to you anyway. After all is
2k files in a directory reasonable, that's why we have directory trees. I
also thought it worthwhile to point out that this is not a Xenix specific 
problem. I have seen it in AIX and I suspect the same limitation is general
SysV.

Do not lose hope, however. For I believe you should still be able to back up
the directory. You do not say how you invoke tar, but if you do it by going
into that directory and saying "tar cvf /dev/rct0 *" it will fail since the
shell expansion of the asterick will fail. What did work with AIX and thus
should here is the following:

		find . -print | xargs tar uvf /dev/rct0

Find will still give you the message about the huge directory, but this did
not stop it from working. I have not tested this with SCO so I would be
interested in hearing about your success or failure.

					Good luck,


-- 
Jack F. Vogel
Turnkey Computer Consultants, Westchester,CA
UUCP: ...{nosc|uunet|gryphon}!turnkey!jack 
Internet: jack@turnkey.TCC.COM || lcc!jackv@CS.UCLA.EDU

richard@neabbs.UUCP (RICHARD RONTELTAP) (03/25/89)

[ Huge directories ]
 
(I'm on a BBS now, and can't qoute)
 
The 32k limit on a directory file really seemed to be the problem.
Since there are only +- 1700 files in the directory now, I resolved
the problem by creating a new direcory, moving all the files of the
huge directory to the new one, and renaming the new to the old.
 
find . -print seemed to work OK for backing up, but I didn't know
about using xargs with tar.
 
Just out of curiosity, could the 64K limit of small model 286 programs
have anything to do with it? Anyway, although there won't be much UNIX
users that have a directory with >2K files, in this case it really is
the most practical solution. The 32K restriction seems pointless, or
am I missing something?
 
Richard
(...!mcvax!neabbs!richard)

jim@applix.UUCP (Jim Morton) (03/26/89)

In article <118947@neabbs.UUCP>, richard@neabbs.UUCP (RICHARD RONTELTAP) writes:
> " Huge directory < /bbs/files/ibm >--call administrator "
> Is it written somewhere: Thou shalt not have a directory with over 2K entries?

Not that I know of, but thou shalt not have a directory with more than
1000 subdirectories - that is the maximum number of links a directory
can have in System V - attempting to create the 1001st directory link
will get you the EMLINK error. Hopefully this will be extended in 
System V.4.

--
Jim Morton, APPLiX Inc., Westboro, MA
UUCP: ...harvard!m2c!applix!jim
      jim@applix.m2c.org