mort@dhump.lakesys.COM (Mort d`Hump) (01/07/90)
I am having trouble using compress to de-compress large .Z files. The operation terminates with a "File too large" error. I have a PLEXUS P-20 running vanilla Sys VR2 w/ 2Meg of RAM and over 25 Meg of free space on the file system I am working in. The results of compress -V: Based on compress.c,v 4.0 85/07/30 12:50:00 joe Release Options: BITS = 16 This is the first file I have trouble with: mort users 1059441 Dec 21 21:01 SML3.1.cpio.Z Does anyone have any pointers? Thanks! --- Marty Wiedmeyer mort@dhump.lakesys.COM {uunet!marque,uwvax!uwm}!lakesys!dhump!mort -- Marty Wiedmeyer mort@dhump.lakesys.COM {uunet!marque,uwvax!uwm}!lakesys!dhump!mort
tkevans@fallst.UUCP (Tim Evans) (01/08/90)
In article <493@dhump.lakesys.COM>, mort@dhump.lakesys.COM (Mort d`Hump) writes: > I am having trouble using compress to de-compress large .Z files. > > The operation terminates with a "File too large" error. > What's the system 'ulimit'? The uncompressed file may be larger than the largest file allowed to be created by 'ulimit'. -- UUCP: {rutgers|ames|uunet}!mimsy!woodb!fallst!tkevans INTERNET: tkevans@wb3ffv.ampr.org Tim Evans 2201 Brookhaven Ct, Fallston, MD 21047 (301) 965-3286
wescott@Columbia.NCR.COM (Mike Wescott) (01/09/90)
In article <493@dhump.lakesys.COM> mort@dhump.lakesys.COM (Mort d`Hump) writes: > I am having trouble using compress to de-compress large .Z files. > The operation terminates with a "File too large" error. > This is the first file I have trouble with: > mort users 1059441 Dec 21 21:01 SML3.1.cpio.Z I'll bet that your ulimit is too small for the resulting uncompress'd file. Try increasing your ulimit or dearchiving as you go: zcat SML3.1.cpio.Z | cpio -ivcBd -- -Mike Wescott mike.wescott@ncrcae.Columbia.NCR.COM
sauron@dsoft.UUCP (Ron Stanions) (01/09/90)
In article <493@dhump.lakesys.COM> mort@dhump.lakesys.COM (Mort d`Hump) writes: >I am having trouble using compress to de-compress large .Z files. > >The operation terminates with a "File too large" error. > I remember having this very problem myself once, a long time ago under a 286 xenix. Unfortunately, I don't remember exactly how I got around it, but I can suggest two things. First, make sure you have enough room to decompress to. If you're working on a disk that doesn't have enough free space, then move to another disk or clean up some space. Second, what kind of space are you allowed in your user area? You may have 60 Meg free space on the drive, but you as a user may have a limit of 500K. If you find you don't have enough space, again, clean up your user area and try again. (You could try a dirty trick. copy your file to /tmp/something, chmod the /tmp file to 777 and then chown it to bin or something so it's no longer your file. chmod it to 777 so you can remove it when your done). this way you can remove your original copy and get back that much user space. lastly, try using "zcat > newfile" instead. I somehow don't think the problem is really with compress, but rather with your user allocation, in which case zcat won't do it either. -- Ron Stanions -- sauron@dsoft \_/\--/\_/ All things posted by me are dsoft system administrator < \ / > by-products of a deranged mind Dragonsoft Development \ / from spending too many hours ...!uunet!tronsbox!dsoft!sauron `\oo/' trying to make uucp work!
mort@dhump.lakesys.COM (Mort d`Hump) (01/14/90)
In article <493@dhump.lakesys.COM> I wrote: >I am having trouble using compress to de-compress large .Z files. > >The operation terminates with a "File too large" error. Thanks to all those who replied, the trouble was with the system ulimit. As root I bumped the ulimit with "ulimit 8000". All worked as expected. -- Marty Wiedmeyer mort@dhump.lakesys.COM {uunet!marque,uwvax!uwm}!lakesys!dhump!mort
kaleb@mars.jpl.nasa.gov (Kaleb Keithley) (01/16/90)
>In article <493@dhump.lakesys.COM> mort@dhump.lakesys.COM (Mort d`Hump) >writes: >I am having trouble using compress to de-compress large .Z files. > >The operation terminates with a "File too large" error. > The problem is probably your ULIMIT; you have three options, that I know of, 1) set ULIMIT higher in /etc/conf/cf.d/stune and rebuild kernel. 2) set ULIMIT higher in /etc/defaults/login. 3) as root, type ulimit <big number><CR>. Then try uncompressing your large .Z file. I ran into this while trying to uncompress gcc on my unix pc running ESIX. The default ULIMIT allows files about 2Meg, while the uncompressed tar is about 8Meg. For an 8Meg file, you need a ulimit of 16384, since ulimit is expressed in blocks of 512 bytes. Chewey, get us outta here! kaleb@mars.jpl.nasa.gov (818)354-8771 Kaleb Keithley
tinle@aimt.UU.NET (Tin Le) (01/18/90)
In article <493@dhump.lakesys.COM>, mort@dhump.lakesys.COM (Mort d`Hump) writes:
] I am having trouble using compress to de-compress large .Z files.
]
] The operation terminates with a "File too large" error.
]
] I have a PLEXUS P-20 running vanilla Sys VR2 w/ 2Meg of RAM and over
] 25 Meg of free space on the file system I am working in.
]
] The results of compress -V:
] Based on compress.c,v 4.0 85/07/30 12:50:00 joe Release
] Options: BITS = 16
]
] This is the first file I have trouble with:
] mort users 1059441 Dec 21 21:01 SML3.1.cpio.Z
]
] Does anyone have any pointers?
The problem is your ulimit is too low. If you are in sh, check by simply
typing ulimit. That will give you a number telling you the current ulimit.
My guess is that it's set to the default of 2048 blocks (2MB). Increase
that and you will be fine. If you are using csh, you will need to change
to sh (temporarily) while you are uncompressing the file.
One more note: you will need root privilege in order to change your ulimit.
-- Tin Le
--
Tin Le | UUCP: {wyse, claris, uunet}!aimt!tinle
Sr. Software Engineer | Internet: tinle@aimt.uu.net
AIM Technology | XBBS (408)-739-1520 19.2K Telebit+
Santa Clara, CA 95054 | "'tis an ill wind that blows no mind..."