[mod.computers.vax] sort makes big working files

Wahl.ES@XEROX.ARPA (10/21/85)

Try:

$SORT/PROC=TAG

I found that I HAD to do that to sort some files or the sort would fill
up the ENTIRE disk and then die because of lack of space.

--Lisa

hutch@sdcsvax.UUCP (10/23/85)

In article <4594@cca.UUCP> you write:
>From: Jeff Deifik <JDEIFIK@USC-ISIB.ARPA>
>
>I am attempting to sort a text file with the following characteristics:
>2722 blocks
>11343 lines
>variable length lines maximum 32K
>stream LF
>carraige return carriage control
>1848 characters/line actual maximum
>
>I defined sortwork0..9 to be on 5 different big disks and said
>$sort/work_files=1 foo bar
>
>My working set went up to 6087
>The free disk space on one disk went down by 500,000 blocks
>Sort seemed to use space from mainly sortwork1
>I aborted sort when the disk space on one disk ran low
>On other runs sort reported out of disk space, and aborted
>
>Why did sort try to use 500,000 blocks of space?
>What can I do to sort this file?
>	(I really want the key to be the entire line of text)
>
>	Jeff Deifik	jdeifik@usc-isib.ARPA	jdeifik@isi.ARPA
>-------

Sort uses buckets, those files are buckets.  There is an option,
don't have those manuals at hand (down a floor).  It is also nice
to know that if you drop the temp file size down, for small input
file that they will get sorted faster (fs overhead).

Shrinking the file size down for big sorts will take longer, but
then again longer is sooner than never.

/*
	Jim Hutchison	UUCP:	{dcdwest,ucbvax}!sdcsvax!hutch
			ARPA:	hutch@sdcsvax
  [ Of course, these statements were typed into my terminal while I was away. ]
*/