harless@sdd.hp.com (Mike Harless) (11/03/90)
I'm seeing a problem with perl3.0 at patchlevel 37 on HP-UX 7.0 when I try to suck up an entire file into an array variable. While running monitor to see what was happening, the number of I/O bytes look normal, but the data space for the process goes out of sight. For a 750k file, perl tries to malloc over 50MB. This kind of puts a cramp on the files I can read in and process! :-) I can reproduce this with perl using the system's malloc or the one with perl. I don't know exactly when this broke as the only old copy of perl that I've got around is patchlevel 12, and it works fine there. Any ideas of what the problem might be? ...Mike
lwall@jpl-devvax.JPL.NASA.GOV (Larry Wall) (11/03/90)
In article <1990Nov03.001316.26904@sdd.hp.com> harless@sdd.hp.com (Mike Harless) writes:
: I'm seeing a problem with perl3.0 at patchlevel 37 on HP-UX 7.0 when I try
: to suck up an entire file into an array variable. While running monitor
: to see what was happening, the number of I/O bytes look normal, but the
: data space for the process goes out of sight. For a 750k file, perl tries
: to malloc over 50MB. This kind of puts a cramp on the files I can read in
: and process! :-) I can reproduce this with perl using the system's malloc or
: the one with perl. I don't know exactly when this broke as the only old
: copy of perl that I've got around is patchlevel 12, and it works fine there.
:
: Any ideas of what the problem might be?
Yes, I see the problem, and will fix it, one way or another. I tried to
avoid copying a string and ended up making each string in the array
several K long. Sigh. I've sometimes traded memory for speed, but
this is ridiculous.
For the moment, slurp a file into an array with
push(@array, $_) while <HANDLE>;
Larry