[comp.sys.amiga] Long binaries in Facc

bryce@COGSCI.BERKELEY.EDU.UUCP (06/14/87)

In article <3015@> walton@tybalt.caltech.edu.UUCP (Steve Walton) writes:
>In article <AA08259@> bryce@COGSCI.BERKELEY.EDU (Bryce Nesbitt) writes:
>>
>>1> When a large file is loaded in as a chunk it is unlikely that it will be
>>   accessed again soon.  Think about a large binary like an assembler, text
>>   editor or "DeluxeMusic".  Any buffers used to hold these are almost
>>   completely wasted.
>
>I did think about it. When I type "make mg," generally two or more
>source files are going to be recompiled.  If Facc's buffer is large
>enough (256K), both cc and as (Manx) stay in its buffer during this
>process and get re-used for the second and subsequent compiles. 

Good point.  However, given the choice bewteen countless scattered little
cached files, and a huge one, I will take the smaller files.
Per byte, large files are much cheaper to import from secondary storage.

The comment was also directed at a files LARGER than the buffer.  With far too
many schemes a large file will come in and wipe the buffer clean.  Since the
file wrapped over, the next access will come totally from disk, and no
cache advantage will be realized.
There are lots of ways to fix this.  With AmigaDOS a simple scheme like this
could work;  check the header key before you flush a block, if it's a future
part of a file you plan to load all of, don't replace it.  (However, Facc can't
look at the future and tell if all of a file will be loaded in)