pl@kuling.UUCP (Per Lindgren) (08/09/89)
OS page size query. I wonder if someone out there can explain the following for me. Background: This summer some students at our department prepared some computer architecture experiments on Apollo DN4000 workstations. The aim of one experiment was to find out the page size used (for VM paging). Method: They wrote a program that allocated a big array (4 Mb), and then a loop accessed elements in that array at certain intervals (the interval being the guessed page size). After the loop they compared the number of array references and the number of generated page faults (formed a ratio "page fault freq." / references made). Results: For interval sizes (or guessed page sizes) of less than 1024 bytes this ratio was less than 1.0, which makes sense. For a guessed page size of 1024 then ratio was ~1, and also for a guessed page size of 2048 bytes... But for guessed sizes over 2048 the ratio drops below 1.0 anew. Question: It makes sense that for a page size of 2048, the ratio should still be ~1.0, but why does it drop below 1.0 for bigger (guessed) page sizes? I would like the ratio to stay at 1.0! It is possible that I have missed something fundamental, but I can't figure out what. Please, if somenone can help me with an explanation I would be very happy. Please respond by e-mail, since I don't read this newsgroup regulary. Per Lindgren Dept. of Computer Systems Upsala University, Sweden E-mail: pl@mizar.docs.uu.se -- --Per Lindgren pl@kuling.UU.SE Dept. of Computer Systems or Uppsala University, SWEDEN pl%kuling.UU.SE@uunet.UU.NET