sandra%defun.utah.edu@cs.utah.edu (Sandra J Loosemore) (10/02/89)
In article <26873@genrad.UUCP> charlie@genrad.com (Charlie D. Havener) writes: >It is not at all clear to me how to apply OOD or to use Object >oriented programming style to the design of program language >compilers or interpreters. I've been working on some code that is being used as part of both a compiler and interpreter for Common Lisp. The front end does certain kinds of syntactic preprocessing on ordinary Lisp code (alphatization, macroexpansion, and the like), and converts it to a parse tree representation where each of the special forms in the language is a different kind of object. Each one of these special form types has its own methods for evaluation, code generation, prettyprinting, etc. I think that eventually it will be extended to support two other dialects of Lisp (Standard Lisp and Scheme) as well. Each dialect will need its own codewalker, but they will be able to share quite a few of the special form types and methods since there are a lot of similarities between the semantics of the different dialects. -Sandra Loosemore (sandra@cs.utah.edu)
charlie@genrad.com (Charlie D. Havener) (10/02/89)
The examples I have seen on Object Oriented Design apply nicely to problems like graphic window systems and to problems in which there are real physical objects one can think about. It is not at all clear to me how to apply OOD or to use Object oriented programming style to the design of program language compilers or interpreters. There seems to be a classic way to do these things, i.e. lexer- parser-code generator that is well established. Can someone who is comfortable with OOD comment on its applicability to such problems? Is OOD a poor match to some problems? Will anyone admit it? Comments pro and con appreciated, thanks Charlie Havener GenRad Inc. (508-369-4400 x3302) charlie@genrad.com
alms@cambridge.apple.com (Andrew L. M. Shalit) (10/03/89)
In article <26873@genrad.UUCP> charlie@genrad.com (Charlie D. Havener) writes:
The examples I have seen on Object Oriented Design apply
nicely to problems like graphic window systems and to problems
in which there are real physical objects one can think about.
It is not at all clear to me how to apply OOD or to use Object
oriented programming style to the design of program language
compilers or interpreters.
There seems to be a classic way to do these things, i.e. lexer-
parser-code generator that is well established.
Can someone who is comfortable with OOD comment on its applicability
to such problems? Is OOD a poor match to some problems? Will anyone
admit it?
Henry Lieberman (then of the MIT AI-Lab, now of the Media Lab) wrote
an interesting series of papers on object-oriented interpreters. I
don't know if he ever got past the prototype stage, but he did
demonstrate some very powerful uses of objects in interpreter design.
If I remember correctly, he started with the notion of a 'program'
object, which knows how to respond to the 'run' message. (He actually
worked in Lisp, so it was an 'expression' object and an 'eval'
message.) By specializing 'run', you can incrementally modify the
semantics of the language you are interpreting.
Sorry, I can't supply references. I'm actually not sure if these
papers were ever really published. Perhaps someone else reading
this would know.
Piersol@apple.com (Kurt Piersol) (10/03/89)
In article <26873@genrad.UUCP> charlie@genrad.com (Charlie D. Havener) writes: > It is not at all clear to me how to apply OOD or to use Object > oriented programming style to the design of program language > compilers or interpreters. Peter Deutsch at ParcPlace has built some very interesting parser generators using O-O techniques. He probably has some excellent thoughts on the matter. However, there are a few points to make which come to mind almost immediately. One of the characteristics of Smalltalk-like O-O systems is polymorphism, which when combined with standard message protocols form a concept of types for such languages. This characteristic can be exploited by clever compilation engines to provide easily replaced lexical analysers and parsers. While it is certainly possible to accomplish this in non-O-O languages, it appears much easier in an O-O style. Consider the concept of a token object, which has language specific contents but language independent properties. It contains an identifier in Kanji, yet has properties which identify it as an identifier with a certain identity. Or perhaps it is a binary operator, with the basic properties of such an operator but as yet undefined function. This object could be used by a language independent parser and code generator, which could be the back end for a language independent compiler. The analyzer, which has localized properties, producing a stream of these generalized tokens, which can be used to generate the code. However, the token still retains its full information content, and can be used to easily reconstruct the syntax for debugging purposes. The concept can be applied with greater levels of generalization, to larger and larger language constructions. Given time, it can serve as a basis for support of multiple syntax forms in a single language and programming environment. By using this sort of construction, O-O systems can naturally represent structures which have taken a great deal of work in the traditional compiler community. I'm interested in such concepts, having implemented a few compilers in ST80, and would love to talk offline with anyone interested. Kurt Piersol Senior Scientist Usenet: {sun,...}!apple!Piersol Internet: Piersol@apple.com AppleLink: Piersol.k Disclaimer: The opinions presented in this flame do not in any way represent the opinions of anyone, even myself whilst I was writing it.
steve@arc.UUCP (Steve Savitzky) (10/03/89)
In article <26873@genrad.UUCP> charlie@genrad.com (Charlie D. Havener) writes: >The examples I have seen on Object Oriented Design apply >nicely to problems like graphic window systems and to problems >in which there are real physical objects one can think about. >It is not at all clear to me how to apply OOD or to use Object >oriented programming style to the design of program language >compilers or interpreters. > >There seems to be a classic way to do these things, i.e. lexer- >parser-code generator that is well established. > >Can someone who is comfortable with OOD comment on its applicability >to such problems? Is OOD a poor match to some problems? Will anyone >admit it? Well, the last time I wrote a Smalltalkish byte compiler, there were three obvious candidates for "physical" objects: the lexer, the syntax productions and the abstract syntax trees. Basically, you do something like: (Program parse: aStream tokenize) emitCode i.e. first you tell a character stream to pass you back a stream of tokens, then you tell the top-level production to parse the tokens, then you tell the root of the resulting tree to emit some code. You may end up using the same set of classes for both syntax and semantics; i.e. the *class* Expression may parse an input stream, producing an *instance* of Expression in the process. As for interpreters, that's basically what a Lisp interpreter does: take an Expression object and send it the message "value". You can refine this by having different subclasses of Expression like If, For, and so on. An interpreter written this way will typically run slower than one that takes the "standard" approach of compiling into byte codes or machine language, but has advantages for debugging, algorithm animation, and the like. Actually, it may run faster if things evaluate themselves by performing incremental compilation. >Comments pro and con appreciated, thanks >Charlie Havener GenRad Inc. (508-369-4400 x3302) charlie@genrad.com You're welcome. -- Steve Savitzky | steve@arc.uucp | apple.com!arc!steve ADVANsoft Research Corp. | (408) 727-3357(w) / 294-6492(h) 4301 Great America Parkway | #include<disclaimer.h> Santa Clara, CA 95054 | May the Source be with you!
preston@titan.rice.edu (Preston Briggs) (10/03/89)
In article <26873@genrad.UUCP> charlie@genrad.com (Charlie D. Havener) writes: >It is not at all clear to me how to apply OOD or to use Object >oriented programming style to the design of program language >compilers or interpreters. >There seems to be a classic way to do these things, i.e. lexer- >parser-code generator that is well established. >Charlie Havener It's no big deal. There's lots of opportunities to do beautiful OO things whith any large project, compilers included. For an existance proof, see the Smalltalk system. Compilers need symbol tables of various sorts. Sounds like a great object to me! Need a syntax tree? Another object. A flow graph? Another object. Bit vectors (or more generally, lattices)? Another object. In a nice system, I can imagine applying, very cleanly, all the various graph manipulation and traversal algorithms to my specialized graph objects. In C, you invent some interesting interlinking structure to represent a flow graph. Then you hack up a version of Tarjan's reducibility checker. Then you spend some days debugging it. In a nice language, I would subclass a more general, existing graph class, and use the existing message, "reducible?" (or I'd like to someday). Of course, it may run very slowly and take a lot of memory; but the design part is still valid. The implementation however made need work. Preston Briggs
ttwang@polyslo.CalPoly.EDU (Thomas Wang) (10/03/89)
charlie@genrad.com (Charlie D. Havener) writes: >The examples I have seen on Object Oriented Design apply >nicely to problems like graphic window systems and to problems >in which there are real physical objects one can think about. >It is not at all clear to me how to apply OOD or to use Object >oriented programming style to the design of program language >compilers or interpreters. Object language is well suited for the development of compilers. There are many structures inside a compiler where an object language can help out. For example: sets, graphs, hash tables, stacks, symbol tables, and sparse matrix. I managed to write a LL(1) backtracing parser generator in one month using C++, so I can comfortably say that the above things definitely helped. >Charlie Havener GenRad Inc. (508-369-4400 x3302) charlie@genrad.com -Thomas Wang ("I am, therefore I am." - Akira ) ttwang@polyslo.calpoly.edu
barmar@kulla (Barry Margolin) (10/03/89)
In article <4514@internal.Apple.COM> Piersol@apple.com (Kurt Piersol) writes: >In article <26873@genrad.UUCP> charlie@genrad.com (Charlie D. Havener) >writes: >> It is not at all clear to me how to apply OOD or to use Object >> oriented programming style to the design of program language >> compilers or interpreters. >Consider the concept of a token object, which has language specific >contents but language independent properties. In fact, I believe that Symbolics's "conventional language" compilers for their Lisp Machines are heavily object oriented. I haven't seen much of their source code, but I've been in the debugger while using their C and Fortran compilers, and I've seen lots of objects that represent tokens, expressions, statements, types, etc. It seems like OOP would be a good foundation for building universal compilers, where different front and back ends can be plugged in. It should be more elegant than defining a universal intermediate representation. Barry Margolin, Thinking Machines Corp. barmar@think.com {uunet,harvard}!think!barmar
dchapman@portia.Stanford.EDU (David Chapman) (10/03/89)
In article <26873@genrad.UUCP> charlie@genrad.com (Charlie D. Havener) writes: > [Can OOD be used for compilers and interpreters?] Sure. I'm working on a compiler for an internal language that has two different code generators. I defined a "code generator" interface (the language, MAINSAIL, isn't quite object-oriented in the sense that Smalltalk or C++ are) and wrote two different modules that implemented that interface. The primary reason that you don't think in terms of objects for something like a compiler is that you only have _one_ of the "objects" at a time - one lexical analyzer feeds one parser feeds one optimizer feeds one code generator. On the other hand, if you wrote a compiler in the way that the FORTH language is defined - as a set of words to call, such that the parsing function (for example) is distributed among the words, you could define a powerful and easily extendible (if unconventional) compiler. I've thought of doing it myself (one of these days; it's number 973 on my list of things to do :-). >The examples I have seen on Object Oriented Design apply >nicely to problems like graphic window systems and to problems >in which there are real physical objects one can think about. These are "classic" examples in much the way that you noted there is a classic way of writing compilers and interpreters. That abstract examples do not exist yet is more a function of the relative newness of OOP, in my opinion. >Can someone who is comfortable with OOD comment on its applicability >to such problems? Is OOD a poor match to some problems? Will anyone >admit it? There is real power in just about any programming technique you want to try - procedural (Pascal, C); functional (LISP, APL); logical (PROLOG); or object-oriented. Which works best? Whichever one you yourself are most comfortable with and find easiest to use. Why learn new ones? You might find that a new paradigm fits your style of thinking better. I'm slowly shifting from procedural languages to object-oriented ones. Using C++ makes that transition easier for me, but that's personal preference only and I won't criticize your choice of language.
stephen@temvax.UUCP (Stephen C. Arnold) (10/03/89)
In article <26873@genrad.UUCP> charlie@genrad.com (Charlie D. Havener) writes: >The examples I have seen on Object Oriented Design apply >nicely to problems like graphic window systems and to problems >in which there are real physical objects one can think about. >It is not at all clear to me how to apply OOD or to use Object >oriented programming style to the design of program language >compilers or interpreters. > >There seems to be a classic way to do these things, i.e. lexer- >parser-code generator that is well established. > >Can someone who is comfortable with OOD comment on its applicability >to such problems? Is OOD a poor match to some problems? Will anyone >admit it? > >Comments pro and con appreciated, thanks >Charlie Havener GenRad Inc. (508-369-4400 x3302) charlie@genrad.com I really don't count as someone confortable with OOD but I have considered the problem. The making of lexical analyser and parser generaters is well studied and the techiques are quiet good. My interest is in the extention of these tools into attribute grammars. From attempts to implement an attribute grammar in c (using lex and yacc), I found the nodes of the parse tree acting very much like objects. The active attribute instances appear to be messages from one node to the next and passive attribute instances appear to be the public data space. Since each node of the parse tree is so very similer in structure, it appears that a super class of node, with specific sub-classes for each type of node, may be an easy way of creating the "data structure" and many of the operations need for each data structure. If the node is considered of evaluate it passive attribute instances to produce its active attribute instances, this could be internal behavior of the object. When all the messages come in containing the passive attribute instances to evaluate an active attribute instance, the message containing the active attribute instance could be generated and sent to the node in which this attribute instance is passive. I've never seen such an implementation, but it does seem a natural aplication of OOD to attribute grammars and also a natural aplication of OOD to objects that do not corrispond to some physical objects (such as window diplays).
djones@megatest.UUCP (Dave Jones) (10/04/89)
From article <1989Oct2.204603.10320@polyslo.CalPoly.EDU>, by ttwang@polyslo.CalPoly.EDU (Thomas Wang): > I managed to write a LL(1) backtracing parser generator in one > month using C++, so I can comfortably say ... I managed to write an LR(1) parser-generator with default reductions and equivalent state merging, in C, in under three weeks. Going once, going twice...
grover%brahmand@Sun.COM (Vinod Grover) (10/04/89)
In article <26873@genrad.UUCP> charlie@genrad.com (Charlie D. Havener) writes: >The examples I have seen on Object Oriented Design apply >nicely to problems like graphic window systems and to problems >in which there are real physical objects one can think about. >It is not at all clear to me how to apply OOD or to use Object >oriented programming style to the design of program language >compilers or interpreters. > >There seems to be a classic way to do these things, i.e. lexer- >parser-code generator that is well established. > >Can someone who is comfortable with OOD comment on its applicability >to such problems? Is OOD a poor match to some problems? Will anyone >admit it? > >Comments pro and con appreciated, thanks >Charlie Havener GenRad Inc. (508-369-4400 x3302) charlie@genrad.com IF (big if:) we understand OOD to include the use of inheritance, THEN IDL (Interface Description Language) can be said to be "object-oriented". Many compilers use IDL to describe the intermediate-language and data structures. You define a class of data for some program node, and various backend phases inherit from this class of data to add attributes to that class of data. DIANA is a publicly available IDL description for use in Ada compilers. -- Vinod Grover
markg@ashtate (Mark Grand) (10/05/89)
dchapman@portia.Stanford.EDU (David Chapman) writes: > Sure. I'm working on a compiler for an internal language that has two > different code generators. I defined a "code generator" interface (the > language, MAINSAIL, isn't quite object-oriented in the sense that > Smalltalk or C++ are) and wrote two different modules that implemented > that interface. > > The primary reason that you don't think in terms of objects for > something like a compiler is that you only have _one_ of the "objects" > at a time - one lexical analyzer feeds one parser feeds one optimizer Some years ago I was writing a compiler for a language that really did need to be able to have muliple instances of its lexical analyzer, parser, and friends. The reason for this was that when an undefined reference to a certain class of object was encountered in the source, the compiler was supposed to suspend what it was doing and go compile another file that was supposed to define the object. This compiler was written in MAINSAIL. ---------------- Mark Grand Ashton-Tate markg@alexis.a-t.com Walnut Creek Development Center ...!ashtate!alexis!markg 2033 N. Main Street, Suite 980 Walnut Creek, CA 94596-3722 (415) 746-1570
schrader@super.ORG (Jennifer A. Schrader) (10/10/89)
In article <8497@goofy.megatest.UUCP> djones@megatest.UUCP (Dave Jones) writes: >From article <1989Oct2.204603.10320@polyslo.CalPoly.EDU>, by ttwang@polyslo.CalPoly.EDU (Thomas Wang): > >> I managed to write a LL(1) backtracing parser generator in one >> month using C++, so I can comfortably say ... > > >I managed to write an LR(1) parser-generator with default reductions >and equivalent state merging, in C, in under three weeks. > >Going once, going twice... I managed to write an LW(3,498) concurrent parser-generator with default reductions, equivalent state-merging and sub-differentiation with seven yak-and-poker retrieval and an alternate compliler-optimizer with a small coffee-maker in just under three minutes in True Basic. And it worked the first time. (SOLD!) Jenn
nsw@cbnewsm.ATT.COM (Neil Weinstock) (10/11/89)
In article <15231@super.ORG> schrader@metropolis.UUCP (Jennifer A. Schrader) writes: [ ... ] >I managed to write an LW(3,498) concurrent parser-generator with >default reductions, equivalent state-merging and sub-differentiation with >seven yak-and-poker retrieval and an alternate compliler-optimizer >with a small coffee-maker in just under three minutes in True >Basic. And it worked the first time. Yeah, I once did something similar, but in Vax microcode. The hard part was typing it in hex... ________________ __________________ ____________________________ //// \\// \\// \\\\ \\\\ Neil Weinstock //\\ att!cord!nsw or //\\ "Oh dear, now I shall have //// //// AT&T Bell Labs \\// nsw@cord.att.com \\// to create more Martians." \\\\ \\\\________________//\\__________________//\\____________________________////
david@jpl-devvax.JPL.NASA.GOV (David E. Smyth) (10/13/89)
In article <26873@genrad.UUCP> charlie@genrad.com (Charlie D. Havener) writes: >The examples I have seen on Object Oriented Design apply >nicely to problems like graphic window systems and to problems >in which there are real physical objects one can think about. >It is not at all clear to me how to apply OOD or to use Object >oriented programming style to the design of program language >compilers or interpreters. > >There seems to be a classic way to do these things, i.e. lexer- >parser-code generator that is well established. > >Can someone who is comfortable with OOD comment on its applicability >to such problems? Is OOD a poor match to some problems? Will anyone >admit it? Actually, I've found that OOD and OOP works GREAT for implementing compilers and interpreters. I still use lex and yacc or similar tools, but the action code in all cases deals with objects. Its really very clean: A symbol table is a container of symbols, each of which are objects: some operations apply to symbol tables, some to symbols. The same operation applied to two different kinds of symbols may do different things: "allocate space" for a floating point variable gives, say, 4 bytes, for a boolean perhaps packs the bit into a word. I talked with a few people implementing SOTA compilers last week at OOPSLA and they have been finding the same thing: for example, instructions can generate their own bit patterns. Note that Peter Deutsch is renowned both for OO stuff (Smalltalk) and SOTA compilers. The implementation of compiler concepts via encapsulation of code and data (i.e., objects) allows the compiler to evolve easily: not only does OOP work for a quick initial implementation, but the code is easier to enhance to get a really good compiler over time. I think that compiler implementation is one of the BEST places to use OOD and OOP I have ever seen. In fact, conceptually OOP is kinda like yacc generated compilers: the actions are short, the methods are short. The actions have a very well constrained environment within which they are invoked, the methods on an object are likewise very constrained and localized. Try it, you'll like it. I used C++ with yacc: just use methods and instance variables, virtual functions, and single inheritance: "A better C." Avoid the overloaded operators and multiple inheritance like the plague, and you'll be productive very quickly. Note that you really don't need to use C++, I'm actually gone "back" to using C with structs for classes and instances. OOP doen't require an OOPL like C++ any more than structure programming required COBOL. OOP in C really works very easily, and I think gives me more control over the resultant software (that's what I've always liked about C).
djones@megatest.UUCP (Dave Jones) (10/13/89)
From article <5226@cbnewsm.ATT.COM>, by nsw@cbnewsm.ATT.COM (Neil Weinstock): ... > > Yeah, I once did something similar, but in Vax microcode. The hard part was > typing it in hex... > Typing it? TYPING IT? You kids got it too easy today. When I was your age, we used to enter the operating system into core each morning using the toggle switches on the front panel.
davidc@vlsisj.VLSI.COM (David Chapman) (10/17/89)
In article <8721@goofy.megatest.UUCP> djones@megatest.UUCP (Dave Jones) writes:
<From article <5226@cbnewsm.ATT.COM>, by nsw@cbnewsm.ATT.COM (Neil Weinstock):
<...
<> Yeah, I once did something similar, but in Vax microcode. The hard part was
<> typing it in hex...
<
<Typing it? TYPING IT? You kids got it too easy today. When I was your
<age, we used to enter the operating system into core each morning using
<the toggle switches on the front panel.
You had an operating system? Oh how I wished I could have one when I was
younger...
And just be glad you had core instead of mercury delay lines!
--
David Chapman
{known world}!decwrl!vlsisj!fndry!davidc
vlsisj!fndry!davidc@decwrl.dec.com
barmar@kulla (Barry Margolin) (10/20/89)
In article <15367@vlsisj.VLSI.COM> davidc@vlsisj.UUCP (David Chapman) writes: >In article <8721@goofy.megatest.UUCP> djones@megatest.UUCP (Dave Jones) writes: >You had an operating system? Oh how I wished I could have one when I was >younger... >And just be glad you had core instead of mercury delay lines! At least you HAD computer memory! When I was a lad, the computer would ask ME for the value of every word. It did it in binary with a knife, slashing my left side for a 0 and my right side for a 1 in the address. I- and D-space were indicated by poking one of my eyes out. Barry Margolin, Thinking Machines Corp. barmar@think.com {uunet,harvard}!think!barmar
peter@ficc.uu.net (Peter da Silva) (10/20/89)
> <> typing it in hex... > <we used to enter the operating system into core using the toggle switches... > And just be glad you had core instead of mercury delay lines! Wimps. Hiding behind electromechanical hardware. What about babbage engines? -- Peter da Silva, *NIX support guy @ Ferranti International Controls Corporation. Biz: peter@ficc.uu.net, +1 713 274 5180. Fun: peter@sugar.hackercorp.com. `-_-' "You can tell when a USENET discussion is getting old when one of the 'U` participants drags out Hitler and the Nazis" -- Richard Sexton
levin@bbn.com (Joel B Levin) (10/20/89)
In article <31004@news.Think.COM> barmar@kulla (Barry Margolin) writes: |In article <15367@vlsisj.VLSI.COM> davidc@vlsisj.UUCP (David Chapman) writes: |>In article <8721@goofy.megatest.UUCP> djones@megatest.UUCP (Dave Jones) writes: |>You had an operating system? Oh how I wished I could have one when I was |>younger... |>And just be glad you had core instead of mercury delay lines! | |At least you HAD computer memory! When I was a lad, the computer would ask |ME for the value of every word. It did it in binary with a knife, slashing |my left side for a 0 and my right side for a 1 in the address. I- and |D-space were indicated by poking one of my eyes out. |Barry Margolin, Thinking Machines Corp. At least on my first machine the stones I moved from pile to pile were mostly(*) harmless. /JBL (*) There was the time a scorpion was nesting in one pile. = Nets: levin@bbn.com | or {...}!bbn!levin | POTS: (617)873-3463 |
eachus@mbunix.mitre.org (Robert Eachus) (10/21/89)
In article <15367@vlsisj.VLSI.COM> davidc@vlsisj.UUCP (David Chapman) writes: >In article <8721@goofy.megatest.UUCP> djones@megatest.UUCP (Dave Jones) writes: ><From article <5226@cbnewsm.ATT.COM>, by nsw@cbnewsm.ATT.COM (Neil Weinstock): ><> Yeah, I once did something similar, but in Vax microcode. The hard part was ><> typing it in hex... ><Typing it? TYPING IT? You kids got it too easy today. When I was your ><age, we used to enter the operating system into core each morning using ><the toggle switches on the front panel. >You had an operating system? Oh how I wished I could have one when I was >younger... >And just be glad you had core instead of mercury delay lines! >-- > David Chapman This has gone far enough (well almost :-)... Mercury delay lines? Sounds like a Univac I or EDVAC. REAL old timers can tell you about programming the ENIAC using rotary switches. (Or programming an IBM 403 {Yetch! Acch! Pfft!} or a Burroughs E-101 using plugboards.) The first time I saw RPG-II, it looked vaguely familiar, I finally realized that it was originally designed to allow reuse of old IBM 403 plugboard programs. Or isn't that the kind of software reuse you had in mind.... Seriously, although I never programmed the ENIAC, when the HP-55 came out (yes, the handheld programmable calculator), my father and I dug out some old ENIAC code, since the architectures were so similar, and did some comparative benchmarks. The HP-55 weighed in at 50 times the speed of the ENIAC, cost a lot less, and was much more portable. The most smallest and most ancient machine for which I ever attempted a parser and code generator was a Royal-McBee RPG 4000 with 24-bit words and a 8000-odd word drum memory. (The last few tracks stored eight words each but they came around eight times as fast due to the creative use of additional read and write heads.) Input-output was paper tape and console switches and console switches ONLY. (The display was an oscilloscope trace for the register contents, no blinking lights even.) Made debugging real fun. Code "patches to a program were done by splicing or creative use of Scotch tape and a hole punch, and when I was working on a bootstrap loader our Frieden Flexowriter's punch was broken, so... I wrote a bootstrap loader for an RPG-4000...AND HAND PUNCHED IT INTO THE PAPER TAPE! (It was about fifty words long, written in hex so that I could put key entry points in words with the "right" address. This allowed us to hook the Flexowriter up directly to the machine and type things like LOAD PUNCH. Woopie!) We sure have it easy nowadays. I would have directed followups to comp.history.ancient, but as far as I know, those machines are not connected to Internet. Robert I. Eachus with STANDARD_DISCLAIMER; use STANDARD_DISCLAIMER; function MESSAGE (TEXT: in CLEVER_IDEAS) return BETTER_IDEAS is...
eliot@phoenix.Princeton.EDU (Eliot Handelman) (10/21/89)
In article <31004@news.Think.COM> barmar@kulla (Barry Margolin) writes: ;In article <15367@vlsisj.VLSI.COM> davidc@vlsisj.UUCP (David Chapman) writes: ;>In article <8721@goofy.megatest.UUCP> djones@megatest.UUCP (Dave Jones) writes: ;>You had an operating system? Oh how I wished I could have one when I was ;>younger... ;>And just be glad you had core instead of mercury delay lines! ; ;At least you HAD computer memory! When I was a lad, the computer would ask ;ME for the value of every word. It did it in binary with a knife, slashing ;my left side for a 0 and my right side for a 1 in the address. I- and ;D-space were indicated by poking one of my eyes out. You had BINARY? Well that shore beats all get out. Back when I was a youngster we had a unary system, it were almost plum useless. If you wanted to add 2+2 you needed 4 separate machines and you got your answer by counting the number of machines present.
dwiggins@atsun.a-t.com (Don Dwiggins) (10/25/89)
> You had BINARY? Well that shore beats all get out. Back when I was a > youngster we had a unary system, it were almost plum useless. If you wanted > to add 2+2 you needed 4 separate machines and you got your answer by > counting the number of machines present. Ahh, you folks are reminding me of my experiences in the '50s (19, that is), working on the THROBAC (THrifty ROman numeral BAckward-looking Calculator). I was designing the divide unit -- a truly challenging task. Unfortunately, the whole project was scrapped when it was discovered that we had no way to represent zero. A pity; it was going to be a truly impressive-looking machine. -- Don Dwiggins "Solvitur Ambulando" Ashton-Tate, Inc. dwiggins@ashtate.uucp dwiggins@ashtate.a-t.com