bob@boulder.colorado.edu (Robert Gray ) (02/17/89)
Recent discussions of lexical analysis have focused on the problem of scanning speed. Both Flex and GLA generate very fast scanners. GLA-generated scanners are slightly faster, but Flex has the advantage of being compatible with lex. In this note I'd like to emphasize what I consider GLA's more significant contribution - that of encapsulating knowledge of lexical analysis. GLA provides a simple, yet powerful, interface to a regular expression translator. As an analogy, consider the troff macro packages -me, -ms and -man. They simplify writing papers by encapsulating knowledge of how to begin paragraphs, build numbered lists, ... etc. Most troff users avoid the horrendous task of dealing with troff directly by using one of these macro packages. In addition to simplifying the user's task, they provide standard formats that increase the uniformity of the documents produced using them. (This is particularly true of -man.) Likewise, GLA and its libraries have captured knowledge about lexical analysis. For example, the GLA macro C_STRING_LITERAL arranges for proper scanning and processing of legal C strings and flags illegal ones. This includes the tedious work of handling arbitrary length, multi-line strings, escaped embedded double quotes, null strings, and converting strings with escape sequences such as \\, \r and \013. Coupling GLA with libraries for string storage, identifier hash tables and error reporting (all of which are available from the compiler group at the University of Colorado) allows compiler writers to spend their time on problems that have not yet been solved. Bob Gray (bob@boulder.colorado.edu) Computer Science Dept 430 University of Colorado Boulder Co 80309-0430 -- Send compilers articles to ima!compilers or, in a pinch, to Levine@YALE.EDU Plausible paths are { decvax | harvard | yale | bbn}!ima Please send responses to the originator of the message -- I cannot forward mail accidentally sent back to compilers. Meta-mail to ima!compilers-request