fs@rex.cs.tulane.edu (Frank Silbermann) (06/07/90)
Richard Kennaway CMP RA writes: <1555@sys.uea.ac.uk> jrk@sys.uea.ac.uk > I prefer not to think of pattern-matching as syntactic sugar. > Why is it better to define pattern-matching by compilation > to lambda calculus than taking pattern-matching seriously > in its own right? The syntactic sugar interpretation seems to me the most direct way to understand pattern-matching as it is currently implemented in _today's_ languages. You admit you would like to see a different kind of language -- one in which pattern-matching is accepted at its face value. You will need a set of constructs which is sufficiently expressive, and an evaluation procedure which is correct and complete. Also, you must be able to avoid invalid programs (e.g. as set of equations which implies 1 = 2). > I'm not sure what you meant above by "higher order functions". > In my f/g example: >>> f Nil Nil = 1 g Nil Nil = 1 >>> f Nil (Cons x y) = 2 g (Cons x y) Nil = 2 >>> f (Cons x y) z = 3 g z (Cons x y) = 3 > > if I apply f to Nil, I get a function which maps > Nil to 1 and (Cons ? ?) to 2. I can also define > the argument-switching combinator: c x y z = x z y > Is this higher-order enough for you? First, you must convince me that 'f' and 'g' are functions. I admit that this program's operational behavior could be interpreted as the implementation of a few familiar functions, but do the _formal semantics_ interpret them as such? The term-rewriting I have studied concerns equivalence classes within a universe of syntactic terms -- functions are never explicitly mentioned. Here, 'f' and 'g' would be ordinary constructors. Denotational semantics deals _explicitly_ with functions and their domains, so this seems a natural framework for describing a functional language (in its pure form). You speculated that the ADJ group's initial algebra work might deal with this question. I don't know enough to comment. Frank Silbermann fs@rex.cs.tulane.edu Tulane University, New Orleans, Louisianna USA