[net.ai] Understanding speech vs. hearing words

Feuerman.pasa@XEROX.ARPA (09/04/84)

The subject has come up about whether one need understand the semantics
of an utterance before one can recognize words, or even syllables.
While it seems a bit of research has been cited for both sides, I
thought it would be interesting to offer an experience of mine for
evidence:

I was travelling in Italy, and it was that time of the evening again,
time to find our daily ration of gelato (Italian ice cream)!  Our search
brought us into a bar of sorts, with Paul Simon's (I think it was Paul
Simon) recording of "Slip Sliding Away" playing in the background.  The
bartender was singing along, only it didn't quite come out right.  What
he was singing was more like "Sleep Sliding Ayway" (all of the vowels
being rather exagerated).  I regret that I had no way of knowing whether
he had seen the words written down before (which could account for some
of his mis-pronunciations), but it was pretty clear that he had no idea
of the meaning of what he was singing.


--Ken.

[It seems to me that the same sort of anecdote could be told of any
child; they frequently store and repeat phrases that are to them
merely nonsense (e.g., the alphabet, especially LMNOP).  More to the
point, a good first step in learning any new oral language is to listen
to it, sans understanding, long enough to begin to identify syllables.
This greatly simplifies later word drills since the student can then
grasp the phonetic distinctions that the teacher considers important
(and obvious).  The implication for speech understanding is that it is
indeed possible to identify syllables without understanding, but only
after some training and the development of fairly sophisticated
discriminant capabilities.  -- KIL]