[net.nlang] Diacritics...

jbdp@jenny.UUCP (Julian Pardoe) (08/15/85)

In article <talcott.483> tmb@talcott.UUCP (Thomas M. Breuel) writes
> Diacritical marks,  contracted letters, and special characters are not
> a sign of cultural identity -- they  are  annoying  leftovers  from  a
> time  in  which people used to do most of their writing with a pen (or
> a brush, on the other side of the world).  Let's hope they'll soon get
> out of fashion!

They  are  not  annoying  leftovers  but  necessary compensation for the
insufficiency of the Latin alphabet,  in particular its lack of  symbols
for  the  sounds  sh,  zh  &c...  Remember  that  `j',  `v' and `w' were
originally ``contracted  letters,  and  special  characters''.  (And  of
course  the  Romans  borrowed  `y' and `z' from Greek fairly late in the
day -- and modified `c' to produce `g'...)

[So...]

English  is one of the feuu languages that can get by uuithout adding to
that alphabet (and one of the feuu  that  uses  all  of  it),  but  only
because  uue're  prepared  to  put  up  uuith  such  a  loose connection
betuueen sound and symbol.  I iust  don't  belieue  it  uuould  euen  be
possible  to deuise *usable* orthographies for the many languages of the
uuorld that relied  on  combinations  of  letters  rather  than  special
letters and diacritics.

Houu  one  can  uurite  a  C program in Hungarian or Serbo-Croat I don't
knouu... It's an interesting problem!

Julian Pardoe

-------------

University of Cambridge         Tel:     +44 223 352435 ext. 265
        Computer Laboratory     Arpa:    <@ucl-cs: jbdp@cl.cam.ac.uk>
Corn Exchange Street            Janet:   jbdp@UK.AC.Cam.CL
CAMBRIDGE, CB2 3QG              UUCP:    mcvax!ukc!cl-jenny!jbdp
Great Britain

tmb@talcott.UUCP (Thomas M. Breuel) (08/17/85)

In article <258@jenny.UUCP>, jbdp@jenny.UUCP (Julian Pardoe) writes:
> They  are  not  annoying  leftovers  but  necessary compensation for the
> insufficiency of the Latin alphabet,  in particular its lack of  symbols
> for  the  sounds  sh,  zh  &c...  Remember  that  `j',  `v' and `w' were
> originally ``contracted  letters,  and  special  characters''.  (And  of

But I claim that the Roman alphabet is likely to be sufficient for the
representation of most languages (notable exceptions:  Chinese, because
of tones and homphones, and Japanese because of an even larger number
of homophones than Chinese):  German, for example,  has a large number
of vowels, which are represented by combinations of the basic vowels
'aeiou' with 'h' or 'e' (or doubled) to indicate length, vowel followed
by 'e' to indicate umlaut formation, and vowel followed by double
consonant to indicate brevity.  The special sounds 'sh', 'zh', 'ch',
and 'kh' are all represented by unique combinations of consonants. In
principle, the letter 'c' could be eliminated from the German alphabet
(in East Germany it has been, except for the consonants 'sch', 'ch' and
in a small number of foreign words), as well as the letter 'v', which
can be replaced by either 'f' or 'w' in all contexts. Nasalised 'n' is
written as 'ng'.  And so on. 

The point is that German is phonetically not simple, but nevertheless
has a relatively straightforward orthography (i.e. you can spell a word
by sound) and can do without special characters.

> English is one of the feuu languages that can get by uuithout adding to
> that alphabet (and one of the feuu  that  uses  all  of  it),  but  only
> because  uue're  prepared  to  put  up  uuith  such  a  loose connection
> betuueen sound and symbol.  I iust  don't  belieue  it  uuould  euen  be
> possible  to deuise *usable* orthographies for the many languages of the
> uuorld that relied  on  combinations  of  letters  rather  than  special
> letters and diacritics.

Given how complicated orthography is in English, people are doing very
well. In fact, English is one of the easiest languages to learn.
Therefore, even if a spelling reform that eliminates all national
characters would complicate the orthography slightly (which I strongly
doubt), it would probably not harm the language too much. But if a
government undertook the task of a spelling reform with the goal of
eliminating national characters, they would at the same time probably
also correct some unrelated spelling problems, which would improve
rather than worsen matters.

Why am I arguing about this at all? The existence of national
characters is a problem: it requires special equipment and impedes
trade and information exchange. I have experienced these problems
myself (being German), and I believe that the most reasonable solution
is to eliminate national characters rather than to live with the
burden, unless such an elimination is linguistically unacceptable, as
in the case of Chinese or Japanese. If there are such linguistic
reasons in the case of the Scandinavian languages, I would like to hear
about them. Mere flaming or insistence is not going to help anyone.

						Thomas.

P.S.:

The English, by the way, have shown that it is possible to eliminate
letters from their alphabet and replace them with letter combinations
(I don't know the historical details or reasons for this, though):  they
eliminated 'th' and 'dh' (which are still present in modern Icelandic).

P.P.S.:

> Houu  one  can  uurite  a  C program in Hungarian or Serbo-Croat I don't
> knouu... It's an interesting problem!

Again, I can only tell you about my experience in German: it is
actually quite nice to have keywords and identifiers in different
languages, since one is unlikely to use a keyword as an identifier
accidentally. PASCAL and 'C' programs just look horrible, though, when
printed on a printer in 'German mode' (i.e. with brackets and braces
mapped to umlaute). Since my old Epson printer could not be switched in
software, I just used it in American mode and wrote all my papers using
the vanilla 26 letter alphabet, which is marginally acceptable for
school papers and certain kinds of publications (in particular in
computer science).

sommar@enea.UUCP (Erland Sommarskog) (08/23/85)

Thomas Breuel writes:

>Why am I arguing about this at all? The existence of national
>characters is a problem: it requires special equipment and impedes
>trade and information exchange. I have experienced these problems

You've got it all wrong. The existence is not the problem. The
problem is that some people - like you for instance - think the 
world is completely computerized. It isn't so. The very vast majority
of the people in the world don't know anything about computers and
ASCII

As a consequence hereof they would very much disagree of changing
the spelling just "because of the computer".

If you think I'm wrong with this opion I think you should try to convince
you grand-mother - or any other person who don't deal with computers -
about the absolute nessecity about removing "A, "O and "U from German.
Do that before you write your next reply. 

aer@alice.UucP (y) (09/02/85)

There are a few languages that use near-phonetic pronunciation- Italian among them. There is even a language hardly anyone uses that uses *pure* phonetic
notation (and, yes, diacritical marks)- Esperanto. As a matter of fact, I 
made a rule table for a speech synthesizer in Esperanto in under thirty
seconds, as all the sounds agree with something called the International
Phonetic Alphabet. The diacritical marks, (^s) serve to harden the sound of
a consonant.
Touche' to Thomas' challenge.