[comp.ai.neural-nets] Neuron Digest V6 #50

neuron-request@HPLMS2.HPL.HP.COM ("Neuron-Digest Moderator Peter Marvit") (08/24/90)

Neuron Digest   Thursday, 23 Aug 1990
                Volume 6 : Issue 50

Today's Topics:
               neuron digest submission (protein folding)
                          Help with reference.
                        Re: Help with reference.
   Papers on applications of N.N.s to constraint satisfaction problem
                        Re NN-Definition Language
                     SF Bay area - AI Forum Meeting
                       Post-Doc positions in U.K.
                Re: request for simple NN program (LONG)


Send submissions, questions, address maintenance and requests for old issues to
"neuron-request@hplabs.hp.com" or "{any backbone,uunet}!hplabs!neuron-request"
Use "ftp" to get old issues from hplpm.hpl.hp.com (15.255.176.205).

------------------------------------------------------------

Subject: neuron digest submission (protein folding)
From:    smuskal%calv01.hepnet@Csa2.LBL.Gov
Date:    Sun, 19 Aug 90 15:24:18 -0700

   Recently, the world of neural networks has cast itself upon molecular 
biology's most challenging computational problems, "The protein Folding
Problem." Our lab and others have applied feed-forward, supervised
networks to the most obvious subproblems such as secondary structure
predictions, solvent accessibility predictions, disulfide-bonding state
predictions, and small tertiary structure predictions.

   Aside from using similar type neural networks, each approach to these 
subproblems has one thing in common, mapping LOCAL amino acid sequence to
some protein structural feature. And while people have experimented with
representation of the 20 common amino acids, still no one has provided
any good method of representing the ENTIRE amino acid sequence for
predictive purposes. This is, in essence, the key problem since a
protein's folded structure depends on ALL of the competing interactions
within and surrounding the COMPLETE amino acid sequence. Being experts in
representation, perhaps some of you have ideas on describing the complete
amino acid sequence of a protein in as few computational nodes as
possible. Keep in mind that there are 20 common amino acids and that most
proteins of interest are over 100 amino acids long.

                                                Steve
                                                smmuskal@lbl.gov

------------------------------

Subject: Help with reference.
From:    "Paulo V. Rocha" <P.Rocha@cs.ucl.ac.uk>
Date:    Tue, 21 Aug 90 09:15:55 +0100


I wonder if anyone could help me finding a journal.
The reference I have is somewhat incomplete but ...

        Bulletin of the Electrotechnical Laboratory
        Vol 53, No 10 (1989)

I suppose the referred lab is one of the NTT labs in Japan.
The article I am interested on is

    Toward Soft Logic for the Foundation of Flexible Information Processing
    Nobuyuki OTSU

I would appreciate if someone could supply the author's address, email or
any means by which I could reach him or even better :-), where I could find
the journal itself (here in London it is being very hard, none of the Univ.
libraries seem to have it)


The article is said to be an edited English Version of the Japanese papers
(if it helps...)

        Towards Soft Logic
        N. Otsu & K. Tamura
        Journal of IPS (Information Processing Society?), 28 629-636 (1987)

        Soft Logic for Recognition and Understanding - Recapitulation of
            pattern Recognition
        Journal of IEICE (?), 71, 1231-1240 (1988)

Thanks for your time,

P.

+-----------------------------+---------------------------------------------+
Paulo Valverde de L. P. Rocha |   JANET:procha@uk.ac.ucl.cs
Department of Computer Science|  BITNET:procha%uk.ac.ucl.cs@UKACRL
University College London     |Internet:procha%cs.ucl.ac.uk@nsfnet-relay.ac.uk
Gower Street                  | ARPANet:procha@cs.ucl.ac.uk
London WC1E 6BT               |    UUCP:...!mcvax!ukc!ucl-cs!procha
England                       |     tel: +44 (071) 387 7050 x 3719
                              |     fax: +44 (071) 387 1397
+-----------------------------+---------------------------------------------+

------------------------------

Subject: Re: Help with reference.
From:    yakiyama@etl.go.jp (Yutaka Akiyama)
Organization: Electrotechnical Laboratory, Tsukuba Science City
Date:    22 Aug 90 01:19:14 +0000

Dear Dr. Rocha,
 I'm writing from the Electrotechnical Laboratory...

In article <1128@ucl-cs.UUCP> P.Rocha@ucl-cs.UUCP writes:
> I wonder if anyone could help me finding a journal.
> The reference I have is somewhat incomplete but ...
> 
>       Bulletin of the Electrotechnical Laboratory
>       Vol 53, No 10 (1989)

You can reach the author by:
        e-mail: otsu@etl.go.jp
        snail:  Dr. Nobuyuki Otsu
                Mathematical Informatics Section,
                Information Science Division, ETL
                Umezono 1-1-4, Tsukuba, 305, JAPAN
 
> I suppose the referred lab is one of the NTT labs in Japan.

 The Electrotechnical Laboratory (ETL) belongs to the Agency of
Industrial Science and Technology (AIST) of the Ministry of International
Trade and Industry (MITI).

 The largest national research institute in Japan was founded in 1891 and
is dedicated to basic research and development in the fields such as
electronics, information processing, energy technology and standards and
measurements.

 Dr. Otsu is currently the chief of the Math. Info. Section, and serves
concurrently as the head researcher of the ETL.
 I'm sure to relay your request to him. (^_^)

Yutaka Akiyama (yakiyama@etl.go.jp)
 Computation Models Section,  Computer Science Division,
 Electrotechnical Laboratory, Umezono, Tsukuba Science City, 305 JAPAN
 Interest: Optimization by Neural Networks, VLSI impl. of Gaussian Machines

------------------------------

Subject: Papers on applications of N.N.s to constraint satisfaction problem
From:    qian@icopen.ICO.OLIVETTI.COM (DA QUN QIAN)
Date:    Tue, 21 Aug 90 15:58:29 +0100


I am looking for the papers on how to use neural nets to solve constraint
satisfaction problem and how to generate production rules or other
represenations of knowledge from neural networks.

I would be grateful if you can offer me such information.

Best regards.

Qian Da Qun

Email: qian@icopen.ico.olivetti.com

Olivetti Artificial Intelligence Center
Nuova ICO 3 Piano
Via Jervis 77, 10015 Ivrea(TO)
Italy

------------------------------

Subject: Re NN-Definition Language
From:    Gary Fleming <72260.2544@compuserve.com>
Date:    21 Aug 90 13:13:32 -0400

Date:          21 August 1990
From:          Gary Fleming
               72260.2544@CompuServe.COM
Organization:  American Electronics, Inc., and
               International Neural Network Society / SIG 
               Washington

Dear Ms. Thalmann and Mr. Almassy,

One item I found missing from your definition language was any mention of
the nodal activation (or transfer function).  I assume you have
implemented your network with the logistic activation function f(x) =
1/(1+exp(-x)).  However, is this true of the nodes comprising the input
and output layers?  You might investigate the necessity (and
desirability) of using anything but a linear activation function for the
input-output layers.

It is my belief that the logistic function is not a particularly good
choice for the activation function of the hidden layer nodes, either.  It
is this premise that has forced me to generate my own NN software rather
than purchase an off-the-shelf product.  You might consider how the user
could specify an arbitrary activation function (and its derivative) for
each of the nodes comprising your network.  I think the observation that
every researcher wishes to do something a little differently will quickly
render any NN-definition language non-universal.

However, on a broader stance, I must say that a general purpose language
is premature.  A more desirable level of organization is the
establishment of a common artificial neural network (ANN) vocabulary to
be shared by all of the researchers and practitioners who represent many
diverse fields.  As you know, electrical engineers, neurophysiologists,
cognitive psychologists, mathematicians, et. al., can easily use
different terms for the same concept or (worse) use the same term for
differing concepts.  This intellectual Tower of Babel hobbles the sister
field of nonlinear systems theory (chaos) as well.  There are several
members of the INNS/SIG Washington interested in a resolution to this
problem.

At any rate, a universally accepted NN-definition language should
implement universally accepted ANN concepts and methodologies.

Gary Fleming
72260.2544@CompuServe.COM


------------------------------

Subject: SF Bay area - AI Forum Meeting
From:    kingsley@hpwrc02.hp.com
Date:    Tue, 21 Aug 90 15:52:37 -0700

**************************************************************
*                                                            *
*            A I     F O R U M     M E E T I N G             *
*                                                            *
*                                                            *
*        SPEAKER: Andreas Wiegend                            *
*        TOPIC:   Pruning neural nets and predicting         *
*                 Sun Spots                                  *
*        WHEN:    7PM Tuesday 8/28/90                        *
*        WHERE:   Lockheed building 202, auditorium          *
*                 3251 Hanover Street                        *
*                 Palo Alto, CA                              *
*                                                            *
*        AI Forum meetings are free, open and monthly!       *
*              Call (415) 594-1685 for more info             *
**************************************************************


------------------------------

Subject: Post-Doc positions in U.K.
From:    PSS001%VAXA.BANGOR.AC.UK@VMA.CC.CMU.EDU
Date:    Wed, 22 Aug 90 18:47:17 +0000


Department of Psychology, University of Wales, Bangor
and Department of Psychology, University of York


CONNECTIONISM AND PSYCHOLOGY

THREE POST-DOCTORAL RESEARCH FELLOWSHIPS

Applications are invited for three post-doctoral research fellowships to
work on the connectionist and psychological modelling of human short-term
memory and spelling development.

Two Fellowships are available for three years, on an ESRC- funded project
concerned with the development and evaluation of a connectionist model of
short-term memory.  One Fellow will be based with Dr. Gordon Brown in the
Cognitive Neurocomputation Unit at Bangor and will be responsible for
implementing the model.  The other Fellow, based at York with Dr. Charles
Hulme, will be responsible for undertaking psychological experiments with
children and adults to evaluate the model.  Starting salary for both
posts on research 1A grade up to # 13,495.

One two-year Fellowship is available to work on an MRC-funded project to
develop a sequential connectionist model of the development of spelling
and phonemic awareness in children.  This post is based in Bangor with
Dr. Gordon Brown.  Starting salary on research 1A grade up to # 14,744.


Applicants should have postgraduate research experience or interest in
cognitive psychology/cognitive science or connectionist/ neural network
modelling and computer science.  Good computing skills are essential for
the posts based in Bangor, and experience in running psychological
experiments is required for the York-based post.  Excellent computational
and research facilities will be available to the successful applicants.

The appointments may commence from 1st. October 1990, but start could be
delayed until 1st. January 1991.  Closing date for applications is 7th.
September 1990, but intending applicants should get in touch as soon as
possible.  Informal enquiries regarding the Bangor-based posts, and
requests for further details of the posts and host departments, to Gordon
Brown (0248 351151 Ext 2624; email PSS001@uk.ac.bangor.vaxa); informal
enquiries concerning the York-based post to Charles Hulme ( 0904 433145;
email ch1@uk.ac.york.vaxa).  Applications (in the form of a curriculum
vitae and the names and addresses of two referees) should be sent to Mr.
Alan James, Personnel Office, University of Wales, Bangor, Gwynedd LL57
2DG, UK.

(Apologies to anyone who receives this posting through more than one list
or newsgroup)

------------------------------

Subject: Re: request for simple NN program (LONG)
From:    chuck@utkux1.utk.edu (chuck)
Organization: University of Tennessee Computing Center, Knoxville
Date:    22 Aug 90 21:05:06 +0000

[[ Editor's Note: Every so often, a series of requests comes for public
domain or other software to run neural network simulations.  The list
below is primarily for UNIX machines, though some DOC and MAC code is
included.  No commercial packages are listed.  I'm sure we all thanks
Chuck for the effort he put in.  -PM ]]

        I had occasion recently to compile this list as an appendix to
my MS thesis.  Enjoy, but please try to be considerate of the authors.
If you want one at your site, have it installed by your system folks
(really only true of the larger ones. . .) to prevent multiple copies and
mass FTP loads!

        Enjoy.  It's been fun!

chuck

________

A list of publicly available simulators for artificial neural networks.

        The following list is by no means to be considered complete or
exhaustive, it merely represents those simulators which have come to my
attention through postings on the internet and articles in various
journals.  Inclusion in this list represents no particular endorsement,
and ommission from the list should be regarded as unfortunate oversight
on the author's part.  The list is provided as a starting point for
individuals and institutions who desire to obtain software suitable to
their needs who wish to examine these offerings.

        The term "publically available" may be taken to mean anything
available at nominal or no cost through channels other than those
conventionally regarded as commercial software products.  Since the term
'public domain' has become synonimous with free and unrestricted software
it should be noted in passing that much of what is available is, in fact,
copyrighted work which must be licensed --albeit at no cost.

        The term 'ftp' used below stands for file transfer protocol,
which is a means of obtaining and transfering files between computers
linked by an international computer network collectively know as the
Internet.  To obtain more information concerning ftp or the Internet,
please consult operators of your local computing facility.  The format
used for site addresses and electronic mail addresses below conforms to
that currently used on the Internet. I have included alternate means of
obtaining the simulators where they have come to my attention.

++++++++++++++++++++++++
bps - George Mason University Back Prop Simulator
Current version is 1.01 (Nov., 1989)

A special-purpose simulator for Back Propagation and a BP speedup
technique called 'gradient correlation' [IJCNN, Jan., 1990].

Available via anonymous ftp from gmuvax2.gmu.edu (129.174.1.8).

Distributed as executables for VAX 8530 under Ultrix 3.0, and versions
for 8088 based IBM PC, and 80286/386 IBM PC machines.  Includes examples
and a tutorial document.  Source code license is available.  Contact:

Eugene Norris                           (703) 323-2713
Computer Science Department             enorris@gmuvax2.gmu.edu
George Mason University
Fairfax, Virginia  22032

++++++++++++++++++++++++
Back Propagation simulator [Name and version not known.]

A special-purpose simulator for back propagation using several training
methods.

Distributed on disk for the IBM PC, with mouse and VGA or EGA display.

To obtain it, send a stamped, self-addressed envelope and a floppy disk
(3.5" or 5.25") to:

                Universitt Kassel
                Fachbereich Mathematik
                Forschungsgruppe Neuronale Netzwerke
                Heinrich-Plett-Str 40
                3500 Kassel
                West Germany

[Report taken from Internet News, October, 1989. ]

++++++++++++++++++++++++
MIRRORS/II -- Maryland MIRRORS/II Connectionist Simulator
A general-purpose connectionist simulator.

To obtain this simulator you must sign an institutional site license.  A
license for individuals is not acceptable.  The only costs incurred are
for postage for a printed copy of the manual and tape cartridge (you send
your own 1/4" cartridge or TK50 cartridge to them, if desired.)
Instructions for obtaining the software via ftp are returned to you upon
receipt of the license agreement.  To obtain a copy of the license send
your physical mail address via e-mail to:

                          mirrors@cs.umd.edu

or by U.S. Mail to:

                       Lynne D'Autrechy
                       University of Maryland
                       Department of Computer Science
                       College Park, MD  20742

        MIRRORS/II is implemented in Franz Lisp and will run under
Opuses 38, 42, and 43 of Franz Lisp on UNIX systems. It is currently
running on a MicroVAX, VAX and SUN 3.

++++++++++++++++++++++++
NeurDS -- The Neural Design and Simulation System.
Current Version is 3.1 (May, 1989.)

A general purpose simulator.

The system is licensed on a no-fee basis to educational instutions by
Digital Equipment Corporation.  To obtain information, send your
physical or electronic mail address to:

Max McClanahan                  mcclanahan%cookie.dec.com@decwrl.dec.com
Digital Equipment Corporation
1175 Chapel Hills Drive
Colorado Springs, Colorado  80920-3952

You should receive instructions on how to obtain a copy of the manual
and copies of the license agreement.  [Beyond receipt of the license
agreement, I do not know the details of distribution.]

The NeurDS system will run on any Digital platform including Vax/VMS,
Vax/Ultrix, and DECsystem/Ultrix.  A graphics terminal is not required
to support the window interface.

Specific models are described using a superset of the C programming
language, and compiled into a simulator form.  This simulator can accept
command scripts or interactive commands.  Output can take the form of a
window-type envirionment on VT100 terminals, or non-window output on any
terminal.

++++++++++++++++++++++++
FULL -- Fully connected temporally recurrent neural networks.
A demonstration network described in "Learning State Space Trajectories
in Recurrent Neural Networks" [no other reference material!]

        The author (whose name is Barak Pearlmutter of the Journal of
Neural Computation, 'bap@f.gp.cs.cmu.edu') describes this as "a bare
bones simulator for [. . .] temporally recurrent neural networks" and
claims that it should vectorize and parrallelize well.  It is available
for ftp from doghen.boltz.cs.cmu.edu.  Login as 'ftpguest' password
'oaklisp'. Be sure to ftp as binary for the file 'full/full.tar.Z' (you
must either use a directory named full on your local machine, or use
'get' and let it prompt you for remote and local file names.) Do not
attempt to change directories.  It is copyrighted and is given out for
academic purposes.

[The information dates from November of 1988.]

++++++++++++++++++++++++
GRADSIM Connectionist Network Simulator. A special-purpose
simulator specifically designed for experiments with the temporal
flow model.

Latest Version 1.7 

The simulator is available for anonymous ftp from ai.toronto.edu
(128.100.1.65).  For information contact:

Raymond Watrous                 watrous@ai.toronto.edu
Department of Computer Science
University of Toronto
Toronto, Ontario M5S 1A4

        In C, implementations on VAX (VMS & Ultrix), Sun and CYBER are
mentioned.  A graphical interface is 'under development.' (March, 1988.)
Includes an excellent article with references.

++++++++++++++++++++++++
opt -- A Neural-Net Training Program Based on Conjugate-Gradient
Optimization.  A special-purpose simulator. [Current version unknown.]

Available for anonymous ftp from cse.ogc.edu (129.95.40.2) from the
directory "/ogc2/guest/ftp/pub/nnvowels".  Consult the file 'README'
for more instructions.  For further information, you might contact
'opt-dist@cse.ogc.edu'.

        Basically  C code to be compiled under BSD Unix, with no
graphic interface. They do maintain a list of users, perhaps a
mailing list.  An unsigned paper describing the technique and use
of the simulator is included.

++++++++++++++++++++++++

CasCor1 -- Cascade-Correlation Simulator
A special-purpose simulator for experimenting with the
Cascade-Correlation algorithm described in:

Fahlman, Scott, and C. Lebiere. "The Cascade-Correlation Learning
        Architecture."  In _Advances in Neural Information
        Processing Systems 2_, edited by D. S. Touretzky. New York :
        Morgan Kaufman Publishers, 1990.

It is available for anonymous ftp from pt.cs.cmu.edu (128.2.254.155) in
the directory "/afs/cs/project/connect/code" (subdirectories may be
available, but parent directories may not be.)  There are Lisp and C
versions available, as well as several other programs.  The simulator is
placed in the public domain.  For information contact:

Scott E. Fahlman                        fahlman@cs.cmu.edu
School of Computer Science
Carnegie-Mellon University
Pittsburgh, PA  15217

The original version by Scott Fahlman was written in Common Lisp and has
been tested on CMU Common Lisp on the IBM RT, Allegro Common Lisp (beta
test) for Decstation 3100, and Sun/Lucid Common Lisp on the Sun 3.  This
program was translated into C by Scott Crowder.

++++++++++++++++++++++++
GENESIS - GEneral NEural SImulation System with
XODUS   - X-windows Output and Display Utility for Simulations
A general simulator.

Currently Beta-Test Version, 1990.

From the release announcement ( January, 1990 by Jim Bower ):

"       Full source for the simulator is available via FTP from
genesis.cns.caltech.edu (131.215.135.64).  To acquire FTP access to
this machine it is necessary to first register for distribution by
using telnet or rlogin to login under user "genesis" and then follow
the instructions (e.g. 'telnet genesis.cns.caltech.edu' and ' login as
'genesis').  When necessary, tapes can be provided for a small handling
fee ($50).  Those requiring tapes should send requests to
genesis-req@caltech.bitnet.  Any other questions about the system or
its distribution should also be sent to this address. 
 
        GENESIS and XODUS are written in C and run on SUN and DEC
graphics work stations under UNIX (version 4.0 and up), and X-windows
(version 11).  The software requires 14 meg of disk space and the tar
file is approximately 1 meg.

        The current distribution includes full source for both GENESIS
and XODUS as well as three tutorial simulations (squid axon, multicell,
visual cortex).  Documentation for these tutorials as well as three
papers describing the structure of the simulator are also included. As
described in more detail in the "readme" file at the FTP address, those
interested in in developing new GENESIS applications are encouraged to
become registered members of the GENESIS users group (BABEL) for an
additional one time $200 registration fee.  As a registered user, one
is provided documentation on the simulator itself (currently in an
early stage), access to additional simulator components, bug report
listings, and access to a user's bulletin board.  In addition we are
establishing a depository for additional completed simulations. "

++++++++++++++++++++++++
SunNet
A generalized simulator.

Version 5.5.2.4

Available for anonymous ftp from boulder.colorado.edu (128.138.240.1).

While this program was obviously written for Sun workstations (versions
for suntools and the X-window envirionment), the documents list other
configurations.  These include a non-graphic version which runs on "any
UNIX machine", and versions which run on an Alliant or UNIX machine and
send data to a graphics support program running on a Sun workstation. 
It is very easy to install.

A mailing list exists for users of the simulator.

++++++++++++++++++++++++
RCS - The Rochester Connectionist Simulator
A general simulator.

Version 4.2

Available for anonymous ftp from cs.rochester.edu (192.5.53.209).
Tapes may be purchased (1600 bpi 1/2" reel or QIC-24 Sun 1/4" cartridge)
from:
                Peg Meeker
                Computer Science Department
                University of Rochester
                Rochester, New York  14627

C source code is provided, including a graphic interface which may 
function under X Windows or SunView on Sun Workstations.  A wide 
variety of Unix machines are supported, and the simulator may be
used without the graphics interface.  A version for the MacIntosh
is included in the distribution.  Mailing lists exist for users
and bug reports.

++++++++++++++++++++++++
SFINX  -- Structure and Function in Neural Connections
A General Simulator.

Version 2.0 ( November, 1989 )

In order to ftp this simulator, a license agreement must be submitted.
Upon receipt of this agreement, instructions and the password to ftp
the software are made available.  To obtain the license write:

                Machine Perception Laboratory
                Computer Science Department
                University of California
                Los Angeles, CA 90024

This system requires color to operate the graphics interface, but may
be operated without graphics.  Support for Sun, Ardent Titan, HP 300,
and IBM PC RT machines is specifically mentioned --but other Unix
platforms should function as well.  Specific graphics support is provided
for Matrox VIP 1024, Imagraph AGC-1010P, HP Starbase and X Windows.  A
version providing support for monochrome graphics is expected to be
released in Fall, 1990.

++++++++++++++++++++++++
Mactivation
A specialized simulator for investigating associative memory using the
delta rule and Hebbian Learning.

Version 3.2

A public domain version is available for anonymous ftp from the
University of Colorado at Boulder (boulder.colorado.edu, 128.138.240.1)
or possibly by contacting the author.

                        Mike Kranzdorf
                        University of Colorado
                        Optoelectronic Computing Systems Center
                        Campus Box 525
                        Boulder, Colorado  80309-0525
                        
                        mikek@boulder.colorado.edu

Future versions will probably not be public domain, but will be
availble from Oblio, Inc., 5942 Sugarloaf Road, Boulder, Colorado
80309.

Provided as executables for the Apple MacIntosh.

++++++++++++++++++++++++

Several special purpose simulators are provided with the following
book:

McClelland, J. L., and David.E. Rumelhart. _Explorations in
Parallel Distributed Processing_. Cambrige: MIT Press, 1988.

Versions exist which contain C code for the IBM PC, and a version
has recently been released for the Apple MacIntosh.


++++++++++++++++++++++++
Hopfield-style Network Simulator
A Special Purpose simulator for experimentation with the Hopfield-style
network developed by the author.

Software is available by e-mail upon request from the author, Arun
Jagota (jagota@cs.buffalo.edu).  It is written in C and should be
useful on 32-bit Unix machines, and a MSDOS version is also supplied.

++++++++++++++++++++++++

Several demonstration-type simulators have been published as source
code in various journals.  These are cited below:

Brown, Robert Jay. "An Artificial Neural Network Experiment."
        _Dr. Dobb's Journal_ (April, 1987) 16ff.

Colvin, Gregory. "Synapsys: A Neural Network." _The C Users Journal_
        (April, 1989) 59ff.

King, Todd. "Using Neural Networks for Pattern Recognition."
        _Dr. Dobb's Journal_ (January, 1989) 14ff.

Klimasauskas, Casey. "Neural Nets and Noise Filtering."
        _Dr. Dobb's Journal_ (January, 1989) 32ff.

Jones, Wiliam P., and Josiah Hoskins. "Back-Propagation." _Byte_
        (October, 1987) 155ff.

------------------------------

End of Neuron Digest [Volume 6 Issue 50]
****************************************