neuron-request@HPLMS2.HPL.HP.COM ("Neuron-Digest Moderator Peter Marvit") (10/02/90)
Neuron Digest Monday, 1 Oct 1990 Volume 6 : Issue 57 Today's Topics: Re: Marr's VISION out of date length restrictions on announcements A general NN. Constraint Satisfaction Problem. A learning algorithm A BOOK ON CONTROL THEORY AND NEURAL NETWORKS Minds & Machines (articles available by anonymous ftp) Technical Report Call for Papers. Course change news Send submissions, questions, address maintenance and requests for old issues to "neuron-request@hplabs.hp.com" or "{any backbone,uunet}!hplabs!neuron-request" Use "ftp" to get old issues from hplpm.hpl.hp.com (15.255.176.205). ------------------------------------------------------------ Subject: Re: Marr's VISION out of date From: michael k finegan <Mike.Finegan@UC.EDU> Date: Wed, 26 Sep 90 10:45:15 -0400 Steve - I am working in image understanding, not ANN; but applications of BCS to image analysis/understanding would be most interesting. I have gone through a few of the BCS papers, but didn't want to implement the PDE or DE solvers, etc. Can you give me an idea of how computationally expensive (your) BCS is to use ? Also, since I am working on a system with several knowledge sources (blackboard, a la VISIONS), can your BCS be a 'C' module ? Thanks for any info., examples, or references. - Michael Finegan mfinegan@uceng.UC.EDU ------------------------------ Subject: length restrictions on announcements From: ross@zeno.mmwb.ucsf.edu Date: Thu, 27 Sep 90 13:35:51 -0700 A modest proposal: all conference and journal announcements and requests for papers be restricted to a 1 page summary including if relevant time, location, price and correspondent email address(es), with the whole document available by anonymous ftp. I'm tired of seeing lists of names, bureacratic statements of purpose (all similar) and order forms. I could live with lists of titles and am glad to see abstracts on individual papers especially when the papers are available on the network. More ambitious: how about a group of bibliographies anonymously ftpable by subject to be updated from the stream of stuff coming in, so that only the new ones would appear in mail postings unless accompanied by comments. Comments could be included in the bibliography too. Bill Ross [[ Editor's Note: I sympathize with Bill's lament. It should be noted that nearly 1/5 of Digest subscribers are on BITNET and probably cannot use ftp. As for the general suggestion of cutting down lengths of announcements, I can only hope submitters will consider the request. As moderator (using my copious spare time), I'm not ready to either organize the announcement archives nor edit incoming submissions. As regular readers know, I try to put announcements at the end of Digests and place discussions and comments at the beginning. At one point, I tried segregating Digests, but timely information got lost in the shuffle and it was a royal pain for me. I don't have a good solution for the trade-off of providing appropriate infomation anf over-doing it. Given that electrons are relatively cheap, and (as a reader) I don't have to expend any extra effort to get the recent information which the Digest provides, the current situation will probabably remain. Of course, opposing viewpoints cheerfully considered (and published, if desired). -PM ]] ------------------------------ Subject: A general NN. Constraint Satisfaction Problem. From: qian@icopen.ICO.OLIVETTI.COM (DA QUN QIAN) Date: Fri, 28 Sep 90 14:07:37 +0100 1. At present I am studying a new type of neural networks. In the neural network, every arc can simultaneously several values (assume M values) from input node to output node that the arc connects. The state of the output node is determined by all its input values N*M if assume there are N arcs pointing to it, and the output node also owns M values determined by the N*M values. Who can offer me some references on this type of neural networks? 2. I am also studying how to apply neural networks to solve constraint satisfaction problem. I need some examples of description of engineering problems (such as chemical engineering problem) as constraint satisfaction problem. Who can offer me resome referfences in which this kind of examples are reported. Thanks in advance Qian Da Qun Olivetti Artificial Intelligence Center Olivetti Nuova 3 ICO Piano Via Jervis 77, 10015 Ivrea(TO) Italy Email: qian@icopen.ICO.OLIVETTI.COM qian%uucp.icopen%it.olivetti.ico.iconet ------------------------------ Subject: A learning algorithm From: qian@icopen.ICO.OLIVETTI.COM (DA QUN QIAN) Date: Sun, 30 Sep 90 12:33:31 +0100 I correct my question just sent. The question I am proposing is as follows: The neural net I am studying has the form: s(j,t')=f(W(i,j,t), S(i,t), T(i)) s(j,t') is the state of node x(j) at time t',S(i,t) is the state vector of nodes x(i)s at time t, W(i,j,t) is the weight vector on arcs starting from x(i)s to x(j) at time t, T(i) is information propagation time vector along arcs starting from x(i)s to x(j). Usually learning algorithms are used to modify the values of W(i,j,t). Now I intend to study the learning algorithm by which not only the values of W(i,j,t) will be modified, but also the time T(i), i.e., information propagation time, will be modified. Who can offer me some references on this type of learning algorithms. Thanks in advance. Qian Da Qun Artificial Inteligence Center Olivetti Nuova ICO 3 Piano Via Jervis 77, 10015 Ivrea(TO) Italy Email: qian@icopen.ico.olivetti.com ------------------------------ Subject: A BOOK ON CONTROL THEORY AND NEURAL NETWORKS From: qian@icopen.ICO.OLIVETTI.COM (DA QUN QIAN) Date: Mon, 01 Oct 90 13:24:35 +0100 Last month someone sent an email to this newsgroup in which he/she recommended a book on control theory and neural networks to us. I lost the email. Who can send me this email ? Thanks in advance. Qian Da Qun Email: qian@icopen.ICO.OLIVETTI.COM ------------------------------ Subject: Minds & Machines (articles available by anonymous ftp) From: harnad@phoenix.Princeton.EDU (Stevan Harnad) Organization: Princeton University, Princeton, New Jersey Date: 20 Sep 90 15:22:02 +0000 The following article is retrievable by anonymous ftp as (compressed) file otherminds.Z from directory /pub/harnad on princeton.edu (retrieve it in "binary" mode) Other bodies, other minds: A machine incarnation of an old philosophical problem [To appear in: Minds and Machines 1: 1991] Stevan Harnad Department of Psychology Princeton University Princeton NJ 08544 ABSTRACT: Explaining the mind by building machines with minds runs into the other-minds problem: How can we tell whether any body other than our own has a mind when the only way to know is by BEING the other body? In practice we all use some form of Turing Test: If it can DO everything a body with a mind can do such that we can't tell them apart, we have no basis for doubting it has a mind. But what is "everything" a body with a mind can do? Turing's original "pen-pal" version (the TT) only tested linguistic capacity, but Searle has shown that a mindless symbol-manipulator could pass the TT undetected. The Total Turing Test (TTT) calls for all of our linguistic AND robotic capacities; immune to Searle's argument, it suggests how to ground a symbol manipulating system in the capacity to pick out the objects its symbols refer to. No Turing Test, however, can guarantee that a body has a mind. Worse, nothing in the explanation of its successful performance requires a model to have a mind at all. Minds are hence very different from the unobservables of physics (e.g., quarks, superstrings); and Turing Testing, though essential for machine-modeling the mind, can really only yield an explanation of the body. KEYWORDS: artificial intelligence; causality; cognition; computation; explanation; mind/body problem; other-minds problem; robotics; Searle; symbol grounding; Turing Test. Other papers available from the same directory: symbol.Z (The Symbol Grounding Problem, Physica D 1990) searle.Z (Minds, Machines and Searle, J. Th. Exp. AI 1989) categorization.Z (Category Induction and Representation, UP 1987) Stevan Harnad Department of Psychology Princeton University harnad@clarity.princeton.edu / harnad@pucc.bitnet / srh@flash.bellcore.com harnad@learning.siemens.com / harnad@elbereth.rutgers.edu / (609)-921-7771 ------------------------------ Subject: Technical Report From: ANDERSON%BROWNCOG.BITNET@MITVMA.MIT.EDU Date: Fri, 21 Sep 90 15:00:00 -0400 A technical report is available: "Why, having so many neurons, do we have so few thoughts?" Technical Report 90-1 Brown University Department of Cognitive and Linguistic Sciences James A. Anderson Department of Cognitive and Linguistic Sciences Box 1978 Brown University Providence, RI 02912 This is a chapter to appear in: Relating Theory and Data Edited by W.E. Hockley and S. Lewandowsky, Hillsdale, NJ: Erlbaum (LEA) Abstract Experimental cognitive psychology often involves recording two quite distinct kinds of data. The first is whether the computation itself is done correctly or incorrectly and the second records how long it took to get an answer. Neural network computations are often loosely described as being `brain-like.' This suggests that it might be possible to model experimental reaction time data simply by seeing how long it takes for the network to generate the answer and error data by looking at the computed results in the same system. Simple feedforward nets usually do not give direct computation time data. However, network models realizing dynamical systems can give `reaction times' directly by noting the time required for the network computation to be completed. In some cases genuine random processes are necessary to generate differing reaction times, but in other cases deterministic, noise free systems can also give distributions of reaction times. This report can be obtained by sending an email message to: LI700008@brownvm.BITNET or anderson@browncog.BITNET and asking for Cognitive Science Technical Report 90-1 on reaction times, or by sending a note by regular mail to the address above. ------------------------------ Subject: Call for Papers. From: /PN=JAMES.L.RASH/O=GSFCMAIL/PRMD=GSFC/ADMD=TELEMAIL/C=US/@sprint.com Date: 01 Oct 90 21:24:00 +0000 Call for Papers 1991 Goddard Conference on Space Applications of Artificial Intelligence May 14 & 15, 1991 NASA Goddard Space Flight Center Greenbelt, Maryland The Sixth Annual Goddard Conference on Space Applications of Artificial Intelligence will focus on AI research and applications relevant to space systems, space operations, and space science. Topics will include, but are not limited to: o knowledge-based spacecraft command & control o expert system management & methodologies o distributed knowledge-based systems o intelligent database management o fault-tolerant rule-based systems o simulation-based reasoning o fault isolation & diagnosis o knowledge acquisition o robotics & telerobotics o planning & scheduling o neural networks o image analysis Original, unpublished papers are now being solicited for the conference. Abstracts should be no longer than one page. Five copies of the abstract should be submitted by November 1, 1990 along with the authorUs name, affiliation, address and telephone number. Notification of tentative acceptance will be given by November 15, 1990. Papers should be no longer than 15 pages and must be submitted in camera-ready form for final acceptance by February 1, 1991. Accepted papers will be presented formally or as poster presentations, which may include demonstrations. All accepted papers will be published in the conference proceedings as an official NASA document, and select papers will appear in a special issue of the international journal Telematics and Informatics. There will be a conference award for Best Paper. No commercial presentations will be accepted. Send abstracts to: Jonathan Hartley NASA/GSFC Code 522 Greenbelt, MD 20771 For further info call: (301) 286-3150 ------------------------------ Subject: Course change news From: elsberry@arrisun3.arl.utexas.edu (Wes Elsberry) Date: Mon, 01 Oct 90 20:22:16 -0500 Applied Neural Networks Computing course changes The upcoming UCLA short course on Applied Neural Networks Computing (December 3-6) has had two topics added to the course curriculum: Biomedical Engineering Applications and Macroeconomic Applications. The Biomedical Engineering Applications portion will include a look at an advanced Adaptive Resonance Theory architecture network model used for identification of the HIV virus. The Macroeconomic Applications portion will look at non-geometric and non-parametric models of macroeconomic systems. The course is offered by the University of California at Los Angeles Extension, with Dr. Harold Szu of the Office of Naval Research teaching the course. The catalog list of topics includes the following: Nonconvex Optimization; Problem Solving by Fixed Point Learning Systems; Solving Image Processing and Automated Pattern Recognition Problems; Designing to Solve Particular Problems; Strategy of Neural Nets for Human Visual System; Dynamic Reconfigurable Nets; and Application of Neural Networks According to Underlying Principles. As it says in the catalog, "For technical information about the course, call Harold Szu at (202) 767-1493. For registration information, call the Short Course program Office at (213) 825-3344." ------------------------------ End of Neuron Digest [Volume 6 Issue 57] ****************************************