Connectionist Models - 1st Edition - ISBN: 9781483214481

Connectionist Models

1st Edition

Proceedings of the 1990 Summer School

Editors: David S. Touretzky Jeffrey L. Elman Terrence J. Sejnowski
eBook ISBN: 9781483214481
Imprint: Morgan Kaufmann
Published Date: 12th May 2014
Page Count: 416
Sales tax will be calculated at check-out Price includes VAT/GST
15% off
15% off
15% off
Price includes VAT/GST
× DRM-Free

Easy - Download and start reading immediately. There’s no activation process to access eBooks; all eBooks are fully searchable, and enabled for copying, pasting, and printing.

Flexible - Read on multiple operating systems and devices. Easily read eBooks on smart phones, computers, or any eBook readers, including Kindle.

Open - Buy once, receive and download all available eBook formats, including PDF, EPUB, and Mobi (for Kindle).

Institutional Access

Secure Checkout

Personal information is secured with SSL technology.

Free Shipping

Free global shipping
No minimum order.


Connectionist Models contains the proceedings of the 1990 Connectionist Models Summer School held at the University of California at San Diego. The summer school provided a forum for students and faculty to assess the state of the art with regards to connectionist modeling. Topics covered range from theoretical analysis of networks to empirical investigations of learning algorithms; speech and image processing; cognitive psychology; computational neuroscience; and VLSI design.

Comprised of 40 chapters, this book begins with an introduction to mean field, Boltzmann, and Hopfield networks, focusing on deterministic Boltzmann learning in networks with asymmetric connectivity; contrastive Hebbian learning in the continuous Hopfield model; and energy minimization and the satisfiability of propositional logic. Mean field networks that learn to discriminate temporally distorted strings are described. The next sections are devoted to reinforcement learning and genetic learning, along with temporal processing and modularity. Cognitive modeling and symbol processing as well as VLSI implementation are also discussed.

This monograph will be of interest to both students and academicians concerned with connectionist modeling.

Table of Contents

Part I Mean Field, Boltzmann, and Hopfield Networks

Deterministic Boltzmann Learning in Networks with Asymmetric Connectivity

Contrastive Hebbian Learning in the Continuous Hopfield Model

Mean Field Networks that Learn to Discriminate Temporally Distorted Strings

Energy Minimization and the Satisfiability of Propositional Logic

Part II Reinforcement Learning

On the Computational Economics of Reinforcement Learning

Reinforcement Comparison

Learning Algorithms for Networks with Internal and External Feedback

Part III Genetic Learning

Exploring Adaptive Agency I: Theory and Methods for Simulating the Evolution of Learning

The Evolution of Learning: An Experiment in Genetic Connectionism

Evolving Controls for Unstable Systems

Part IV Temporal Processing

Back-Propagation, Weight-Elimination and Time Series Prediction

Predicting the Mackey-Glass Timeseries with Cascade-Correlation Learning

Learning in Recurrent Finite Difference Networks

Temporal Backpropagation: An Efficient Algorithm for Finite Impulse Response Neural Networks

Part V Theory and Analysis

Optimal Dimensionality Reduction Using Hebbian Learning

Basis-Function Trees for Approximation in High-Dimensional Spaces

Effects of Circuit Parameters on Convergence of Trinary Update Back-Propagation

Equivalence Proofs for Multi-Layer Perceptron Classifiers and the Bayesian Discriminant Function

A Local Approach to Optimal Queries

Part VI Modularity

A Modularization Scheme for Feedforward Networks

A Compositional Connectionist Architecture

Part VII Cognitive Modeling and Symbol Processing

From Rote Learning to System Building: Acquiring Verb Morphology in Children and Connectionist Nets

Parallel Mapping Circuitry in a Phonological Model

A Modular Neural Network Model of the Acquisition of Category Names in Children

A Computational Model of Attentional Requirements in Sequence Learning

Recall of Sequences of Items by a Neural Network

Binding, Episodic Short-Term Memory, and Selective Attention, Or Why are PDP Models Poor at Symbol Manipulation?

Analogical Retrieval Within a Hybrid Spreading-Activation Network

Appropriate Uses of Hybrid Systems

Cognitive Map Construction and Use: A Parallel Distributed Processing Approach

Part VIII Speech and Vision

Unsupervised Discovery of Speech Segments Using Recurrent Networks

Feature Extraction Using an Unsupervised Neural Network

Motor Control for Speech Skills: A Connectionist Approach

Extracting Features From Faces Using Compression Networks: Face, Identity, Emotion, and Gender Recognition Using Holons

The Development of Topography and Ocular Dominance

On Modeling Some Aspects of Higher Level Vision

Part IX Biology

Modeling Cortical Area 7a Using Stochastic Real-Valued (SRV) Units

Neuronal Signal Strength is Enhanced by Rhythmic Firing

Part X VLSI Implementation

An Analog VLSI Neural Network Cocktail Party Processor

A VLSI Neural Network with On-Chip Learning



No. of pages:
© Morgan Kaufmann 1991
Morgan Kaufmann
eBook ISBN:

About the Editor

David S. Touretzky

Jeffrey L. Elman

Terrence J. Sejnowski

Ratings and Reviews