Connectionist Models

Connectionist Models

Proceedings of the 1990 Summer School

1st Edition - December 1, 1990

Write a review

  • Editors: David S. Touretzky, Jeffrey L. Elman, Terrence J. Sejnowski
  • eBook ISBN: 9781483214481

Purchase options

Purchase options
DRM-free (PDF)
Sales tax will be calculated at check-out

Institutional Subscription

Free Global Shipping
No minimum order

Description

Connectionist Models contains the proceedings of the 1990 Connectionist Models Summer School held at the University of California at San Diego. The summer school provided a forum for students and faculty to assess the state of the art with regards to connectionist modeling. Topics covered range from theoretical analysis of networks to empirical investigations of learning algorithms; speech and image processing; cognitive psychology; computational neuroscience; and VLSI design. Comprised of 40 chapters, this book begins with an introduction to mean field, Boltzmann, and Hopfield networks, focusing on deterministic Boltzmann learning in networks with asymmetric connectivity; contrastive Hebbian learning in the continuous Hopfield model; and energy minimization and the satisfiability of propositional logic. Mean field networks that learn to discriminate temporally distorted strings are described. The next sections are devoted to reinforcement learning and genetic learning, along with temporal processing and modularity. Cognitive modeling and symbol processing as well as VLSI implementation are also discussed. This monograph will be of interest to both students and academicians concerned with connectionist modeling.

Table of Contents


  • Part I Mean Field, Boltzmann, and Hopfield Networks

    Deterministic Boltzmann Learning in Networks with Asymmetric Connectivity

    Contrastive Hebbian Learning in the Continuous Hopfield Model

    Mean Field Networks that Learn to Discriminate Temporally Distorted Strings

    Energy Minimization and the Satisfiability of Propositional Logic

    Part II Reinforcement Learning

    On the Computational Economics of Reinforcement Learning

    Reinforcement Comparison

    Learning Algorithms for Networks with Internal and External Feedback

    Part III Genetic Learning

    Exploring Adaptive Agency I: Theory and Methods for Simulating the Evolution of Learning

    The Evolution of Learning: An Experiment in Genetic Connectionism

    Evolving Controls for Unstable Systems

    Part IV Temporal Processing

    Back-Propagation, Weight-Elimination and Time Series Prediction

    Predicting the Mackey-Glass Timeseries with Cascade-Correlation Learning

    Learning in Recurrent Finite Difference Networks

    Temporal Backpropagation: An Efficient Algorithm for Finite Impulse Response Neural Networks

    Part V Theory and Analysis

    Optimal Dimensionality Reduction Using Hebbian Learning

    Basis-Function Trees for Approximation in High-Dimensional Spaces

    Effects of Circuit Parameters on Convergence of Trinary Update Back-Propagation

    Equivalence Proofs for Multi-Layer Perceptron Classifiers and the Bayesian Discriminant Function

    A Local Approach to Optimal Queries

    Part VI Modularity

    A Modularization Scheme for Feedforward Networks

    A Compositional Connectionist Architecture

    Part VII Cognitive Modeling and Symbol Processing

    From Rote Learning to System Building: Acquiring Verb Morphology in Children and Connectionist Nets

    Parallel Mapping Circuitry in a Phonological Model

    A Modular Neural Network Model of the Acquisition of Category Names in Children

    A Computational Model of Attentional Requirements in Sequence Learning

    Recall of Sequences of Items by a Neural Network

    Binding, Episodic Short-Term Memory, and Selective Attention, Or Why are PDP Models Poor at Symbol Manipulation?

    Analogical Retrieval Within a Hybrid Spreading-Activation Network

    Appropriate Uses of Hybrid Systems

    Cognitive Map Construction and Use: A Parallel Distributed Processing Approach

    Part VIII Speech and Vision

    Unsupervised Discovery of Speech Segments Using Recurrent Networks

    Feature Extraction Using an Unsupervised Neural Network

    Motor Control for Speech Skills: A Connectionist Approach

    Extracting Features From Faces Using Compression Networks: Face, Identity, Emotion, and Gender Recognition Using Holons

    The Development of Topography and Ocular Dominance

    On Modeling Some Aspects of Higher Level Vision

    Part IX Biology

    Modeling Cortical Area 7a Using Stochastic Real-Valued (SRV) Units

    Neuronal Signal Strength is Enhanced by Rhythmic Firing

    Part X VLSI Implementation

    An Analog VLSI Neural Network Cocktail Party Processor

    A VLSI Neural Network with On-Chip Learning

    Index

Product details

  • No. of pages: 416
  • Language: English
  • Copyright: © Morgan Kaufmann 2014
  • Published: December 1, 1990
  • Imprint: Morgan Kaufmann
  • eBook ISBN: 9781483214481

About the Editors

David S. Touretzky

Jeffrey L. Elman

Terrence J. Sejnowski

Ratings and Reviews

Write a review

There are currently no reviews for "Connectionist Models"