COLT '89 - 1st Edition - ISBN: 9780080948294

COLT '89

1st Edition

Proceedings of the Second Annual Workshop, UC Santa Cruz, California, July 31 - August 2 1989

Editors: COLT
eBook ISBN: 9780080948294
Imprint: Morgan Kaufmann
Published Date: 28th June 2014
Page Count: 389
Sales tax will be calculated at check-out Price includes VAT/GST
Price includes VAT/GST

Institutional Subscription

Secure Checkout

Personal information is secured with SSL technology.

Free Shipping

Free global shipping
No minimum order.


Computational Learning Theory presents the theoretical issues in machine learning and computational models of learning. This book covers a wide range of problems in concept learning, inductive inference, and pattern recognition.

Organized into three parts encompassing 32 chapters, this book begins with an overview of the inductive principle based on weak convergence of probability measures. This text then examines the framework for constructing learning algorithms. Other chapters consider the formal theory of learning, which is learning in the sense of improving computational efficiency as opposed to concept learning. This book discusses as well the informed parsimonious (IP) inference that generalizes the compatibility and weighted parsimony techniques, which are most commonly applied in biology. The final chapter deals with the construction of prediction algorithms in a situation in which a learner faces a sequence of trials, with a prediction to be given in each and the goal of the learner is to make some mistakes.

This book is a valuable resource for students and teachers.

Table of Contents


Invited Lecture

Inductive Principles of the Search for Empirical Dependences (Methods Based on Weak Convergence of Probability Measures)

Technical Papers

Polynomial Learnability of Semilinear Sets

Learning Nested Differences of Intersection-Closed Concept Classes

A Polynomial-Time Algorithm for Learning k-Variable Pattern Languages from Examples

On Learning from Exercises

On Approximate Truth

Informed Parsimonious Inference of Prototypical Genetic Sequences

Complexity Issues in Learning by Neural Nets

Equivalence Queries and Approximate Fingerprints

Learning Read-Once Formulas Using Membership Queries

Learning Simple Deterministic Languages

Learning in the Presence of Inaccurate Information

Convergence to Nearly Minimal Size Grammars by Vacillating Learning Machines

Inductive Inference with Bounded Number of Mind Changes

Learning Via Queries to an Oracle

Learning Structure from Data: A Survey

A Statistical Approach to Learning and Generalization in Layered Neural Networks

The Light Bulb Problem

From On-Line to Batch Learning

A Parametrization Scheme for Classifying Models of Learnability

On the Role of Search for Learning

Elementary Formal System as a Unifying Framework for Language Learning

Identification of Unions of Languages Drawn from an Identifiable Class

Induction from the General to the More General

Space-Bounded Learning and the Vapnik-Chervonenkis Dimension

Reliable and Useful Learning

Short Abstracts

The Strength of Weak Learnability

On the Complexity of Learning form Counterexamples

Generalizing the PAC Model: Sample Size Bounds From Metric Dimension-Based Uniform Convergence Results

A Theory of Learning Simple Concepts Under Simple Distributions

Learning Binary Relations and Total Orders

The Weighted Majority Algorithm

Author Index


No. of pages:
© Morgan Kaufmann 1989
Morgan Kaufmann
eBook ISBN:

About the Editor

Ratings and Reviews