COLT '89 - 1st Edition - ISBN: 9780080948294

COLT '89

1st Edition

Proceedings of the Second Annual Workshop, UC Santa Cruz, California, July 31 - August 2 1989

Editors: COLT
eBook ISBN: 9780080948294
Imprint: Morgan Kaufmann
Published Date: 28th June 2014
Page Count: 389
Sales tax will be calculated at check-out Price includes VAT/GST
4400.00
43.95
19.99
24.95
31.95
Unavailable
Price includes VAT/GST
× DRM-Free

Easy - Download and start reading immediately. There’s no activation process to access eBooks; all eBooks are fully searchable, and enabled for copying, pasting, and printing.

Flexible - Read on multiple operating systems and devices. Easily read eBooks on smart phones, computers, or any eBook readers, including Kindle.

Open - Buy once, receive and download all available eBook formats, including PDF, EPUB, and Mobi (for Kindle).

Institutional Access

Secure Checkout

Personal information is secured with SSL technology.

Free Shipping

Free global shipping
No minimum order.

Description

Computational Learning Theory presents the theoretical issues in machine learning and computational models of learning. This book covers a wide range of problems in concept learning, inductive inference, and pattern recognition.

Organized into three parts encompassing 32 chapters, this book begins with an overview of the inductive principle based on weak convergence of probability measures. This text then examines the framework for constructing learning algorithms. Other chapters consider the formal theory of learning, which is learning in the sense of improving computational efficiency as opposed to concept learning. This book discusses as well the informed parsimonious (IP) inference that generalizes the compatibility and weighted parsimony techniques, which are most commonly applied in biology. The final chapter deals with the construction of prediction algorithms in a situation in which a learner faces a sequence of trials, with a prediction to be given in each and the goal of the learner is to make some mistakes.

This book is a valuable resource for students and teachers.

Table of Contents


Foreword


Invited Lecture


Inductive Principles of the Search for Empirical Dependences (Methods Based on Weak Convergence of Probability Measures)


Technical Papers


Polynomial Learnability of Semilinear Sets


Learning Nested Differences of Intersection-Closed Concept Classes


A Polynomial-Time Algorithm for Learning k-Variable Pattern Languages from Examples


On Learning from Exercises


On Approximate Truth


Informed Parsimonious Inference of Prototypical Genetic Sequences


Complexity Issues in Learning by Neural Nets


Equivalence Queries and Approximate Fingerprints


Learning Read-Once Formulas Using Membership Queries


Learning Simple Deterministic Languages


Learning in the Presence of Inaccurate Information


Convergence to Nearly Minimal Size Grammars by Vacillating Learning Machines


Inductive Inference with Bounded Number of Mind Changes


Learning Via Queries to an Oracle


Learning Structure from Data: A Survey


A Statistical Approach to Learning and Generalization in Layered Neural Networks


The Light Bulb Problem


From On-Line to Batch Learning


A Parametrization Scheme for Classifying Models of Learnability


On the Role of Search for Learning


Elementary Formal System as a Unifying Framework for Language Learning


Identification of Unions of Languages Drawn from an Identifiable Class


Induction from the General to the More General


Space-Bounded Learning and the Vapnik-Chervonenkis Dimension


Reliable and Useful Learning


Short Abstracts


The Strength of Weak Learnability


On the Complexity of Learning form Counterexamples


Generalizing the PAC Model: Sample

Details

No. of pages:
389
Language:
English
Copyright:
© Morgan Kaufmann 1989
Published:
Imprint:
Morgan Kaufmann
eBook ISBN:
9780080948294

About the Editor

Ratings and Reviews