The Mathematics Of Generalization
eBook - ePub

The Mathematics Of Generalization

  1. 460 pages
  2. English
  3. ePUB (mobile friendly)
  4. Available on iOS & Android
eBook - ePub

The Mathematics Of Generalization

About this book

This book provides different mathematical frameworks for addressing supervised learning. It is based on a workshop held under the auspices of the Center for Nonlinear Studies at Los Alamos and the Santa Fe Institute in the summer of 1992.

Frequently asked questions

Yes, you can cancel anytime from the Subscription tab in your account settings on the Perlego website. Your subscription will stay active until the end of your current billing period. Learn how to cancel your subscription.
No, books cannot be downloaded as external files, such as PDFs, for use outside of Perlego. However, you can download books within the Perlego app for offline reading on mobile or tablet. Learn more here.
Perlego offers two plans: Essential and Complete
  • Essential is ideal for learners and professionals who enjoy exploring a wide range of subjects. Access the Essential Library with 800,000+ trusted titles and best-sellers across business, personal growth, and the humanities. Includes unlimited reading time and Standard Read Aloud voice.
  • Complete: Perfect for advanced learners and researchers needing full, unrestricted access. Unlock 1.4M+ books across hundreds of subjects, including academic and specialized titles. The Complete Plan also includes advanced features like Premium Read Aloud and Research Assistant.
Both plans are available with monthly, semester, or annual billing cycles.
We are an online textbook subscription service, where you can get access to an entire online library for less than the price of a single book per month. With over 1 million books across 1000+ topics, we’ve got you covered! Learn more here.
Look out for the read-aloud symbol on your next book to see if you can listen to it. The read-aloud tool reads text aloud for you, highlighting the text as it is being read. You can pause it, speed it up and slow it down. Learn more here.
Yes! You can use the Perlego app on both iOS or Android devices to read anytime, anywhere — even offline. Perfect for commutes or when you’re on the go.
Please note we cannot support devices running on iOS 13 and Android 7 or earlier. Learn more about using the app.
Yes, you can access The Mathematics Of Generalization by David. H Wolpert in PDF and/or ePUB format, as well as other popular books in Matemáticas & Matemáticas general. We have over one million books available in our catalogue for you to explore.

Information

Publisher
CRC Press
Year
2018
eBook ISBN
9780429972157
David Haussler
Baskin Center for Computer Engineering and Information Sciences, University of California, Santa Cruz, CA 95064; e-mail: [email protected].
Decision Theoretic Generalizations of the PAC Model for Neural Net and Other Learning Applications
This chapter, reprinted by permission, originally appeared in Information and Computation 100(1) (1992): 78–150. Copyright © by Academic Press.
We describe a generalization of the PAC learning model that is based on statistical decision theory. In this model the learner receives randomly drawn examples, each example consisting of an instance xX and an outcome yY, and tries to find a decision rule h: XA, where hH, that specifies the appropriate action aA to take for each instance x, in order to minimize the expectation of a loss l(y,a). Here X, Y, and A are arbitrary sets, l is a real-valued function, and examples are generated according to an arbitrary joint distribution on X × Y. Special cases include the problem of learning a function from X into Y, the problem of learning the conditional probability distribution on Y given X (regression), and the problem of learning a distribution on X (density estimation).
We give theorems on the uniform convergence of empirical loss estimates to true expected loss rates for certain decision rule spaces H, and show how this implies learnability with bounded sample size, disregarding computational complexity. As an application, we give distribution-independent upper bounds on the sample size needed for learning with feedforward neural networks. Our theorems use a generalized notion of VC dimension that applies to classes of real-valued functions, adapted from Vapnik and Pollard’s work, and a notion of capacity and metric dimension for classes of functions that map into a bounded metric space.
1. INTRODUCTION
The introduction of the Probably Approximately Correct (PAC) model4,86 of learning from examples has done an admirable job of ...

Table of contents

  1. Cover
  2. Half Title
  3. Title Page
  4. Copyright Page
  5. Table of Contents
  6. Preface
  7. The Status of Supervised Learning Science circa 1994—The Search for a Consensus
  8. Reflections After Refereeing Papers for NIPS
  9. The Probably Approximately Correct (PAC) and Other Learning Models
  10. Decision Theoretic Generalizations of the PAC Model for Neural Net and Other Learning Applications
  11. The Relationship Between PAC, the Statistical Physics Framework, the Bayesian Framework, and the VC Framework
  12. Statistical Physics Models of Supervised Learning
  13. On Exhaustive Learning
  14. A Study of Maximal-Coverage Learning Algorithms
  15. On Bayesian Model Selection
  16. Soft Classification, a.k.a. Risk Estimation, via Penalized Log Likelihood and Smoothing Spline Analysis of Variance
  17. Current Research
  18. Preface to Simplifying Neural Networks by Soft Weight Sharing
  19. Simplifying Neural Networks by Soft Weight Sharing
  20. Error-Correcting Output Codes: A General Method for Improving Multiclass Inductive Learning Programs
  21. Image Segmentation and Recognition
  22. Index