Corners are bent. Stains on outside cover/inside the book. There are tears on paperback. Spine is twisted/bowed. Cover/Case has some rubbing and edgewear. Access codes, CD's, slipcovers and other accessories may not be included. International Buyers - Please Note: Import duties, taxes, and charges are not included in the item price or shipping cost. These charges are the buyer's responsibility. Please check with your country's customs office to determine what these additional costs will be prior to bidding or buying.
Oops! Looks like we're having trouble connecting to our server.
Refresh your browser window to try again.
About this product
Product Identifiers
PublisherMIT Press
ISBN-100262111934
ISBN-139780262111935
eBay Product ID (ePID)107534
Product Key Features
Number of Pages222 Pages
LanguageEnglish
Publication NameIntroduction to Computational Learning Theory
SubjectComputer Science, Learning Styles
Publication Year1994
TypeTextbook
AuthorMichael J. Kearns, Umesh Vazirani
Subject AreaComputers, Education
FormatHardcover
Dimensions
Item Height0.8 in
Item Weight20 Oz
Item Length9.2 in
Item Width7.1 in
Additional Product Features
Intended AudienceTrade
LCCN94-016588
TitleLeadingAn
Dewey Edition20
Grade FromCollege Graduate Student
IllustratedYes
Dewey Decimal006.3
SynopsisEmphasizing issues of computational efficiency, Michael Kearns and Umesh Vazirani introduce a number of central topics in computational learning theory for researchers and students in artificial intelligence, neural networks, theoretical computer science, and statistics. Emphasizing issues of computational efficiency, Michael Kearns and Umesh Vazirani introduce a number of central topics in computational learning theory for researchers and students in artificial intelligence, neural networks, theoretical computer science, and statistics. Computational learning theory is a new and rapidly expanding area of research that examines formal models of induction with the goals of discovering the common methods underlying efficient learning algorithms and identifying the computational impediments to learning. Each topic in the book has been chosen to elucidate a general principle, which is explored in a precise formal setting. Intuition has been emphasized in the presentation to make the material accessible to the nontheoretician while still providing precise arguments for the specialist. This balance is the result of new proofs of established theorems, and new presentations of the standard proofs. The topics covered include the motivation, definitions, and fundamental results, both positive and negative, for the widely studied L. G. Valiant model of Probably Approximately Correct Learning; Occam's Razor, which formalizes a relationship between learning and data compression; the Vapnik-Chervonenkis dimension; the equivalence of weak and strong learning; efficient learning in the presence of noise by the method of statistical queries; relationships between learning and cryptography, and the resulting computational limitations on efficient learning; reducibility between learning problems; and algorithms for learning finite automata from active experimentation., Emphasizing issues of computational efficiency, Michael Kearns andUmesh Vazirani introduce a number of central topics in computational learning theory for researchers and students in artificial intelligence, neural networks, theoretical computer science, and statistics.