|Listed in category:
Have one to sell?

Feature Extraction : Foundations and Applications, Paperback by Guyon, Isabel...

US $336.84
No Interest if paid in full in 6 mo on $99+ with PayPal Credit*
Condition:
Like New
2 available
Shipping:
Free Economy Shipping.
Located in: Jessup, Maryland, United States
Delivery:
Estimated between Mon, Oct 21 and Sat, Oct 26 to 20147
Estimated delivery dates - opens in a new window or tab include seller's handling time, origin ZIP Code, destination ZIP Code and time of acceptance and will depend on shipping service selected and receipt of cleared paymentcleared payment - opens in a new window or tab. Delivery times may vary, especially during peak periods.
This item has an extended handling time and a delivery estimate greater than 10 business days.
Returns:
14 days returns. Buyer pays for return shipping.
Payments:
      
*No Interest if paid in full in 6 months on $99+. See terms and apply now- for PayPal Credit, opens in a new window or tab
Earn up to 5x points when you use your eBay Mastercard®. Learn moreabout earning points with eBay Mastercard

Shop with confidence

eBay Money Back Guarantee
Get the item you ordered or your money back. Learn moreeBay Money Back Guarantee - opens new window or tab
Seller assumes all responsibility for this listing.
eBay item number:387091801999
Last updated on Oct 05, 2024 15:56:26 PDTView all revisionsView all revisions

Item specifics

Condition
Like New: A book that looks new but has been read. Cover has no visible wear, and the dust jacket ...
Book Title
Feature Extraction : Foundations and Applications
ISBN
9783662517710
Subject Area
Mathematics, Computers, Technology & Engineering
Publication Name
Feature Extraction : Foundations and Applications
Publisher
Springer Berlin / Heidelberg
Item Length
9.3 in
Subject
Engineering (General), Intelligence (Ai) & Semantics, Applied, Computer Vision & Pattern Recognition
Publication Year
2017
Series
Studies in Fuzziness and Soft Computing Ser.
Type
Textbook
Format
Trade Paperback
Language
English
Author
Steve Gunn
Item Weight
57.2 Oz
Item Width
6.1 in
Number of Pages
Xxiv, 778 Pages

About this product

Product Identifiers

Publisher
Springer Berlin / Heidelberg
ISBN-10
366251771X
ISBN-13
9783662517710
eBay Product ID (ePID)
15038417333

Product Key Features

Number of Pages
Xxiv, 778 Pages
Language
English
Publication Name
Feature Extraction : Foundations and Applications
Subject
Engineering (General), Intelligence (Ai) & Semantics, Applied, Computer Vision & Pattern Recognition
Publication Year
2017
Type
Textbook
Subject Area
Mathematics, Computers, Technology & Engineering
Author
Steve Gunn
Series
Studies in Fuzziness and Soft Computing Ser.
Format
Trade Paperback

Dimensions

Item Weight
57.2 Oz
Item Length
9.3 in
Item Width
6.1 in

Additional Product Features

Dewey Edition
22
Series Volume Number
207
Number of Volumes
1 vol.
Illustrated
Yes
Dewey Decimal
006.3
Table Of Content
An Introduction to Feature Extraction.- An Introduction to Feature Extraction.- Feature Extraction Fundamentals.- Learning Machines.- Assessment Methods.- Filter Methods.- Search Strategies.- Embedded Methods.- Information-Theoretic Methods.- Ensemble Learning.- Fuzzy Neural Networks.- Feature Selection Challenge.- Design and Analysis of the NIPS2003 Challenge.- High Dimensional Classification with Bayesian Neural Networks and Dirichlet Diffusion Trees.- Ensembles of Regularized Least Squares Classifiers for High-Dimensional Problems.- Combining SVMs with Various Feature Selection Strategies.- Feature Selection with Transductive Support Vector Machines.- Variable Selection using Correlation and Single Variable Classifier Methods: Applications.- Tree-Based Ensembles with Dynamic Soft Feature Selection.- Sparse, Flexible and Efficient Modeling using L 1 Regularization.- Margin Based Feature Selection and Infogain with Standard Classifiers.- Bayesian Support Vector Machines for Feature Ranking and Selection.- Nonlinear Feature Selection with the Potential Support Vector Machine.- Combining a Filter Method with SVMs.- Feature Selection via Sensitivity Analysis with Direct Kernel PLS.- Information Gain, Correlation and Support Vector Machines.- Mining for Complex Models Comprising Feature Selection and Classification.- Combining Information-Based Supervised and Unsupervised Feature Selection.- An Enhanced Selective Naïve Bayes Method with Optimal Discretization.- An Input Variable Importance Definition based on Empirical Data Probability Distribution.- New Perspectives in Feature Extraction.- Spectral Dimensionality Reduction.- Constructing Orthogonal Latent Features for Arbitrary Loss.- Large Margin Principles for Feature Selection.- Feature Extraction for Classificationof Proteomic Mass Spectra: A Comparative Study.- Sequence Motifs: Highly Predictive Features of Protein Function.
Synopsis
Everyonelovesagoodcompetition. AsIwritethis, twobillionfansareeagerly anticipating the 2006 World Cup. Meanwhile, a fan base that is somewhat smaller (but presumably includes you, dear reader) is equally eager to read all about the results of the NIPS 2003 Feature Selection Challenge, contained herein. Fans of Radford Neal and Jianguo Zhang (or of Bayesian neural n- works and Dirichlet di?usion trees) are gloating "I told you so" and looking forproofthattheirwinwasnota?uke. Butthematterisbynomeanssettled, and fans of SVMs are shouting "wait 'til next year!" You know this book is a bit more edgy than your standard academic treatise as soon as you see the dedication: "To our friends and foes. " Competition breeds improvement. Fifty years ago, the champion in 100m butter?yswimmingwas22percentslowerthantoday'schampion;thewomen's marathon champion from just 30 years ago was 26 percent slower. Who knows how much better our machine learning algorithms would be today if Turing in 1950 had proposed an e?ective competition rather than his elusive Test? But what makes an e?ective competition? The ?eld of Speech Recognition hashadNIST-runcompetitionssince1988;errorrateshavebeenreducedbya factorofthreeormore, butthe?eldhasnotyethadtheimpactexpectedofit. Information Retrieval has had its TREC competition since 1992; progress has been steady and refugees from the competition have played important roles in the hundred-billion-dollar search industry. Robotics has had the DARPA Grand Challenge for only two years, but in that time we have seen the results go from complete failure to resounding success (although it may have helped that the second year's course was somewhat easier than the ?rst's)., Everyonelovesagoodcompetition. AsIwritethis, twobillionfansareeagerly anticipating the 2006 World Cup. Meanwhile, a fan base that is somewhat smaller (but presumably includes you, dear reader) is equally eager to read all about the results of the NIPS 2003 Feature Selection Challenge, contained herein. Fans of Radford Neal and Jianguo Zhang (or of Bayesian neural n- works and Dirichlet di'usion trees) are gloating "I told you so" and looking forproofthattheirwinwasnota'uke. Butthematterisbynomeanssettled, and fans of SVMs are shouting "wait 'til next year " You know this book is a bit more edgy than your standard academic treatise as soon as you see the dedication: "To our friends and foes. " Competition breeds improvement. Fifty years ago, the champion in 100m butter'yswimmingwas22percentslowerthantoday'schampion;thewomen's marathon champion from just 30 years ago was 26 percent slower. Who knows how much better our machine learning algorithms would be today if Turing in 1950 had proposed an e'ective competition rather than his elusive Test? But what makes an e'ective competition? The ?eld of Speech Recognition hashadNIST-runcompetitionssince1988;errorrateshavebeenreducedbya factorofthreeormore, butthe'eldhasnotyethadtheimpactexpectedofit. Information Retrieval has had its TREC competition since 1992; progress has been steady and refugees from the competition have played important roles in the hundred-billion-dollar search industry. Robotics has had the DARPA Grand Challenge for only two years, but in that time we have seen the results go from complete failure to resounding success (although it may have helped that the second year's course was somewhat easier than the ?rst's)., Everyonelovesagoodcompetition. AsIwritethis,twobillionfansareeagerly anticipating the 2006 World Cup. Meanwhile, a fan base that is somewhat smaller (but presumably includes you, dear reader) is equally eager to read all about the results of the NIPS 2003 Feature Selection Challenge, contained herein. Fans of Radford Neal and Jianguo Zhang (or of Bayesian neural n- works and Dirichlet di'usion trees) are gloating "I told you so" and looking forproofthattheirwinwasnota'uke. Butthematterisbynomeanssettled, and fans of SVMs are shouting "wait 'til next year!" You know this book is a bit more edgy than your standard academic treatise as soon as you see the dedication: "To our friends and foes. " Competition breeds improvement. Fifty years ago, the champion in 100m butter'yswimmingwas22percentslowerthantoday'schampion;thewomen's marathon champion from just 30 years ago was 26 percent slower. Who knows how much better our machine learning algorithms would be today if Turing in 1950 had proposed an e'ective competition rather than his elusive Test? But what makes an e'ective competition? The ?eld of Speech Recognition hashadNIST-runcompetitionssince1988;errorrateshavebeenreducedbya factorofthreeormore,butthe'eldhasnotyethadtheimpactexpectedofit. Information Retrieval has had its TREC competition since 1992; progress has been steady and refugees from the competition have played important roles in the hundred-billion-dollar search industry. Robotics has had the DARPA Grand Challenge for only two years, but in that time we have seen the results go from complete failure to resounding success (although it may have helped that the second year's course was somewhat easier than the ?rst's)., Everyonelovesagoodcompetition. AsIwritethis,twobillionfansareeagerly anticipating the 2006 World Cup. Meanwhile, a fan base that is somewhat smaller (but presumably includes you, dear reader) is equally eager to read all about the results of the NIPS 2003 Feature Selection Challenge, contained herein. Fans of Radford Neal and Jianguo Zhang (or of Bayesian neural n- works and Dirichlet di?usion trees) are gloating "I told you so" and looking forproofthattheirwinwasnota?uke. Butthematterisbynomeanssettled, and fans of SVMs are shouting "wait 'til next year!" You know this book is a bit more edgy than your standard academic treatise as soon as you see the dedication: "To our friends and foes. " Competition breeds improvement. Fifty years ago, the champion in 100m butter?yswimmingwas22percentslowerthantoday'schampion;thewomen's marathon champion from just 30 years ago was 26 percent slower. Who knows how much better our machine learning algorithms would be today if Turing in 1950 had proposed an e?ective competition rather than his elusive Test? But what makes an e?ective competition? The ?eld of Speech Recognition hashadNIST-runcompetitionssince1988;errorrateshavebeenreducedbya factorofthreeormore,butthe?eldhasnotyethadtheimpactexpectedofit. Information Retrieval has had its TREC competition since 1992; progress has been steady and refugees from the competition have played important roles in the hundred-billion-dollar search industry. Robotics has had the DARPA Grand Challenge for only two years, but in that time we have seen the results go from complete failure to resounding success (although it may have helped that the second year's course was somewhat easier than the ?rst's).
LC Classification Number
TA1634

Item description from the seller

Great Book Prices Store

Great Book Prices Store

96.5% positive feedback
1.2M items sold
Joined Feb 2017
Usually responds within 24 hours

Detailed seller ratings

Average for the last 12 months
Accurate description
4.9
Reasonable shipping cost
5.0
Shipping speed
4.9
Communication
4.8

Seller feedback (354,574)

  • 8***5 (19)- Feedback left by buyer.
    Past 6 months
    Verified purchase
    I didn't see a verbal description of this book as "hardcover" or paperback. Cover design and price were similar to both cover styles. I thought I'd left a question for the seller, but I may have done something wrong. The package was too small to match the hardcovers previously received so I returned it unopened. Seller was very efficient, and packaging offered excellent protection. Haven't checked to see if refund was processed.
  • w***t (665)- Feedback left by buyer.
    Past 6 months
    Verified purchase
    PERFECT TRANSACTION! Shipped right after payment, well packaged, arrived during the estimated time. The item is in great condition EXACTLY as described. Very Happy, very nice purchase. Excellent communication. Thank you
  • l***l (31)- Feedback left by buyer.
    Past year
    Verified purchase
    Excellent! This seller has really good prices, communication, packaging and fast shipping. The book I bought was better than described and I would definitely buy from this seller again. It was my time to have this book.🙌 A+++++

Product ratings and reviews

No ratings or reviews yet
Be the first to write the review.