|Listed in category:
Have one to sell?

Feature Extraction: Foundations and Applications by Isabelle Guyon (English) Har

US $406.48
No Interest if paid in full in 6 mo on $149+ with PayPal Credit*
Condition:
Brand New
3 available
Shipping:
Free Economy Shipping.
Located in: Fairfield, Ohio, United States
Delivery:
Estimated between Thu, Nov 21 and Tue, Nov 26 to 20147
Delivery time is estimated using our proprietary method which is based on the buyer's proximity to the item location, the shipping service selected, the seller's shipping history, and other factors. Delivery times may vary, especially during peak periods.
Includes 10 business days handling time after receipt of cleared payment.
Returns:
30 days returns. Buyer pays for return shipping.
Payments:
      
*No Interest if paid in full in 6 months on $149+. See terms and apply now- for PayPal Credit, opens in a new window or tab
Earn up to 5x points when you use your eBay Mastercard®. Learn moreabout earning points with eBay Mastercard

Shop with confidence

eBay Money Back Guarantee
Get the item you ordered or your money back. Learn moreeBay Money Back Guarantee - opens new window or tab
Seller assumes all responsibility for this listing.
eBay item number:386714587556
Last updated on Nov 02, 2024 21:46:56 PDTView all revisionsView all revisions

Item specifics

Condition
Brand New: A new, unread, unused book in perfect condition with no missing or damaged pages. See the ...
ISBN-13
9783540354871
Book Title
Feature Extraction
ISBN
9783540354871
Subject Area
Mathematics, Computers, Technology & Engineering
Publication Name
Feature Extraction : Foundations and Applications
Publisher
Springer Berlin / Heidelberg
Item Length
9.3 in
Subject
Engineering (General), Intelligence (Ai) & Semantics, Applied, Computer Vision & Pattern Recognition
Publication Year
2006
Series
Studies in Fuzziness and Soft Computing Ser.
Type
Textbook
Format
Hardcover
Language
English
Author
Steve Gunn
Item Weight
48.1 Oz
Item Width
6.1 in
Number of Pages
Xxiv, 778 Pages

About this product

Product Identifiers

Publisher
Springer Berlin / Heidelberg
ISBN-10
3540354875
ISBN-13
9783540354871
eBay Product ID (ePID)
59048759

Product Key Features

Number of Pages
Xxiv, 778 Pages
Publication Name
Feature Extraction : Foundations and Applications
Language
English
Subject
Engineering (General), Intelligence (Ai) & Semantics, Applied, Computer Vision & Pattern Recognition
Publication Year
2006
Type
Textbook
Author
Steve Gunn
Subject Area
Mathematics, Computers, Technology & Engineering
Series
Studies in Fuzziness and Soft Computing Ser.
Format
Hardcover

Dimensions

Item Weight
48.1 Oz
Item Length
9.3 in
Item Width
6.1 in

Additional Product Features

Intended Audience
Scholarly & Professional
LCCN
2006-928001
Dewey Edition
22
Series Volume Number
207
Number of Volumes
1 vol.
Illustrated
Yes
Dewey Decimal
006.3
Table Of Content
An Introduction to Feature Extraction.- An Introduction to Feature Extraction.- Feature Extraction Fundamentals.- Learning Machines.- Assessment Methods.- Filter Methods.- Search Strategies.- Embedded Methods.- Information-Theoretic Methods.- Ensemble Learning.- Fuzzy Neural Networks.- Feature Selection Challenge.- Design and Analysis of the NIPS2003 Challenge.- High Dimensional Classification with Bayesian Neural Networks and Dirichlet Diffusion Trees.- Ensembles of Regularized Least Squares Classifiers for High-Dimensional Problems.- Combining SVMs with Various Feature Selection Strategies.- Feature Selection with Transductive Support Vector Machines.- Variable Selection using Correlation and Single Variable Classifier Methods: Applications.- Tree-Based Ensembles with Dynamic Soft Feature Selection.- Sparse, Flexible and Efficient Modeling using L 1 Regularization.- Margin Based Feature Selection and Infogain with Standard Classifiers.- Bayesian Support Vector Machines for Feature Ranking and Selection.- Nonlinear Feature Selection with the Potential Support Vector Machine.- Combining a Filter Method with SVMs.- Feature Selection via Sensitivity Analysis with Direct Kernel PLS.- Information Gain, Correlation and Support Vector Machines.- Mining for Complex Models Comprising Feature Selection and Classification.- Combining Information-Based Supervised and Unsupervised Feature Selection.- An Enhanced Selective Naïve Bayes Method with Optimal Discretization.- An Input Variable Importance Definition based on Empirical Data Probability Distribution.- New Perspectives in Feature Extraction.- Spectral Dimensionality Reduction.- Constructing Orthogonal Latent Features for Arbitrary Loss.- Large Margin Principles for Feature Selection.- Feature Extraction for Classificationof Proteomic Mass Spectra: A Comparative Study.- Sequence Motifs: Highly Predictive Features of Protein Function.
Synopsis
Everyonelovesagoodcompetition. AsIwritethis, twobillionfansareeagerly anticipating the 2006 World Cup. Meanwhile, a fan base that is somewhat smaller (but presumably includes you, dear reader) is equally eager to read all about the results of the NIPS 2003 Feature Selection Challenge, contained herein. Fans of Radford Neal and Jianguo Zhang (or of Bayesian neural n- works and Dirichlet di?usion trees) are gloating "I told you so" and looking forproofthattheirwinwasnota?uke. Butthematterisbynomeanssettled, and fans of SVMs are shouting "wait 'til next year!" You know this book is a bit more edgy than your standard academic treatise as soon as you see the dedication: "To our friends and foes. " Competition breeds improvement. Fifty years ago, the champion in 100m butter?yswimmingwas22percentslowerthantoday'schampion;thewomen's marathon champion from just 30 years ago was 26 percent slower. Who knows how much better our machine learning algorithms would be today if Turing in 1950 had proposed an e?ective competition rather than his elusive Test? But what makes an e?ective competition? The ?eld of Speech Recognition hashadNIST-runcompetitionssince1988;errorrateshavebeenreducedbya factorofthreeormore, butthe?eldhasnotyethadtheimpactexpectedofit. Information Retrieval has had its TREC competition since 1992; progress has been steady and refugees from the competition have played important roles in the hundred-billion-dollar search industry. Robotics has had the DARPA Grand Challenge for only two years, but in that time we have seen the results go from complete failure to resounding success (although it may have helped that the second year's course was somewhat easier than the ?rst's)., Everyonelovesagoodcompetition. AsIwritethis, twobillionfansareeagerly anticipating the 2006 World Cup. Meanwhile, a fan base that is somewhat smaller (but presumably includes you, dear reader) is equally eager to read all about the results of the NIPS 2003 Feature Selection Challenge, contained herein. Fans of Radford Neal and Jianguo Zhang (or of Bayesian neural n- works and Dirichlet di'usion trees) are gloating "I told you so" and looking forproofthattheirwinwasnota'uke. Butthematterisbynomeanssettled, and fans of SVMs are shouting "wait 'til next year " You know this book is a bit more edgy than your standard academic treatise as soon as you see the dedication: "To our friends and foes. " Competition breeds improvement. Fifty years ago, the champion in 100m butter'yswimmingwas22percentslowerthantoday'schampion;thewomen's marathon champion from just 30 years ago was 26 percent slower. Who knows how much better our machine learning algorithms would be today if Turing in 1950 had proposed an e'ective competition rather than his elusive Test? But what makes an e'ective competition? The ?eld of Speech Recognition hashadNIST-runcompetitionssince1988;errorrateshavebeenreducedbya factorofthreeormore, butthe'eldhasnotyethadtheimpactexpectedofit. Information Retrieval has had its TREC competition since 1992; progress has been steady and refugees from the competition have played important roles in the hundred-billion-dollar search industry. Robotics has had the DARPA Grand Challenge for only two years, but in that time we have seen the results go from complete failure to resounding success (although it may have helped that the second year's course was somewhat easier than the ?rst's)., Everyonelovesagoodcompetition. AsIwritethis,twobillionfansareeagerly anticipating the 2006 World Cup. Meanwhile, a fan base that is somewhat smaller (but presumably includes you, dear reader) is equally eager to read all about the results of the NIPS 2003 Feature Selection Challenge, contained herein. Fans of Radford Neal and Jianguo Zhang (or of Bayesian neural n- works and Dirichlet di'usion trees) are gloating "I told you so" and looking forproofthattheirwinwasnota'uke. Butthematterisbynomeanssettled, and fans of SVMs are shouting "wait 'til next year!" You know this book is a bit more edgy than your standard academic treatise as soon as you see the dedication: "To our friends and foes. " Competition breeds improvement. Fifty years ago, the champion in 100m butter'yswimmingwas22percentslowerthantoday'schampion;thewomen's marathon champion from just 30 years ago was 26 percent slower. Who knows how much better our machine learning algorithms would be today if Turing in 1950 had proposed an e'ective competition rather than his elusive Test? But what makes an e'ective competition? The ?eld of Speech Recognition hashadNIST-runcompetitionssince1988;errorrateshavebeenreducedbya factorofthreeormore,butthe'eldhasnotyethadtheimpactexpectedofit. Information Retrieval has had its TREC competition since 1992; progress has been steady and refugees from the competition have played important roles in the hundred-billion-dollar search industry. Robotics has had the DARPA Grand Challenge for only two years, but in that time we have seen the results go from complete failure to resounding success (although it may have helped that the second year's course was somewhat easier than the ?rst's)., Everyonelovesagoodcompetition. AsIwritethis,twobillionfansareeagerly anticipating the 2006 World Cup. Meanwhile, a fan base that is somewhat smaller (but presumably includes you, dear reader) is equally eager to read all about the results of the NIPS 2003 Feature Selection Challenge, contained herein. Fans of Radford Neal and Jianguo Zhang (or of Bayesian neural n- works and Dirichlet di?usion trees) are gloating "I told you so" and looking forproofthattheirwinwasnota?uke. Butthematterisbynomeanssettled, and fans of SVMs are shouting "wait 'til next year!" You know this book is a bit more edgy than your standard academic treatise as soon as you see the dedication: "To our friends and foes. " Competition breeds improvement. Fifty years ago, the champion in 100m butter?yswimmingwas22percentslowerthantoday'schampion;thewomen's marathon champion from just 30 years ago was 26 percent slower. Who knows how much better our machine learning algorithms would be today if Turing in 1950 had proposed an e?ective competition rather than his elusive Test? But what makes an e?ective competition? The ?eld of Speech Recognition hashadNIST-runcompetitionssince1988;errorrateshavebeenreducedbya factorofthreeormore,butthe?eldhasnotyethadtheimpactexpectedofit. Information Retrieval has had its TREC competition since 1992; progress has been steady and refugees from the competition have played important roles in the hundred-billion-dollar search industry. Robotics has had the DARPA Grand Challenge for only two years, but in that time we have seen the results go from complete failure to resounding success (although it may have helped that the second year's course was somewhat easier than the ?rst's)., This book is both a reference for engineers and scientists and a teaching resource, featuring tutorial chapters and research papers on feature extraction. Its CD-ROM includes the data of the NIPS 2003 Feature Selection Challenge and sample Matlab® code. Until now there has been insufficient consideration of feature selection algorithms, no unified presentation of leading methods, and no systematic comparisons.
LC Classification Number
TA1634

Item description from the seller

About this seller

grandeagleretail

98.3% positive feedback2.7M items sold

Joined Sep 2010
Usually responds within 24 hours
Grand Eagle Retail is your online bookstore. We offer Great books, Great prices and Great service.

Detailed seller ratings

Average for the last 12 months
Accurate description
4.9
Reasonable shipping cost
5.0
Shipping speed
4.9
Communication
4.9

Seller feedback (1,036,751)

  • h***9 (3109)- Feedback left by buyer.
    Past 6 months
    Verified purchase
    🏆 SUPER STAR 🤩 AMAZING PHOTOS 🎯 ACCURATE DESCRIPTION ✏️ GENUINE PRODUCTS 💎 HIGH QUALITY 🍯 SUPER PRICES 💰 EASY TO WORK WITH 🍰 ECONOMY HANDLING ⏱️ FAST SHIPPING 🚀 BUBBLE PACKAGE 📦 ARRIVED WITHIN DAYS 🌎 EXCEPTIONAL COMMUNICATION 🎙️ OUTSTANDING CUSTOMER SERVICE 🛎️ GREAT SENSE OF HUMOR 🍿 TOTAL ASSET TO THE EBAY-ECO SYSTEM 🥇 SAVED SELLER 🎱 PROMT REPLY FOR RETURNS 🎯 WOULD BUY FROM AGAIN 🧲 UNDER PROMISES OVER DELIVERS ⛳️ MADE ME VERY HAPPY 🌈 LEFT POSITIVE FEEDBACK 🌼 THANK YOU! 😇 A+++
  • w***8 (7)- Feedback left by buyer.
    Past month
    Verified purchase
    The book came exactly as described and packaged well. The shipping time was long but not unfair. I had some issues tracking but contacted the seller and they were responsive and accommodating. Great price and quality and I will be purchasing again if possible! Cheers!
  • l***a (3611)- Feedback left by buyer.
    Past 6 months
    Verified purchase
    Excellent seller. Great customer service and communication, timely shipping, fair prices, safe packing, as described. Thank you. A+++

Product ratings and reviews

No ratings or reviews yet
Be the first to write the review.