Knowledge Discovery with Support Vector Machines (inbunden)
Format
Inbunden (Hardback)
Språk
Engelska
Antal sidor
262
Utgivningsdatum
2009-09-04
Upplaga
1
Förlag
Wiley-Interscience
Medarbetare
Hamel
Illustrationer
Illustrations
Dimensioner
229 x 157 x 20 mm
Vikt
568 g
Antal komponenter
1
Komponenter
52:B&W 6.14 x 9.21in or 234 x 156mm (Royal 8vo) Case Laminate on White w/Gloss Lam
ISBN
9780470371923

Knowledge Discovery with Support Vector Machines

Inbunden,  Engelska, 2009-09-04
1667
  • Skickas från oss inom 5-8 vardagar.
  • Fri frakt över 249 kr för privatkunder i Sverige.
Finns även som
Visa alla 2 format & utgåvor
An easy-to-follow introduction to support vector machines This book provides an in-depth, easy-to-follow introduction to support vector machines drawing only from minimal, carefully motivated technical and mathematical background material. It begins with a cohesive discussion of machine learning and goes on to cover: Knowledge discovery environments Describing data mathematically Linear decision surfaces and functions Perceptron learning Maximum margin classifiers Support vector machines Elements of statistical learning theory Multi-class classification Regression with support vector machines Novelty detection Complemented with hands-on exercises, algorithm descriptions, and data sets, Knowledge Discovery with Support Vector Machines is an invaluable textbook for advanced undergraduate and graduate courses. It is also an excellent tutorial on support vector machines for professionals who are pursuing research in machine learning and related areas.
Visa hela texten

Passar bra ihop

  1. Knowledge Discovery with Support Vector Machines
  2. +
  3. Python Crash Course, 3rd Edition

De som köpt den här boken har ofta också köpt Python Crash Course, 3rd Edition av Eric Matthes (häftad).

Köp båda 2 för 2264 kr

Kundrecensioner

Har du läst boken? Sätt ditt betyg »

Övrig information

Lutz Hamel, PhD, teaches at the University of Rhode Island, where he founded the machine learning and data mining group. His major research interests are computational logic, machine learning, evolutionary computation, data mining, bioinformatics, and computational structures in art and literature.

Innehållsförteckning

Preface. PART I. 1 What is Knowledge Discovery? 1.1 Machine Learning. 1.2 The Structure of the Universe X. 1.3 Inductive Learning. 1.4 Model Representations. Exercises. Bibliographic Notes. 2 Knowledge Discovery Environments. 2.1 Computational Aspects of Knowledge Discovery. 2.1.1 Data Access. 2.1.2 Visualization. 2.1.3 Data Manipulation. 2.1.4 Model Building and Evaluation. 2.1.5 Model Deployment. 2.2 Other Toolsets. Exercises. Bibliographic Notes. 3 Describing Data Mathematically. 3.1 From Data Sets to Vector Spaces. 3.1.1 Vectors. 3.1.2 Vector Spaces. 3.2 The Dot Product as a Similarity Score. 3.3 Lines, Planes, and Hyperplanes. Exercises. Bibliographic Notes. 4 Linear Decision Surfaces and Functions. 4.1 From Data Sets to Decision Functions. 4.1.1 Linear Decision Surfaces through the Origin. 4.1.2 Decision Surfaces with an Offset Term. 4.2 A Simple Learning Algorithm. 4.3 Discussion. Exercises. Bibliographic Notes. 5 Perceptron Learning. 5.1 Perceptron Architecture and Training. 5.2 Duality. 5.3 Discussion. Exercises. Bibliographic Notes. 6 Maximum Margin Classifiers. 6.1 Optimization Problems. 6.2 Maximum Margins. 6.3 Optimizing the Margin. 6.4 Quadratic Programming. 6.5 Discussion. Exercises. Bibliographic Notes. PART II. 7 Support Vector Machines. 7.1 The Lagrangian Dual. 7.2 Dual MaximumMargin Optimization. 7.2.1 The Dual Decision Function. 7.3 Linear Support Vector Machines. 7.4 Non-Linear Support Vector Machines. 7.4.1 The Kernel Trick. 7.4.2 Feature Search. 7.4.3 A Closer Look at Kernels. 7.5 Soft-Margin Classifiers. 7.5.1 The Dual Setting for Soft-Margin Classifiers. 7.6 Tool Support. 7.6.1 WEKA. 7.6.2 R. 7.7 Discussion. Exercises. Bibliographic Notes. 8 Implementation. 8.1 Gradient Ascent. 8.1.1 The Kernel-Adatron Algorithm. 8.2 Quadratic Programming. 8.2.1 Chunking. 8.3 Sequential Minimal Optimization. 8.4 Discussion. Exercises. Bibliographic Notes. 9 Evaluating What has been Learned. 9.1 Performance Metrics. 9.1.1 The Confusion Matrix. 9.2 Model Evaluation. 9.2.1 The Hold-Out Method. 9.2.2 The Leave-One-Out Method. 9.2.3 N-Fold Cross-Validation. 9.3 Error Confidence Intervals. 9.3.1 Model Comparisons. 9.4 Model Evaluation in Practice. 9.4.1 WEKA. 9.4.2 R. Exercises. Bibliographic Notes. 10 Elements of Statistical Learning Theory. 10.1 The VC-Dimension and Model Complexity. 10.2 A Theoretical Setting for Machine Learning. 10.3 Empirical Risk Minimization. 10.4 VC-Confidence. 10.5 Structural Risk Minimization. 10.6 Discussion. Exercises. Bibliographic Notes. PART III. 11 Multi-Class Classification. 11.1 One-versus-the-Rest Classification. 11.2 Pairwise Classification. 11.3 Discussion. Exercises. Bibliographic Notes. 12 Regression with Support Vector Machines. 12.1 Regression as Machine Learning. 12.2 Simple and Multiple Linear Regression. 12.3 Regression with Maximum Margin Machines. 12.4 Regression with Support Vector Machines. 12.5 Model Evaluation. 12.6 Tool Support. 12.6.1 WEKA. 12.6.2 R. Exercises. Bibliographic Notes. 13 Novelty Detection. 13.1 Maximum Margin Machines. 13.2 The Dual Setting. 13.3 Novelty Detection in R. Exercises. Bibliographic Notes. Appendix A: Notation. Appendix B: A Tutorial Introduction to R. B.1 Programming Constructs. B.2 Data Constructs. B.3 Basic Data Analysis. Bibliographic Notes. References. Index.