- Häftad (Paperback / softback)
- Antal sidor
- Softcover reprint of the original 1st ed. 2013
- Springer-Verlag Berlin and Heidelberg GmbH & Co. K
- 45 Illustrations, color; 3 Illustrations, black and white; XII, 132 p. 48 illus., 45 illus. in color
- 234 x 156 x 8 mm
- Antal komponenter
- 1 Paperback / softback
- 218 g
Du kanske gillar
Amy C EdmondsonInbunden
Dimensionality Reduction with Unsupervised Nearest Neighbors1399Skickas inom 10-15 vardagar.
Fri frakt inom Sverige för privatpersoner.This book is devoted to a novel approach for dimensionality reduction based on the famous nearest neighbor method that is a powerful classification and regression approach. It starts with an introduction to machine learning concepts and a real-world application from the energy domain. Then, unsupervised nearest neighbors (UNN) is introduced as efficient iterative method for dimensionality reduction. Various UNN models are developed step by step, reaching from a simple iterative strategy for discrete latent spaces to a stochastic kernel-based algorithm for learning submanifolds with independent parameterizations. Extensions that allow the embedding of incomplete and noisy patterns are introduced. Various optimization approaches are compared, from evolutionary to swarm-based heuristics. Experimental comparisons to related methodologies taking into account artificial test data sets and also real-world data demonstrate the behavior of UNN in practical scenarios. The book contains numerous color figures to illustrate the introduced concepts and to highlight the experimental results.
KundrecensionerHar du läst boken? Sätt ditt betyg »
Fler böcker av Oliver Kramer
Recensioner i media
From the reviews: "The book provides an overview of the author's work on dimensionality reduction using unsupervised nearest neighbors. ... this book is primarily of interest to scholars who want to learn more about Prof. Kramer's research on dimensionality reduction." (Laurens van der Maaten, zbMATH, Vol. 1283, 2014)
Part I Foundations.- Part II Unsupervised Nearest Neighbors.- Part III Conclusions.