Zhiyuan Luo - Böcker
Visar alla böcker från författaren Zhiyuan Luo. Handla med fri frakt och snabb leverans.
4 produkter
4 produkter
792 kr
Skickas inom 10-15 vardagar
This volume honors Alexander Gammerman on the occasion of his 80th birthday, Prof. Gammerman is one of the leading figures in the area of AI uncertainty quantification, most notably he coinvented the Conformal Prediction algorithm, widely used by researchers, industry practitioners, and government policymakers. He began his academic career as a researcher at the Agrophysical Research Institute in St. Petersburg, followed by a lecturer position at Heriot-Watt University in Edinburgh. He joined Royal Holloway, University of London in 1993, where he served as head of the Computer Science department for 10 years and founded the Centre for Reliable Machine Learning. Prof. Gammerman’s career exemplifies the transformative impact of interdisciplinary research, he has written over 250 research papers, with nearly 12,000 citations, and among his 9 books is the highly cited Algorithmic Learning in a Random World. He founded the Kolmogorov Lecture series in 2003 and the COPA conference in 2012, and he has chaired many international events on Machine Learning and Bayesian methods. He has also influenced future generations through his university teaching and his mentorship, he was the lead supervisor for over 30 PhD students, many of whom are now also at the forefront of AI research and applications.From pioneering mathematical models of plant photoreceptors to advancing the formal treatment of uncertainty in artificial intelligence, Alexander Gammerman’s work is a rare confluence of analytical precision, conceptual depth, and visionary application. The contributions in this volume recognize the breadth and depth of his intellectual influence and his long-lasting impact as a researcher, educator, and mentor.
Del 9653 - Lecture Notes in Computer Science
Conformal and Probabilistic Prediction with Applications
5th International Symposium, COPA 2016, Madrid, Spain, April 20-22, 2016, Proceedings
Häftad, Engelska, 2016
553 kr
Skickas inom 10-15 vardagar
This book constitutes the refereed proceedings of the 5th InternationalSymposium on Conformal and Probabilistic Prediction with Applications, COPA2016, held in Madrid, Spain, in April 2016.The 14 revised fullpapers presented together with 1 invited paper were carefully reviewed andselected from 23 submissions and cover topics on theory of conformal prediction; applicationsof conformal prediction; and machine learning.
553 kr
Skickas inom 10-15 vardagar
This book honours the outstanding contributions of Vladimir Vapnik, a rare example of a scientist for whom the following statements hold true simultaneously: his work led to the inception of a new field of research, the theory of statistical learning and empirical inference; he has lived to see the field blossom; and he is still as active as ever. He started analyzing learning algorithms in the 1960s and he invented the first version of the generalized portrait algorithm. He later developed one of the most successful methods in machine learning, the support vector machine (SVM) – more than just an algorithm, this was a new approach to learning problems, pioneering the use of functional analysis and convex optimization in machine learning. Part I of this book contains three chapters describing and witnessing some of Vladimir Vapnik's contributions to science. In the first chapter, Léon Bottou discusses the seminal paper published in 1968 by Vapnik and Chervonenkis that lay the foundations of statistical learning theory, and the second chapter is an English-language translation of that original paper. In the third chapter, Alexey Chervonenkis presents a first-hand account of the early history of SVMs and valuable insights into the first steps in the development of the SVM in the framework of the generalised portrait method. The remaining chapters, by leading scientists in domains such as statistics, theoretical computer science, and mathematics, address substantial topics in the theory and practice of statistical learning theory, including SVMs and other kernel-based methods, boosting, PAC-Bayesian theory, online and transductive learning, loss functions, learnable function classes, notions of complexity for function classes, multitask learning, and hypothesis selection.These contributions include historical and context notes, short surveys, and comments on future research directions. This book will be of interest to researchers, engineers, and graduate students engaged with all aspects of statistical learning.
537 kr
Skickas inom 10-15 vardagar
This book honours the outstanding contributions of Vladimir Vapnik, a rare example of a scientist for whom the following statements hold true simultaneously: his work led to the inception of a new field of research, the theory of statistical learning and empirical inference; he has lived to see the field blossom; and he is still as active as ever. He started analyzing learning algorithms in the 1960s and he invented the first version of the generalized portrait algorithm. He later developed one of the most successful methods in machine learning, the support vector machine (SVM) – more than just an algorithm, this was a new approach to learning problems, pioneering the use of functional analysis and convex optimization in machine learning. Part I of this book contains three chapters describing and witnessing some of Vladimir Vapnik's contributions to science. In the first chapter, Léon Bottou discusses the seminal paper published in 1968 by Vapnik and Chervonenkis that lay the foundations of statistical learning theory, and the second chapter is an English-language translation of that original paper. In the third chapter, Alexey Chervonenkis presents a first-hand account of the early history of SVMs and valuable insights into the first steps in the development of the SVM in the framework of the generalised portrait method. The remaining chapters, by leading scientists in domains such as statistics, theoretical computer science, and mathematics, address substantial topics in the theory and practice of statistical learning theory, including SVMs and other kernel-based methods, boosting, PAC-Bayesian theory, online and transductive learning, loss functions, learnable function classes, notions of complexity for function classes, multitask learning, and hypothesis selection.These contributions include historical and context notes, short surveys, and comments on future research directions. This book will be of interest to researchers, engineers, and graduate students engaged with all aspects of statistical learning.