Vincent N. LaRiccia - Böcker
Visar alla böcker från författaren Vincent N. LaRiccia. Handla med fri frakt och snabb leverans.
4 produkter
4 produkter
2 115 kr
Skickas inom 10-15 vardagar
This book is intended for graduate students in statistics and industrial mathematics, as well as researchers and practitioners in the field. We cover both theory and practice of nonparametric estimation. The text is novel in its use of maximum penalized likelihood estimation, and the theory of convex minimization problems (fully developed in the text) to obtain convergence rates. We also use (and develop from an elementary view point) discrete parameter submartingales and exponential inequalities. A substantial effort has been made to discuss computational details, and to include simulation studies and analyses of some classical data sets using fully automatic (data driven) procedures. Some theoretical topics that appear in textbook form for the first time are definitive treatments of I.J. Good's roughness penalization, monotone and unimodal density estimation, asymptotic optimality of generalized cross validation for spline smoothing and analogous methods for ill-posed least squares problems, and convergence proofs of EM algorithms for random sampling problems.
2 115 kr
Skickas inom 10-15 vardagar
This book is intended for graduate students in statistics and industrial mathematics, as well as researchers and practitioners in the field. We cover both theory and practice of nonparametric estimation. The text is novel in its use of maximum penalized likelihood estimation, and the theory of convex minimization problems (fully developed in the text) to obtain convergence rates. We also use (and develop from an elementary view point) discrete parameter submartingales and exponential inequalities. A substantial effort has been made to discuss computational details, and to include simulation studies and analyses of some classical data sets using fully automatic (data driven) procedures. Some theoretical topics that appear in textbook form for the first time are definitive treatments of I.J. Good's roughness penalization, monotone and unimodal density estimation, asymptotic optimality of generalized cross validation for spline smoothing and analogous methods for ill-posed least squares problems, and convergence proofs of EM algorithms for random sampling problems.
2 111 kr
Skickas inom 10-15 vardagar
Good's roughness penalization, monotone and unimodal density estimation, asymptotic optimality of generalized cross validation for spline smoothing and analogous methods for ill-posed least squares problems, and convergence proofs of EM algorithms for random sampling problems.
1 375 kr
Skickas inom 10-15 vardagar
This is the second volume of a text on the theory and practice of maximum penalized likelihood estimation. It is intended for graduate students in s- tistics, operationsresearch, andappliedmathematics, aswellasresearchers and practitioners in the ?eld. The present volume was supposed to have a short chapter on nonparametric regression but was intended to deal mainly with inverse problems. However, the chapter on nonparametric regression kept growing to the point where it is now the only topic covered. Perhaps there will be a Volume III. It might even deal with inverse problems. But for now we are happy to have ?nished Volume II. The emphasis in this volume is on smoothing splines of arbitrary order, but other estimators (kernels, local and global polynomials) pass review as well. We study smoothing splines and local polynomials in the context of reproducing kernel Hilbert spaces. The connection between smoothing splines and reproducing kernels is of course well-known. The new twist is thatlettingtheinnerproductdependonthesmoothingparameteropensup new possibilities: It leads to asymptotically equivalent reproducing kernel estimators (without quali?cations) and thence, via uniform error bounds for kernel estimators, to uniform error bounds for smoothing splines and, via strong approximations, to con?dence bands for the unknown regression function. ItcameassomewhatofasurprisethatreproducingkernelHilbert space ideas also proved useful in the study of local polynomial estimators.