Silvelyn Zwanzig - Böcker
Visar alla böcker från författaren Silvelyn Zwanzig. Handla med fri frakt och snabb leverans.
8 produkter
8 produkter
840 kr
Skickas inom 10-15 vardagar
This textbook gives an overview of statistical methods that have been developed during the last years due to increasing computer use, including random number generators, Monte Carlo methods, Markov Chain Monte Carlo (MCMC) methods, Bootstrap, EM algorithms, SIMEX, variable selection, density estimators, kernel estimators, orthogonal and local polynomial estimators, wavelet estimators, splines, and model assessment. Computer Intensive Methods in Statistics is written for students at graduate level, but can also be used by practitioners.Features Presents the main ideas of computer-intensive statistical methods Gives the algorithms for all the methods Uses various plots and illustrations for explaining the main ideas Features the theoretical backgrounds of the main methods. Includes R codes for the methods and examplesSilvelyn Zwanzig is an Associate Professor for Mathematical Statistics at Uppsala University. She studied Mathematics at the Humboldt- University in Berlin. Before coming to Sweden, she was Assistant Professor at the University of Hamburg in Germany. She received her Ph.D. in Mathematics at the Academy of Sciences of the GDR. Since 1991, she has taught Statistics for undergraduate and graduate students. Her research interests have moved from theoretical statistics to computer intensive statistics. Behrang Mahjani is a postdoctoral fellow with a Ph.D. in Scientific Computing with a focus on Computational Statistics, from Uppsala University, Sweden. He joined the Seaver Autism Center for Research and Treatment at the Icahn School of Medicine at Mount Sinai, New York, in September 2017 and was formerly a postdoctoral fellow at the Karolinska Institutet, Stockholm, Sweden. His research is focused on solving large-scale problems through statistical and computational methods.
2 306 kr
Skickas inom 10-15 vardagar
This textbook gives an overview of statistical methods that have been developed during the last years due to increasing computer use, including random number generators, Monte Carlo methods, Markov Chain Monte Carlo (MCMC) methods, Bootstrap, EM algorithms, SIMEX, variable selection, density estimators, kernel estimators, orthogonal and local polynomial estimators, wavelet estimators, splines, and model assessment. Computer Intensive Methods in Statistics is written for students at graduate level, but can also be used by practitioners.Features Presents the main ideas of computer-intensive statistical methods Gives the algorithms for all the methods Uses various plots and illustrations for explaining the main ideas Features the theoretical backgrounds of the main methods. Includes R codes for the methods and examplesSilvelyn Zwanzig is an Associate Professor for Mathematical Statistics at Uppsala University. She studied Mathematics at the Humboldt- University in Berlin. Before coming to Sweden, she was Assistant Professor at the University of Hamburg in Germany. She received her Ph.D. in Mathematics at the Academy of Sciences of the GDR. Since 1991, she has taught Statistics for undergraduate and graduate students. Her research interests have moved from theoretical statistics to computer intensive statistics. Behrang Mahjani is a postdoctoral fellow with a Ph.D. in Scientific Computing with a focus on Computational Statistics, from Uppsala University, Sweden. He joined the Seaver Autism Center for Research and Treatment at the Icahn School of Medicine at Mount Sinai, New York, in September 2017 and was formerly a postdoctoral fellow at the Karolinska Institutet, Stockholm, Sweden. His research is focused on solving large-scale problems through statistical and computational methods.
2 357 kr
Skickas inom 10-15 vardagar
Bayesian Inference: Theory, Methods, Computations provides a comprehensive coverage of the fundamentals of Bayesian inference from all important perspectives, namely theory, methods and computations.All theoretical results are presented as formal theorems, corollaries, lemmas etc., furnished with detailed proofs. The theoretical ideas are explained in simple and easily comprehensible forms, supplemented with several examples. A clear reasoning on the validity, usefulness, and pragmatic approach of the Bayesian methods is provided. A large number of examples and exercises, and solutions to all exercises, are provided to help students understand the concepts through ample practice. The book is primarily aimed at first or second semester master students, where parts of the book can also be used at Ph.D. level or by research community at large. The emphasis is on exact cases. However, to gain further insight into the core concepts, an entire chapter is dedicated to computer intensive techniques. Selected chapters and sections of the book can be used for a one-semester course on Bayesian statistics.Key Features:Explains basic ideas of Bayesian statistical inference in an easily comprehensible formIllustrates main ideas through sketches and plotsContains large number of examples and exercisesProvides solutions to all exercisesIncludes R codesSilvelyn Zwanzig is a Professor for Mathematical Statistics at Uppsala University. She studied Mathematics at the Humboldt University of Berlin. Before coming to Sweden, she was Assistant Professor at the University of Hamburg in Germany. She received her Ph.D. in Mathematics at the Academy of Sciences of the GDR. She has taught Statistics to undergraduate and graduate students since 1991. Her research interests include theoretical statistics and computer-intensive methods.Rauf Ahmad is Associate Professor at the Department of Statistics, Uppsala University. He did his Ph.D. at the University of Göttingen, Germany. Before joining Uppsala University, he worked at the Division of Mathematical Statistics, Department of Mathematics, Linköping University, and at Biometry Division, Swedish University of Agricultural Sciences, Uppsala. He has taught Statistics to undergraduate and graduate students since 1995. His research interests include high-dimensional inference, mathematical statistics, and U-statistics.
929 kr
Skickas inom 10-15 vardagar
Bayesian Inference: Theory, Methods, Computations provides a comprehensive coverage of the fundamentals of Bayesian inference from all important perspectives, namely theory, methods and computations.All theoretical results are presented as formal theorems, corollaries, lemmas etc., furnished with detailed proofs. The theoretical ideas are explained in simple and easily comprehensible forms, supplemented with several examples. A clear reasoning on the validity, usefulness, and pragmatic approach of the Bayesian methods is provided. A large number of examples and exercises, and solutions to all exercises, are provided to help students understand the concepts through ample practice. The book is primarily aimed at first or second semester master students, where parts of the book can also be used at Ph.D. level or by research community at large. The emphasis is on exact cases. However, to gain further insight into the core concepts, an entire chapter is dedicated to computer intensive techniques. Selected chapters and sections of the book can be used for a one-semester course on Bayesian statistics.Key Features:Explains basic ideas of Bayesian statistical inference in an easily comprehensible formIllustrates main ideas through sketches and plotsContains large number of examples and exercisesProvides solutions to all exercisesIncludes R codesSilvelyn Zwanzig is a Professor for Mathematical Statistics at Uppsala University. She studied Mathematics at the Humboldt University of Berlin. Before coming to Sweden, she was Assistant Professor at the University of Hamburg in Germany. She received her Ph.D. in Mathematics at the Academy of Sciences of the GDR. She has taught Statistics to undergraduate and graduate students since 1991. Her research interests include theoretical statistics and computer-intensive methods.Rauf Ahmad is Associate Professor at the Department of Statistics, Uppsala University. He did his Ph.D. at the University of Göttingen, Germany. Before joining Uppsala University, he worked at the Division of Mathematical Statistics, Department of Mathematics, Linköping University, and at Biometry Division, Swedish University of Agricultural Sciences, Uppsala. He has taught Statistics to undergraduate and graduate students since 1995. His research interests include high-dimensional inference, mathematical statistics, and U-statistics.
2 760 kr
Skickas inom 10-15 vardagar
Based on the authors lecture notes, Introduction to the Theory of Statistical Inference presents concise yet complete coverage of statistical inference theory, focusing on the fundamental classical principles. Suitable for a second-semester undergraduate course on statistical inference, the book offers proofs to support the mathematics. It illustrates core concepts using cartoons and provides solutions to all examples and problems.HighlightsBasic notations and ideas of statistical inference are explained in a mathematically rigorous, but understandable, form Classroom-tested and designed for students of mathematical statistics Examples, applications of the general theory to special cases, exercises, and figures provide a deeper insight into the material Solutions provided for problems formulated at the end of each chapter Combines the theoretical basis of statistical inference with a useful applied toolbox that includes linear models Theoretical, difficult, or frequently misunderstood problems are markedThe book is aimed at advanced undergraduate students, graduate students in mathematics and statistics, and theoretically-interested students from other disciplines. Results are presented as theorems and corollaries. All theorems are proven and important statements are formulated as guidelines in prose. With its multipronged and student-tested approach, this book is an excellent introduction to the theory of statistical inference.
807 kr
Skickas inom 10-15 vardagar
Based on the authors’ lecture notes, Introduction to the Theory of Statistical Inference presents concise yet complete coverage of statistical inference theory, focusing on the fundamental classical principles. Suitable for a second-semester undergraduate course on statistical inference, the book offers proofs to support the mathematics. It illustrates core concepts using cartoons and provides solutions to all examples and problems.Highlights Basic notations and ideas of statistical inference are explained in a mathematically rigorous, but understandable, formClassroom-tested and designed for students of mathematical statisticsExamples, applications of the general theory to special cases, exercises, and figures provide a deeper insight into the materialSolutions provided for problems formulated at the end of each chapter Combines the theoretical basis of statistical inference with a useful applied toolbox that includes linear modelsTheoretical, difficult, or frequently misunderstood problems are markedThe book is aimed at advanced undergraduate students, graduate students in mathematics and statistics, and theoretically-interested students from other disciplines. Results are presented as theorems and corollaries. All theorems are proven and important statements are formulated as guidelines in prose. With its multipronged and student-tested approach, this book is an excellent introduction to the theory of statistical inference.
Applications of Linear and Nonlinear Models
Fixed Effects, Random Effects, and Total Least Squares
Inbunden, Engelska, 2022
2 329 kr
Skickas inom 10-15 vardagar
This book provides numerous examples of linear and nonlinear model applications. Here, we present a nearly complete treatment of the Grand Universe of linear and weakly nonlinear regression models within the first 8 chapters. Our point of view is both an algebraic view and a stochastic one. For example, there is an equivalent lemma between a best, linear uniformly unbiased estimation (BLUUE) in a Gauss–Markov model and a least squares solution (LESS) in a system of linear equations. While BLUUE is a stochastic regression model, LESS is an algebraic solution. In the first six chapters, we concentrate on underdetermined and overdetermined linear systems as well as systems with a datum defect. We review estimators/algebraic solutions of type MINOLESS, BLIMBE, BLUMBE, BLUUE, BIQUE, BLE, BIQUE, and total least squares. The highlight is the simultaneous determination of the first moment and the second central moment of a probability distribution in an inhomogeneous multilinear estimationby the so-called E-D correspondence as well as its Bayes design. In addition, we discuss continuous networks versus discrete networks, use of Grassmann–Plucker coordinates, criterion matrices of type Taylor–Karman as well as FUZZY sets. Chapter seven is a speciality in the treatment of an overjet. This second edition adds three new chapters:(1) Chapter on integer least squares that covers (i) model for positioning as a mixed integer linear model which includes integer parameters. (ii) The general integer least squares problem is formulated, and the optimality of the least squares solution is shown. (iii) The relation to the closest vector problem is considered, and the notion of reduced lattice basis is introduced. (iv) The famous LLL algorithm for generating a Lovasz reduced basis is explained.(2) Bayes methods that covers (i) general principle of Bayesian modeling. Explain the notion of prior distribution and posterior distribution. Choose the pragmatic approach for exploring the advantages of iterative Bayesian calculations and hierarchical modeling. (ii) Present the Bayes methods for linear models with normal distributed errors, including noninformative priors, conjugate priors, normal gamma distributions and (iii) short outview to modern application of Bayesian modeling. Useful in case of nonlinear models or linear models with no normal distribution: Monte Carlo (MC), Markov chain Monte Carlo (MCMC), approximative Bayesian computation (ABC) methods.(3) Error-in-variables models, which cover: (i) Introduce the error-in-variables (EIV) model, discuss the difference to least squares estimators (LSE), (ii) calculate the total least squares (TLS) estimator. Summarize the properties of TLS, (iii) explain the idea of simulation extrapolation (SIMEX) estimators, (iv) introduce the symmetrized SIMEX (SYMEX) estimator and its relation to TLS, and (v) short outview to nonlinear EIV models. The chapter onalgebraic solution of nonlinear system of equations has also been updated in line with the new emerging field of hybrid numeric-symbolic solutions to systems of nonlinear equations, ermined system of nonlinear equations on curved manifolds. The von Mises–Fisher distribution is characteristic for circular or (hyper) spherical data. Our last chapter is devoted to probabilistic regression, the special Gauss–Markov model with random effects leading to estimators of type BLIP and VIP including Bayesian estimation. A great part of the work is presented in four appendices. Appendix A is a treatment, of tensor algebra, namely linear algebra, matrix algebra, and multilinear algebra. Appendix B is devoted to sampling distributions and their use in terms of confidence intervals and confidence regions. Appendix C reviews the elementary notions of statistics, namely random events and stochastic processes. Appendix D introduces the basics of Groebner basis algebra, its careful definition, the Buchberger algorithm, especially the C. F. Gauss combinatorial algorithm.
Applications of Linear and Nonlinear Models
Fixed Effects, Random Effects, and Total Least Squares
Häftad, Engelska, 2023
2 310 kr
Skickas inom 10-15 vardagar
This book provides numerous examples of linear and nonlinear model applications. Here, we present a nearly complete treatment of the Grand Universe of linear and weakly nonlinear regression models within the first 8 chapters. Our point of view is both an algebraic view and a stochastic one. For example, there is an equivalent lemma between a best, linear uniformly unbiased estimation (BLUUE) in a Gauss–Markov model and a least squares solution (LESS) in a system of linear equations. While BLUUE is a stochastic regression model, LESS is an algebraic solution. In the first six chapters, we concentrate on underdetermined and overdetermined linear systems as well as systems with a datum defect. We review estimators/algebraic solutions of type MINOLESS, BLIMBE, BLUMBE, BLUUE, BIQUE, BLE, BIQUE, and total least squares. The highlight is the simultaneous determination of the first moment and the second central moment of a probability distribution in an inhomogeneous multilinear estimationby the so-called E-D correspondence as well as its Bayes design. In addition, we discuss continuous networks versus discrete networks, use of Grassmann–Plucker coordinates, criterion matrices of type Taylor–Karman as well as FUZZY sets. Chapter seven is a speciality in the treatment of an overjet. This second edition adds three new chapters:(1) Chapter on integer least squares that covers (i) model for positioning as a mixed integer linear model which includes integer parameters. (ii) The general integer least squares problem is formulated, and the optimality of the least squares solution is shown. (iii) The relation to the closest vector problem is considered, and the notion of reduced lattice basis is introduced. (iv) The famous LLL algorithm for generating a Lovasz reduced basis is explained.(2) Bayes methods that covers (i) general principle of Bayesian modeling. Explain the notion of prior distribution and posterior distribution. Choose the pragmatic approach for exploring the advantages of iterative Bayesian calculations and hierarchical modeling. (ii) Present the Bayes methods for linear models with normal distributed errors, including noninformative priors, conjugate priors, normal gamma distributions and (iii) short outview to modern application of Bayesian modeling. Useful in case of nonlinear models or linear models with no normal distribution: Monte Carlo (MC), Markov chain Monte Carlo (MCMC), approximative Bayesian computation (ABC) methods.(3) Error-in-variables models, which cover: (i) Introduce the error-in-variables (EIV) model, discuss the difference to least squares estimators (LSE), (ii) calculate the total least squares (TLS) estimator. Summarize the properties of TLS, (iii) explain the idea of simulation extrapolation (SIMEX) estimators, (iv) introduce the symmetrized SIMEX (SYMEX) estimator and its relation to TLS, and (v) short outview to nonlinear EIV models. The chapter onalgebraic solution of nonlinear system of equations has also been updated in line with the new emerging field of hybrid numeric-symbolic solutions to systems of nonlinear equations, ermined system of nonlinear equations on curved manifolds. The von Mises–Fisher distribution is characteristic for circular or (hyper) spherical data. Our last chapter is devoted to probabilistic regression, the special Gauss–Markov model with random effects leading to estimators of type BLIP and VIP including Bayesian estimation. A great part of the work is presented in four appendices. Appendix A is a treatment, of tensor algebra, namely linear algebra, matrix algebra, and multilinear algebra. Appendix B is devoted to sampling distributions and their use in terms of confidence intervals and confidence regions. Appendix C reviews the elementary notions of statistics, namely random events and stochastic processes. Appendix D introduces the basics of Groebner basis algebra, its careful definition, the Buchberger algorithm, especially the C. F. Gauss combinatorial algorithm.