Michael C. Mozer - Böcker
Visar alla böcker från författaren Michael C. Mozer. Handla med fri frakt och snabb leverans.
3 produkter
3 produkter
3 280 kr
Skickas inom 10-15 vardagar
Recent years have seen an explosion of new mathematical results on learning and processing in neural networks. This body of results rests on a breadth of mathematical background which even few specialists possess. In a format intermediate between a textbook and a collection of research articles, this book has been assembled to present a sample of these results, and to fill in the necessary background, in such areas as computability theory, computational complexity theory, the theory of analog computation, stochastic processes, dynamical systems, control theory, time-series analysis, Bayesian analysis, regularization theory, information theory, computational learning theory, and mathematical statistics. Mathematical models of neural networks display an amazing richness and diversity. Neural networks can be formally modeled as computational systems, as physical or dynamical systems, and as statistical analyzers. Within each of these three broad perspectives, there are a number of particular approaches. For each of 16 particular mathematical perspectives on neural networks, the contributing authors provide introductions to the background mathematics, and address questions such as:* Exactly what mathematical systems are used to model neural networks from the given perspective?* What formal questions about neural networks can then be addressed?* What are typical results that can be obtained? and* What are the outstanding open problems?A distinctive feature of this volume is that for each perspective presented in one of the contributed chapters, the first editor has provided a moderately detailed summary of the formal results and the requisite mathematical concepts. These summaries are presented in four chapters that tie together the 16 contributed chapters: three develop a coherent view of the three general perspectives -- computational, dynamical, and statistical; the other assembles these three perspectives into a unified overview of the neural networks field.
1 952 kr
Skickas inom 10-15 vardagar
The result of the 1993 Connectionist Models Summer School, the papers in this volume exemplify the tremendous breadth and depth of research underway in the field of neural networks. Although the slant of the summer school has always leaned toward cognitive science and artificial intelligence, the diverse scientific backgrounds and research interests of accepted students and invited faculty reflect the broad spectrum of areas contributing to neural networks, including artificial intelligence, cognitive science, computer science, engineering, mathematics, neuroscience, and physics. Providing an accurate picture of the state of the art in this fast-moving field, the proceedings of this intense two-week program of lectures, workshops, and informal discussions contains timely and high-quality work by the best and the brightest in the neural networks field.
801 kr
Skickas inom 10-15 vardagar
Recent years have seen an explosion of new mathematical results on learning and processing in neural networks. This body of results rests on a breadth of mathematical background which even few specialists possess. In a format intermediate between a textbook and a collection of research articles, this book has been assembled to present a sample of these results, and to fill in the necessary background, in such areas as computability theory, computational complexity theory, the theory of analog computation, stochastic processes, dynamical systems, control theory, time-series analysis, Bayesian analysis, regularization theory, information theory, computational learning theory, and mathematical statistics. Mathematical models of neural networks display an amazing richness and diversity. Neural networks can be formally modeled as computational systems, as physical or dynamical systems, and as statistical analyzers. Within each of these three broad perspectives, there are a number of particular approaches. For each of 16 particular mathematical perspectives on neural networks, the contributing authors provide introductions to the background mathematics, and address questions such as:* Exactly what mathematical systems are used to model neural networks from the given perspective?* What formal questions about neural networks can then be addressed?* What are typical results that can be obtained? and* What are the outstanding open problems?A distinctive feature of this volume is that for each perspective presented in one of the contributed chapters, the first editor has provided a moderately detailed summary of the formal results and the requisite mathematical concepts. These summaries are presented in four chapters that tie together the 16 contributed chapters: three develop a coherent view of the three general perspectives -- computational, dynamical, and statistical; the other assembles these three perspectives into a unified overview of the neural networks field.