Robert M. Gray - Böcker
Visar alla böcker från författaren Robert M. Gray. Handla med fri frakt och snabb leverans.
17 produkter
17 produkter
1 067 kr
Skickas inom 10-15 vardagar
Stochastic Image Processing provides the first thorough treatment of Markov and hidden Markov random fields and their application to image processing. Although promoted as a promising approach for over thirty years, it has only been in the past few years that the theory and algorithms have developed to the point of providing useful solutions to old and new problems in image processing. Markov random fields are a multidimensional extension of Markov chains, but the generalization is complicated by the lack of a natural ordering of pixels in multidimensional spaces. Hidden Markov fields are a natural generalization of the hidden Markov models that have proved essential to the development of modern speech recognition, but again the multidimensional nature of the signals makes them inherently more complicated to handle. This added complexity contributed to the long time required for the development of successful methods and applications. This book collects together a variety of successful approaches to a complete and useful characterization of multidimensional Markov and hidden Markov models along with applications to image analysis. The book provides a survey and comparative developme
656 kr
Skickas inom 7-10 vardagar
This book describes the essential tools and techniques of statistical signal processing. At every stage theoretical ideas are linked to specific applications in communications and signal processing using a range of carefully chosen examples. The book begins with a development of basic probability, random objects, expectation, and second order moment theory followed by a wide variety of examples of the most popular random process models and their basic uses and properties. Specific applications to the analysis of random signals and systems for communicating, estimating, detecting, modulating, and other processing of signals are interspersed throughout the book. Hundreds of homework problems are included and the book is ideal for graduate students of electrical engineering and applied mathematics. It is also a useful reference for researchers in signal processing and communications.
1 084 kr
Skickas inom 7-10 vardagar
This book describes the essential tools and techniques of statistical signal processing. At every stage theoretical ideas are linked to specific applications in communications and signal processing using a range of carefully chosen examples. The book begins with a development of basic probability, random objects, expectation, and second order moment theory followed by a wide variety of examples of the most popular random process models and their basic uses and properties. Specific applications to the analysis of random signals and systems for communicating, estimating, detecting, modulating, and other processing of signals are interspersed throughout the book. Hundreds of homework problems are included and the book is ideal for graduate students of electrical engineering and applied mathematics. It is also a useful reference for researchers in signal processing and communications.
Del 571 - Springer International Series in Engineering and Computer Science
Image Segmentation and Compression Using Hidden Markov Models
Inbunden, Engelska, 2000
1 582 kr
Skickas inom 10-15 vardagar
In an age of information technology, the issues of distributing and utilizing images efficiently and effectively are of substantial concern. Solutions to many of the problems arising from these issues are provided by techniques of image processing. Two of these, segmentation and compression, are discussed in this book. The authors present an algorithm that models the statistical dependence among image blocks by two dimensional hidden Markov models (HMMs). Formulas for estimating the model according to the maximum likelihood criterion are derived from the EM algorithm. To segment an image, optimal classes are searched jointly for all the blocks by the maximum a posteriori (MAP) rule. The 2-D HMM is extended to multiresolution so that more context information is exploited in classification and fast progressive segmentation schemes can be formed naturally. The second issue addressed in the book is the design of joint compression and classification systems using the 2-D HMM and vector quantization.
Del 83 - Springer International Series in Engineering and Computer Science
Source Coding Theory
Inbunden, Engelska, 1989
537 kr
Skickas inom 10-15 vardagar
Source coding theory has as its goal the characterization of the optimal performance achievable in idealized communication systems which must code an information source for transmission over a digital communication or storage channel for transmission to a user. The user must decode the information into a form that is a good approximation to the original. A code is optimal within some class if it achieves the best possible fidelity given whatever constraints are imposed on the code by the available channel. In theory, the primary constraint imposed on a code by the channel is its rate or resolution, the number of bits per second or per input symbol that it can transmit from sender to receiver. In the real world, complexity may be as important as rate. The origins and the basic form of much of the theory date from Shan non's classical development of noiseless source coding and source coding subject to a fidelity criterion (also called rate-distortion theory) [73] [74]. Shannon combined a probabilistic notion of information with limit theo rems from ergodic theory and a random coding technique to describe the optimal performance of systems with a constrained rate but with uncon strained complexity and delay. An alternative approach called asymptotic or high rate quantization theory based on different techniques and approx imations was introduced by Bennett at approximately the same time [4]. This approach constrained the delay but allowed the rate to grow large.
Del 159 - Springer International Series in Engineering and Computer Science
Vector Quantization and Signal Compression
Inbunden, Engelska, 1991
1 173 kr
Skickas inom 10-15 vardagar
Herb Caen, a popular columnist for the San Francisco Chronicle, recently quoted a Voice of America press release as saying that it was reorganizing in order to "eliminate duplication and redundancy. " This quote both states a goal of data compression and illustrates its common need: the removal of duplication (or redundancy) can provide a more efficient representation of data and the quoted phrase is itself a candidate for such surgery. Not only can the number of words in the quote be reduced without losing informa tion, but the statement would actually be enhanced by such compression since it will no longer exemplify the wrong that the policy is supposed to correct. Here compression can streamline the phrase and minimize the em barassment while improving the English style. Compression in general is intended to provide efficient representations of data while preserving the essential information contained in the data. This book is devoted to the theory and practice of signal compression, i. e. , data compression applied to signals such as speech, audio, images, and video signals (excluding other data types such as financial data or general purpose computer data). The emphasis is on the conversion of analog waveforms into efficient digital representations and on the compression of digital information into the fewest possible bits. Both operations should yield the highest possible reconstruction fidelity subject to constraints on the bit rate and implementation complexity.
Del 322 - Springer International Series in Engineering and Computer Science
Fourier Transforms
An Introduction for Engineers
Inbunden, Engelska, 1995
1 279 kr
Skickas inom 10-15 vardagar
The Fourier transform is one of the important mathematical tools in a wide variety of science and engineering fields. Its application - as Fourier analysis or harmonic analysis - provides useful decompositions of signals into fundamental ("primitive") components, giving shortcuts in the computation of complicated sums and integrals, and often revealing hidden structure in the data. This text develops the basic definitions, properties and applications of Fourier analysis, the emphasis being on techniques for its application to linear systems, although other applications are also considered. The application of Fourier analysis to a wide variety of signals, including discrete time (or parameter), continuous time (or parameter), finite duration, and infinite duration are discussed in the text.
1 696 kr
Skickas inom 10-15 vardagar
Probability, Random Processes, and Ergodic Properties is for mathematically inclined information/communication theorists and people working in signal processing. It will also interest those working with random or stochastic processes, including mathematicians, statisticians, and economists.Highlights:Complete tour of book and guidelines for use given in Introduction, so readers can see at a glance the topics of interest.Structures mathematics for an engineering audience, with emphasis on engineering applications.New in the Second Edition:Much of the material has been rearranged and revised for pedagogical reasons.The original first chapter has been split in order to allow a more thorough treatment of basic probability before tackling random processes and dynamical systems.The final chapter has been broken into two pieces to provide separate emphasis on process metrics and the ergodic decomposition of affine functionals.Many classic inequalities are now incorporated into the text, along with proofs; and many citations have been added.
1 776 kr
Skickas inom 10-15 vardagar
This fully updated new edition of the classic work on information theory presents a detailed analysis of Shannon-source and channel-coding theorems, before moving on to address sources, channels, codes and the properties of information and distortion measures.
Del 83 - Springer International Series in Engineering and Computer Science
Source Coding Theory
Häftad, Engelska, 2011
537 kr
Skickas inom 10-15 vardagar
Source coding theory has as its goal the characterization of the optimal performance achievable in idealized communication systems which must code an information source for transmission over a digital communication or storage channel for transmission to a user. The user must decode the information into a form that is a good approximation to the original. A code is optimal within some class if it achieves the best possible fidelity given whatever constraints are imposed on the code by the available channel. In theory, the primary constraint imposed on a code by the channel is its rate or resolution, the number of bits per second or per input symbol that it can transmit from sender to receiver. In the real world, complexity may be as important as rate. The origins and the basic form of much of the theory date from Shan non's classical development of noiseless source coding and source coding subject to a fidelity criterion (also called rate-distortion theory) [73] [74]. Shannon combined a probabilistic notion of information with limit theo rems from ergodic theory and a random coding technique to describe the optimal performance of systems with a constrained rate but with uncon strained complexity and delay. An alternative approach called asymptotic or high rate quantization theory based on different techniques and approx imations was introduced by Bennett at approximately the same time [4]. This approach constrained the delay but allowed the rate to grow large.
1 067 kr
Skickas inom 10-15 vardagar
Stochastic Image Processing provides the first thorough treatment of Markov and hidden Markov random fields and their application to image processing. Although promoted as a promising approach for over thirty years, it has only been in the past few years that the theory and algorithms have developed to the point of providing useful solutions to old and new problems in image processing. Markov random fields are a multidimensional extension of Markov chains, but the generalization is complicated by the lack of a natural ordering of pixels in multidimensional spaces. Hidden Markov fields are a natural generalization of the hidden Markov models that have proved essential to the development of modern speech recognition, but again the multidimensional nature of the signals makes them inherently more complicated to handle. This added complexity contributed to the long time required for the development of successful methods and applications. This book collects together a variety of successful approaches to a complete and useful characterization of multidimensional Markov and hidden Markov models along with applications to image analysis. The book provides a survey and comparative development of an exciting and rapidly evolving field of multidimensional Markov and hidden Markov random fields with extensive references to the literature.
Del 322 - Springer International Series in Engineering and Computer Science
Fourier Transforms
An Introduction for Engineers
Häftad, Engelska, 2012
1 286 kr
Skickas inom 10-15 vardagar
The Fourier transform is one of the most important mathematical tools in a wide variety of fields in science and engineering. In the abstract it can be viewed as the transformation of a signal in one domain (typically time or space) into another domain, the frequency domain. Applications of Fourier transforms, often called Fourier analysis or harmonic analysis, provide useful decompositions of signals into fundamental or "primitive" components, provide shortcuts to the computation of complicated sums and integrals, and often reveal hidden structure in data. Fourier analysis lies at the base of many theories of science and plays a fundamental role in practical engineering design. The origins of Fourier analysis in science can be found in Ptolemy's decomposing celestial orbits into cycles and epicycles and Pythagorus' de composing music into consonances. Its modern history began with the eighteenth century work of Bernoulli, Euler, and Gauss on what later came to be known as Fourier series. J. Fourier in his 1822 Theorie analytique de la Chaleur [16] (still available as a Dover reprint) was the first to claim that arbitrary periodic functions could be expanded in a trigonometric (later called a Fourier) series, a claim that was eventually shown to be incorrect, although not too far from the truth. It is an amusing historical sidelight that this work won a prize from the French Academy, in spite of serious concerns expressed by the judges (Laplace, Lagrange, and Legendre) re garding Fourier's lack of rigor.
Del 159 - Springer International Series in Engineering and Computer Science
Vector Quantization and Signal Compression
Häftad, Engelska, 2012
1 173 kr
Skickas inom 10-15 vardagar
Herb Caen, a popular columnist for the San Francisco Chronicle, recently quoted a Voice of America press release as saying that it was reorganizing in order to "eliminate duplication and redundancy. " This quote both states a goal of data compression and illustrates its common need: the removal of duplication (or redundancy) can provide a more efficient representation of data and the quoted phrase is itself a candidate for such surgery. Not only can the number of words in the quote be reduced without losing informa tion, but the statement would actually be enhanced by such compression since it will no longer exemplify the wrong that the policy is supposed to correct. Here compression can streamline the phrase and minimize the em barassment while improving the English style. Compression in general is intended to provide efficient representations of data while preserving the essential information contained in the data. This book is devoted to the theory and practice of signal compression, i. e. , data compression applied to signals such as speech, audio, images, and video signals (excluding other data types such as financial data or general purpose computer data). The emphasis is on the conversion of analog waveforms into efficient digital representations and on the compression of digital information into the fewest possible bits. Both operations should yield the highest possible reconstruction fidelity subject to constraints on the bit rate and implementation complexity.
Del 571 - Springer International Series in Engineering and Computer Science
Image Segmentation and Compression Using Hidden Markov Models
Häftad, Engelska, 2012
1 582 kr
Skickas inom 10-15 vardagar
In the current age of information technology, the issues of distributing and utilizing images efficiently and effectively are of substantial concern. Solutions to many of the problems arising from these issues are provided by techniques of image processing, among which segmentation and compression are topics of this book.Image segmentation is a process for dividing an image into its constituent parts. For block-based segmentation using statistical classification, an image is divided into blocks and a feature vector is formed for each block by grouping statistics of its pixel intensities. Conventional block-based segmentation algorithms classify each block separately, assuming independence of feature vectors.Image Segmentation and Compression Using Hidden Markov Models presents a new algorithm that models the statistical dependence among image blocks by two dimensional hidden Markov models (HMMs). Formulas for estimating the model according to the maximum likelihood criterion are derived from the EM algorithm. To segment an image, optimal classes are searched jointly for all the blocks by the maximum a posteriori (MAP) rule. The 2-D HMM is extended to multiresolution so that more context information is exploited in classification and fast progressive segmentation schemes can be formed naturally.The second issue addressed in the book is the design of joint compression and classification systems using the 2-D HMM and vector quantization. A classifier designed with the side goal of good compression often outperforms one aimed solely at classification because overfitting to training data is suppressed by vector quantization.Image Segmentation and Compression Using Hidden Markov Models is an essential reference source for researchers and engineers working in statistical signal processing or image processing, especially those who are interested in hidden Markov models. It is also of value to those working on statistical modeling.
653 kr
Skickas inom 5-8 vardagar
1 792 kr
Skickas inom 10-15 vardagar
This fully updated new edition of the classic work on information theory presents a detailed analysis of Shannon-source and channel-coding theorems, before moving on to address sources, channels, codes and the properties of information and distortion measures.
1 331 kr
Skickas inom 5-8 vardagar
Probability, Random Processes, and Ergodic Properties is for mathematically inclined information/communication theorists and people working in signal processing. It will also interest those working with random or stochastic processes, including mathematicians, statisticians, and economists.Highlights:Complete tour of book and guidelines for use given in Introduction, so readers can see at a glance the topics of interest.Structures mathematics for an engineering audience, with emphasis on engineering applications.New in the Second Edition:Much of the material has been rearranged and revised for pedagogical reasons.The original first chapter has been split in order to allow a more thorough treatment of basic probability before tackling random processes and dynamical systems.The final chapter has been broken into two pieces to provide separate emphasis on process metrics and the ergodic decomposition of affine functionals.Many classic inequalities are now incorporated into the text, along with proofs; and many citations have been added.