- Inbunden (Hardback)
- Antal sidor
- 245 x 185 x 40 mm
- Antal komponenter
- International edition available ISBN 0139083855
- 1300 g
Du kanske gillar
A Comprehensive Foundation: United States Edition
Fri frakt inom Sverige för privatpersoner.
NEW TO THIS EDITION
- NEWNew chapters now cover such areas as:
- Support vector machines.
- Reinforcement learning/neurodynamic programming.
- Dynamically driven recurrent networks.
- NEW-Endof-chapter problems revised, improved and expanded in number.
- Extensive, state-of-the-art coverage exposes the reader to the many facets of neural networks and helps them appreciate the technology's capabilities and potential applications.
- Detailed analysis of back-propagation learning and multi-layer perceptrons.
- Explores the intricacies of the learning processan essential component for understanding neural networks.
- Considers recurrent networks, such as Hopfield networks, Boltzmann machines, and meanfield theory machines, as well as modular networks, temporal processing, and neurodynamics.
- Integrates computer experiments throughout, giving the opportunity to see how neural networks are designed and perform in practice.
- Reinforces key concepts with chapter objectives, problems, worked examples, a bibliography, photographs, illustrations, and a thorough glossary.
- Includes a detailed and extensive bibliography for easy reference.
- Computer-oriented experiments distributed throughout the book
- Uses Matlab SE version 5.
Fler böcker av Simon O Haykin
Adaptive Filter Theory
Simon O Haykin
<> Adaptive Filter Theory, 5e, is ideal for courses in Adaptive Filters. Haykin examines both the mathematical theory behind various linear adaptive filters and the elements of supervised multilayer perceptrons. In its fifth edition, this hi...
Bloggat om Neural Networks
2. Learning Processes.
3. Single-Layer Perceptrons.
4. Multilayer Perceptrons.
5. Radial-Basis Function Networks.
6. Support Vector Machines.
7. Committee Machines.
8. Principal Components Analysis.
9. Self-Organizing Maps.
10. Information-Theoretic Models.
11. Stochastic Machines & Their Approximates Rooted in Statistical Mechanics.
12. Neurodynamic Programming.
13. Temporal Processing Using Feedforward Networks.
15. Dynamically Driven Recurrent Networks.