2229:-
Uppskattad leveranstid 7-12 arbetsdagar
Fri frakt för medlemmar vid köp för minst 249:-
Andra format:
- Inbunden 2229:-
This book provides theoretical and practical knowledge for develop ment of algorithms that infer linear and nonlinear models. It offers a methodology for inductive learning of polynomial neural network mod els from data. The design of such tools contributes to better statistical data modelling when addressing tasks from various areas like system identification, chaotic time-series prediction, financial forecasting and data mining. The main claim is that the model identification process involves several equally important steps: finding the model structure, estimating the model weight parameters, and tuning these weights with respect to the adopted assumptions about the underlying data distrib ution. When the learning process is organized according to these steps, performed together one after the other or separately, one may expect to discover models that generalize well (that is, predict well). The book off'ers statisticians a shift in focus from the standard f- ear models toward highly nonlinear models that can be found by con temporary learning approaches. Speciafists in statistical learning will read about alternative probabilistic search algorithms that discover the model architecture, and neural network training techniques that identify accurate polynomial weights. They wfil be pleased to find out that the discovered models can be easily interpreted, and these models assume statistical diagnosis by standard statistical means. Covering the three fields of: evolutionary computation, neural net works and Bayesian inference, orients the book to a large audience of researchers and practitioners.
- Format: Pocket/Paperback
- ISBN: 9781441940605
- Språk: Engelska
- Antal sidor: 316
- Utgivningsdatum: 2011-02-11
- Förlag: Springer-Verlag New York Inc.