bokomslag Backpropagation and It's Modifications
Data & IT

Backpropagation and It's Modifications

Richa Kathuria Karthikeyan

Pocket

919:-

Funktionen begränsas av dina webbläsarinställningar (t.ex. privat läge).

Uppskattad leveranstid 7-12 arbetsdagar

Fri frakt för medlemmar vid köp för minst 249:-

  • 72 sidor
  • 2012
Gradient based methods are one of the most widely used error minimization methods used to train back propagation networks. The BP training algorithm is a supervised learning method for multi-layered feedforward neural networks. It is essentially a gradient descent local optimization technique which involves backward error correction of the network weights. It has many limitations of convergence , getting trapped in local minima and performance. To solve this there are different modifications like introducing momentum and bias terms, conjugate gradient are used. In the conjugate gradient algorithms a search is performed along conjugate directions, which produces generally faster convergence than steepest descent directions.Here, in this monograph we will consider parity bit checking problem with conventional backpropagation method and other methods. We have to construct suitable neural network and train it properly. The training dataset has to be used to train "classification engine" for solution purpose and then the trained network is used for testing its validation.
  • Författare: Richa Kathuria Karthikeyan
  • Format: Pocket/Paperback
  • ISBN: 9783847379355
  • Språk: Engelska
  • Antal sidor: 72
  • Utgivningsdatum: 2012-01-31
  • Förlag: LAP Lambert Academic Publishing