Neural networks training with optimal bounded ellipsoid algorithm

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

7 Scopus citations

Abstract

Compared to normal learning algorithms, for example backpropagation, the optimal bounded ellipsoid (OBE) algorithm has some better properties, such as faster convergence, since it has a similar structure as Kalman filter. OBE has some advantages over Kalman filter training, the noise is not required to be Guassian. In this paper OBE algorithm is applied traing the weights of recurrent neural networks for nonlinear system identification. Both hidden layers and output layers can be updated. From a dynamic systems point of view, such training can be useful for all neural network applications requiring real-time updating of the weights. A simple simulation gives the effectiveness of the suggested algorithm.

Original languageEnglish
Title of host publicationAdvances in Neural Networks - ISNN 2007 - 4th International Symposium on Neural Networks, ISNN 2007, Proceedings
PublisherSpringer Verlag
Pages1173-1182
Number of pages10
EditionPART 1
ISBN (Print)9783540723820
DOIs
StatePublished - 2007
Externally publishedYes
Event4th International Symposium on Neural Networks, ISNN 2007 - Nanjing, China
Duration: 3 Jun 20077 Jun 2007

Publication series

NameLecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)
NumberPART 1
Volume4491 LNCS
ISSN (Print)0302-9743
ISSN (Electronic)1611-3349

Conference

Conference4th International Symposium on Neural Networks, ISNN 2007
Country/TerritoryChina
CityNanjing
Period3/06/077/06/07

Fingerprint

Dive into the research topics of 'Neural networks training with optimal bounded ellipsoid algorithm'. Together they form a unique fingerprint.

Cite this