BACKPROPAGATION TRAINING ALGORITHM WITH ADAPTIVE PARAMETERS TO SOLVE DIGITAL PROBLEMS
Journal: ICTACT Journal on Soft Computing (IJSC) (Vol.1, No. 3)Publication Date: 2011-01-01
Authors : R. Saraswathi;
Page : 145-151
Keywords : Single Hidden Layer; Lyapunov Stability Theory; Adaptive Learning Parameter;
Abstract
An efficient technique namely Backpropagation training with adaptive parameters using Lyapunov Stability Theory for training single hidden layer feed forward network is proposed. A three-layered Feedforward neural network architecture is used to solve the selected problems. Sequential Training Mode is used to train the network. Lyapunov stability theory is employed to ensure the faster and steady state error convergence and to construct and energy surface with a single global minimum point through the adaptive adjustment of the weights and the adaptive parameter ß. To avoid local minima entrapment, an adaptive backpropagation algorithm based on Lyapunov stability theory is used. Lyapunov stability theory gives the algorithm, the efficiency of attaining a single global minimum point. The learning parameters used in this algorithm is responsible for the faster error convergence. The adaptive learning parameter used in this algorithm is chosen properly for faster error convergence. The error obtained has been asymptotically converged to zero according to Lyapunov Stability theory. The performance of the adaptive Backpropagation algorithm is measured by solving parity problem, half adder and full adder problems.
Other Latest Articles
- KNOWLEDGE ENGINEERING TO AID THE RECRUITMENT PROCESS OF AN INDUSTRY BY IDENTIFYING SUPERIOR SELECTION CRITERIA
- ENSEMBLE DESIGN OF MASQUERADER DETECTION SYSTEMS FOR INFORMATION SECURITY
- DIVERSE DEPICTION OF PARTICLE SWARM OPTIMIZATION FOR DOCUMENT CLUSTERING
- DENSITY CONSCIOUS SUBSPACE CLUSTERING USING ITL DATA STRUCTURE
- FUZZY LOGIC BASED OPTIMIZATION OF CAPACITOR VALUE FOR SINGLE PHASE OPEN WELL SUBMERSIBLE INDUCTION MOTOR
Last modified: 2013-12-05 15:05:46