Neural Network Training Using MATLAB Phuong Ngo School of Mechanical Engineering Purdue University.

Презентация:



Advertisements
Похожие презентации
WS17-1 WORKSHOP 17 DIRECT TRANSIENT ANALYSIS OF A CAR CHASSIS NAS122, Workshop 17, August 2005 Copyright 2005 MSC.Software Corporation.
Advertisements

© 2006 Cisco Systems, Inc. All rights reserved. CVOICE v Configuring Voice Networks Configuring Dial Peers.
WS16-1 WORKSHOP 16 MODAL FREQUENCY ANALYSIS OF A CAR CHASSIS NAS122, Workshop 16, August 2005 Copyright 2005 MSC.Software Corporation.
Bubble Sort. Bubble Sort Example 9, 6, 2, 12, 11, 9, 3, 7 6, 9, 2, 12, 11, 9, 3, 7 6, 2, 9, 12, 11, 9, 3, 7 6, 2, 9, 11, 12, 9, 3, 7 6, 2, 9, 11, 9, 12,
Kurochkin I.I., Prun A.I. Institute for systems analysis of RAS Centre for grid-technologies and distributed computing GRID-2012, Dubna, Russia july.
1/27 Chapter 9: Template Functions And Template Classes.
Microsoft Excel Performed: Kerimbayeva Dana Group: 145.
WORKSHOP 14 BUCKLING OF A SUBMARINE PRESSURE HULL.
Parity Generator & Checker Ando KI June Copyright © 2009 by Ando KiModule overview ( 2 ) Objectives Learn what is parity. Learn how to use Verilog.
WS18-1 WORKSHOP 18 MODAL TRANSIENT ANALYSIS OF THE TOWER MODEL WITH SEISMIC INPUT NAS122, Workshop 18, August 2005 Copyright 2005 MSC.Software Corporation.
WS8-1 WORKSHOP 8 DIRECT TRANSIENT RESPONSE WITH ENFORCED ACCELERATION MATRIX PARTITION APPROACH NAS122, Workshop 8, August 2005 Copyright 2005 MSC.Software.
S15-1 PAT325, Section 15, February 2004 Copyright 2004 MSC.Software Corporation SECTION 15 OPTIMIZATION OF COMPOSITES USING MSC.NASTRAN.
WS10b-1 WORKSHOP 10B FREQUENCY RESPONSE ANALYSIS OF A CIRCUIT BOARD NAS122, Workshop 10b, August 2005 Copyright 2005 MSC.Software Corporation.
S11-1 PAT318, Section 11, March 2005 SECTION 11 ANALYSIS SETUP.
WS9-1 WORKSHOP 9 TRANSIENT THERMAL ANALYSIS OF A COOLING FIN NAS104, Workshop 9, March 2004 Copyright 2004 MSC.Software Corporation.
WS5-1 WORKSHOP 5 DIRECT FREQUENCY RESPONSE ANALYSIS NAS122, Workshop 5, August 2005 Copyright 2005 MSC.Software Corporation.
© 2006 Cisco Systems, Inc. All rights reserved. CIPT1 v Deployment of Cisco Unified CallManager Release 5.0 Endpoints Managing Endpoints with Cisco.
WS6-1 WORKSHOP 6 MODAL FREQUENCY RESPONSE ANALYSIS NAS122, Workshop 6, August 2005 Copyright 2005 MSC.Software Corporation.
S11-1PAT301, Section 11, October 2003 SECTION 11 ANALYSIS SETUP.
Convolutional Codes Mohammad Hanaysheh Mahdi Barhoush.
Транксрипт:

Neural Network Training Using MATLAB Phuong Ngo School of Mechanical Engineering Purdue University

Neural Network Toolbox Available Models in MATLAB: Feedforward Neural Networks Adaptive Neural Network Filters Perceptron Neural Networks Radial Basis Neural Networks Probabilistic Neural Networks Generalized Regression Neural Networks Learning Vector Quantization (LVQ) Neural Networks Linear Neural Networks Hopfield Neural Network ME697Y2

Feedforward Neural Network ME697Y3 trainlmLevenberg-Marquardt trainbrBayesian Regularization trainbfgBFGS Quasi-Newton trainrpResilient Backpropagation trainscgScaled Conjugate Gradient traincgbConjugate Gradient with Powell/Beale Restarts traincgfFletcher-Powell Conjugate Gradient traincgpPolak-Ribiére Conjugate Gradient trainossOne Step Secant traingdxVariable Learning Rate Gradient Descent traingdmGradient Descent with Momentum traingdGradient Descent

Radial Basis Function Network Exact Design (newrbe) This function can produce a network with zero error on training vectors. It is called in the following way: net = newrbe(P,T,SPREAD) More Efficient Design (newrb) The function newrb iteratively creates a radial basis network one neuron at a time. Neurons are added to the network until the sum-squared error falls beneath an error goal or a maximum number of neurons has been reached. The call for this function is net = newrb(P,T,GOAL,SPREAD) ME697Y4

Fuzzy Basis Function Network (Not included in Neural Network Toolbox) Backpropagation Algorithm Adaptive Least Square with Genetic Algorithm ME697Y5

Training Steps Generate training and checking data Select the structure of the neural network Perform the training Verify the error with checking data ME697Y6

Examples ME697Y7

MATLAB Code (GenerateTrainingData) function [x,Cx,d,Cd]=GenerateTrainingData(n,m,range) % n - number of training samples % m - number of checking samples % range - zx2 range of input (z is the number of inputs) % x - zxn matrix of training inputs % Cx - zxm matrix of checking inputs % d - 1xn matrix of training outputs % Cd - 1xm matrix of checking outputs if nargin < 3, error('Not enough input arguments'),end [z,~] = size(range); % Obtain the number of system input x = zeros(z,n); Cx = zeros(z,m); for i = 1:z x(i,:) = (range(i,2)-range(i,1))*rand(1,n)+range(i,1)*ones(1,n); % Generate random training inputs Cx(i,:) = (range(i,2)-range(i,1))*rand(1,m)+range(i,1)*ones(1,m); % Generate random checking inputs end d = zeros(1,n); % Define matrix d as an array of training outputs for i = 1:n d(i) = NonlinearFunction(x(:,i)); % Calculate d matrix end Cd = zeros(1,m); % Define matrix Cd as an array of checking outputs for i = 1:m Cd(i) = NonlinearFunction(Cx(:,i)); % Calculate Cd matrix end save('TrainingData.mat') % Save training data into file ME697Y8

MATLAB Code (Main Program) addpath('./FBFN'); % add FBFN library n = 900; % Define n as the number of training samples m = 841; % Define m as the number of checking samples InputRange = [-3 3; -3 3]; % Range of Input Signal [x,Cx,d,Cd]=GenerateTrainingData(n,m,InputRange); DP = [25,0,0]; % Specify the maxnimum number of fuzzy rules warning('off'); [m_matrix,sigma_matrix,temp_w,NR,NDEI,CR] = adnfbf2(x,d,Cx,Cd,DP); save('FBFN.mat') plot(1:length(NDEI),NDEI) xlabel('Number of Fuzzy Rules'); ylabel('NDEI'); ME697Y9

adnfbf2.m % [fismat,NR,TR,CR] = ADNEWFBF(x,d,Cx,Cd,DP) % x - nxN matrix of N input vectors. % d - 1xN vector of N target outputs % Cx - nxCN matrix of CN input vectors for checking. % Cd - 1xCN vector of CN target outputs % DP - Design parameters (optional). % Returns: % m_matrix,sigma_matrix,temp_w - parameters of fbfn found % NR - the number of fuzzy basis functions used. % training_error: NDEI % TR - training record: [row of errors] % CR - checking recored: [row of errors] % % Design parameters are: % DP(1) - Maximum number of FBF(Ms), default = N. % DP(2) - Root-sum-squared error goal, default = 0.0. % DP(3) - Spread of pseudo-FBF(sigma), default = del_x/Ms % Missing parameters and NaN's are replaced with defaults. ME697Y10

DEMO ME697Y11