天天看點

數學模組化智能優化算法之神經網絡案例附Matlab代碼

讀書使人充實,讨論使人機智,筆記使人準确…。凡有所學,皆成性格。

———— (英國)培根

使用MATLAB建立前饋神經網絡主要會使用到下面3個函數:

newff:前饋神經網絡建立函數;

train:訓練一個神經網絡;

sim:使用神經網絡進行仿真。

下面簡要介紹這3個函數的用法。

newff函數newff函數文法

文法:net=newff(P,T,[S1 S2…S(N-l)],{TF1 TF2…TFN},BTF,BLF,PF,IPF,OPF,DDF)

P:輸入矩陣向量;

T:目标矩陣向量;

[S1 S2…S(N-l)]:神經網絡前N-1層每層神經元數;

{TF1 TF2…TFN}:神經網絡激活函數,預設為‘tansig’;

BTF:學習規則采用的訓練算法,預設為‘trainlm’;

BLF:BP權值/偏差學習函數,預設為‘learngdm’;

PF:性能函數,預設為‘mse’;

IPF:輸入處理函數;

OPF:輸出處理函數;

DDF:驗證資料劃分函數。一般在使用過程中設定前7個參數,後3個參數采用系統預設參數即可

常用的激活函數

常用的激活函數有
線性函數 f ( x ) = x f(x)=x f(x)=x​ 該函數的字元串為‘purelin’
對數S形轉移函數 f ( x ) = 1 1 + e − x f(x)=\frac{1}{1+e^{-x}} f(x)=1+e−x1​​ ( 0 < f ( x ) < 1 ) (0 < f(x) < 1) (0<f(x)<1)​ 該函數的字元串為‘logsig’
雙曲正切S形函數 f ( x ) = 2 1 + e − x − 1 f(x)=\frac{2}{1+e^{-x}}-1 f(x)=1+e−x2​−1​​​​​ ( − 1 < f ( x ) < 1 ) (-1 < f(x) < 1) (−1<f(x)<1)​​​​​ 該函數的字元串為‘tansig’

隻有當希望對網絡的輸出進行限制(如限制在0和1之間)時在輸出層才應當包含S形激活函數。在一般情況下,均是在隐含層采用雙極S形激活函數,而輸出層采用線性激活函數。

常用的訓練函數有:
trainbfg BFGS拟牛頓BP算法訓練函數
trainbr 貝葉斯正則化算法的BP算法訓練函數
traingd 梯度下降的BP算法訓練函數
traingda 梯度下降自适應lr的BP算法訓練函數
traingdm 梯度下降動量的BP算法訓練函數
traingdx 梯度下降動量和自适應lr的BP算法訓練函數
trainlm Levenberg-Marquardt的BP算法訓練函數
trainrp 具有彈性的BP算法訓練函數
trains 順序遞增BP訓練函數
trainscg 量化連接配接梯度BP訓練函數
常用的學習函數
learngd BP學習規則
learngdm 帶動量項的BP學習規則
常用的性能函數
mse 均方誤差函數
msereg 均方誤差規範化函數
配置參數 一些重要的網絡配置參數如下
net.trainparam.goal 神經網絡訓練的目标誤差
net.trainparam.show 顯示中間結果的周期
net.trainparam.epochs 最大疊代次數
net.trainParam.lr 學習率

train函數,train函數即神經網絡訓練學習函數。

文法:[net,tr,Y1,E]=train(net,X,Y)

X:網絡輸入矩陣;Y:網絡輸出矩陣;tr:訓練跟蹤資訊;Y1:網絡實際輸出;E:誤差矩陣。

sim函數,sim函數即神經網絡仿真計算函數。

文法:Y=sim(net,X)

net:訓練好的神經網絡;X:網絡輸入矩陣;Y:網絡輸出矩陣。

MATLAB仿真執行個體

采用貝葉斯正則化算法提高BP網絡的推廣能力。用來訓練BP網絡,使其能夠拟合某一附加有白噪聲的正弦樣本資料。

解:仿真過程如下:

(1)建構一個3層BP神經網絡:輸入層結點數為1個,隐含層結點數為3個,隐含層的激活函數為‘tansig’;輸出層結點數為1個,輸出層的激活函數為‘logsig’。

(2)采用貝葉斯正則化算法‘trainbr’訓練BP網絡,目标誤差goal=1×10-3,學習率lr=0.05,最大疊代次數epochs=500,拟合附加有白噪聲的正弦樣本資料,拟合資料均方根誤差為0.0054,拟合後的圖形如圖所示。

BP網絡拟合附加白噪聲的正弦曲線

數學模組化智能優化算法之神經網絡案例附Matlab代碼

MATLAB源程式如下:

%%%%%%%%%%%%%運用BP網絡拟合白噪聲的正弦樣本資料%%%%%%%%%%%%%%%%
clear all;                      %清除所有變量
close all;                      %清圖
clc;                            %清屏
%%%%%%%%%%%%%%%%%%%%%定義訓練樣本矢量%%%%%%%%%%%%%%%%%%%%%%%%%
%%%%%%%%%%%%%%%%%%%%%%P 為輸入矢量%%%%%%%%%%%%%%%%%%%%%%%%%%%%
P = [-1:0.05:1];
%%%%%%%%%%%%%%%%%%%%%%T 為目标矢量%%%%%%%%%%%%%%%%%%%%%%%%%%%%
T = sin(2*pi*P)+0.1*randn(size(P));
%%%%%%%%%%%%%%%%%%%%%%繪制樣本資料點%%%%%%%%%%%%%%%%%%%%%%%%%%
figure
plot(P,T,'+');
hold on;
plot(P,sin(2*pi*P),':');
%%%%%%%%%%%%%%%%%%%%%繪制不含噪聲的正弦曲線%%%%%%%%%%%%%%%%%%%
net=newff(P,T,20,{'tansig','purelin'});
%%%%%%%%%%%%%%%%%%%采用貝葉斯正則化算法 TRAINBR%%%%%%%%%%%%%%%
net.trainFcn='trainbr';
%%%%%%%%%%%%%%%%%%%%%%設定訓練參數%%%%%%%%%%%%%%%%%%%%%%%%%%%%
net.trainParam.show = 50;              %顯示中間結果的周期
net.trainParam.lr = 0.05;              %學習率
net.trainParam.epochs = 500;           %最大疊代次數
net.trainParam.goal = 1e-3;            %目标誤差
net.divideFcn = ''; %清除樣本資料分為訓練集、驗證集和測試集指令
%%%%%%%%%%%%%%%%%%%%用相應算法訓練 BP 網絡%%%%%%%%%%%%%%%%%%%%
[net,tr]=train(net,P,T);
%%%%%%%%%%%%%%%%%%%%%對 BP 網絡進行仿真%%%%%%%%%%%%%%%%%%%%%%%
A = sim(net,P);
%%%%%%%%%%%%%%%%%%%%%%計算仿真誤差%%%%%%%%%%%%%%%%%%%%%%%%%%%%
E = T - A;
MSE=mse(E);
%%%%%%%%%%%%%%%%%%%%%%繪制比對結果曲線%%%%%%%%%%%%%%%%%%%%%%%%
plot(P,A,P,T,'+',P,sin(2*pi*P),':');
legend('樣本點','标準正弦曲線','拟合正弦曲線');
           

表中所示為某藥品的月度銷售情況,利用BP神經網絡對藥品的銷售進行預測,預測方法采用滾動預測方式,即用前3個月的銷售量來預測第4個月的銷售量。如用1、2、3月的銷售量為輸入,預測第4個月的銷售量;用2、3、4月的銷售量為輸入,預測第5個月的銷售量。反複疊代,直至滿足預測精度要求為止。

月份 1 2 3 4 5 6
銷量 2056 2395 2600 2298 1634 1600
月份 7 8 9 10 11 12
銷量 1873 1478 1900 1500 2046 1556

解:仿真過程如下:

(1)建構一個3層BP神經網絡對藥品的銷售進行預測:輸入層結點數為3個,隐含層結點數為5,隐含層的激活函數為‘tansig’;輸出層結點數為1個,輸出層的激活函數為‘logsig’。

(2)采用梯度下降動量和自适應lr算法‘traingdx’訓練BP網絡,目标誤差goal= 1 × 1 0 − 3 1×10^{-3} 1×10−3,學習率lr=0.05,最大疊代次數epochs=1000,其銷售實際值和預測值對比曲線如圖所示。

數學模組化智能優化算法之神經網絡案例附Matlab代碼

藥品銷售實際值和預測值對比曲線

MATLAB源程式如下:

%%%%%%%%%%%%%%%%%%%%%%運用BP網絡預測資料%%%%%%%%%%%%%%%%%%%%%%%%
clear all;                      %清除所有變量
close all;                      %清圖
clc;                            %清屏
%%%%%%%%%%%%%%%%%%%%%%%%%%%%原始資料%%%%%%%%%%%%%%%%%%%%%%%%%%%%
p=[2056 2395 2600 2298 1634 1600 1873 1478 1900
   2395 2600 2298 1634 1600 1873 1478 1900 1500
   2600 2298 1634 1600 1873 1478 1900 1500 2046];
t=[2298 1634 1600 1873 1478 1900 1500 2046 1556];
%%%%%%%%%%%%%%%%%%%%%%%%%%原始資料歸一化%%%%%%%%%%%%%%%%%%%%%%%%%
pmax=max(max(p));·
pmin=min(min(p));
P=(p-pmin)./(pmax-pmin);                 %輸入資料矩陣
tmax=max(t);
tmin=min(t);
T=(t-tmin)./(tmax-tmin);                 %目标資料向量
%%%%%%%%%%%%%%%%%%%建立一個新的前向神經網絡%%%%%%%%%%%%%%%%%%%%%%%
net=newff(P,T,5,{'tansig','purelin'},'traingdx');
%%%%%%%%%%%%%%%%%%%%%%%設定訓練參數%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
net.trainParam.show = 50;              %顯示中間結果的周期
net.trainParam.lr = 0.01;              %學習率
net.trainParam.epochs = 1000;          %最大疊代次數
net.trainParam.goal = 1e-3;            %目标誤差
net.divideFcn = ''; %清除樣本資料分為訓練集、驗證集和測試集指令
%%%%%%%%%%%%%%%%%調用 TRAINGDM 算法訓練 BP 網絡%%%%%%%%%%%%%%%%%%
[net,tr]=train(net,P,T);
%%%%%%%%%%%%%%%%%%%%%對 BP 網絡進行仿真%%%%%%%%%%%%%%%%%%%%%%%%%%
A = sim(net,P);
%%%%%%%%%%%%%%%%%%%%%%計算預測資料原始值%%%%%%%%%%%%%%%%%%%%%%%%%
a = A.*(tmax-tmin)+tmin;               
%%%%%%%%%%%%%%%%%%%%%繪制實際值和預測值曲線%%%%%%%%%%%%%%%%%%%%%%
x=4:12;
figure
plot(x,t,'+');
hold on;
plot(x,a,'or');
hold off
xlabel('月份')
ylabel('銷量')
legend('實際銷量 ','預測銷量');
           

表中所示為某地區公路運力的曆史統計資料表,請建立相應的BP神經網絡預測模型,并根據給出的2010年和2011年的資料,預測相應的公路客運量和貨運量。

數學模組化智能優化算法之神經網絡案例附Matlab代碼

解:仿真過程如下:

(1)建構一個3層BP神經網絡對該地區公路運力進行預測:輸入層結點數為3個,隐含層結點數為8,隐含層的激活函數為‘tansig’;輸出層結點數為2個,輸出層的激活函數為‘purelin’。

(2)采用梯度下降動量和自适應lr算法‘traingdx’訓練BP網絡,目标誤差goal=1×10-3,學習率lr=0.035,最大疊代次數epochs=2000。拟合的曆年公路客運量曲線和曆年公路貨運量曲線分别如圖9.6和圖9.7所示。預測結果為:2010年公路客運量為4.5277億人,公路貨運量為2.2290億噸;2011年公路客運量為4.5308億人,公路貨運量為2.2296億噸。

曆年公路客運量拟合曲線

數學模組化智能優化算法之神經網絡案例附Matlab代碼

曆年公路貨運量拟合曲線

數學模組化智能優化算法之神經網絡案例附Matlab代碼

MATLAB源程式如下:

%%%%%%%%%%%%%%%%%%%%%%運用BP網絡預測資料%%%%%%%%%%%%%%%%%%%%%%%%
clear all;                      %清除所有變量
close all;                      %清圖
clc;                            %清屏
%%%%%%%%%%%%%%%%%%%%%%%%%%%%原始資料%%%%%%%%%%%%%%%%%%%%%%%%%%%%
%%%%%%%%%%%%%%%%%%%%%%%%人數(機關:萬人)%%%%%%%%%%%%%%%%%%%%%%%%%
sqrs=[20.55 22.44 25.37 27.13 29.45 30.10 30.96 34.06 36.42 38.09...
    39.13 39.99 41.93 44.59 47.30 52.89 55.73 56.76 59.17 60.63];
%%%%%%%%%%%%%%%%%%%%%%機動車數(機關:萬輛)%%%%%%%%%%%%%%%%%%%%%%%%
sqjdcs=[0.6 0.75 0.85 0.9 1.05 1.35 1.45 1.6 1.7 1.85 2.15 2.2...
    2.25 2.35 2.5 2.6 2.7 2.85 2.95 3.1];
%%%%%%%%%%%%%%%%%%%%公路面積(機關:萬平方公裡)%%%%%%%%%%%%%%%%%%%%
sqglmj=[0.09 0.11 0.11 0.14 0.20 0.23 0.23 0.32 0.32 0.34 0.36...
    0.36 0.38 0.49 0.56 0.59 0.59 0.67 0.69 0.79];
%%%%%%%%%%%%%%%%%%%%%公路客運量(機關:萬人)%%%%%%%%%%%%%%%%%%%%%%%
glkyl=[5126 6217 7730 9145 10460 11387 12353 15750 18304 19836 ... 
    21024 19490 20433 22598 25107 33442 36836 40548 42927 43462];
%%%%%%%%%%%%%%%%%%%%%公路貨運量(機關:萬噸)%%%%%%%%%%%%%%%%%%%%%%%
glhyl=[1237 1379 1385 1399 1663 1714 1834 4322 8132 8936 11099 ...
    11203 10524 11115 13320 16762 18673 20724 20803 21804];
%%%%%%%%%%%%%%%%%%%%%%%%%%輸入資料矩陣%%%%%%%%%%%%%%%%%%%%%%%%%%%
p=[sqrs;sqjdcs;sqglmj]; 
%%%%%%%%%%%%%%%%%%%%%%%%%%目标資料矩陣%%%%%%%%%%%%%%%%%%%%%%%%%%%
t=[glkyl;glhyl];    
%%%%%%%%%%%%%%%%%%%%%%%%%原始樣本歸一化%%%%%%%%%%%%%%%%%%%%%%%%%%
[P,PSp] = mapminmax(p);
[T,PSt] = mapminmax(t);
%%%%%%%%%%%%%%%%%%%建立一個新的前向神經網絡%%%%%%%%%%%%%%%%%%%%%%%
net=newff(P,T,8,{'tansig','purelin'},'traingdx');
%%%%%%%%%%%%%%%%%%%%%%%設定訓練參數%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
net.trainParam.show = 50;              %顯示中間結果的周期
net.trainParam.lr = 0.035;             %學習率
net.trainParam.epochs = 1000;          %最大疊代次數
net.trainParam.goal = 1e-3;            %目标誤差
net.divideFcn = ''; %清除樣本資料分為訓練集、驗證集和測試集指令
%%%%%%%%%%%%%%%%%調用 TRAINGDM 算法訓練 BP 網絡%%%%%%%%%%%%%%%%%%
[net,tr]=train(net,P,T);
%%%%%%%%%%%%%%%%%%%%%對 BP 網絡進行仿真%%%%%%%%%%%%%%%%%%%%%%%%%%
A = sim(net,P);
a=mapminmax('reverse',A,PSt);
%%%%%%%%%%%%%%%%%%%%優化後輸入層權值和門檻值%%%%%%%%%%%%%%%%%%%%%%%
inputWeights=net.IW{1,1};
inputbias=net.b{1};
%%%%%%%%%%%%%%%%%%%%優化後網絡層權值和門檻值%%%%%%%%%%%%%%%%%%%%%%%
layerWeights=net.LW{2,1};
layerbias=net.b{2};
%%%%%%%%%%%%%%%%%%%%%%%%%時間軸刻度%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
x=1990:2009;   
%%%%%%%%%%%%%%%%%%%%%%%網絡輸出客運量%%%%%%%%%%%%%%%%%%%%%%%%%%%%  
newk=a(1,:);       
%%%%%%%%%%%%%%%%%%%%%%%網絡輸出貨運量%%%%%%%%%%%%%%%%%%%%%%%%%%%%
newh=a(2,:);                                      
%%%%%%%%%%%%%%%%%%%%繪值公路客運量對比圖%%%%%%%%%%%%%%%%%%%%%%%%%%
figure
plot(x,newk,'r-o',x,glkyl,'b--+')  
legend('網絡輸出客運量','實際客運量');
xlabel('年份');ylabel('客運量/萬人');
%%%%%%%%%%%%%%%%%%%%%%繪制公路貨運量對比圖%%%%%%%%%%%%%%%%%%%%%%%
figure
plot(x,newh,'r-o',x,glhyl,'b--+')     
legend('網絡輸出貨運量','實際貨運量');
xlabel('年份');ylabel('貨運量/萬噸');
%%%%%%%%%%%%%%%%%%%%%利用訓練好的網絡進行預測%%%%%%%%%%%%%%%%%%%%%
%%%%%%%%%%%%%%%%%%%%%2010年和2011年的相關資料%%%%%%%%%%%%%%%%%%%%%
pnew=[73.39 75.55;3.9 4.1;0.98 1.02];                     
SamNum=size(pnew,2);  
%%%%%%%%%%%利用原始輸入資料的歸一化參數對新資料進行歸一化%%%%%%%%%%% 
pnewn=mapminmax('apply',pnew,PSp);
%%%%%%%%%%%%%%%%%%%%%%%隐含層輸出預測結果%%%%%%%%%%%%%%%%%%%%%%%%%
HiddenOut=tansig(inputWeights*pnewn+repmat(inputbias,1,SamNum)); 
%%%%%%%%%%%%%%%%%%%%%%%輸出層輸出預測結果%%%%%%%%%%%%%%%%%%%%%%%%%
anewn=purelin(layerWeights*HiddenOut+repmat(layerbias,1,SamNum));  
%%%%%%%%%%%%%%把網絡預測得到的資料還原為原始的數量級%%%%%%%%%%%%%%%
anew=mapminmax('reverse',anewn,PSt);
           

繼續閱讀