1 簡介
人工神經網絡的最大缺點是訓練時間太長進而限制其實時應用範圍,近年來,極限學習機(Extreme Learning Machine, ELM)的提出使得前饋神經網絡的訓練時間大大縮短,然而當原始資料混雜入大量噪聲變量時,或者當輸入資料次元非常高時,極限學習機算法的綜合性能會受到很大的影響.深度學習算法的核心是特征映射,它能夠摒除原始資料中的噪聲,并且當向低次元空間進行映射時,能夠很好的起到對資料降維的作用,是以我們思考利用深度學習的優勢特性來彌補極限學習機的弱勢特性進而改善極限學習機的性能.為了進一步提升DELM預測精度,本文采用麻雀搜尋算法進一步優化DELM超參數,仿真結果表明,改進算法的預測精度更高。
2 部分代碼
%_________________________________________________________________________%
%獅群算法 %
%_________________________________________________________________________%
function [Best_pos,Best_score,curve]=LSO(pop,Max_iter,lb,ub,dim,fobj)
beta = 0.5;%成年獅所占比列
Nc = round(pop*beta);%成年獅數量
Np = pop-Nc;%幼師數量
if(max(size(ub)) == 1)
ub = ub.*ones(1,dim);
lb = lb.*ones(1,dim);
end
%種群初始化
X0=initialization(pop,dim,ub,lb);
X = X0;
%計算初始适應度值
fitness = zeros(1,pop);
for i = 1:pop
fitness(i) = fobj(X(i,:));
end
[value, index]= min(fitness);%找最小值
GBestF = value;%全局最優适應度值
GBestX = X(index,:);%全局最優位置
curve=zeros(1,Max_iter);
XhisBest = X;
fithisBest = fitness;
indexBest = index;
gbest = GBestX;
for t = 1: Max_iter
%母獅移動範圍擾動因子計算
stepf = 0.1*(mean(ub) - mean(lb));
alphaf = stepf*exp(-30*t/Max_iter)^10;
%幼獅移動範圍擾動因子計算
alpha = (Max_iter - t)/Max_iter;
%母獅位置更新
for i = 1:Nc
index = i;
while(index == i)
index = randi(Nc);%随機挑選一隻母獅
end
X(i,:) = (X(i,:) + X(index,:)).*(1 + alphaf.*randn())./2;
end
%幼師位置更新
for i = Nc+1:pop
q=rand;
if q<=1/3
X(i,:) = (gbest + XhisBest(i,:)).*( 1 + alpha.*randn())/2;
elseif q>1/3&&q<2/3
indexT = i;
while indexT == i
indexT = randi(Nc) + pop - Nc;%随機位置
end
X(i,:) = (X(indexT,:) + XhisBest(i,:)).*( 1 + alpha.*randn())/2;
else
gbestT = ub + lb - gbest;
X(i,:) = (gbestT + XhisBest(i,:)).*( 1 + alpha.*randn())/2;
end
end
%邊界控制
for j = 1:pop
for a = 1: dim
if(X(j,a)>ub)
X(j,a) =ub(a);
end
if(X(j,a)<lb)
X(j,a) =lb(a);
end
end
end
%計算适應度值
for j=1:pop
fitness(j) = fobj(X(j,:));
end
for j = 1:pop
if(fitness(j)<fithisBest(j))
XhisBest(j,:) = X(j,:);
fithisBest(j) = fitness(j);
end
if(fitness(j) < GBestF)
GBestF = fitness(j);
GBestX = X(j,:);
indexBest = j;
end
end
%% 獅王更新
Temp = gbest.*(1 + randn().*abs(XhisBest(indexBest,:) - gbest));
Temp(Temp>ub)=ub(Temp>ub);
Temp(Temp<lb) = lb(Temp<lb);
fitTemp = fobj(Temp);
if(fitTemp<GBestF)
GBestF =fitTemp;
GBestX = Temp;
X(indexBest,:)=Temp;
fitness(indexBest) = fitTemp;
end
[value, index]= min(fitness);%找最小值
gbest = X(index,:);%目前代,種群最優值
curve(t) = GBestF;
end
Best_pos = GBestX;
Best_score = curve(end);
end