天天看點

【DELM分類】基于花朵授粉算法改進深度學習極限學習機實作資料分類附matlab代碼

1 簡介

人工神經網絡的最大缺點是訓練時間太長進而限制其實時應用範圍,近年來,極限學習機(Extreme Learning Machine, ELM)的提出使得前饋神經網絡的訓練時間大大縮短,然而當原始資料混雜入大量噪聲變量時,或者當輸入資料次元非常高時,極限學習機算法的綜合性能會受到很大的影響.深度學習算法的核心是特征映射,它能夠摒除原始資料中的噪聲,并且當向低次元空間進行映射時,能夠很好的起到對資料降維的作用,是以我們思考利用深度學習的優勢特性來彌補極限學習機的弱勢特性進而改善極限學習機的性能.為了進一步提升DELM預測精度,本文采用麻雀搜尋算法進一步優化DELM超參數,仿真結果表明,改進算法的預測精度更高。

【DELM分類】基于花朵授粉算法改進深度學習極限學習機實作資料分類附matlab代碼
【DELM分類】基于花朵授粉算法改進深度學習極限學習機實作資料分類附matlab代碼
【DELM分類】基于花朵授粉算法改進深度學習極限學習機實作資料分類附matlab代碼
【DELM分類】基于花朵授粉算法改進深度學習極限學習機實作資料分類附matlab代碼
【DELM分類】基于花朵授粉算法改進深度學習極限學習機實作資料分類附matlab代碼

2 部分代碼

% --------------------------------------------------------------------%
% Flower pollenation algorithm (FPA), or flower algorithm             %
% Programmed by Xin-She Yang @ May 2012                               %
% --------------------------------------------------------------------%
%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
%Notes: This demo program contains the very basic components of      %
% the flower pollination algorithm (FPA), or flower algorithm (FA),   %
% for single objective optimization.    It usually works well for     %
% unconstrained functions only. For functions/problems with           %
% limits/bounds and constraints, constraint-handling techniques       %
% should be implemented to deal with constrained problems properly.   %
%                                                                     %
% Citation details:                                                   %
%1)Xin-She Yang, Flower pollination algorithm for global optimization,%
% Unconventional Computation and Natural Computation,                 %
% Lecture Notes in Computer Science, Vol. 7445, pp. 240-249 (2012).   %
%2)X. S. Yang, M. Karamanoglu, X. S. He, Multi-objective flower       %
% algorithm for optimization, Procedia in Computer Science,           %
% vol. 18, pp. 861-868 (2013).                                        %
%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
clc
clear all
close all
n=30;           % Population size, typically 10 to 25
p=0.8;           % probabibility switch
% Iteration parameters
N_iter=3000;            % Total number of iterations
fitnessMSE = ones(1,N_iter);
% % Dimension of the search variables Example 1
d=2;
Lb = -1*ones(1,d);
Ub = 1*ones(1,d);
% % Dimension of the search variables Example 2
% d=3;
% Lb = [-2 -1 -1];
% Ub = [2 1 1];
%
% % Dimension of the search variables Example 3
% d=3;
% Lb = [-1 -1 -1];
% Ub = [1 1 1];
%
%
% % % Dimension of the search variables Example 4
% d=9;
% Lb = -1.5*ones(1,d);
% Ub = 1.5*ones(1,d);
% Initialize the population/solutions
for i=1:n,
    Sol(i,:)=Lb+(Ub-Lb).*rand(1,d);
    % To simulate the filters use fitnessX() functions in the next line
    Fitness(i)=fitness(Sol(i,:));
end
% Find the current best
[fmin,I]=min(Fitness);
best=Sol(I,:);
S=Sol;
% Start the iterations -- Flower Algorithm
for t=1:N_iter,
    % Loop over all bats/solutions
    for i=1:n,
        % Pollens are carried by insects and thus can move in
        % large scale, large distance.
        % This L should replace by Levy flights
        % Formula: x_i^{t+1}=x_i^t+ L (x_i^t-gbest)
        if rand>p,
            %% L=rand;
            L=Levy(d);
            dS=L.*(Sol(i,:)-best);
            S(i,:)=Sol(i,:)+dS;
            % Check if the simple limits/bounds are OK
            S(i,:)=simplebounds(S(i,:),Lb,Ub);
            % If not, then local pollenation of neighbor flowers
        else
            epsilon=rand;
            % Find random flowers in the neighbourhood
            JK=randperm(n);
            % As they are random, the first two entries also random
            % If the flower are the same or similar species, then
            % they can be pollenated, otherwise, no action.
            % Formula: x_i^{t+1}+epsilon*(x_j^t-x_k^t)
            S(i,:)=S(i,:)+epsilon*(Sol(JK(1),:)-Sol(JK(2),:));
            % Check if the simple limits/bounds are OK
            S(i,:)=simplebounds(S(i,:),Lb,Ub);
        end
        % Evaluate new solutions
        % To simulate the filters use fitnessX() functions in the next
        % line
        Fnew=fitness(S(i,:));
        % If fitness improves (better solutions found), update then
        if (Fnew<=Fitness(i)),
            Sol(i,:)=S(i,:);
            Fitness(i)=Fnew;
        end
        % Update the current global best
        if Fnew<=fmin,
            best=S(i,:)   ;
            fmin=Fnew   ;
        end
    end
    % Display results every 100 iterations
    if round(t/100)==t/100,
        best
        fmin
    end
    fitnessMSE(t) = fmin;
end
%figure, plot(1:N_iter,fitnessMSE);
% Output/display
disp(['Total number of evaluations: ',num2str(N_iter*n)]);
disp(['Best solution=',num2str(best),'   fmin=',num2str(fmin)]);
figure(1)
plot( fitnessMSE)
xlabel('Iteration');
ylabel('Best score obtained so far');      

3 仿真結果

4 參考文獻

部落客簡介:擅長智能優化算法、神經網絡預測、信号處理、元胞自動機、圖像處理、路徑規劃、無人機等多種領域的Matlab仿真,相關matlab代碼問題可私信交流。

部分理論引用網絡文獻,若有侵權聯系部落客删除。

繼續閱讀