This add-in to the PSO Research toolbox (Evers 2009) aims to allow an artificial neural network (ANN or simply NN) to be trained using the Particle Swarm Optimization (PSO) technique (Kennedy, Eberhart et al. 2001). This add-in acts like a bridge or interface between MATLAB’s NN toolbox and the PSO Research Toolbox. In this way, MATLAB’s NN functions can call the NN add-in, which in turn calls the PSO Research toolbox for NN training. This approach to training a NN by PSO treats each PSO particle as one possible solution of weight and bias combinations for the NN (Settles and Rylander ; Rui Mendes 2002; Venayagamoorthy 2003). The PSO particles therefore move about in the search space aiming to minimise the output of the NN performance function. The author acknowledges that there already exists code for PSO training of a NN (Birge 2005), however that code was found to work only with MATLAB version 2005 and older. This NN-addin works with newer versions of MATLAB till versions 2010a. HELPFUL LINKS: 1. This NN add-in only works when used with the PSORT found at, http://www.mathworks.com/matlabcentral/fileexchange/28291-particle-swarm-optimization-research-toolbox. 2. The author acknowledges the modification of code used in an old PSO toolbox for NN training found at http://www.mathworks.com.au/matlabcentral/fileexchange/7506. 3. User support and contact information for the author of this NN add-in can be found at http://www.tricia-rambharose.com/ ACKNOWLEDGEMENTS The author acknowledges the support of advisors and fellow researchers who supported in various ways to better her understanding of PSO and NN which lead to the creation of this add-in for PSO training of NNs. The acknowledged are as follows: * Dr. Alexander Nikov - Senior lecturer and Head of Usaility Lab, UWI, St. Augustine, Trinidad, W.I. http://www2.sta.uwi.edu/~anikov/ * Dr. Sabine Graf - Assistant Professor, Athabasca University, Alberta, Canada. http://scis.athabascau.ca/scis/staff/faculty.jsp?id=sabineg * Dr. Kinshuk - Professor, Athabasca University, Alberta, Canada. http://scis.athabascau.ca/scis/staff/faculty.jsp?id=kinshuk * Members of the iCore group at Athabasca University, Edmonton, Alberta, Canada.
2022-01-11 12:47:47 352KB pso算法 神经网络
1
hslogic算法仿真-PSO粒子群优化算法——对多个函数进行最优值搜索
2022-01-05 20:01:11 281KB PSO粒子群优化
带收缩因子的PSO优化算法 c1 = 2; % 学习因子1 ,一般在[0,2] c2 = 2; % 学习因子2 ,一般在[0,2] % c1 = 2.04344; %学习因子1 ,一般在[0,2] % c2 = 0.94874; %学习因子2 ,一般在[0,2] k1 = 0.7298; % 收缩因子 Dimension = 2; % 搜索空间维数(未知数个数) Popsize = 20; % 初始化群体个体数目 MaxDT = 100; % 最大迭代次数 DivH = 0.25; % 最大多样性系数 DivL = 0.0005; % 最小多样性系数
2022-01-05 20:01:10 8KB 收缩因子 PSO优化
TSP-PSO %% 与个体最优进行交叉 c1=round(rand*(n-2))+1; %在[1,n-1]范围内随机产生一个交叉位 c2=round(rand*(n-2))+1; while c1==c2 c1=round(rand*(n-2))+1; %在[1,n-1]范围内随机产生一个交叉位 c2=round(rand*(n-2))+1; end chb1=min(c1,c2); chb2=max(c1,c2); cros=Tour_pbest(i,chb1:chb2); %交叉区域矩阵 ncros=size(cros,2); %交叉区域元素个数
2022-01-05 20:01:09 24KB TSP-PSO
通过利用最大类间方差法(OTSU)作为目标函数,结合智能优化算法中的粒子群优化算法(PSO),来获得图像分割的多个阈值,且阈值个数可设定,效果较好。
1
python实现PSO算法优化二元函数,具体代码如下所示: import numpy as np import random import matplotlib.pyplot as plt from mpl_toolkits.mplot3d import Axes3D #----------------------PSO参数设置--------------------------------- class PSO(): def __init__(self,pN,dim,max_iter): #初始化类 设置粒子数量 位置信息维度 最大迭代次数 #self.w = 0.8 self.
2022-01-04 19:56:37 116KB python python函数 python算法
1
利用Python程序计算导航定位的几何精度因子,基于改进粒子群实现选星
2022-01-03 16:18:33 4KB GDOP python PSO
hslogic算法仿真-MOPSO优化算法得到Pareto域,多目标优化
2022-01-01 09:02:21 10KB Pareto域 PSO
hslogic算法仿真-MOPSO优化算法得到Pareto域,多目标优化
2022-01-01 09:02:21 10KB Pareto域 PSO
本文件对PID参数kp,ki,kd进行寻优,以ITAE作为指标函数。 PSO 文件中有详细的参数设置和寻优过程 GA寻优与PSO寻优作为对比出现 figure1展示了随着迭代次数的变化,适应度函数的收敛情况 figure2展示了kp,ki,kd的迭代情况 ht 文件是用来画图的 问题解决思路.pdf 简要介绍了粒子群算法寻优的过程
2021-12-30 19:00:24 250KB pso