您好,欢迎访问三七文档
当前位置:首页 > 机械/制造/汽车 > 机械/模具设计 > 带激活函数的梯度下降及线性回归算法和matlab代码-20170522
1带激活函数的梯度下降及线性回归和matlab代码1.单变量线性回归带输入输出数据归一化处理我们知道,对于输入数据进行归一化处理能极大地提高神经网络的学习速率,从而提高神经网络的收敛速度。对于同样的学习速率,未经过归一化数据处理的同样的数据集训练神经网络时,将会出现发散的现象,导致神经网络无法收敛。神经网络的激活函数(比如Relu、Sigmoid函数)会导致神经网络的输出节点只能输出正值,无法输出负值。此时由于激活函数的输出值范围是正值,因此神经网络的期望输出值也是正值。万一当神经网络的训练数据集中包含负值时,可对输出数据集进行加一个最大负值的绝对值的操作,使得神经网络的期望输出值全部为正值。如果神经网络的输出值的范围比较大,也可以对神经网络输出值进行归一化处理。如果神经元包含激活函数,则激活函数会自动使得神经元的输出为非负值。(1).未进行输入数据归一化处理的代码clearallclc%trainingsampledata;p0=3;p1=7;x=1:3;y=p0+p1*x;num_sample=size(y,2);%gradientdescendingprocess%initialvaluesofparameterstheta0=1;theta1=3;2%learningratealpha=0.33;%ifalphaistoolarge,thefinalerrorwillbemuchlarge.%ifalphaistoosmall,theconvergencewillbeslowepoch=100;fork=1:epochv_k=kh_theta_x=theta0+theta1*x;%hypothesisfunctionJcost(k)=((h_theta_x(1)-y(1))^2+(h_theta_x(2)-y(2))^2+(h_theta_x(3)-y(3))^2)/num_sampletheta0=theta0-alpha*((h_theta_x(1)-y(1))+(h_theta_x(2)-y(2))+(h_theta_x(3)-y(3)))/num_sample;theta1=theta1-alpha*((h_theta_x(1)-y(1))*x(1)+(h_theta_x(2)-y(2))*x(2)+(h_theta_x(3)-y(3))*x(3))/num_sample;endplot(Jcost)yn=theta0+theta1*x上述未进行输入数据归一化处理的代码最大的学习速率为0.35,在迭代次数到达60次时,输出误差下降至0.0000。(2).进行输入数据归一化处理的代码clearallclc%trainingsampledata;p0=3;p1=7;x=1:3;x_mean=mean(x)x_max=max(x)x_min=min(x)3xn=(x-x_mean)/(x_max-x_min)x=xn;y=p0+p1*xy=y+0.5;num_sample=size(y,2);%gradientdescendingprocess%initialvaluesofparameterstheta0=1;theta1=3;%learningratealpha=0.9;%ifalphaistoolarge,thefinalerrorwillbemuchlarge.%ifalphaistoosmall,theconvergencewillbeslowepoch=100;fork=1:epochv_k=kh_theta_x=theta0+theta1*x;%hypothesisfunctionJcost(k)=((h_theta_x(1)-y(1))^2+(h_theta_x(2)-y(2))^2+(h_theta_x(3)-y(3))^2)/num_sampletheta0=theta0-alpha*((h_theta_x(1)-y(1))+(h_theta_x(2)-y(2))+(h_theta_x(3)-y(3)))/num_sample;theta1=theta1-alpha*((h_theta_x(1)-y(1))*x(1)+(h_theta_x(2)-y(2))*x(2)+(h_theta_x(3)-y(3))*x(3))/num_sample;endyn=theta0+theta1*x;plot(Jcost)上述进行输入数据归一化处理的代码最大的学习速率为0.96,在迭代次数到达33次时,输出误差下降至0.0000。上述代码为了使得输出值中不包含负数,对所有输出值都加了0.5。这个0.5被反映至theta0的值上。Theta0的值比p0的值多0.5。4(3).输出带sigmoid激活函数的线性回归算法matlab代码clearallclc%trainingsampledata;p0=2;p1=9;x=1:3;x_mean=mean(x)x_max=max(x)x_min=min(x)xn=(x-x_mean)/(x_max-x_min)x=xn;y_temp=p0+p1*x;y=1./(1+exp(-y_temp));num_sample=size(y,2);%gradientdescendingprocess%initialvaluesofparameterstheta0=1;theta1=3;%learningratealpha=69;%ifalphaistoolarge,thefinalerrorwillbemuchlarge.%ifalphaistoosmall,theconvergencewillbeslow5epoch=800;fork=1:epochv_k=kzc=theta0+theta1*x;%h_theta_x=theta0+theta1*x;%hypothesisfunctionh_theta_x=1./(1+exp(-zc));fz=h_theta_x.*(1-h_theta_x);Jcost(k)=((h_theta_x(1)-y(1))^2+(h_theta_x(2)-y(2))^2+(h_theta_x(3)-y(3))^2)/num_sample;theta0=theta0-alpha*((h_theta_x(1)-y(1))*fz(1)+(h_theta_x(2)-y(2))*fz(2)+(h_theta_x(3)-y(3))*fz(3))/num_sample;theta1=theta1-alpha*((h_theta_x(1)-y(1))*x(1)*fz(1)+(h_theta_x(2)-y(2))*x(2)*fz(2)+(h_theta_x(3)-y(3))*x(3)*fz(3))/num_sample;endynt=theta0+theta1*x;yn=1./(1+exp(-ynt))plot(Jcost)上述matlab代码的训练过程的误差见下图所示:6图1训练过程中的误差示意图(4).输出带sigmoid激活函数的双输入线性回归算法matlab代码%doublevariableinputwithactivationfunction%normalizationofinputdataclearallclc%trainingsampledata;7p0=2;p1=9;p2=3;x1=[16128731112];x2=[3791284292];x1_mean=mean(x1)x1_max=max(x1)x1_min=min(x1)x1n=(x1-x1_mean)/(x1_max-x1_min)x1=x1n;x2_mean=mean(x2)x2_max=max(x2)x2_min=min(x2)x2n=(x2-x2_mean)/(x2_max-x2_min)x2=x2n;y_temp=p0+p1*x1+p2*x2;y=1./(1+exp(-y_temp));num_sample=size(y,2);%gradientdescendingprocess%initialvaluesofparameterstheta0=1;theta1=3;theta2=8;%learningratealpha=39;8%ifalphaistoolarge,thefinalerrorwillbemuchlarge.%ifalphaistoosmall,theconvergencewillbeslow%lamda=0.0001;lamda=0.000001;epoch=29600;fork=1:epochv_k=kzc=theta0+theta1*x1+theta2*x2;%h_theta_x=theta0+theta1*x;%hypothesisfunctionh_theta_x=1./(1+exp(-zc));fz=h_theta_x.*(1-h_theta_x);%Jcost(k)=((h_theta_x(1)-y(1))^2+(h_theta_x(2)-y(2))^2+(h_theta_x(3)-y(3))^2)/num_sample;Jcost(k)=sum((h_theta_x-y).^2)/num_sample;%theta0=theta0-alpha*((h_theta_x(1)-y(1))*fz(1)+(h_theta_x(2)-y(2))*fz(2)+(h_theta_x(3)-y(3))*fz(3))/num_sample;r0=sum((h_theta_x-y).*fz);theta0=theta0-alpha*r0/num_sample;%theta1=theta1-alpha*((h_theta_x(1)-y(1))*x(1)*fz(1)+(h_theta_x(2)-y(2))*x(2)*fz(2)+(h_theta_x(3)-y(3))*x(3)*fz(3))/num_sample+lamda*theta1;r1=sum(((h_theta_x-y).*x1).*fz);theta1=theta1-alpha*r1/num_sample+lamda*theta1;r2=sum(((h_theta_x-y).*x2).*fz);theta2=theta2-alpha*r2/num_sample+lamda*theta2;endynt=theta0+theta1*x1+theta2*x2;yn=1./(1+exp(-ynt))plot(Jcost)9(5).输出带sigmoid激活函数的三输入双输出线性回归算法matlab代码%triplevariableinputswithactivationfunction%doubleoutputs%normalizationofinputdataclearallclc%trainingsampledata;pa0=2;pa1=9;pa2=3;pa3=11;pb0=3;pb1=1;pb2=2;pb3=6;x1=[1612873111318273117];x2=[379128492837912];x3=[91792268412239226841];x1_mean=mean(x1)x1_max=max(x1)x1_min=min(x1)x1n=(x1-x1_mean)/(x1_max-x1_min)x1=x1n;x2_mean=mean(x2)x2_max=max(x2)x2_min=min(x2)x2n=(x2-x2_mean)/(x2_max-x2_min)x2=x2n;x3_mean=mean(x3)x3_max=max(x3)x3_min=min(x3)10x3n=(x3-x3_mean)/(x3_max-x3_min)x3=x3n;ya_temp=pa0+pa1*x1+pa2*x2+pa3*x3;ya=1./(1+exp(-ya_temp));yb_temp=pb0+pb1*x1+pb2*x2+pb3
本文标题:带激活函数的梯度下降及线性回归算法和matlab代码-20170522
链接地址:https://www.777doc.com/doc-7095957 .html