-
Notifications
You must be signed in to change notification settings - Fork 2.9k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
第四章 《单变量线性回归》讨论区 #415
Comments
老师您好 |
是的,都应该改成私有属性。 |
4.0-单入单出单层 |
fixed with some other words. |
请问,数据在哪里down |
老师您好,能帮我看看为什么为下面的想法是错误的呢?
import matplotlib.pyplot as plt
class NeuralNet(object):
def __init__(self):
self.w = 0
self.b = 0
self.x=0
def forward(self,x):
self.x=x
return self.x*self.w+self.b
# def backward(self,z,y,eta):
# delta_w=-eta*(z-y)*self.x
# delta_b=-eta*(z-y)
# self.w+=delta_w
# self.b+=delta_b
# return delta_w,self.w,delta_b,self.b
def backward(self,delta_z):
delta_w=delta_z/self.x
delta_b=delta_z
self.w+=delta_w
self.b+=delta_b
return delta_w,self.w,delta_b,self.b
eta=0.1
def loss(z,y):
return pow(z-y,2)/2
def get_delta_z(z,y):
return -eta*(z-y)
file_name = "D://学习//人工智能//ai-edu-master//Data//ch04.npz"
from HelperClass.DataReader_1_0 import *
if __name__ == '__main__':
reader = DataReader_1_0(file_name)
reader.ReadData()
X,Y = reader.GetWholeTrainSamples()
net=NeuralNet()
for i in range(reader.num_train):
# get x and y value for one sample
xi = X[i]
yi = Y[i]
zi=net.forward(xi)
delta_z=get_delta_z(zi,yi)
delta_w,w,delta_b,b=net.backward(delta_z)
# delta_w,w,delta_b,b=net.backward(zi,yi,eta)
print("x=",xi,"y=",yi,"z=",zi,"loss=",loss(zi,yi),"delta_z=",delta_z,"delta_w=",delta_w,"w=",w,"delta_b=",delta_b,"b=",b)
# draw sample data
plt.plot(X, Y, "b.")
plt.title("Air Conditioner Power")
plt.xlabel("Number of Servers(K)")
plt.ylabel("Power of Air Conditioner(KW)")
PX = np.linspace(0,1,10)
PZ = net.forward(PX)
plt.plot(PX, PZ, "r")
plt.show()
训练之后的w(3096.04578468)和b(-1168.9221563)值都非常大 |
来信收到,谢谢支持。祝您快乐!!!
ljy
|
No description provided.
The text was updated successfully, but these errors were encountered: