问题:试设计一个前馈神经网络来解决XOR问题,要求该前馈神经网络具有两个隐藏层神经元和一个输出神经元,并使用ReLU作为激活函数。
解答:神经网络如下图。
image.png image.png其中激活函数 ReLU,令
image.png image.png image.png
即可解决XOR问题。
代码实现:
import torch
import torch.nn.functional as F
import os
import numpy as np
os.chdir(r"C:\Users\Eddie\Desktop")
# 1. prepare dataset
x_data = torch.tensor([[0,1.0],[1.0,1.0],[0,0],[1.0,0]])
y_data = torch.tensor([[1.0],[0],[1.0],[0]])
# 2. Design model
class LogisticRegressionModel(torch.nn.Module):
def __init__(self):
super(LogisticRegressionModel, self).__init__()
self.linear1 = torch.nn.Linear(2,2,bias = True)
self.linear2 = torch.nn.Linear(2,1,bias = True)
def forward(self, x):
y_pred = torch.relu(self.linear1(x))
y_pred = self.linear2(y_pred)
return y_pred
model = LogisticRegressionModel()
model.load_state_dict(torch.load(r"models\net1.pth")) # load weights and bias data
# 3. Construct loss and optimizer
criterion = torch.nn.MSELoss(reduction='mean')
optimizer = torch.optim.SGD(model.parameters(), lr=0.01)
#-------------------------------------------------------#
# 4. Training cycle
for epoch in range(1000):
y_pred = model(x_data)
loss = criterion(y_pred, y_data)
print(epoch, loss.item())
optimizer.zero_grad()
loss.backward()
optimizer.step()
# 5. 保存权重数据
torch.save(obj=model.state_dict(), f=r"models\net1.pth")
# 6. 输出结果
print(y_pred)
print(y_data)
print("权重1:",model.state_dict()['linear1.weight'])
print("偏置1:",model.state_dict()['linear1.bias'])
print("权重2:",model.state_dict()['linear2.weight'])
print("偏置2:",model.state_dict()['linear2.bias'])
网友评论