文章作者:Tyan
博客:noahsnail.com | CSDN | 简书
本文主要是关于PyTorch的激活函数。
import torch
import torch.nn.functional as func
from torch.autograd import Variable
import matplotlib.pyplot as plt
# 定义数据x
x = torch.linspace(-5, 5, 200)
x = Variable(x)
np_x = x.data.numpy()
# 通过激活函数处理x
y_relu = func.relu(x).data.numpy()
y_sigmoid = func.sigmoid(x).data.numpy()
y_tanh = func.tanh(x).data.numpy()
y_softmax = func.softplus(x).data.numpy()
# 绘制激活函数图
plt.figure(1, figsize = (8, 6))
plt.subplot(221)
plt.plot(np_x, y_relu, c = 'red', label = 'relu')
plt.ylim((-1, 5))
plt.legend(loc = 'best')
plt.figure(1, figsize = (8, 6))
plt.subplot(222)
plt.plot(np_x, y_sigmoid, c = 'red', label = 'sigmoid')
plt.ylim((0, 1))
plt.legend(loc = 'best')
plt.figure(1, figsize = (8, 6))
plt.subplot(223)
plt.plot(np_x, y_tanh, c = 'red', label = 'tanh')
plt.ylim((-1, 1))
plt.legend(loc = 'best')
plt.figure(1, figsize = (8, 6))
plt.subplot(224)
plt.plot(np_x, y_softmax, c = 'red', label = 'softmax')
plt.ylim((-1, 5))
plt.legend(loc = 'best')
plt.show()
运行结果:
Figure
网友评论