美文网首页
LeNet、AlexNet模型实现(pytorch)

LeNet、AlexNet模型实现(pytorch)

作者: ClarenceHoo | 来源:发表于2019-03-29 16:26 被阅读0次

可以直接沿用前面写过的mnist分类网络,将模型做相应的替换即可
1、LeNet


LeNet

很多训练上tricks都是原始论文所没有的,这里不做修改。

import torch.nn as nn
import torch.nn.functional as func
class LeNet(nn.Module):
    def __init__(self):
        super(LeNet, self).__init__()
        self.conv1 =self.conv1 = nn.Conv2d(1, 6, kernel_size=5)
#如果要做彩色图像的任务将self.conv1 = nn.Conv2d(1, 6, kernel_size=5)
#替换成self.conv1 = nn.Conv2d(3, 6, kernel_size=5)即可
        self.conv2 = nn.Conv2d(6, 16, kernel_size=5)
        self.fc1 = nn.Linear(16*16, 120)
        self.fc2 = nn.Linear(120, 84)
        self.fc3 = nn.Linear(84, 10)

    def forward(self, x):
        x = func.relu(self.conv1(x))
        x = func.max_pool2d(x, 2)
        x = func.relu(self.conv2(x))
        x = func.max_pool2d(x, 2)
        x = x.view(x.size(0), -1)
        x = func.relu(self.fc1(x))
        x = func.relu(self.fc2(x))
        x = self.fc3(x)
        return x

2、AlexNet
原始imagenet上的网络输入是3通道的,这里用的是mnist故而采用1通道的,原始的结构如下,

AlexNet
考虑到原始的输入图像为224224,而MNIST为3232,故而对网络进行了相应的调整。
class AlexNet(nn.Module):
    def __init__(self, num_classes=NUM_CLASSES):
        super(AlexNet, self).__init__()
        self.features = nn.Sequential(
            nn.Conv2d(1, 96, kernel_size=11,padding=1),
            nn.ReLU(inplace=True),
            nn.MaxPool2d(kernel_size=2),
            nn.Conv2d(96, 256, kernel_size=3, padding=1),
            nn.ReLU(inplace=True),
            nn.MaxPool2d(kernel_size=2),
            nn.Conv2d(256, 384, kernel_size=3, padding=1),
            nn.ReLU(inplace=True),
            nn.Conv2d(384, 384, kernel_size=3, padding=1),
            nn.ReLU(inplace=True),
            nn.Conv2d(384, 256, kernel_size=3, padding=1),
            nn.ReLU(inplace=True),
            nn.MaxPool2d(kernel_size=2),
        )
        self.classifier = nn.Sequential(
            nn.Dropout(),
            nn.Linear(256 * 2 * 2, 4096),
            nn.ReLU(inplace=True),
            nn.Dropout(),
            nn.Linear(4096, 4096),
            nn.ReLU(inplace=True),
            nn.Linear(4096, 10),
        )

    def forward(self, x):
        x = self.features(x)
        x = x.view(x.size(0), 256 * 2 * 2)
        x = self.classifier(x)
        return x

突然复现两个实验发现结果都不能收敛,尝试加入BN还是不能收敛,loss试验过几个都不能收敛

相关文章

网友评论

      本文标题:LeNet、AlexNet模型实现(pytorch)

      本文链接:https://www.haomeiwen.com/subject/qjzlbqtx.html