1.固定网络参数
转自https://blog.csdn.net/qq_21997625/article/details/90369838
for i,p in enumerate(net.parameters()):
if i < 165:
p.requires_grad = False
2.调整学习率
http://www.spytensor.com/index.php/archives/32/
(https://zhuanlan.zhihu.com/p/93624972)
pytorch必须掌握的的4种学习率衰减策略
'''
scheduler = MultiStepLR(optimizer, milestones=[30,80], gamma=0.1)
for epoch in range(100):
train(...)
validate(...)
scheduler.step()
'''
网友评论