美文网首页
pytorch问题记录

pytorch问题记录

作者: 井底蛙蛙呱呱呱 | 来源:发表于2021-01-05 13:57 被阅读0次

    1、dataloader + h5py速度很慢?

    使用torch.tensor来将h5数据转化为torch tensor时候速度非常慢,可以先将h5数据转换为np array形式,然后再使用torch.tensor对其进行转换。

    import h5py
    import numpy as np
    import torch
    
    # Create a new HDF5 file with a 1000x1000 float32 dataset:
    testfile = h5py.File('testfile.h5', 'w')
    testfile['data'] = torch.eye(1000)
    
    # Then load it back into a Tensor:
    >>> %timeit torch.tensor(testfile['data'])
    421 ms ± 28.2 ms per loop (mean ± std. dev. of 7 runs, 1 loop each)
    
    # It's very slow. However, if the dataset is converted into a NumPy array first, it performs much faster:
    >>> %timeit torch.tensor(np.array(testfile['data']))
    2.84 ms ± 162 µs per loop (mean ± std. dev. of 7 runs, 100 loops each)
    

    2、如何固定网络中的某些参数不进行更新

    参考:https://discuss.pytorch.org/t/correct-way-to-freeze-layers/26714

    # we want to freeze the fc2 layer this time: only train fc1 and fc3
    net.fc2.weight.requires_grad = False
    net.fc2.bias.requires_grad = False
    
    # passing only those parameters that explicitly requires grad
    optimizer = optim.Adam(filter(lambda p: p.requires_grad, net.parameters()), lr=0.1)
    
    # then do the normal execution of loss calculation and backward propagation
    
    #  unfreezing the fc2 layer for extra tuning if needed
    net.fc2.weight.requires_grad = True
    net.fc2.bias.requires_grad = True
    
    # add the unfrozen fc2 weight to the current optimizer
    optimizer.add_param_group({'params': net.fc2.parameters()})
    

    譬如固定bert模型预训练参数:

    model = BertForSequenceClassification.from_pretrained('bert-base-uncased')
    
    for param in model.bert.parameters():
        param.requires_grad = False
    
    optimizer = optim.Adam(filter(lambda p: p.requires_grad, model.parameters()), lr=0.1)
    

    更多的方法,可以参考链接https://discuss.pytorch.org/t/correct-way-to-freeze-layers/26714

    相关文章

      网友评论

          本文标题:pytorch问题记录

          本文链接:https://www.haomeiwen.com/subject/iroioktx.html