美文网首页
2020-04-18-调参细节需要注意

2020-04-18-调参细节需要注意

作者: 啊啊啊啊啊1231 | 来源:发表于2020-04-18 17:08 被阅读0次

    1.tongwei paper, KGNN-LS, g,f are

    2.proposed + bpr loss learning rate is set to 0.01, can achieve higher performance.

    3.there are several cases when the loss and parameters cease to update

    1) the weight of bpr is too small, for example, 0.1 can not, but 1.0 can.

    2) the encoder involves many layers of dense parameters.

    3) L2 regularizer weight is important, if 1e-5 is the right magnitude, if change to 1e-4, cannot update the parameters. 

    4) larger l2 weight, loss effective, keep at 0.5。 l2 regularization weight from 1e-5 to 2e-5, parameter fails to update

    base_loss=0.6,bpr_loss=0.6, ls_loss=0.2, l2_loss=xxx.

    5)successful case: only bpr loss. 

    occurs overfitting problem, epoch 1: 0.67, then descrease, but the training auc still increases. explicit and obvious indicator of overfitting.

    what I do is to add the L2 regularization loss and the layer regularization loss to overcome the overfitting problem. 

    successful! oh yeah!!!

    6) bpr loss+ l2 regularization loss. Also, one problem, about two epoches. converge.RQ: fall into the local optim? how to get rid of the local optim?

    epoch0-epoch2, train auc increase, test auc increase

    epoch2-epoch6, train auc increase, test auc descrease (look like overfit)

    epoch 6 onwards: train auc descrease, test auc descrease, occlislator.

    problem:

    solution: 

    相关文章

      网友评论

          本文标题:2020-04-18-调参细节需要注意

          本文链接:https://www.haomeiwen.com/subject/rhevvhtx.html