美文网首页
代码实现(三)Stacked semantic attentio

代码实现(三)Stacked semantic attentio

作者: 续袁 | 来源:发表于2019-07-21 12:42 被阅读0次

1. 代码运行

Train

'''
python main_nabird.py
'''

Parameters

feat_model: easy or hard

Data

https://drive.google.com/file/d/1K1w4IKUOxn7sVqhZjB4RTpOUIv6vtIEn/view?usp=sharing

1.2 数据

1.3 问题

Traceback (most recent call last):
  File "C:/Users/xpb/PycharmProjects/sga-master/main_nabird.py", line 33, in <module>
    model.train()
  File "C:\Users\xpb\PycharmProjects\sga-master\attention.py", line 108, in train
    sess.run(self.optimizer,feed_dict={self.seman:seman_batch,self.image:img_batch,self.seman_b:tr_seman_pro,self.label:label_batch})
  File "C:\Users\xpb\Anaconda3\lib\site-packages\tensorflow\python\client\session.py", line 929, in run
    run_metadata_ptr)
  File "C:\Users\xpb\Anaconda3\lib\site-packages\tensorflow\python\client\session.py", line 1128, in _run
    str(subfeed_t.get_shape())))
ValueError: Cannot feed value of shape (1024, 1) for Tensor 'Placeholder_2:0', which has shape '(?, 323)'

原因: self.label 和 label_batch两者形状大小不相匹配
解决方法:图像lable需要改成onehot编码形式

1.3.2

Caused by op 'Loss/softmax_cross_entropy_with_logits_sg', defined at:
  File "C:/Users/xpb/PycharmProjects/sga-master/main_nabird.py", line 32, in <module>
    model = Attention(args,dataset)
  File "C:\Users\xpb\PycharmProjects\sga-master\attention.py", line 33, in __init__
    self.build_model()
  File "C:\Users\xpb\PycharmProjects\sga-master\attention.py", line 80, in build_model
    loss = tf.reduce_mean(tf.nn.softmax_cross_entropy_with_logits(logits=logit, labels=self.label))
  File "C:\Users\xpb\Anaconda3\lib\site-packages\tensorflow\python\util\deprecation.py", line 306, in new_func
    return func(*args, **kwargs)
  File "C:\Users\xpb\Anaconda3\lib\site-packages\tensorflow\python\ops\nn_ops.py", line 1954, in softmax_cross_entropy_with_logits
    labels=labels, logits=logits, dim=dim, name=name)
  File "C:\Users\xpb\Anaconda3\lib\site-packages\tensorflow\python\ops\nn_ops.py", line 1864, in softmax_cross_entropy_with_logits_v2
    precise_logits, labels, name=name)
  File "C:\Users\xpb\Anaconda3\lib\site-packages\tensorflow\python\ops\gen_nn_ops.py", line 7747, in softmax_cross_entropy_with_logits
    name=name)
  File "C:\Users\xpb\Anaconda3\lib\site-packages\tensorflow\python\framework\op_def_library.py", line 787, in _apply_op_helper
    op_def=op_def)
  File "C:\Users\xpb\Anaconda3\lib\site-packages\tensorflow\python\util\deprecation.py", line 488, in new_func
    return func(*args, **kwargs)
  File "C:\Users\xpb\Anaconda3\lib\site-packages\tensorflow\python\framework\ops.py", line 3274, in create_op
    op_def=op_def)
  File "C:\Users\xpb\Anaconda3\lib\site-packages\tensorflow\python\framework\ops.py", line 1770, in __init__
    self._traceback = tf_stack.extract_stack()

InvalidArgumentError (see above for traceback): logits and labels must be broadcastable: logits_size=[1024,39230] labels_size=[1024,323]
     [[node Loss/softmax_cross_entropy_with_logits_sg (defined at C:\Users\xpb\PycharmProjects\sga-master\attention.py:80)  = SoftmaxCrossEntropyWithLogits[T=DT_FLOAT, _device="/job:localhost/replica:0/task:0/device:CPU:0"](Loss/softmax_cross_entropy_with_logits_sg/Reshape, Loss/softmax_cross_entropy_with_logits_sg/Reshape_1)]]

问题原因: 数据不对,样本语义信息和类别语义信息混淆
解决办法: 使用类别语义信息

1.3.3

测试样本lable不要onehot编码

1.3.4

问题2: seman和seman_pro区别?
  seman–样本语义;seman_pro–类别语义。

2. 代码解析

参考资料

[1] [NIPS 2018 论文笔记] Stacked Semantics-Guided Attention Model for Fine-Grained Zero-Shot Learning
[2] 【python】sklearn中PCA的使用方法
[3] Python使用三种方法实现PCA算法
[4] python读取文件——python读取和保存mat文件
[5] Python字典中get()函数作用

代码

[1] ylytju/sga

数据

[1] EthanZhu90/ZSL_PP_CVPR17

论文

[1] Stacked semantic attention model for Zero-Shot learning

相关文章

网友评论

      本文标题:代码实现(三)Stacked semantic attentio

      本文链接:https://www.haomeiwen.com/subject/sdmmlctx.html