Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

TypeError: sampled_loss() got an unexpected keyword argument 'logits' #9

Closed
rubby33 opened this issue May 28, 2018 · 2 comments
Closed

Comments

@rubby33
Copy link

rubby33 commented May 28, 2018

我的tensorflow版本是1.6,发生如下问题,请大牛解答,谢谢

2018-05-28 11:16:27.772360: I tensorflow/core/platform/cpu_feature_guard.cc:137] Your CPU supports instructions that this TensorFlow binary was not compiled to use: SSE4.1 SSE4.2 AVX AVX2 FMA
Creating 3 layers of 256 units.
Traceback (most recent call last):
File "/Users/jiangwei/PycharmProjects/chatbot/chatbot_master/execute.py", line 259, in
train()
File "/Users/jiangwei/PycharmProjects/chatbot/chatbot_master/execute.py", line 124, in train
model = create_model(sess, False)
File "/Users/jiangwei/PycharmProjects/chatbot/chatbot_master/execute.py", line 94, in create_model
model = seq2seq_model.Seq2SeqModel( gConfig['enc_vocab_size'], gConfig['dec_vocab_size'], _buckets, gConfig['layer_size'], gConfig['num_layers'], gConfig['max_gradient_norm'], gConfig['batch_size'], gConfig['learning_rate'], gConfig['learning_rate_decay_factor'], forward_only=forward_only)
File "/Users/jiangwei/PycharmProjects/chatbot/chatbot_master/seq2seq_model.py", line 149, in init
softmax_loss_function=softmax_loss_function)
File "/Users/jiangwei/anaconda/envs/tensorflow-1.6-py3.5/lib/python3.5/site-packages/tensorflow/contrib/legacy_seq2seq/python/ops/seq2seq.py", line 1224, in model_with_buckets
softmax_loss_function=softmax_loss_function))
File "/Users/jiangwei/anaconda/envs/tensorflow-1.6-py3.5/lib/python3.5/site-packages/tensorflow/contrib/legacy_seq2seq/python/ops/seq2seq.py", line 1137, in sequence_loss
softmax_loss_function=softmax_loss_function))
File "/Users/jiangwei/anaconda/envs/tensorflow-1.6-py3.5/lib/python3.5/site-packages/tensorflow/contrib/legacy_seq2seq/python/ops/seq2seq.py", line 1092, in sequence_loss_by_example
crossent = softmax_loss_function(labels=target, logits=logit)
TypeError: sampled_loss() got an unexpected keyword argument 'logits'

@zhaoyingjun
Copy link
Owner

这是因为TF的版本更新后有些函数有变化了,你百度一下或者看一下官方文档,对应修改一下函数就可以了

@1148270327
Copy link

这是因为TF的版本更新后有些函数有变化了,你百度一下或者看一下官方文档,对应修改一下函数就可以了

你好,我用tf1.10.1和要求的一致也是这个问题,版本升级到tf1.13问题依然在。大牛可有解决方法指导下?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants