Skip to content
This repository has been archived by the owner on Mar 3, 2024. It is now read-only.

Error in get_model() : ValueError: too many values to unpack (expected 2) #13

Closed
1 task done
nshaud opened this issue May 29, 2019 · 8 comments
Closed
1 task done
Assignees
Labels
bug Something isn't working wontfix This will not be worked on

Comments

@nshaud
Copy link

nshaud commented May 29, 2019

Describe the Bug

Creating a Transformer model fails due to a tuple unpacking error (at least using eager execution). See below for the stacktrace, apparently the error is due to the decoder_embed_layer.

Version Info

  • I'm using the latest version

Minimal Codes To Reproduce

from keras_transformer import
import tensorflow as tf
tf.enable_eager_execution()

model = get_model(token_num=1000, embed_dim=32, encoder_num=2, decoder_num=2, head_num=4, hidden_dim=128, dropout_rate=0.05)

results in

    377     decoder_input = keras.layers.Input(shape=(None,), name='Decoder-Input')
--> 378     decoder_embed, decoder_embed_weights = decoder_embed_layer(decoder_input)
    379     decoder_embed = TrigPosEmbedding(
    380         mode=TrigPosEmbedding.MODE_ADD,

ValueError: too many values to unpack (expected 2)
@nshaud nshaud added the bug Something isn't working label May 29, 2019
CyberZHG added a commit that referenced this issue May 30, 2019
CyberZHG added a commit that referenced this issue May 30, 2019
@CyberZHG
Copy link
Owner

I think it's fixed. A newer version 0.25.0 has been released.

@CyberZHG CyberZHG added bug Something isn't working and removed bug Something isn't working labels May 30, 2019
@stale
Copy link

stale bot commented Jun 4, 2019

This issue has been automatically marked as stale because it has not had recent activity. It will be closed if no further activity occurs. Thank you for your contributions.

@stale stale bot added the wontfix This will not be worked on label Jun 4, 2019
@stale stale bot closed this as completed Jun 6, 2019
@nshaud
Copy link
Author

nshaud commented Jun 12, 2019

@CyberZHG This is not fixed, neither in 0.25.0 or in the latest 0.28.0. I still get the same error in eager mode.

@CyberZHG
Copy link
Owner

There are CI tasks that run successfully in eager mode. Make sure you have the latest keras-embed-sim.

@nshaud
Copy link
Author

nshaud commented Jun 12, 2019

From pip freeze:

keras-embed-sim==0.7.0
keras-layer-normalization==0.12.0
keras-multi-head==0.20.0
keras-pos-embd==0.10.0
keras-position-wise-feed-forward==0.5.0
Keras-Preprocessing==1.0.9
keras-self-attention==0.41.0
keras-transformer==0.28.0

Env variables TF_KERAS=1 and TF_EAGER=1 are set.

Focusing on the EmbeddingRet layer :

In [1]: import tensorflow as tf                                                                                                                                                                                    
In [2]: tf.enable_eager_execution()                                                                                                                                                                                
In [3]: from keras_embed_sim import EmbeddingRet                                                                                                                                                                   
In [4]: input_layer = tf.keras.layers.Input(shape=(None,), name='Input')                                                                                                                                           
In [5]: embed, embed_weights = EmbeddingRet( 
   ...:     input_dim=20, 
   ...:     output_dim=100, 
   ...:     mask_zero=True, 
   ...: )(input_layer)                                                                                                                                                                                             
<tensorflow stuff>
---------------------------------------------------------------------------
ValueError                                Traceback (most recent call last)
<ipython-input-5-9363f7bad7cf> in <module>
      3     output_dim=100,
      4     mask_zero=True,
----> 5 )(input_layer)

ValueError: too many values to unpack (expected 2)

@nshaud
Copy link
Author

nshaud commented Jun 12, 2019

Seems like EmbeddingRet returns a list of 3 elems.

In [1]: import tensorflow as tf                                                                                                                                                                                    
In [2]: tf.enable_eager_execution()
In [3]: input_layer = tf.keras.layers.Input(shape=(None,), name='Input') 
In [4]: embed = EmbeddingRet(input_dim=20, output_dim=100, mask_zero=True)(input_layer)
In [5]: embed                                                                                                                                                                                                      
Out[5]: 
[<DeferredTensor 'None' shape=(?, ?, 100) dtype=float32>,
 <DeferredTensor 'None' shape=(20,) dtype=float32>,
 <DeferredTensor 'None' shape=(100,) dtype=float32>]

@CyberZHG
Copy link
Owner

CyberZHG commented Jun 12, 2019

tensorflow==1.13.1

>>> EmbeddingRet(input_dim=20, output_dim=100, mask_zero=True)(input_layer)
[<tf.Tensor 'embedding_lookup/Identity_2:0' shape=(?, ?, 100) dtype=float32>, <tf.Tensor 'Identity:0' shape=(20, 100) dtype=float32>]

@nshaud
Copy link
Author

nshaud commented Jun 12, 2019

Ok, I'm stuck on TF 1.12 right now so I guess that's the problem, thanks.

Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
bug Something isn't working wontfix This will not be worked on
Projects
None yet
Development

No branches or pull requests

2 participants