Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

How to export? #113

Closed
jinamshah opened this issue Jul 3, 2019 · 11 comments
Closed

How to export? #113

jinamshah opened this issue Jul 3, 2019 · 11 comments

Comments

@jinamshah
Copy link

I needed to know how to write the serving function to export the trained xlnet model.
I have this right now:

def serving_input_fn():
with tf.variable_scope("model"):
feature_spec = {
"input_ids": tf.FixedLenFeature([MAX_SEQ_LENGTH], tf.int64),
"input_mask": tf.FixedLenFeature([MAX_SEQ_LENGTH], tf.int64),
"segment_ids": tf.FixedLenFeature([MAX_SEQ_LENGTH], tf.int64),
"label_ids": tf.FixedLenFeature([], tf.int64),
}
serialized_tf_example = tf.placeholder(dtype=tf.string,
shape=[None],
name='input_example_tensor')
receiver_tensors = {'examples': serialized_tf_example}
features = tf.parse_example(serialized_tf_example, feature_spec)
return tf.estimator.export.ServingInputReceiver(features, receiver_tensors)

EXPORT_DIR = 'gs://{}/export/{}'.format(BUCKET, TASK_VERSION)
estimator._export_to_tpu = False # this is important
path = estimator.export_savedmodel(EXPORT_DIR, serving_input_fn)

This is throwing me errors.
Please note: this is the function that I used for Bert, and as I am no expert in tensorflow, I don't understand why it won't work.
It throws a type mismatch error

@lukemelas
Copy link

Hello @jinamshah , have you resolved this issue? If not, let me know and I can help debug.

@jinamshah
Copy link
Author

Hey @lukemelas I was able to resolve this issue. I wrote my own function to help me do this.
Thanks!

@lukemelas
Copy link

Great!

@lukemelas
Copy link

I believe this issue can be closed.

@ashgorithm
Copy link

I was also getting type mismatch errors while exporting model.

TypeError: Tensors in list passed to 'values' of 'ConcatV2' Op have types [int32, int64] that don't all match.

How did you resolve this issue?

@jinamshah
Copy link
Author

@ashgorithm
for name in list(features.keys()): t = features[name] if t.dtype == tf.int64: t = tf.cast(t, tf.int32) features[name] = t
I believe this would help you.
Basically, the input is required to be int32

@AndrewPelton
Copy link

@jinamshah would you be able to share your solution? I am having the same problem

@hexiaoyupku
Copy link

@jinamshah would you be able to share your solution? I am having the same problem

+1, please!

@kobkrit
Copy link

kobkrit commented Oct 14, 2019

@jinamshah would you be able to share your solution? I am having the same problem
+1 Please!!

@kobkrit
Copy link

kobkrit commented Oct 14, 2019

Hey @AndrewPelton @hexiaoyupku I solved it :) Thank you to @jinamshah

  def serving_input_fn():
        with tf.variable_scope("foo"):
          feature_spec = {
              "unique_ids": tf.FixedLenFeature([], tf.int64),
              "input_ids": tf.FixedLenFeature([FLAGS.max_seq_length], tf.int64),
              "input_mask": tf.FixedLenFeature([FLAGS.max_seq_length], tf.float32),
              "segment_ids": tf.FixedLenFeature([FLAGS.max_seq_length], tf.int64),
              "cls_index": tf.FixedLenFeature([], tf.int64),
              "p_mask": tf.FixedLenFeature([FLAGS.max_seq_length], tf.float32)
          }
        
     
          serialized_tf_example = tf.placeholder(dtype=tf.string,
                                                 shape=[FLAGS.predict_batch_size], #[None],
                                                 name='input_example_tensor')
          
          receiver_tensors = {'examples': serialized_tf_example}
          features = tf.parse_example(serialized_tf_example, feature_spec)
            
          for name in list(features.keys()): 
            t = features[name] 
            if t.dtype == tf.int64:
                t = tf.cast(t, tf.int32)
                features[name] = t
                
          return tf.estimator.export.ServingInputReceiver(features, receiver_tensors)
        
  if FLAGS.do_export:
    estimator._export_to_tpu = False  # this is important
    print("Estimator save model..")
    estimator.export_savedmodel('export_t', serving_input_fn)

@rob-nn
Copy link

rob-nn commented Jan 3, 2020

Could someone please send me an example of http post request to this serving?
thanks!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

7 participants