Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

not able to start server #580

Open
4 tasks
majidhamzavi opened this issue Aug 9, 2020 · 3 comments
Open
4 tasks

not able to start server #580

majidhamzavi opened this issue Aug 9, 2020 · 3 comments

Comments

@majidhamzavi
Copy link

majidhamzavi commented Aug 9, 2020

Prerequisites

Please fill in by replacing [ ] with [x].

System information

Some of this information can be collected via this script.

  • OS Platform and Distribution (e.g., Linux Ubuntu 16.04):
  • TensorFlow installed from (source or binary):
  • TensorFlow version:
  • Python version:
  • bert-as-service version:
  • GPU model and memory:
  • CPU model and memory:

Description

Please replace YOUR_SERVER_ARGS and YOUR_CLIENT_ARGS accordingly. You can also write your own description for reproducing the issue.

I'm using this command to start the server:

bert-serving-start YOUR_SERVER_ARGS

/opt/anaconda3/lib/python3.8/site-packages/bert_serving/server/helper.py:175: UserWarning: Tensorflow 2.3.0 is not tested! It may or may not work. Feel free to submit an issue at https://github.com/hanxiao/bert-as-service/issues/
warnings.warn('Tensorflow %s is not tested! It may or may not work. '
usage: /opt/anaconda3/bin/bert-serving-start -model_dir /uncased_L-12_H-768_A-12/ -num_worker=2 -max_seq_len 50
ARG VALUE


       ckpt_name = bert_model.ckpt
     config_name = bert_config.json
            cors = *
             cpu = False
      device_map = []
   do_lower_case = True

fixed_embed_length = False
fp16 = False
gpu_memory_fraction = 0.5
graph_tmp_dir = None
http_max_connect = 10
http_port = None
mask_cls_sep = False
max_batch_size = 256
max_seq_len = 50
model_dir = /uncased_L-12_H-768_A-12/
no_position_embeddings = False
no_special_token = False
num_worker = 2
pooling_layer = [-2]
pooling_strategy = REDUCE_MEAN
port = 5555
port_out = 5556
prefetch_size = 10
priority_batch_size = 16
show_tokens_to_client = False
tuned_model_dir = None
verbose = False
xla = False

I:VENTILATOR:[__i:__i: 67]:freeze, optimize and export graph, could take a while...
/opt/anaconda3/lib/python3.8/site-packages/bert_serving/server/helper.py:175: UserWarning: Tensorflow 2.3.0 is not tested! It may or may not work. Feel free to submit an issue at https://github.com/hanxiao/bert-as-service/issues/
warnings.warn('Tensorflow %s is not tested! It may or may not work. '
E:GRAPHOPT:[gra:opt:154]:fail to optimize the graph!
Traceback (most recent call last):
File "/opt/anaconda3/lib/python3.8/site-packages/bert_serving/server/graph.py", line 42, in optimize_graph
tf = import_tf(verbose=args.verbose)
File "/opt/anaconda3/lib/python3.8/site-packages/bert_serving/server/helper.py", line 186, in import_tf
tf.logging.set_verbosity(tf.logging.DEBUG if verbose else tf.logging.ERROR)
AttributeError: module 'tensorflow' has no attribute 'logging'
Traceback (most recent call last):
File "/opt/anaconda3/bin/bert-serving-start", line 8, in
sys.exit(main())
File "/opt/anaconda3/lib/python3.8/site-packages/bert_serving/server/cli/init.py", line 4, in main
with BertServer(get_run_args()) as server:
File "/opt/anaconda3/lib/python3.8/site-packages/bert_serving/server/init.py", line 71, in init
self.graph_path, self.bert_config = pool.apply(optimize_graph, (self.args,))
TypeError: cannot unpack non-iterable NoneType object

Then this issue shows up:

...

@majidhamzavi
Copy link
Author

Hi there,
I have a problem with the starting server. It gives actually two errors, one with the "tf.logging" and the other one as "TypeError: cannot unpack non-iterable NoneType object".
Thanks

@oakieoaktree
Copy link

Similar issue here, seems to be very tricky dependencies management.

Suggested fix seems generally to be to downgrade to tensorflow>=1.10.

However older tensorflow<=2.0 seems not to be supported with Python 3.8.:

Python 3.8 support requires TensorFlow 2.2

Is tensorflow>=2.2 support planned for bert-serving-server?

@skro123
Copy link

skro123 commented Aug 23, 2020

我将tensorflow 1.15降到1.10,解决问题了。

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants