Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

'bert-serving-start' is not recognized as an internal or external command #194

Closed
moon-home opened this issue Jan 16, 2019 · 13 comments
Closed

Comments

@moon-home
Copy link

moon-home commented Jan 16, 2019

Hi,
This is a very silly question.....
I have python 3.6.6, tensorflow 1.12.0, doing everything in conda environment, Windows 10.
I pip installed bert-serving-server/client and it shows
Successfully installed GPUtil-1.4.0 bert-serving-client-1.7.2 bert-serving-server-1.7.2 pyzmq-17.1.2
but when I run the following as CLI
bert-serving-start -model_dir /tmp/english_L-12_H-768_A-12/ -num_worker=4
it says
'bert-serving-start' is not recognized as an internal or external command

I found bert-serving library is located under C:\Users\Name\Anaconda\Lib\site-packages. So I tried to run bert-serving-start again under these three folders:

  1. site-packages
  2. site-packages\bert_serving
  3. site-packages\bert_serving_server-1.7.2.dist-info


However, the result is same as not recognized. Can anyone help me?

@astariul
Copy link
Contributor

Did you try adding site-packages\bert_serving to your PATH variable ?

@hanxiao
Copy link
Member

hanxiao commented Jan 17, 2019

@sailormoon2016 please refer to the step 1 of #99 (comment)

for the latest bert-as-service >=1.7.0, you don't need to do step 2 in that comment

@moon-home
Copy link
Author

@hanxiao That script worked! Thank you!

@colanim I tried and added bert_serving to environment variables as you suggested, it's still not recognized, not sure if I did something else wrong, but thank you for trying to help!

@hanxiao
Copy link
Member

hanxiao commented Jan 22, 2019

fyi, this is fixed in #212, the new feature is available since 1.7.7. Please do

pip install -U bert-serving-server bert-serving-client

for the update.

@hanxiao
Copy link
Member

hanxiao commented Apr 4, 2019

@A6Matrix version too old, please do pip install -U first

@monk1337
Copy link

Create a start-bert-as-service.py with the following code

import sys

from bert_serving.server import BertServer
from bert_serving.server.helper import get_run_args


if __name__ == '__main__':
    args = get_run_args()
    server = BertServer(args)
    server.start()
    server.join()

so you can run with the following command
python start-bert-as-service.py -model_dir ./tmp/chinese_L-12_H-768_A-12/ -num_worker=1

@amitgupta911
Copy link

Create a start-bert-as-service.py with the following code

import sys

from bert_serving.server import BertServer
from bert_serving.server.helper import get_run_args


if __name__ == '__main__':
    args = get_run_args()
    server = BertServer(args)
    server.start()
    server.join()

so you can run with the following command
python start-bert-as-service.py -model_dir ./tmp/chinese_L-12_H-768_A-12/ -num_worker=1

@monk1337 Getting this Error using this solution on windows 10 with python 3.6 and tensorflow 2.0.0 .
image

@Giuli17G
Copy link

Giuli17G commented May 1, 2020

Hi,
I had another problem...
I installed correctly bert-serving-server and bert-serving client and I downloaded the multilingual model that I uncompressed in the same folder on my code.
But when i run
bert-serving-start -model_dir /tmp/multi_cased_L-12_H-768_A-12/ -num_worker=1
it launches the following error:
SyntaxError: invalid token.
Could you help me?
Thank you so much

@hanxiao
Copy link
Member

hanxiao commented May 11, 2020

Stop writing shitty answers and give a solution which actually solves the problem. In my case, I don't have any executables with the name bert-serving-start.

@askaydevs You should be really shamed of yourself!

@askaydevs
Copy link

askaydevs commented May 11, 2020

@hanxiao My sincere apologies for that earlier comment. I was just starting with BERT and at that moment there were not enough resources to learn about BERT and also I was an amateur, after giving numerous unsuccessful tries I got frustrated and wrote that; didn't mean any of that. I totally understand you are doing exceptional job to keep things clean and bug free. Again I am sorry @hanxiao.

@hanxiao
Copy link
Member

hanxiao commented May 11, 2020

image

@askaydevs You may delete your comment and wish the internet forget what you did. I hope you learn to respect other open source work and be responsible of what you did.

@ShrikanthSingh
Copy link

Hi,
I had another problem...
I installed correctly bert-serving-server and bert-serving client and I downloaded the multilingual model that I uncompressed in the same folder on my code.
But when i run
bert-serving-start -model_dir /tmp/multi_cased_L-12_H-768_A-12/ -num_worker=1
it launches the following error:
SyntaxError: invalid token.
Could you help me?
Thank you so much

Did you get any solution ? Even I am facing the same issue.

@sideburnss
Copy link

Create a start-bert-as-service.py with the following code

import sys

from bert_serving.server import BertServer
from bert_serving.server.helper import get_run_args


if __name__ == '__main__':
    args = get_run_args()
    server = BertServer(args)
    server.start()
    server.join()

so you can run with the following command
python start-bert-as-service.py -model_dir ./tmp/chinese_L-12_H-768_A-12/ -num_worker=1

@monk1337 Getting this Error using this solution on windows 10 with python 3.6 and tensorflow 2.0.0 . image

I have the same error.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

9 participants