Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Serving opennmt_tf framework #46

Closed
julsal opened this issue Jan 23, 2019 · 12 comments

Comments

Projects
None yet
4 participants
@julsal
Copy link

commented Jan 23, 2019

Dears, thanks for sharing this nice piece of work.

I'd like to serve the opennmt_tf framework using docker, exposing a web service as described in the main README (POST /translate).

As an example, I downloaded the trained corpus 'averaged-ende-export500k'. As far as I understood, after building the Docker image, I need to run the image as :

docker run -p 5000:5000 -v /root/models:/path/to/averaged-ende-export500k opennmt_tf:latest serve --host 0.0.0.0 --port 5000 --config config.json

I have some questions about the configuration file.

{
    "source": "en",
    "target": "de",
    "model": "1539080952",  // (mandatory for trans, serve) Full model name as uuid64
    "imageTag": "string",  // (mandatory) Full URL of the image: url/image:tag.
    "tokenization": {
        // Vocabularies and tokenization options (from OpenNMT/Tokenizer).
        "source": {
            "vocabulary": "string"
            // other source specific tokenization options
        },
        "target": {
            "vocabulary": "string"
            // other target specific tokenization options
        }
    }
}

I removed everything marked as optional. I guess uuid64 required in "model" is the directory name from the corpus averaged-ende-export500k.

Is everything correct up to this point?
What should I put in "imageTag" and "tokenization"?

@guillaumekln

This comment has been minimized.

Copy link
Member

commented Jan 23, 2019

Hi,

Here are the steps to serve this pretrained model.

  1. Download and adapt the model:
wget https://s3.amazonaws.com/opennmt-models/averaged-ende-export500k.tar.gz
tar xf averaged-ende-export500k.tar.gz
mv averaged-ende-export500k/1554540232/ averaged-ende-export500k/1
  1. Create config.json in averaged-ende-export500k:
{
    "source": "en",
    "target": "de",
    "model": "averaged-ende-export500k",
    "modelType": "release",
    "tokenization": {
        "source": {
            "mode": "none",
            "sp_model_path": "${MODEL_DIR}/1/assets.extra/wmtende.model",
            "vocabulary": "${MODEL_DIR}/1/assets/wmtende.vocab"
        },
        "target": {
            "mode": "none",
            "sp_model_path": "${MODEL_DIR}/1/assets.extra/wmtende.model",
            "vocabulary": "${MODEL_DIR}/1/assets/wmtende.vocab"
        }
    }
}
  1. Run the server:
docker run -p 5000:5000 -v $PWD:/root/models nmtwizard/opennmt-tf --model averaged-ende-export500k --model_storage /root/models serve --host 0.0.0.0 --port 5000
  1. Test the server:
$ curl -X POST http://localhost:5000/translate -d '{"src":[{"text": "Hello world!"}]}'
{"tgt": [[{"text": "Hallo Welt!", "score": -0.27484220266342163}, {"text": "Hallo!", "score": -1.6019006967544556}, {"text": "Hallo, die Welt!", "score": -0.8488588333129883}, {"text": "Hallo Welten!", "score": -1.1288902759552002}]]}

Hope this helps.

@julsal

This comment has been minimized.

Copy link
Author

commented Jan 23, 2019

thank you very much, @guillaumekln Worked perfectly!

@julsal julsal closed this Jan 23, 2019

@yms9654

This comment has been minimized.

Copy link

commented Jun 25, 2019

@guillaumekln Can I add inference config in the config.json?
like this

"options": {
        "config": {
            "infer": {
                "n_best": 3,
                "with_scores": true,
                "with_alignments": "hard"
            },
            "score": {
                "with_alignments": "hard"
            }
        }
    }

I have tried but it is not working..
Do you have any idea on that?

Thank you!

@danielinux7

This comment has been minimized.

Copy link

commented Jul 10, 2019

Hello @guillaumekln,
I'm getting these errors, what am I doing wrong?
Screenshot from 2019-07-10 15-04-28

@guillaumekln

This comment has been minimized.

Copy link
Member

commented Jul 10, 2019

Did you follow step 1. in the instructions above? Make sure the directory averaged-ende-export500k exists in $PWD.

@danielinux7

This comment has been minimized.

Copy link

commented Jul 11, 2019

@guillaumekln
I followed step 1 inside root/models/ folder
(I had to change: mv averaged-ende-export500k/1539080952/ averaged-ende-export500k/1
to: mv averaged-ende-export500k/1554540232/ averaged-ende-export500k/1)

I followed the rest as is, on step 3 I'm getting the same error.
I'm not sure what you mean by:

Make sure the directory averaged-ende-export500k exists in $PWD.

@guillaumekln

This comment has been minimized.

Copy link
Member

commented Jul 11, 2019

When you run ls $PWD, does the directory averaged-ende-export500k exist?

@danielinux7

This comment has been minimized.

Copy link

commented Jul 11, 2019

@guillaumekln
When I run it in the terminal inside root/models/, then yes it exists.
Screenshot from 2019-07-11 14-08-45

@guillaumekln

This comment has been minimized.

Copy link
Member

commented Jul 11, 2019

Can you run the command from this directory?

@danielinux7

This comment has been minimized.

Copy link

commented Jul 11, 2019

@guillaumekln Can you be more clear, I can hardly understand you!

Yes, I run ls $PWD while I'm in root/models/
Screenshot from 2019-07-11 16-34-10

@guillaumekln

This comment has been minimized.

Copy link
Member

commented Jul 11, 2019

You should run the Docker command when you are in ~/models. Please note that the instructions above do not indicate to create a models directory or change directory between steps.

@danielinux7

This comment has been minimized.

Copy link

commented Jul 11, 2019

@guillaumekln It's working!
I'm running the Docker command in ~/models.
I did create root/models/ directory and execute the 1 step in there.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
You can’t perform that action at this time.