File tree Expand file tree Collapse file tree 1 file changed +15
-1
lines changed Expand file tree Collapse file tree 1 file changed +15
-1
lines changed Original file line number Diff line number Diff line change @@ -91,10 +91,24 @@ tools/run_in_docker.sh bazel build tensorflow_serving/model_servers:tensorflow_m
9191
9292## Serve a model containing your custom op
9393
94- You can now run the ModelServer binary and start serving your model:
94+ You can now run the ModelServer binary and start serving a model that contains
95+ this custom op:
9596
9697``` bash
9798tools/run_in_docker.sh -o " -p 8501:8501" \
9899bazel-bin/tensorflow_serving/model_servers/tensorflow_model_server \
99100--rest_api_port=8501 --model_name=< model_name> --model_base_path=< model_base_path>
100101```
102+
103+ ## Send an inference request to test op manually
104+
105+ You can now send an inference request to the model server to test your custom
106+ op:
107+
108+ ``` bash
109+ curl http://localhost:8501/v1/models/< model_name> :predict -X POST \
110+ -d ' {"inputs": [[1,2], [3,4]]}'
111+ ```
112+
113+ [ This page] ( https://www.tensorflow.org/tfx/serving/api_rest#top_of_page )
114+ contains a more complete API for sending REST requests to the model server.
You can’t perform that action at this time.
0 commit comments