diff --git a/docs/tutorials/integrations/kserve-mm/blue-green.md b/docs/tutorials/integrations/kserve-mm/blue-green.md index b284e61c..eb02ff47 100644 --- a/docs/tutorials/integrations/kserve-mm/blue-green.md +++ b/docs/tutorials/integrations/kserve-mm/blue-green.md @@ -103,7 +103,6 @@ To send inference requests to the model: 3. Make inference requests: ```shell - cd demo cat wisdom.sh . wisdom.sh ``` @@ -116,8 +115,8 @@ To send inference requests to the model: 2. Download the proto file and a sample input: ```shell - curl -sO https://raw.githubusercontent.com/iter8-tools/docs/v0.15.1/samples/modelmesh-serving/kserve.proto - curl -sO https://raw.githubusercontent.com/iter8-tools/docs/v0.15.1/samples/modelmesh-serving/grpc_input.json + curl -sO https://raw.githubusercontent.com/iter8-tools/docs/v0.15.2/samples/modelmesh-serving/kserve.proto + curl -sO https://raw.githubusercontent.com/iter8-tools/docs/v0.15.2/samples/modelmesh-serving/grpc_input.json ``` 3. Send inference requests: diff --git a/docs/tutorials/integrations/kserve-mm/canary.md b/docs/tutorials/integrations/kserve-mm/canary.md index c483c0e8..69ea4537 100644 --- a/docs/tutorials/integrations/kserve-mm/canary.md +++ b/docs/tutorials/integrations/kserve-mm/canary.md @@ -103,7 +103,6 @@ To send inference requests to the model: 3. Make inference requests: ```shell - cd demo cat wisdom.sh . wisdom.sh ``` @@ -121,8 +120,8 @@ To send inference requests to the model: 2. Download the proto file and a sample input: ```shell - curl -sO https://raw.githubusercontent.com/iter8-tools/docs/v0.15.1/samples/modelmesh-serving/kserve.proto - curl -sO https://raw.githubusercontent.com/iter8-tools/docs/v0.15.1/samples/modelmesh-serving/grpc_input.json + curl -sO https://raw.githubusercontent.com/iter8-tools/docs/v0.15.2/samples/modelmesh-serving/kserve.proto + curl -sO https://raw.githubusercontent.com/iter8-tools/docs/v0.15.2/samples/modelmesh-serving/grpc_input.json ``` 3. Send inference requests: diff --git a/docs/tutorials/integrations/kserve/blue-green.md b/docs/tutorials/integrations/kserve/blue-green.md index 24d0488f..95796c4a 100644 --- a/docs/tutorials/integrations/kserve/blue-green.md +++ b/docs/tutorials/integrations/kserve/blue-green.md @@ -90,7 +90,7 @@ To send inference requests to the model: === "From within the cluster" 1. Create a "sleep" pod in the cluster from which requests can be made: ```shell - curl -s https://raw.githubusercontent.com/iter8-tools/docs/v0.15.1/samples/kserve-serving/sleep.sh | sh - + curl -s https://raw.githubusercontent.com/iter8-tools/docs/v0.15.2/samples/kserve-serving/sleep.sh | sh - ``` 2. exec into the sleep pod: @@ -112,7 +112,7 @@ To send inference requests to the model: 2. Download the sample input: ```shell - curl -sO https://raw.githubusercontent.com/iter8-tools/docs/v0.15.1/samples/kserve-serving/input.json + curl -sO https://raw.githubusercontent.com/iter8-tools/docs/v0.15.2/samples/kserve-serving/input.json ``` 3. Send inference requests: diff --git a/docs/tutorials/integrations/kserve/canary.md b/docs/tutorials/integrations/kserve/canary.md index 04c689aa..91a67e51 100644 --- a/docs/tutorials/integrations/kserve/canary.md +++ b/docs/tutorials/integrations/kserve/canary.md @@ -90,7 +90,7 @@ To send inference requests to the model: === "From within the cluster" 1. Create a "sleep" pod in the cluster from which requests can be made: ```shell - curl -s https://raw.githubusercontent.com/iter8-tools/docs/v0.15.1/samples/kserve-serving/sleep.sh | sh - + curl -s https://raw.githubusercontent.com/iter8-tools/docs/v0.15.2/samples/kserve-serving/sleep.sh | sh - ``` 2. exec into the sleep pod: @@ -117,7 +117,7 @@ To send inference requests to the model: 2. Download the sample input: ```shell - curl -sO https://raw.githubusercontent.com/iter8-tools/docs/v0.15.1/samples/kserve-serving/input.json + curl -sO https://raw.githubusercontent.com/iter8-tools/docs/v0.15.2/samples/kserve-serving/input.json ``` 3. Send inference requests: