doc: fix port-forward command in run-inference.md#232
Conversation
Signed-off-by: Christian Kadner <ckadner@us.ibm.com>
chinhuang007
left a comment
There was a problem hiding this comment.
Looks good, thanks, @ckadner !
|
[APPROVALNOTIFIER] This PR is NOT APPROVED This pull-request has been approved by: chinhuang007, ckadner The full list of commands accepted by this bot can be found here. DetailsNeeds approval from an approver in each of these files:Approvers can indicate their approval by writing |
|
/lgtm |
|
/lgtm |
|
Not sure why the bot isn't merging these... I'll do it manually. |
I think the bot was waiting for both |
release: Update image tags for v0.11.1 (kserve#440)
Signed-off-by: konflux-internal-p02 <170854209+konflux-internal-p02[bot]@users.noreply.github.com> Co-authored-by: konflux-internal-p02[bot] <170854209+konflux-internal-p02[bot]@users.noreply.github.com>
Motivation
The code snippet to
port-forwardthemodelmesh-servingservice did not work:$ kubectl port-forward modelmesh-serving 8033:8033 Error from server (NotFound): pods "modelmesh-serving" not foundModifications
Update the code snippet to specify the resource type
service:Result
/cc @njhill
/cc @chinhuang007