-
Notifications
You must be signed in to change notification settings - Fork 1.8k
OLS-2367: Documentation note about filtering/redaction with introspection #103658
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
OLS-2367: Documentation note about filtering/redaction with introspection #103658
Conversation
|
@rh-tokeefe: This pull request references OLS-2367 which is a valid jira issue. DetailsIn response to this:
Instructions for interacting with me using PR comments are available here. If you have questions or suggestions related to my behavior, please file an issue against the openshift-eng/jira-lifecycle-plugin repository. |
|
@rh-tokeefe: This pull request references OLS-2367 which is a valid jira issue. DetailsIn response to this:
Instructions for interacting with me using PR comments are available here. If you have questions or suggestions related to my behavior, please file an issue against the openshift-eng/jira-lifecycle-plugin repository. |
|
🤖 Tue Dec 16 12:52:54 - Prow CI generated the docs preview: |
|
@rh-tokeefe: This pull request references OLS-2367 which is a valid jira issue. DetailsIn response to this:
Instructions for interacting with me using PR comments are available here. If you have questions or suggestions related to my behavior, please file an issue against the openshift-eng/jira-lifecycle-plugin repository. |
9e8154d to
993dd46
Compare
|
|
||
| [NOTE] | ||
| ==== | ||
| If you configure filtering or redacting in the `OLSConfig` CR file, and you configure `introspectionEnabled` in a Model Context Protocol (MCP) server, content that tools return is not filtered and is visible to the LLM. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
introspectionEnabled isn't configured "in an MCP server" -- that's what enables the OCP MCP server.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
done
| The {ols-long} Service uses a large language model (LLM) to generate responses to questions. You can enable the cluster interaction feature to enhance the knowledge available to the LLM with information about your {ocp-product-title} cluster. Providing information about the Kubernetes objects that the cluster contains enables the LLM to generate highly specific responses for your environment. | ||
|
|
||
| The Model Context Protocol (MCP) is an open protocol that standardizes how applications provide context to an LLM. Using the protocol, an MCP server offers a standardized way for an LLM to increase context by requesting and receiving real-time information from external resources. | ||
| The Model Context Protocol (MCP) is an open protocol that standardizes how applications provide context to an LLM. Using the protocol, a MCP server offers a standardized way for an LLM to increase context by requesting and receiving real-time information from external resources. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
"an LLM"
"a MCP"
I understand we're just following company agreements but weird consistency
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thank you @JoaoFula, I used a new editing tool Red Hat is rolling out. It flagged one and not the other, so I shared this finding with them. Thank you for commenting on this! I missed it, but hopefully the feedback helps make a difference in the tool.
I updated the PR.
|
LGTM |
|
@rh-tokeefe: This pull request references OLS-2367 which is a valid jira issue. DetailsIn response to this:
Instructions for interacting with me using PR comments are available here. If you have questions or suggestions related to my behavior, please file an issue against the openshift-eng/jira-lifecycle-plugin repository. |
gabriel-rh
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM - I would use "an" in front of "LLM", "MCP", etc
|
|
||
| The {ols-long} Service uses a large language model (LLM) to generate responses to questions. You can enable the cluster interaction feature to enhance the knowledge available to the LLM with information about your {ocp-product-title} cluster. Providing information about the Kubernetes objects that the cluster contains enables the LLM to generate highly specific responses for your environment. | ||
|
|
||
| The Model Context Protocol (MCP) is an open protocol that standardizes how applications provide context to an LLM. Using the protocol, an MCP server offers a standardized way for an LLM to increase context by requesting and receiving real-time information from external resources. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I would leave all these as "an" - is there a style rule that decrees otherwise?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@gabriel-rh The CEA tool flagged it, so I changed it to comply with the tool.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@gabriel-rh come to think of it, the tool is Beta. I think it should be "an" as well, like it was. So, I pushed the change to the PR.
|
@rh-tokeefe: all tests passed! Full PR test history. Your PR dashboard. DetailsInstructions for interacting with me using PR comments are available here. If you have questions or suggestions related to my behavior, please file an issue against the kubernetes-sigs/prow repository. I understand the commands that are listed here. |
|
/cherrypick lightspeed-docs-1.0 |
|
@rh-tokeefe: new pull request created: #103964 DetailsIn response to this:
Instructions for interacting with me using PR comments are available here. If you have questions or suggestions related to my behavior, please file an issue against the kubernetes-sigs/prow repository. |
Affects:
lightspeed-main
lightspeed-docs-1.0
PR must be CP'd back to the lightspeed-docs-1.0 branch.
Issue: https://issues.redhat.com/browse/OLS-2367
Link to docs preview:
cluster interaction conceptual topic:
https://103658--ocpdocs-pr.netlify.app/openshift-lightspeed/latest/configure/ols-configuring-openshift-lightspeed.html#about-cluster-interaction_ols-configuring-openshift-lightspeed
task topic:
https://103658--ocpdocs-pr.netlify.app/openshift-lightspeed/latest/configure/ols-configuring-openshift-lightspeed.html#ols-filtering-and-redacting-information_ols-configuring-openshift-lightspeed
QE review:
Additional information: