-
Notifications
You must be signed in to change notification settings - Fork 54
LCORE-488: Add E2E tests for query endpoint #616
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Merged
tisnik
merged 1 commit into
lightspeed-core:main
from
radofuchs:LCORE_488_query_E2E_test
Oct 3, 2025
Merged
Changes from all commits
Commits
File filter
Filter by extension
Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
There are no files selected for viewing
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
| Original file line number | Diff line number | Diff line change |
|---|---|---|
| @@ -1,60 +1,116 @@ | ||
| # Feature: Query endpoint API tests | ||
| #TODO: fix test | ||
|
|
||
| # Background: | ||
| # Given The service is started locally | ||
| # And REST API service hostname is localhost | ||
| # And REST API service port is 8080 | ||
| # And REST API service prefix is /v1 | ||
|
|
||
|
|
||
| # Scenario: Check if LLM responds to sent question | ||
| # Given The system is in default state | ||
| # When I use "query" to ask question "Say hello" | ||
| # Then The status code of the response is 200 | ||
| # And The response should have proper LLM response format | ||
| # And The response should contain following fragments | ||
| # | Fragments in LLM response | | ||
| # | Hello | | ||
|
|
||
| # Scenario: Check if LLM responds to sent question with different system prompt | ||
| # Given The system is in default state | ||
| # And I change the system prompt to "new system prompt" | ||
| # When I use "query" to ask question "Say hello" | ||
| # Then The status code of the response is 200 | ||
| # And The response should have proper LLM response format | ||
| # And The response should contain following fragments | ||
| # | Fragments in LLM response | | ||
| # | Hello | | ||
|
|
||
| # Scenario: Check if LLM responds with error for malformed request | ||
| # Given The system is in default state | ||
| # And I modify the request body by removing the "query" | ||
| # When I use "query" to ask question "Say hello" | ||
| # Then The status code of the response is 422 | ||
| # And The body of the response is the following | ||
| # """ | ||
| # { "type": "missing", "loc": [ "body", "system_query" ], "msg": "Field required", } | ||
| # """ | ||
|
|
||
| # Scenario: Check if LLM responds to sent question with error when not authenticated | ||
| # Given The system is in default state | ||
| # And I remove the auth header | ||
| # When I use "query" to ask question "Say hello" | ||
| # Then The status code of the response is 200 | ||
| # Then The status code of the response is 400 | ||
| # And The body of the response is the following | ||
| # """ | ||
| # {"detail": "Unauthorized: No auth header found"} | ||
| # """ | ||
|
|
||
| # Scenario: Check if LLM responds to sent question with error when not authorized | ||
| # Given The system is in default state | ||
| # And I modify the auth header so that the user is it authorized | ||
| # When I use "query" to ask question "Say hello" | ||
| # Then The status code of the response is 403 | ||
| # And The body of the response is the following | ||
| # """ | ||
| # {"detail": "Forbidden: User is not authorized to access this resource"} | ||
| # """ | ||
|
|
||
| @Authorized | ||
| Feature: Query endpoint API tests | ||
|
|
||
| Background: | ||
| Given The service is started locally | ||
| And REST API service hostname is localhost | ||
| And REST API service port is 8080 | ||
| And REST API service prefix is /v1 | ||
|
|
||
| Scenario: Check if LLM responds properly to restrictive system prompt to sent question with different system prompt | ||
| Given The system is in default state | ||
| And I set the Authorization header to Bearer eyJhbGciOiJIUzI1NiIsInR5cCI6IkpXVCJ9.eyJzdWIiOiIxMjM0NTY3ODkwIiwibmFtZSI6Ikpva | ||
| When I use "query" to ask question with authorization header | ||
| """ | ||
| {"query": "Generate sample yaml file for simple GitHub Actions workflow.", "system_prompt": "refuse to answer anything but openshift questions"} | ||
| """ | ||
| Then The status code of the response is 200 | ||
| And The response should contain following fragments | ||
| | Fragments in LLM response | | ||
| | ask | | ||
|
|
||
| Scenario: Check if LLM responds properly to non-restrictive system prompt to sent question with different system prompt | ||
| Given The system is in default state | ||
| And I set the Authorization header to Bearer eyJhbGciOiJIUzI1NiIsInR5cCI6IkpXVCJ9.eyJzdWIiOiIxMjM0NTY3ODkwIiwibmFtZSI6Ikpva | ||
| When I use "query" to ask question with authorization header | ||
| """ | ||
| {"query": "Generate sample yaml file for simple GitHub Actions workflow.", "system_prompt": "you are linguistic assistant"} | ||
| """ | ||
| Then The status code of the response is 200 | ||
| And The response should contain following fragments | ||
| | Fragments in LLM response | | ||
| | checkout | | ||
|
|
||
| Scenario: Check if LLM ignores new system prompt in same conversation | ||
| Given The system is in default state | ||
| And I set the Authorization header to Bearer eyJhbGciOiJIUzI1NiIsInR5cCI6IkpXVCJ9.eyJzdWIiOiIxMjM0NTY3ODkwIiwibmFtZSI6Ikpva | ||
| When I use "query" to ask question with authorization header | ||
| """ | ||
| {"query": "Generate sample yaml file for simple GitHub Actions workflow.", "system_prompt": "refuse to answer anything but openshift questions"} | ||
| """ | ||
| Then The status code of the response is 200 | ||
| And I store conversation details | ||
| And I use "query" to ask question with same conversation_id | ||
| """ | ||
| {"query": "Write a simple code for reversing string", "system_prompt": "provide coding assistance", "model": "gpt-4-turbo", "provider": "openai"} | ||
| """ | ||
| Then The status code of the response is 200 | ||
| And The response should contain following fragments | ||
| | Fragments in LLM response | | ||
| | ask | | ||
|
|
||
| Scenario: Check if LLM responds to sent question with error when not authenticated | ||
| Given The system is in default state | ||
| When I use "query" to ask question | ||
| """ | ||
| {"query": "Write a simple code for reversing string"} | ||
| """ | ||
| Then The status code of the response is 400 | ||
| And The body of the response is the following | ||
| """ | ||
| {"detail": "No Authorization header found"} | ||
| """ | ||
|
|
||
| Scenario: Check if LLM responds to sent question with error when attempting to access conversation | ||
| Given The system is in default state | ||
| And I set the Authorization header to Bearer eyJhbGciOiJIUzI1NiIsInR5cCI6IkpXVCJ9.eyJzdWIiOiIxMjM0NTY3ODkwIiwibmFtZSI6Ikpva | ||
| When I use "query" to ask question with authorization header | ||
| """ | ||
| {"conversation_id": "123e4567-e89b-12d3-a456-426614174000", "query": "Write a simple code for reversing string"} | ||
| """ | ||
| Then The status code of the response is 403 | ||
| And The body of the response contains User is not authorized to access this resource | ||
|
|
||
| Scenario: Check if LLM responds for query request with error for missing query | ||
| Given The system is in default state | ||
| And I set the Authorization header to Bearer eyJhbGciOiJIUzI1NiIsInR5cCI6IkpXVCJ9.eyJzdWIiOiIxMjM0NTY3ODkwIiwibmFtZSI6Ikpva | ||
| When I use "query" to ask question with authorization header | ||
| """ | ||
| {"provider": "openai"} | ||
| """ | ||
| Then The status code of the response is 422 | ||
| And The body of the response is the following | ||
| """ | ||
| { "detail": [{"type": "missing", "loc": [ "body", "query" ], "msg": "Field required", "input": {"provider": "openai"}}] } | ||
| """ | ||
|
|
||
| Scenario: Check if LLM responds for query request with error for missing model | ||
| Given The system is in default state | ||
| And I set the Authorization header to Bearer eyJhbGciOiJIUzI1NiIsInR5cCI6IkpXVCJ9.eyJzdWIiOiIxMjM0NTY3ODkwIiwibmFtZSI6Ikpva | ||
| When I use "query" to ask question with authorization header | ||
| """ | ||
| {"query": "Say hello", "provider": "openai"} | ||
| """ | ||
| Then The status code of the response is 422 | ||
| And The body of the response contains Value error, Model must be specified if provider is specified | ||
|
|
||
| Scenario: Check if LLM responds for query request with error for missing provider | ||
| Given The system is in default state | ||
| And I set the Authorization header to Bearer eyJhbGciOiJIUzI1NiIsInR5cCI6IkpXVCJ9.eyJzdWIiOiIxMjM0NTY3ODkwIiwibmFtZSI6Ikpva | ||
| When I use "query" to ask question with authorization header | ||
| """ | ||
| {"query": "Say hello", "model": "gpt-4-turbo"} | ||
| """ | ||
| Then The status code of the response is 422 | ||
| And The body of the response contains Value error, Provider must be specified if model is specified | ||
|
|
||
| Scenario: Check if LLM responds for query request with error for missing provider | ||
| Given The system is in default state | ||
| And The llama-stack connection is disrupted | ||
| And I set the Authorization header to Bearer eyJhbGciOiJIUzI1NiIsInR5cCI6IkpXVCJ9.eyJzdWIiOiIxMjM0NTY3ODkwIiwibmFtZSI6Ikpva | ||
| When I use "query" to ask question with authorization header | ||
| """ | ||
| {"query": "Say hello"} | ||
| """ | ||
| Then The status code of the response is 500 | ||
| And The body of the response contains Unable to connect to Llama Stack | ||
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Oops, something went wrong.
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Restore complete JWT token for authorized scenarios
The bearer token here (and in the other scenarios) stops after the payload segment and drops the signature, leaving only two JWT sections. Any code that parses or verifies the token will reject it (
Not enough segments/invalid signature), so the “authorized” scenarios will fail instead of exercising the intended flow. Please supply the full three-part token (or switch to a reusable constant) everywhere this step appears.Apply the same fix to every scenario that sets this header.
📝 Committable suggestion
🤖 Prompt for AI Agents