Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

feat: OpenAI Python library Integration for API Collection Testing and Coverage Reporting #2655

Open
4 of 7 tasks
Tracked by #629
Van-QA opened this issue Apr 9, 2024 · 1 comment · Fixed by #2778
Open
4 of 7 tasks
Tracked by #629
Assignees
Labels
P1: important Important feature / fix type: feature request A new feature

Comments

@Van-QA
Copy link
Contributor

Van-QA commented Apr 9, 2024

Problem:
Currently, there is no seamless way to run checks on our API collection using OpenAI's tools and sync the output to a Report Portal for coverage analysis.

Success Criteria:

  • Allows us to integrate OpenAI's tools, specifically the openai-python library (https://github.com/openai/openai-python), to run checks on our API collection via Pipeline.
  • Enable us to automatically sync the output from these checks to a Report Portal for easy analysis and coverage tracking.

Additional Context:
The proposed feature would allow us to streamline our testing and analysis by automating the process of running checks on our API collection using OpenAI' tools. Additionally, the ability to sync the output from these checks to a Report Portal would enable us to easily and analyze coverage, making it easier to identify areas for improvement and optimize our API collection.

This feature would be particularly useful for our CI/CD pipeline, as would allow us to automatically run checks on our API collection as part of our testing process, and ensure that all changes and updates to the API collection are thoroughly tested and analyzed for coverage.

Overall, this feature would greatly improve our ability to maintain and optimize our API collection, and ensure that it meets the standard of OpenAI API

Sub tasks:

image
  • Relocate test runner to Cortex ( blocked until the refactor from dev team)
@Van-QA Van-QA added the type: feature request A new feature label Apr 9, 2024
@Van-QA Van-QA added the P1: important Important feature / fix label Apr 10, 2024
@0xSage
Copy link
Contributor

0xSage commented Apr 15, 2024

Thanks for this. Would be great to have a publicly viewable dashboard as well 🙏

@Van-QA Van-QA self-assigned this Apr 19, 2024
@Van-QA Van-QA reopened this Apr 25, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
P1: important Important feature / fix type: feature request A new feature
Projects
Status: In Progress
Development

Successfully merging a pull request may close this issue.

3 participants