Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

VCR shouldn't record 500, 408, 429, when running challenges #4461

Closed
1 task done
waynehamadi opened this issue May 29, 2023 · 2 comments
Closed
1 task done

VCR shouldn't record 500, 408, 429, when running challenges #4461

waynehamadi opened this issue May 29, 2023 · 2 comments

Comments

@waynehamadi
Copy link
Contributor

Duplicates

  • I have searched the existing issues

Summary 💡

Reminder. VCR is used to record HTTP interactions, so that when the same HTTP request is about to be made, we use VCR instead.

This makes tests cheaper and faster, without much drawbacks, except maintaining VCR ;-)

Currently VCR saves the 500, 408, 429.
This means if we get rate limited while recording a challenge, it will behave like that the next time.

TODO: there is a challenge decorator, find a way to change VCR's behavior so that it stops recording anything that's not a 200, if a function uses this decorator.

Not to do: do not ignore all non 200 error in VCR, some of our integration tests need the 500 errors.

Examples 🌈

No response

Motivation 🔦

No response

@erik-megarad
Copy link
Contributor

I opened a PR, #4469, which adds the change.

Unfortunately, due to the way that request filtering in pytest-vcr works we don't have access to the pytest context so we don't know if the challenge marker is set. Instead, this applies filtering to all tests in the challenges module. I think this is the only way to do it.

erik-megarad added a commit to erik-megarad/Auto-GPT that referenced this issue May 29, 2023
@waynehamadi
Copy link
Contributor Author

@erik-megarad Great thank you!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants