Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

C1 not validating and other race conditions #40

Closed
nickevansuk opened this issue Mar 2, 2020 · 1 comment
Closed

C1 not validating and other race conditions #40

nickevansuk opened this issue Mar 2, 2020 · 1 comment
Labels
bug Something isn't working proj-2-continued Work continuing on from Project 2 Contract

Comments

@nickevansuk
Copy link
Collaborator

nickevansuk commented Mar 2, 2020

If you run npm test -- test/flows/book-only/ against the reference implementation on master now you'll see the issues for C1 validation - it's still not running. Very strange!

And if you try pointing it at https://everyoneactivebookingfacade-test.azurewebsites.net/, then many of the endpoints don't validate

Perhaps there's some kind of race condition or similar happening here?

I think I've also might have identified the cause of one of the "race conditions". It looks like the flows are lacking some defensive programming for cases where a prerequisite call fails - specifically those calls that hit the microservice and wait for the feed - getMatch and getOrder. I've made a quick fix for getMatch, but we need something similar for getOrder.

It's also probably worth stopping all the main calls C1, C2, B etc executing if the opportunity couldn't be retrieved from the feed for some reason, as it'll just hit the booking system with loads of broken requests otherwise after the timeout (which may cause more race conditions?). I've tried to do this here but haven't quite managed it: ec96f67#diff-7ec38b867eb5885e1977c92e06a66053

When this issue is closed the tests should be robust regardless of the response times of the endpoints.

@nickevansuk nickevansuk added bug Something isn't working proj-2-continued Work continuing on from Project 2 Contract labels Mar 2, 2020
@ylt
Copy link
Member

ylt commented Mar 4, 2020

"I think I've also might have identified the cause of one of the "race conditions". It looks like the flows are lacking some defensive programming for cases where a prerequisite call fails - specifically those calls that hit the microservice and wait for the feed "

Yeah, looks like the case. Usually exceptions would be relied on for this, but chakram itself doesn't raise exceptions for non-200's.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working proj-2-continued Work continuing on from Project 2 Contract
Projects
None yet
Development

No branches or pull requests

3 participants