Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Cache calls in S3 and return the last result if there is a 500- API #2749

Closed
4 of 5 tasks
jwchumley opened this issue Nov 14, 2017 · 5 comments
Closed
4 of 5 tasks

Cache calls in S3 and return the last result if there is a 500- API #2749

jwchumley opened this issue Nov 14, 2017 · 5 comments

Comments

@jwchumley
Copy link
Contributor

jwchumley commented Nov 14, 2017

Try to get more resiliency in the API (for temporary elastic search outage for example) to minimize impact on users.

Use Flask Hook after requests. And a hook for custom 500 error handling. After successful request, save response to S3. If 500 error return the last successful request and send alarm.

  • Background research on how hooks in Flask work.
  • Read up on Decorators.
  • After that check back with 18F on how Hooks would work here using S3.
  • Make hook and save to S3
  • Retrieve

(Save error handling for next sprint)

@PaulClark2 PaulClark2 added this to the Sprint 4.4 milestone Nov 14, 2017
@AmyKort AmyKort modified the milestones: Sprint 4.4, RBS 1 (Reliability, stability and bugs) Nov 21, 2017
@LindsayYoung
Copy link
Contributor

@pkfec
Copy link
Contributor

pkfec commented Dec 21, 2017

documenting the steps on how to cache the web requests and upload web to s3:

  • Extend the after request to print the results of the request. (remove later)
  • use json.dumps if the request is not already in json
  • save that file locally- make a tmp folder (remove later)
  • replace save function with a boto3 function that you can write as webservices/util
  • you will want all the values to be environment variables (they should already exist so look at the existing code )
  • make this into a function you call instead of writing to local disk
  • name the file with a folder prefix cached-calls/
  • s3.upload_file("example.txt", BUCKET_NAME, "example-cn.txt")
  • try that out locally with the dev s3 bucket, look up the vars and see if you can post.
  • name files something deterministic (should be an example in downloads)
  • when that is successful add it to the files that don't get deleted.
  • that is webservices/tasks/download.py in the clear_bucket() function
  • we will want to think about how long to keep the files- "expiration date in AWS"**

@pkfec
Copy link
Contributor

pkfec commented Dec 21, 2017

Completed the upload section of cache calls (all web requests) and upload to s3.
I have submitted a PR for @LindsayYoung to review. Started working on retrieve the uploaded cache calls for below error codes:

  • 500 Internal Server Error
  • 502 Bad Gateway
  • 503 Service Unavailable
  • 504 Gateway Timeout

@AmyKort
Copy link

AmyKort commented Feb 15, 2018

What's the status of this issue?

@AmyKort
Copy link

AmyKort commented Feb 16, 2018

We've completed work described in this ticket. Work on this issue picks up at #2913

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

6 participants