Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Create large data set for testing purposes #46

Open
bernardbaker opened this issue Apr 16, 2020 · 2 comments
Open

Create large data set for testing purposes #46

bernardbaker opened this issue Apr 16, 2020 · 2 comments
Assignees
Labels
testing something needed for test purposes
Milestone

Comments

@bernardbaker
Copy link
Collaborator

bernardbaker commented Apr 16, 2020

@un7c0rn I need a large data set for testing purposes.

Can you write a 🐍script that I could run?

The reason for this is because I didn't implement from concurrent.futures import ThreadPoolExecutor as you did in the python script firestore.py.

So we need to run tests to see how well the API processes cope.
And measure usage, per user with different sized data sets.

E.g. 1 x User makes a request of the API which processes 1,000 entries in the Firestore DB.

Some information on the Netlify quota for serverless functions:

  • Requests per month: 125,000
  • Runtime per month: 100 hours

By default, all serverless functions are deployed with:

  • us-east-1 AWS Lambda region
  • 1024MB of memory
  • 10 second execution limit

I know that Firestore is blazingly fast at processing.
But it would be good to have a dashboard if one isn't already available.

🌞

I think this could help web workers. I'll look at it tomorrow.

@bernardbaker bernardbaker added the testing something needed for test purposes label Apr 16, 2020
@bernardbaker bernardbaker added this to the version 1 milestone Apr 16, 2020
@un7c0rn
Copy link
Owner

un7c0rn commented Apr 17, 2020

@bernardbaker Sounds good can you provide a quick example of the format you need the test data to be in? I'm a little concerned about the Netlify quota. I'm assuming we can scale up requests for a fee as necessary? If we use GCP to hose the JS lambda functions we get 2M / month free: https://cloud.google.com/functions/pricing

That said, I'm also in favor of getting a working end-to-end testable MVP done and worrying about pricing optimizations as a second step.

@bernardbaker
Copy link
Collaborator Author

bernardbaker commented Apr 17, 2020 via email

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
testing something needed for test purposes
Projects
None yet
Development

No branches or pull requests

3 participants