Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Genie v1 prod with own renderers #1

Closed
wants to merge 5 commits into from

Conversation

essenciary
Copy link

This setup produced the best performance on my machine.

@StatisticalMice
Copy link
Owner

I thought about the logging. In my opinion, logging is necessary for a production server, so it should be retained. If it makes this slower, then it is slower.

@StatisticalMice
Copy link
Owner

I found it easier to combine some of your changes with my local changes and commit them separately.

@essenciary
Copy link
Author

essenciary commented Jun 27, 2021

Oh course re logging. What matters is that:
a) there is choice (people can disable logging, log to file, log to services, etc)
b) when comparing various frameworks, they should run in similar conditions (eg number of threads, production settings if available, logging, etc). Taking things further, Genie already provides a lot more features out of the box - ex html(...) and json(...) both output the correct Content-Type while HTTP.jl seems to output text for both endpoints. So the HTTP.jl test should be extended to include the correct headers (this should not be optional/omitted when returning HTML or JSON responses).

@StatisticalMice
Copy link
Owner

I created a ticket about the difference between Genie.jl and HTTP.jl.

@StatisticalMice
Copy link
Owner

I'm not sure what to do about the logging in tests. Maybe it should be disabled.

@essenciary
Copy link
Author

Oh, and another thing re logging is that in production Genie logs to file (which is the right thing to do). So logging needs to be similar between frameworks (either disable file logging for Genie or enable for others).

Overall the best approach would be to start off with the requirements for the tests, and make sure every framework meets them. These could be:
1/ running in production environment (as offered by each framework)
2/ HTML, JSON and text responses (with proper content-type headers)
3/ Logging to the same backend (console and/or text file)

@essenciary
Copy link
Author

essenciary commented Jun 27, 2021

While setup up similarly (same output), logging should not make any difference, given that they use Julia's logger. Disabling should be fine (it will remove noise from both frameworks). Flask should run in the same conditions though in terms of logging.

If any framework makes optimizations for logging (eg batching writes or something) these would be then relevant, as they'd employ different/smarter strategies (so maybe Flask?)

@StatisticalMice
Copy link
Owner

Removing the logging sounds good, at least for the moment.

As to Flask, it’s not a good test target. I spent 15 minutes coding it, and the reason was debugging problems when running on a Mac. Flask would also be the one benefiting most from a nginx in front. Nginx won’t happen for a while at least.

@StatisticalMice
Copy link
Owner

I removed logging.

My plan is that I run the official benchmarks always on Google Cloud. I created terraform/ansible scripts that set up the environment. (They don't quite work yet; they install things but communication doesn't work possibly due to firewall rules.)

Then I'd have shell scripts that start one of the servers, or the goose attack client. They worked locally, but not yet on the cloud.

I'll close this PR as it's no longer serving a useful purpose, but further PRs are welcome.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

2 participants