Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

use chunked encoding for streaming responses with HTTP/1.1 #2091

Merged
merged 1 commit into from
Jan 28, 2022

Conversation

rawler
Copy link
Contributor

@rawler rawler commented Apr 19, 2021

This approach differs a bit from the approach in PR#1328, in that it is completely contained in the server without any change in the application. (As per RFC, no reliance on hop-by-hop headers). It is triggered if HTTP/1.1 is configured (such as by specifying a subclass of request_handler), at which point Connection: close behavior is always replaced with Transfer-Encoding: chunked. The rationale is that any spec-compliant HTTP client must support this, so it should not cause any unjustifiable regressions for clients. From what I've seen, this seems to also match behavior in other WSGI-servers to ensure the apps behave consistently across test/prod.

Checklist:

  • Add tests that demonstrate the correct behavior of the change. Tests should fail without the change.
  • Add or update relevant docs, in the docs folder and in code.
  • Add an entry in CHANGES.rst summarizing the change and linking to the issue.
  • Add .. versionchanged:: entries in any relevant code docs.
  • Run pre-commit hooks and fix any issues.
  • Run pytest and tox, no tests failed.

@rawler
Copy link
Contributor Author

rawler commented Apr 19, 2021

I'm not sure if there are any docs that needs to change as an effect of this. If so, I need a pointer to the right place.

@rawler rawler force-pushed the chunked-encoding branch 2 times, most recently from 9b059f7 to 2f252e2 Compare April 19, 2021 19:42
@davidism davidism closed this Apr 24, 2021
@davidism davidism reopened this Apr 24, 2021
@davidism
Copy link
Member

davidism commented Jan 28, 2022

While playing around with this and making HTTP/1.1 the default, I've noticed that the debugger page can be inconsistent about loading all the resources when in HTTP/1.1 keep-alive mode, every 3 reloads or so it gets stuck loading the font file. As soon as I go back to HTTP/1.0 mode, where every connection is always closed after a single request, everything loads. Or if I enable the threaded server, everything loads.

This did lead me to notice that the debugger could use send_file and set etags for the resource files, and once they're cached there's no issues with the page load. Additionally, the initial debugger page could be modified to return the content-length header so it didn't act like a streaming response for no reason.

In my own projects I've been leaning towards using Gunicorn or another WSGI server in development since it just generally supports more features than the Werkzeug dev server ever will. You can even enable Werkzeug's debugger, here's an example with Flask:

from flask import Flask
from werkzeug.debug import DebuggedApplication

app = Flask(__name__)

if app.env == "development":
    app.wsgi_app = DebuggedApplication(app, evalex=True)
$ export FLASK_ENV=development
$ gunicorn -b localhost:5000 --reload example:app

@davidism
Copy link
Member

OK, after a bit more experimenting, I think it's safe to enable HTTP/1.1 if threaded or processes is used.

For HTTP/1.0, the only way to stream content is to rely on
the closing of connection to convey end of response. This can typically
lead to silently truncated content in HTTP client implementations.

For HTTP/1.1, the better way is to use `Transfer-Encoding: chunked` which
has a builtin mechanism to always terminate response with a 0-sized chunk,
marking end of stream.
@davidism davidism added this to the 2.1.0 milestone Jan 28, 2022
@davidism
Copy link
Member

I changed

if self.server.passthrough_errors or (status_sent and chunk_response):
    raise

to

if self.server.passthrough_errors:
    raise

if status_sent is not None and chunk_response:
    self.close_connection = True

So that the dev server's traceback logging is used instead of raising the error up to the base server.

@davidism davidism changed the title serving: Use chunked-encoding for progressive content under HTTP/1.1 use chunked encoding for streaming responses with HTTP/1.1 Jan 28, 2022
@davidism davidism merged commit 6631088 into pallets:main Jan 28, 2022
@davidism davidism mentioned this pull request Jan 29, 2022
@github-actions github-actions bot locked as resolved and limited conversation to collaborators Feb 12, 2022
@rawler rawler deleted the chunked-encoding branch May 4, 2023 14:25
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Chunked response support for werkzeug.serving Add support for chunked responses on test server
2 participants