-
-
Notifications
You must be signed in to change notification settings - Fork 1.7k
New issue
Have a question about this project? # for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “#”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? # to your account
Support chunked transfer encoding #1198
Support chunked transfer encoding #1198
Conversation
Thanks for working on this! |
There were some issues with our tests on Travis, you'll need to merge in master to fix the unrelated failing tests. |
CONTRIBUTING.rst
Outdated
@@ -54,7 +54,7 @@ Install Werkzeug in editable mode:: | |||
|
|||
Install the minimal test requirements:: | |||
|
|||
pip install pytest pytest-xprocess requests | |||
pip install pytest pytest-xprocess requests six |
Generally looks good. Needs some fixes for the test failures and I will look into what uses the content length currently. |
- Try to read multiple chunks until the buffer is filled - Add comments to make control flow more clear - Add doc comment for DechunkedInput
Required by RFC 2616, Section 4.4 See: https://www.greenbytes.de/tech/webdav/rfc2616.html#rfc.section.4.4
Pushed some updates:
|
werkzeug/formparser.py
Outdated
@@ -38,7 +38,7 @@ | |||
def default_stream_factory(total_content_length, filename, content_type, | |||
content_length=None): | |||
"""The stream factory that is used per default.""" | |||
if total_content_length > 1024 * 500: | |||
if total_content_length is None or total_content_length > 1024 * 500: |
This comment was marked as off-topic.
This comment was marked as off-topic.
Sorry, something went wrong.
This comment was marked as off-topic.
This comment was marked as off-topic.
Sorry, something went wrong.
This comment was marked as off-topic.
This comment was marked as off-topic.
Sorry, something went wrong.
This comment was marked as off-topic.
This comment was marked as off-topic.
Sorry, something went wrong.
This comment was marked as off-topic.
This comment was marked as off-topic.
Sorry, something went wrong.
* master: Revert "Allow chunk request" fix flake8 errors codecov needs argparse on 2.6 Fix redis tests
Alright, here you go. FWIW, I also ran some manual tests with Just one thing I'm still wondering about is whether we need to take care of infinite streams in |
This shouldn't affect other WSGI server support of chunked encoding, but thanks for testing it. #1126 was supposed to handle infinite streams by setting a hard limit on the content read in, but that was based on some incorrect assumptions from me. I can't recall any special handling in Gunicorn from when I was looking a while ago, but might be good to check. |
body in send instead of endheaders
I merged master again and fixed some 2.6 and style issues. Travis should pass now. |
Whoop, it's all green! 🎉 |
Would you mind looking at how this behaves with infinite streams (not chunked, but no content length) versus how Gunicorn behaves? |
Sure 👍 Please let me know if you or @mitsuhiko have a perspective on this, though. |
I will merge this in the current state now. I think we can look at the edge cases once we see how others are doing it. This is basically just the local dev setup anyways. |
This PR introduces support for requests with chunked transfer encoding to
WSGIRequestHandler
by wrapping the input in aDechunkedInput
. Chunks are unrolled in the following wayOnce a chunk of size zero is encountered, a final newline is read, and then the input is marked as
done
. All further requests toreadinto
will exit early.There is only a test for the success case. It needs to issue a manual HTTP to achieve a proper chunked multipart payload.
Note: For some reason, werkzeug crashes when parsing the response body if a
Content-Length
header is present. If you would like this fixed in this PR too, let me know. RFC 2616, Section 4.4 states that theContent-Length
header must be ignored with anyTransfer-Encoding
other thanidentity
.Fixes #1149