Skip to content

POST multipart upload out of memory #510

@sasha-stratoscale

Description

@sasha-stratoscale

Hi, I have an issue with OOM while using multipart upload of huge binary files. This is due to the
def make_line_iter(stream, limit=None, buffer_size=10 * 1024):
code in werkzeug/werkzeug/wsgi.py.

Steps to reproduce:

Server side code:

@app.route('/', methods=['POST'])
def upload_file():
    file = request.files['file']
    return '''OK'''

client side code (using requests-toolbelt)

from requests_toolbelt import MultipartEncoder
import requests


m = MultipartEncoder(
    fields = {'file': ( 'filename', open( '/var/tmp/sasha_test/input.bin' ), 'application/octet-stream' )}
    )

r = requests.post( 'http://localhost:5000/', data = m,
                  headers = {'Content-Type': m.content_type})

print "Done!!! %s" % r

input bin is a large binary file created like this:

dd if=/dev/zero of=input.bin bs=1MiB count=1K

What happens is that the code searches for teh "\r\n" in order to find multipart-part "terminator". It saves in memory 2 last lines.
This causes OOM while uploading large files.

Thanks.

Metadata

Metadata

Assignees

Labels

Type

No type

Projects

No projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions