#35415: Adding content_type to StreamingHttpResponse on Linux causes memory
error
after streaming around 1GB-2GB of data.
-----------------------------------------+------------------------
Reporter: LouisB12345 | Owner: nobody
Type: Bug | Status: new
Component: HTTP handling | Version: 5.0
Severity: Normal | Keywords:
Triage Stage: Unreviewed | Has patch: 0
Needs documentation: 0 | Needs tests: 0
Patch needs improvement: 0 | Easy pickings: 0
UI/UX: 0 |
-----------------------------------------+------------------------
This bug took a few days to work out and was extremely annoying.
I'm running Django under ASGI and im using was trying to use to stream a
on-the-fly zip-file using the StreamingHttpResponse, note: i dont know if
this occurs under WSGI.
I'm developing on a Windows operating system and after I deemed the code
to be functional i tried it on the Linux vm i have set up.
I noticed that the download would fail almost everytime. The cause was
that the memory usage kept increasing after some time, usually after
around 1-2GB was streamed. So after eliminating multiple factors I came to
the conclusion that when i add content_type= withing the
StreamingHttpResponse this bug occurs.
You can replicate the bug on Linux with the code below, if you remove the
content_type it works as expected but with it the bug occurs.
{{{
from os.path import basename
import logging
import aiofiles
from django.contrib.auth.mixins import LoginRequiredMixin
from django.http import StreamingHttpResponse
from django.views import View
from guppy import hpy
H = hpy()
LOGGER = logging.getLogger(__name__)
class DownloadSelectedFiles(LoginRequiredMixin, View):
def get(self, request) -> StreamingHttpResponse:
file_name = "f.txt"
response = StreamingHttpResponse(file_data(file_name),
content_type="application/octet-stream")
response["Content-Disposition"] = f'attachment;
filename="{basename(file_name)}"'
return response
async def file_data(file_path):
async with aiofiles.open(file_path, "rb") as f:
LOGGER.info(f"Current threads are {threading.active_count()}
opening file {file_path}\n{H.heap()}")
teller = 0
while chunk := await f.read(65536):
teller += 1
await asyncio.sleep(0)
if teller % 1000 == 0:
LOGGER.info(f"Current threads are
{threading.active_count()} yielding chunk nr.{teller}\n{H.heap()}")
yield chunk
}}}
I have some images of the output of the Logs to show the difference.
--
Ticket URL: <https://code.djangoproject.com/ticket/35415>
Django <https://code.djangoproject.com/>
The Web framework for perfectionists with deadlines.
--
You received this message because you are subscribed to the Google Groups
"Django updates" group.
To unsubscribe from this group and stop receiving emails from it, send an email
to [email protected].
To view this discussion on the web visit
https://groups.google.com/d/msgid/django-updates/0107018f2e706881-9c5cd35a-9273-455c-9ef4-9129448c3f50-000000%40eu-central-1.amazonses.com.