-
-
Notifications
You must be signed in to change notification settings - Fork 483
Streaming APIs #919
New issue
Have a question about this project? # for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “#”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? # to your account
Comments
@oliwarner that's already possible something like this (pseudo code): from django.http import StreamingHttpResponse
@api.get("/sse")
def sse_stream(request):
def event_stream():
while True:
# Simulate data updates; you can replace this with actual data
data = f"data: {str(time.time())}\n\n"
yield data
time.sleep(1)
response = StreamingHttpResponse(event_stream(), content_type="text/event-stream")
response['Cache-Control'] = 'no-cache'
return response Client: <!DOCTYPE html>
<html lang="en">
<head>
<meta charset="UTF-8">
<title>SSE Client</title>
</head>
<body>
<h1>SSE Updates:</h1>
<div id="sse-data"></div>
<script>
const sseData = document.getElementById('sse-data');
const eventSource = new EventSource('/api/sse');
eventSource.onmessage = (event) => {
sseData.innerHTML += event.data + '<br>';
};
eventSource.onerror = (error) => {
console.error('EventSource failed:', error);
eventSource.close();
};
</script>
</body>
</html> |
It'd be cool to have a |
@vitalik Your code makes sense but isn't that just a Django request/response? I think @OtherBarry has the right idea but I'd go further and bake all that into If we limited this to Server-Sent Events, I'd want JSON encoding using the same schema system Ninja uses. Something like:
That's packing a lot of magic into a decorator. This gets a whole lot more useful with Django 5 (which can have async signal receivers) so something like this could hook deeply into standard model flow in a few lines. It is absolutely something I could do without Ninja. But where's the fun in that? |
well the main problem here is that OpenAPI does not want to support SSE in schema - OAI/OpenAPI-Specification#396 and we would like to be fully OpenAPI compatible but at the end of the day - yeah - it's only a http-response |
Oh well. That's fair enough. Thanks for considering it. |
Allow long-lived connections to API endpoints which periodically emit data and need low-latency UI updates. Chat, gaming, monitoring, etc, etc.
I realise there are a few ways to handle this. I'm polling at the moment and that works but when you're working on the frontend, and you're seeing hundreds of polling requests tick up the devtools console, it feels kinda wrong; claggy. If you lower the rate, latency suffers. If you use websockets/Channels, you lose pydantic's data handling. You could combine the two (notifications through Channels, updates through REST) but now I want to throw my computer out the window. I like simple.
I've only had a cursory look at the code, so I'm sure it's harder than I'm expecting, but this feels like it might be something fairly easy to sub in. Django has ASGI handlers. Ninja does everything it can to be async. The hard bit seems to be swapping
JsonReponse
for a JSON-friendly version ofStreamingHttpResponse
and letting the connection hang.I think this would be a valuable alternative to Channels for streaming data.
The text was updated successfully, but these errors were encountered: