Skip to content

Streaming APIs #919

New issue

Have a question about this project? # for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “#”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? # to your account

Closed
oliwarner opened this issue Nov 13, 2023 · 5 comments
Closed

Streaming APIs #919

oliwarner opened this issue Nov 13, 2023 · 5 comments

Comments

@oliwarner
Copy link

oliwarner commented Nov 13, 2023

Allow long-lived connections to API endpoints which periodically emit data and need low-latency UI updates. Chat, gaming, monitoring, etc, etc.

I realise there are a few ways to handle this. I'm polling at the moment and that works but when you're working on the frontend, and you're seeing hundreds of polling requests tick up the devtools console, it feels kinda wrong; claggy. If you lower the rate, latency suffers. If you use websockets/Channels, you lose pydantic's data handling. You could combine the two (notifications through Channels, updates through REST) but now I want to throw my computer out the window. I like simple.

I've only had a cursory look at the code, so I'm sure it's harder than I'm expecting, but this feels like it might be something fairly easy to sub in. Django has ASGI handlers. Ninja does everything it can to be async. The hard bit seems to be swapping JsonReponse for a JSON-friendly version of StreamingHttpResponse and letting the connection hang.

I think this would be a valuable alternative to Channels for streaming data.

@vitalik
Copy link
Owner

vitalik commented Nov 13, 2023

@oliwarner that's already possible

something like this (pseudo code):

from django.http import StreamingHttpResponse


@api.get("/sse")
def sse_stream(request):
    def event_stream():
        while True:
            # Simulate data updates; you can replace this with actual data
            data = f"data: {str(time.time())}\n\n"
            yield data
            time.sleep(1)

    response = StreamingHttpResponse(event_stream(), content_type="text/event-stream")
    response['Cache-Control'] = 'no-cache'
    return response

Client:

<!DOCTYPE html>
<html lang="en">
<head>
    <meta charset="UTF-8">
    <title>SSE Client</title>
</head>
<body>
    <h1>SSE Updates:</h1>
    <div id="sse-data"></div>

    <script>
        const sseData = document.getElementById('sse-data');
        const eventSource = new EventSource('/api/sse');

        eventSource.onmessage = (event) => {
            sseData.innerHTML += event.data + '<br>';
        };

        eventSource.onerror = (error) => {
            console.error('EventSource failed:', error);
            eventSource.close();
        };
    </script>
</body>
</html>

@OtherBarry
Copy link
Contributor

It'd be cool to have a NinjaStreamingHttpResponse or something, where your generator can return objects/querysets/whatever and have it automatically converted to json, in the same way a normal ninja endpoint works.

@oliwarner
Copy link
Author

oliwarner commented Nov 14, 2023

@vitalik Your code makes sense but isn't that just a Django request/response? I think @OtherBarry has the right idea but I'd go further and bake all that into @api.… decorator. This would wrap a function that returns an async generator that's yielding objects.

If we limited this to Server-Sent Events, I'd want JSON encoding using the same schema system Ninja uses. Something like:

@api.sse('/api/#s', response=Stream[Event, UserSchema])
async def #s(request):
    async def subscribe():
        pass  # ... async event loop that yields out objects ...
    return subscribe()  # Ninja churns objects into JSON-encoded events

That's packing a lot of magic into a decorator. This gets a whole lot more useful with Django 5 (which can have async signal receivers) so something like this could hook deeply into standard model flow in a few lines.

It is absolutely something I could do without Ninja. But where's the fun in that?

@vitalik
Copy link
Owner

vitalik commented Nov 14, 2023

@oliwarner @OtherBarry

well the main problem here is that OpenAPI does not want to support SSE in schema - OAI/OpenAPI-Specification#396

and we would like to be fully OpenAPI compatible

but at the end of the day - yeah - it's only a http-response

@oliwarner
Copy link
Author

Oh well. That's fair enough. Thanks for considering it.

@oliwarner oliwarner closed this as not planned Won't fix, can't repro, duplicate, stale Nov 15, 2023
# for free to join this conversation on GitHub. Already have an account? # to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants