Skip to content
New issue

Have a question about this project? # for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “#”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? # to your account

How to handle canceled requests? #658

Closed
danechitoaie opened this issue Apr 5, 2022 · 11 comments · Fixed by fastify/fastify#3914
Closed

How to handle canceled requests? #658

danechitoaie opened this issue Apr 5, 2022 · 11 comments · Fixed by fastify/fastify#3914
Labels
help wanted Extra attention is needed

Comments

@danechitoaie
Copy link

Hi,
Any idea how to properly handle aborted requests by the clients and abort other requests further down the line?
I have an endpoint in my Fastify app that acts as a middleware where users make requests and then I get some data from some other external API. Now when a user aborts the request done from him to my middleware I need to catch that so that I can also abort the request that I'm doing to the other API.

I have the following code but when I do curl / postman to my 127.0.0.1:8000/data endpoint I instantly get 204 response always.

Any idea what's wrong with the code bellow? As I would expect to get to the request.raw.on("close", () => { only if I close the request from client before waiting for the response, but somehow it always gets there instantly and my external fetch request never finishes executing.

export default async (app: FastifyInstance, options: FastifyPluginOptions) => {
    app.post("/data", { schema }, async (request: FastifyRequest, reply: FastifyReply) => {
        const abortController = new AbortController();

        request.raw
            .on("close", () => {
                abortController.abort();
            })
            .on("error", () => {
                abortController.abort();
            });

        const {
            body: { data },
        } = request;
        try {
            const res = await fetch("https://....", { signal: abortController.signal });
            return await res.json();
        } catch (err: any) {
            if (abortController.signal.aborted) {
                reply.status(204).send();
                return;
            }

            reply.internalServerError();
            return;
        }
    });
};
@danechitoaie danechitoaie added the help wanted Extra attention is needed label Apr 5, 2022
@metcoder95
Copy link
Member

Hey! I was actually playing a little and came with this approach:

const { once } = require('events');
const { setTimeout } = require('timers/promises');
const fastify = require('fastify');

const app = fastify({ logger: true });

app.get('/', async (request, reply) => {
  const controller = new AbortController();

  // Simulates an API that supports AbortSignal
  async function apiCall({ signal }) {
    try {
      await Promise.race([once(signal, 'abort'), setTimeout(5000)]);
      return 'hello!';
    } catch (err) {
      return null;
    }
  }

  async function wrapper(signal, req) {
    try {
      // Will reject as the Socket will be closed abruptly causing a ECONNRESET error
      await once(req.raw, 'close');
    } catch (err) {}

    signal.abort();
  }

  request.log.info('start');
  const res = await Promise.race([
    wrapper(controller, request),
    apiCall({ signal: controller.signal }),
  ]);

  if (res != null) reply.status(200).send('fulfilled');
  //  Fastify doesn't support a AsyncFunction that returns undefined even when the
  //  incomming connection is already closed
  // Fastify doesn't handle a connection that was closed abruptly
  else reply.status(400).send('');

  request.log.info('end');
});

app.listen(3000);

In your example, my assumption is that you're running a race condition against the fetch request, which most likely will always resolve faster than your finger.
My suggestion would be two things:

  1. Use always the ee.once to avoid possible memory leaks
  2. Take advantage that you're already wrapping everything into a Promise (by using async/await) and also use the Promisified versions of the APIs

@mcollina
Copy link
Member

mcollina commented Apr 6, 2022

@metcoder95 this would be a pretty good plugin.

@metcoder95
Copy link
Member

Sure thing, I'll prepare one soon. Is anything else to consider for the API of the plugin or the plugin core itself?

cc: @danechitoaie
Happy to hear your thoughts as well, as you faced a need for this case 🙂

@danechitoaie
Copy link
Author

@metcoder95 Sure. Just need a bit of time to test this and I'll come back with feedback. Thank you.

@danechitoaie
Copy link
Author

I still have trouble getting this to work.
Here is my updated code based on your example:

export default async (app: FastifyInstance, options: FastifyPluginOptions) => {
    async function waitForClientSocketClose(
        controller: AbortController,
        request: FastifyRequest
    ): Promise<CompletionResponse> {
        console.log("Waiting for client socket close...");

        try {
            await once(request.raw, "close");
        } catch (err: any) {
            console.error(err);
        }

        controller.abort();

        console.log("Client socket closed!");

        return {
            status: CompletionResponseStatus.ABORTED,
        };
    }

    async function waitForCompletionResponse(
        controller: AbortController,
        request: CompletionRequest
    ): Promise<CompletionResponse> {
        console.log("Starting req...");

        try {
            const res = await fetch("https://www.google.com", {
                method: "GET",
                signal: controller.signal,
            }).then((r) => r.text());

            console.log("Got req response!");

            return {
                status: CompletionResponseStatus.OK,
                choices: [],
            };
        } catch (err: any) {
            console.log("Got req error!", err);

            if (controller.signal.aborted) {
                return {
                    status: CompletionResponseStatus.ABORTED,
                };
            }

            return {
                status: CompletionResponseStatus.ERROR,
            };
        }
    }

    app.post("/data", { schema }, async (request: FastifyRequest, reply: FastifyReply) => {
        const abortController = new AbortController();

        const result = await Promise.race([
            waitForClientSocketClose(abortController, request),
            waitForCompletionResponse(abortController, request),
        ]);

        console.log(result);

        if (result.status === CompletionResponseStatus.ERROR) {
            throw app.httpErrors.internalServerError();
        }

        if (result.status === CompletionResponseStatus.ABORTED) {
            reply.code(204);
            return "";
        }

        return {
            choices: result.choices,
        };
    });
};

When I make a curl request to this endpoint

curl 'http://127.0.0.1:8000/api/data' -H 'Content-Type: application/json' -d '{"hello":"world"}' -v
*   Trying 127.0.0.1:8000...
* Connected to 127.0.0.1 (127.0.0.1) port 8000 (#0)
> POST /api/data HTTP/1.1
> Host: 127.0.0.1:8000
> User-Agent: curl/7.79.1
> Accept: */*
> Content-Type: application/json
>
* Mark bundle as not supporting multiuse
< HTTP/1.1 204 No Content
< Content-Security-Policy: default-src 'self';base-uri 'self';block-all-mixed-content;font-src 'self' https: data:;form-action 'self';frame-ancestors 'self';img-src 'self' data:;object-src 'none';script-src 'self' 'nonce-c883885c149031cce633d216d93780c5';script-src-attr 'none';style-src 'self' https: 'unsafe-inline' 'nonce-6f9146da613c8dffdbbab4c8793e0f3d';upgrade-insecure-requests
< Cross-Origin-Embedder-Policy: require-corp
< Cross-Origin-Opener-Policy: same-origin
< Cross-Origin-Resource-Policy: same-origin
< X-DNS-Prefetch-Control: off
< Expect-CT: max-age=0
< X-Frame-Options: SAMEORIGIN
< Strict-Transport-Security: max-age=15552000; includeSubDomains
< X-Download-Options: noopen
< X-Content-Type-Options: nosniff
< Origin-Agent-Cluster: ?1
< X-Permitted-Cross-Domain-Policies: none
< Referrer-Policy: no-referrer
< X-XSS-Protection: 0
< content-type: text/plain; charset=utf-8
< vary: accept-encoding
< content-length: 0
< Date: Thu, 07 Apr 2022 10:50:45 GMT
< Connection: keep-alive
< Keep-Alive: timeout=5
<
* Connection #0 to host 127.0.0.1 left intact

And the console log is:

Waiting for client socket close...
Starting req...
Client socket closed!
{ status: 'ABORTED' }
Got req error! AbortError: GET https://www.google.com aborted

So somehow await once(request.raw, "close"); the socket is closed instantly? Doesn't even wait for a response?
No idea what's going on.

@danechitoaie
Copy link
Author

Seems like we need to listen for await once(request.raw.socket, "close"); instead of await once(request.raw, "close");
This way it seems to be working.

@metcoder95
Copy link
Member

Hmm, I'm wondering if is not for this specific part:

        try {
            await once(request.raw, "close");
        } catch (err: any) {
            console.error(err);
        }

        controller.abort();

        console.log("Client socket closed!");

        return {
            status: CompletionResponseStatus.ABORTED,
        };

But that's ok, I can take care of the details in the plugin, seems the approach works quite well. Is there anything in specific that you think will be helpful for the plugin? e.g. settings, options, etc.

@danechitoaie
Copy link
Author

danechitoaie commented Apr 7, 2022

Because I'm returning?

return {
    status: CompletionResponseStatus.ABORTED,
};

Should't the code never get there? As it will remain awaiting for await once(request.raw, "close"); until I get a response on the other promise and because I return there that promise is returned by Promise.race.

I can't think if any other options. Basically I now implemented this as a request decorator.

app.decorateRequest("waitForSocketClose", async function (controller: AbortController) {
    try {
        await once(this.raw.socket, "close");
    } catch (err: any) {}

    controller.abort();
});

And in my route I have Promise.race([request.waitForSocketClose(controller), myOtherFnction()])

@metcoder95
Copy link
Member

Should't the code never get there? As it will remain awaiting for await once(request.raw, "close"); until I get a response on the other promise and because I return there that promise is returned by Promise.race.

Can you make a reproducible code without TS? I think your implementation has something that I might be missing 🤔

And in my route I have Promise.race([request.waitForSocketClose(controller), myOtherFnction()])

Got it, thanks. I have a starting point!

@metcoder95
Copy link
Member

I'm making some advances in a form of a plugin. I'm planning to expose it in a form of a Promise that rejects whether the Request has been canceled, and do nothing otherwise.

Planning to finish it by this Weekend 🙂

@metcoder95
Copy link
Member

Hi, sorry for the delay, a couple of things happened in the middle.
I'm about to finish the first version of the plugin, here's the PR please take a look and any feedback is welcome. I'll aim for a proper release once more tests are added. Thanks 🙂

# for free to join this conversation on GitHub. Already have an account? # to comment
Labels
help wanted Extra attention is needed
Projects
None yet
Development

Successfully merging a pull request may close this issue.

3 participants