Skip to content
New issue

Have a question about this project? # for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “#”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? # to your account

Throughput of proxying compressed data with fetch and Deno.serve has deteriorated #25798

Closed
magurotuna opened this issue Sep 22, 2024 · 0 comments · Fixed by #25806
Closed

Throughput of proxying compressed data with fetch and Deno.serve has deteriorated #25798

magurotuna opened this issue Sep 22, 2024 · 0 comments · Fixed by #25806

Comments

@magurotuna
Copy link
Member

Description

Let's take a look at the following simple script:

Deno.serve({ port: PORT }, async (_req) => {
  const url = `http://localhost:${UPSTREAM_PORT}`;
  const resp = await fetch(url);
  return new Response(resp.body, { headers: resp.headers });
});

This script fetches the data from the upstream server and just forwards it to the end client.

The problem is that only when the upstream server serves data in a compressed format (e.g. brotli), the throughput becomes significantly worse. This increased throughput can be observed in v1.45.3 and later.

brotli

If the upstream server serves uncompressed data, the throughput hasn't decreased.

text

Reproducer

I have created a repository which contains several components to measure the throughput in this specific setting.

https://github.com/magurotuna/deno_fetch_decompression_throughput

Additional info

Version: Deno 1.45.3 and later

# for free to join this conversation on GitHub. Already have an account? # to comment
Labels
None yet
Projects
None yet
1 participant