-
-
Notifications
You must be signed in to change notification settings - Fork 196
download of compressed files on arm-plattforms #110
New issue
Have a question about this project? # for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “#”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? # to your account
Comments
I'm also having this issue, check out the original issue (nodejs/node#37053) I created, which includes a basic setup for reproducing the issue :) |
Thanks to undici issue: nodejs/undici#803 . I finally know what's going on there. There is a truncating conversion of |
On 32bit platforms `size_t` is essentially `uint32_t` (or at times even meager `uint16_t`). Loading `uint64_t` field value into `size_t` on these platforms would truncate the high bits and leave only the low 32 (16) bits in place. This leads to various interesting errors in downstream modules. See: - nodejs/llhttp#110 - nodejs/undici#803
Fix: nodejs/llparse#44 |
On 32bit platforms `size_t` is essentially `uint32_t` (or at times even meager `uint16_t`). Loading `uint64_t` field value into `size_t` on these platforms would truncate the high bits and leave only the low 32 (16) bits in place. This leads to various interesting errors in downstream modules. See: - nodejs/llhttp#110 - nodejs/undici#803 This patch makes all field loads go into their respective types. Truncation doesn't happen in this case because C coercion rules will cast both types to the largest necessary datatype to hold either of them.
On 32bit platforms `size_t` is essentially `uint32_t` (or at times even meager `uint16_t`). Loading `uint64_t` field value into `size_t` on these platforms would truncate the high bits and leave only the low 32 (16) bits in place. This leads to various interesting errors in downstream modules. See: - nodejs/llhttp#110 - nodejs/undici#803 This patch makes all field loads go into their respective types. Truncation doesn't happen in this case because C coercion rules will cast both types to the largest necessary datatype to hold either of them. PR-URL: #44 Reviewed-By: Matteo Collina <matteo.collina@gmail.com> Reviewed-By: Daniele Belardi <dwon.dnl@gmail.com> Reviewed-By: Robert Nagy <ronagy@icloud.com> Reviewed-By: Rich Trott <rtrott@gmail.com> Reviewed-By: Anna Henningsen <anna@addaleax.net>
PR for node.js: nodejs/node#38665 |
for my specific problem, the issue is not fixed. using node@12.22.1
using node@14.x
using node@16.3.0
|
Can you provide some demo code or the link it happens with? Would love to try it out, given that it worked for me with a 12+ GB file on Node 16.3.0... |
you can use the code from my first post. because the file is non-public, i could send an email with the url. is this ok for you? |
Maybe you could try reducing your code further. Is there a way to use that azure library to get a (temorary) download URL for that file? If yes, you could try download the file from that URL, using the built-in That is, unless you already tried that ^^ |
hi, ive a problem using tools like @azure/storage-blob or node-fetch on armv7 devices with any node-version, which uses llhttp instead of the legacy parser. when downloading a compressed file, the response is always decompressed what cause an data corruption failure. i can reproduce the issue using whatever @azure/storage-blob or node-fetch directly.
the interessting thing is, that the issue only ocurres on arm(v7)-plattforms. running the same code on an x64/86 plattform, doesnt cause any issues.
changin the node-version to 11.x or 12.x, with "http-parser=legacy" also doesnt cause any issues
code sample:
result on armv7:
result on x64
The text was updated successfully, but these errors were encountered: