-
-
Notifications
You must be signed in to change notification settings - Fork 106
other fields lost when upload a large file #252
New issue
Have a question about this project? # for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “#”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? # to your account
Comments
Can you provide steps to reproduce? We need a full, runnable example. |
Here is the sample const fastify = require('fastify')()
const fs = require('fs')
const util = require('util')
const path = require('path')
const { pipeline } = require('stream')
const pump = util.promisify(pipeline)
fastify.register(require('fastify-multipart'))
fastify.post('/', async function (req, reply) {
const data = await req.file()
const fields = data.fields
console.log(fields)
console.log(fields.k1)
console.log(fields.k2)
await pump(data.file, fs.createWriteStream('tmp_file'))
reply.send()
})
fastify.listen(3000, err => {
if (err) throw err
console.log(`server listening on ${fastify.server.address().port}`)
}) We also create two files, one is small, another is big: $ fallocate -l 10KB SMALL_FILE
$ fallocate -l 100MB BIG_FILE When we upload $ curl -vF 'file=@SMALL_FILE' -F 'k1=v1' -F 'k2=v2' localhost:3000 console output: # ======== console.log(fields) ==============
<ref *1> {
file: {
fieldname: 'file',
filename: 'SMALL_FILE',
encoding: '7bit',
mimetype: 'application/octet-stream',
file: FileStream {
_readableState: [ReadableState],
_events: [Object: null prototype],
_eventsCount: 2,
_maxListeners: undefined,
truncated: false,
_read: [Function (anonymous)],
[Symbol(kCapture)]: false
},
fields: [Circular *1],
_buf: null,
toBuffer: [AsyncFunction: toBuffer]
},
k1: {
fieldname: 'k1',
value: 'v1',
fieldnameTruncated: false,
valueTruncated: false,
fields: [Circular *1]
},
k2: {
fieldname: 'k2',
value: 'v2',
fieldnameTruncated: false,
valueTruncated: false,
fields: [Circular *1]
}
}
# ======== console.log(fields.k1) ==============
<ref *2> {
fieldname: 'k1',
value: 'v1',
fieldnameTruncated: false,
valueTruncated: false,
fields: <ref *1> {
file: {
fieldname: 'file',
filename: 'SMALL_FILE',
encoding: '7bit',
mimetype: 'application/octet-stream',
file: [FileStream],
fields: [Circular *1],
_buf: null,
toBuffer: [AsyncFunction: toBuffer]
},
k1: [Circular *2],
k2: {
fieldname: 'k2',
value: 'v2',
fieldnameTruncated: false,
valueTruncated: false,
fields: [Circular *1]
}
}
}
# ======== console.log(fields.k2) ==============
<ref *2> {
fieldname: 'k2',
value: 'v2',
fieldnameTruncated: false,
valueTruncated: false,
fields: <ref *1> {
file: {
fieldname: 'file',
filename: 'SMALL_FILE',
encoding: '7bit',
mimetype: 'application/octet-stream',
file: [FileStream],
fields: [Circular *1],
_buf: null,
toBuffer: [AsyncFunction: toBuffer]
},
k1: {
fieldname: 'k1',
value: 'v1',
fieldnameTruncated: false,
valueTruncated: false,
fields: [Circular *1]
},
k2: [Circular *2]
}
} But when we upload $ curl -vF 'file=@BIG_FILE' -F 'k1=v1' -F 'k2=v2' localhost:3000 console output is: # ======== console.log(fields) ==============
<ref *1> {
file: {
fieldname: 'file',
filename: 'BIG_FILE',
encoding: '7bit',
mimetype: 'application/octet-stream',
file: FileStream {
_readableState: [ReadableState],
_events: [Object: null prototype],
_eventsCount: 2,
_maxListeners: undefined,
truncated: false,
_read: [Function (anonymous)],
[Symbol(kCapture)]: false
},
fields: [Circular *1],
_buf: null,
toBuffer: [AsyncFunction: toBuffer]
}
}
# ======== console.log(fields.k1) ==============
undefined
# ======== console.log(fields.k2) ==============
undefined |
@StarpTech could you take a look? |
Hi @abcfy2, I did some tests and multipart service did not lose the fields, in your example code the console.log is before an asynchronous call is handled the service is still processing the fields. This scenario happens because the size of the file, the function return occurred without the file processing having finished so thats why "data" variable did not yet have the fields. I did a test with your code putting console.log after await pump and the fields are there. const fastify = require('fastify')()
const fs = require('fs')
const util = require('util')
const path = require('path')
const { pipeline } = require('stream')
const pump = util.promisify(pipeline)
fastify.register(require('fastify-multipart'))
fastify.post('/', async function (req, reply) {
const data = await req.file()
await pump(data.file, fs.createWriteStream('tmp_file'))
const fields = data.fields
console.log(fields)
console.log(fields.k1)
console.log(fields.k2)
reply.send()
})
fastify.listen(3000, err => {
if (err) throw err
console.log(`server listening on ${fastify.server.address().port}`)
}) |
Oh thanks. And I find another bug. If we do not call const data = await req.file()
// comment out await pump
// await pump(data.file, fs.createWriteStream('tmp_file'))
const fields = data.fields
console.log(fields)
console.log(fields.k1)
console.log(fields.k2)
reply.send() # hang forever
curl -vF 'file=@BIG_FILE' -F 'k1=v1' -F 'k2=v2' localhost:3000 node console output: server listening on 3000
<ref *1> {
file: {
fieldname: 'file',
filename: 'BIG_FILE',
encoding: '7bit',
mimetype: 'application/octet-stream',
file: FileStream {
_readableState: [ReadableState],
_events: [Object: null prototype],
_eventsCount: 2,
_maxListeners: undefined,
truncated: false,
_read: [Function (anonymous)],
[Symbol(kCapture)]: false
},
fields: [Circular *1],
_buf: null,
toBuffer: [AsyncFunction: toBuffer]
}
}
undefined
undefined I think this should be mentioned in document, and find a better solution. |
@abcfy2 It is not bug, the readable stream must be consumed before all other action. It is waiting for your action, so it's like hanging there. |
@mcollina I believe this issue is not a bug and can be closed, what you thing about that? |
Some additional doc would be nice! |
I came into the same problem and I would like to provide a PR with a little change to the doc. The current README reads: const data = await req.file()
data.file // stream
data.fields // other parsed parts
data.fieldname
data.filename
data.encoding
data.mimetype
// to accumulate the file in memory! Be careful!
//
// await data.toBuffer() // Buffer
//
// or
await pump(data.file, fs.createWriteStream(data.filename)) and so lets you think that you can read I would simply move that line after await pump(data.file, fs.createWriteStream(data.filename))
// be careful of permission issues on disk and not overwrite
// sensitive files that could cause security risks
// also, consider that if the file stream is not consumed, the promise will never fulfill
data.fields // other parsed parts (available only AFTER the stream is consumed)
reply.send() |
Go for it 👍 🚀 |
It is not correct from your sentence, stream is not necessary to be parsed before any field go into It is the order of |
Here it is: #320 |
Prerequisites
Fastify version
3.8.1
Plugin version
4.0.7
Node.js version
14.17.1
Operating system
Linux
Operating system version (i.e. 20.04, 11.3, 10)
Manjaro 21
Description
When upload a large file. the fields will be lost.
Steps to Reproduce
Here is the sample code:
When upload a large file like:
The output is:
There is only
file
field, others likename
,description
lost.Expected Behavior
No response
The text was updated successfully, but these errors were encountered: