You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
CarrierWave should support storing and retrieving large files without exhausting the available memory.
Currently CarrierWave retrieves a file from remote storage by reading it wholesale into memory. This can quickly exhaust available memory, especially if a deployment caps the memory available to any given Rails process (e.g.: four cores, with four 256MB processes, one per core, on a small server, and multiple attached files in the tens to hundreds of MB).
CarrierWave should offer an additional streaming API, where it will retrieve the remote file in chunks of a certain size and immediately steam the bytes to disk, possibly a Tempfile-backed SanitizedFile. Only that amount of memory would be consumed at any time and no more.
Likewise for writes: CarrierWave should offer an additional streaming API for writes, where it will write a remote file from a Tempfile-backed SanitizedFile is chunks of a certain size.
CarrierWave.configure do |config|
config.streaming_chunk_size = 4 * (1024 ** 2) # 4 MB
end
class MyModel
include Mongoid::Document # or any other supported
mount_uploader :attachment, AttachmentUploader
end
my_tempfile = MyModel.find(73).attachment.stream_to_tempfile(*tempfile_open_args)
do_something_with(my_tempfile)
MyModel.find(73).attachment.stream_from_tempfile(my_tempfile)
There already is a CarrierWave::Uploader::Cache module and the actual API for streaming reads/writes from/to remote files could simply be the Cache API, with the Cache internals updated for streaming reads/writes.
The text was updated successfully, but these errors were encountered:
CarrierWave should support storing and retrieving large files without exhausting the available memory.
Currently CarrierWave retrieves a file from remote storage by reading it wholesale into memory. This can quickly exhaust available memory, especially if a deployment caps the memory available to any given Rails process (e.g.: four cores, with four 256MB processes, one per core, on a small server, and multiple attached files in the tens to hundreds of MB).
CarrierWave should offer an additional streaming API, where it will retrieve the remote file in chunks of a certain size and immediately steam the bytes to disk, possibly a Tempfile-backed SanitizedFile. Only that amount of memory would be consumed at any time and no more.
Likewise for writes: CarrierWave should offer an additional streaming API for writes, where it will write a remote file from a Tempfile-backed SanitizedFile is chunks of a certain size.
There already is a
CarrierWave::Uploader::Cache
module and the actual API for streaming reads/writes from/to remote files could simply be theCache
API, with theCache
internals updated for streaming reads/writes.The text was updated successfully, but these errors were encountered: