Skip to content
New issue

Have a question about this project? # for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “#”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? # to your account

Big file issue #634

Open
lyandr opened this issue Jan 8, 2020 · 0 comments
Open

Big file issue #634

lyandr opened this issue Jan 8, 2020 · 0 comments
Assignees

Comments

@lyandr
Copy link

lyandr commented Jan 8, 2020

Hello,

i am trying to download and upload files from S3 to local filesystem and vice versa using Knp Gaufrette in a Symfony project.

My code work run on a container in openshift and work fine for little files but with large file (1Go) i have memory issues. I am limited to 1,5Go in my container. I don't understand why my memory is growing so high. Maybe i don't understand well the concept of stream in php but i thought that the file wasn't loaded in memory with stream but loaded chunk by chunk.
With my code i can see that when i do $srcStream = $this->fs_s3->createStream($filename); $srcStream->open(new StreamMode('rb+')); my memory is growing with the size of the file.
I also tried copy('gaufrette://s3/'.$filename,'gaufrette://nfs/'.$filename); but is is the same.

Am i using stream in the wrong way? Any advice?

Thank you in advance for your help.
Regards

# for free to join this conversation on GitHub. Already have an account? # to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants