-
Notifications
You must be signed in to change notification settings - Fork 65
New issue
Have a question about this project? # for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “#”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? # to your account
Badly formed files can cause create_sample_table()
to OOM.
#395
Comments
Looks like #394 is semi-related, and the limit it uses could also apply to I'm throwing up a proof of concept for that file size check now in a WIP pull request. |
Perhaps a less complicated way to prevent this issue is for the API to provide the table in batches, which seems like it may be a better general solution. However, it would require more Gecko changes to get that working, since the MP4 demuxer currently just assumes that the entire indice is immediately available. |
Sorry for the delay! Thanks for filing the issue and providing the WIP fix. My fix in #394 was a bit of a band aid, as this similar issue arising in a different piece of code demonstrates. I've got an idea for a better approach to #394, which would also handle the case here and other cases an invalid file could trigger an OOM trivially. I'll post an updated patch on #394 in a day or so and would appreciate any feedback. |
Sounds good, I'll follow that PR to see when you update it. |
@kinetiknz Have you had a chance to revisit #394? This will help with fuzzing performance. |
Sorry for the delay, I'm still working on this but delayed due to other priorities - I hope to finish this quite soon. |
See Gecko Bug 1661368 and more recently Bug 1814791.
There is currently no validation of whether a file with an
stsc
box with a large sample count is actually referencing offsets that fall within the file, which allowscreate_sample_table()
to allocate an arbitrarily large vector and OOM.We could just set a configurable hard limit on the number of samples it can allocate, but I don't know if that's the best solution.
We can also fail early when the relevant sample boxes have mismatched sample counts, but I'm not sure if there are files that currently parse and break that rule, and as tnikkel pointed out, the fuzzer may still find a way to create a matching sample count that causes OOM. Might still be worth implementing this if it makes sense to fail that way.
I think the most ideal solution is instead to make
create_sample_table()
aware of the amount of data in the file that is currently available to the caller, and use thestsz
box to determine how many samples the vector needs to allocate to reach EOF. For cases where the full size is not known beforehand, API users would have to callget_indices
multiple times and be aware that their pointers will be invalid, but that doesn't seem like a difficult problem.What I wonder is whether the available data is something that should be accessible from other parts of mp4parse as well, for example to allow the box parsing to indicate that it hit the end of the available data but can continue parsing when more data is available.
Feel free to let me know if this idea doesn't make sense, we can use this issue to brainstorm more solutions if necessary.
The text was updated successfully, but these errors were encountered: