-
Notifications
You must be signed in to change notification settings - Fork 38
New issue
Have a question about this project? # for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “#”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? # to your account
"Tarball has too many files" error causing search results to not update #215
Comments
This was a protection we have added to the system. There were some packages that were saturating the I/O because they had a lot of files in it. I'm not sure what we could do here, //cc @bcoe |
@satazor Saturating in what sense? Too many open file descriptors? Too CPU intensive? Too expensive to analyze? |
Too many file descriptors as well as filesystem writes/entries. For instance, usually it’s faster to write a single large file than multiples that total the same size. What happens is that the I/O isn’t fast enough and it causes the whole system to lag. |
Moreover, a tarball could have almost infinite small files in it. This would be a vector of attack because a well crafted tarball could fill up the filesystem max inodes. We can revisit the threshold for the maximum number of files but it was already quite generous. |
The total number of files is 32000. We may increase it, what value would you think it’s reasonable? |
If a tarball has too many files, the analyzer will not update.
See: https://github.com/Microsoft/Typescript
(50537 files)
This is, unfortunately, preventing Typescript from updating in search results:
see: https://npms.io/search?q=typescript
or: https://www.npmjs.com/search?q=typescript
vs. https://www.npmjs.com/package/typescript
The text was updated successfully, but these errors were encountered: