Skip to content
New issue

Have a question about this project? # for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “#”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? # to your account

It stuck when encoding large amount of text. #43

Open
Alokkumar8 opened this issue Nov 22, 2023 · 0 comments
Open

It stuck when encoding large amount of text. #43

Alokkumar8 opened this issue Nov 22, 2023 · 0 comments

Comments

@Alokkumar8
Copy link

Alokkumar8 commented Nov 22, 2023

I tried to encode text of large PDF of size 11MB. The text must be more than 100k tokens in size. But the gpt-3-encoder failed to process this amount of text data without throwing any error. The program is stuck forever on this line
const encoded = encode(textOfDocument);

How to solve?

# for free to join this conversation on GitHub. Already have an account? # to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant