You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I tried to encode text of large PDF of size 11MB. The text must be more than 100k tokens in size. But the gpt-3-encoder failed to process this amount of text data without throwing any error. The program is stuck forever on this line const encoded = encode(textOfDocument);
How to solve?
The text was updated successfully, but these errors were encountered:
I tried to encode text of large PDF of size 11MB. The text must be more than 100k tokens in size. But the gpt-3-encoder failed to process this amount of text data without throwing any error. The program is stuck forever on this line
const encoded = encode(textOfDocument);
How to solve?
The text was updated successfully, but these errors were encountered: