-
Notifications
You must be signed in to change notification settings - Fork 49
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
memory error #12
Comments
I have text training data that is ~4000 KB, and it takes up about 20 GB of memory. (The average low-end computer has 6-8 GB nowadays.) And my data is less than yours... I'd suggest getting more ram or using a smaller amount of training data. |
̶w̶e̶l̶l̶,̶ ̶t̶h̶e̶ ̶w̶e̶i̶r̶d̶ ̶t̶h̶i̶n̶g̶ ̶i̶s̶ ̶w̶h̶e̶n̶ ̶i̶ ̶t̶r̶i̶e̶d̶ ̶i̶t̶ ̶o̶n̶ ̶a̶ ̶d̶i̶f̶f̶e̶r̶e̶n̶t̶ ̶(̶a̶n̶d̶ ̶o̶l̶d̶e̶r̶)̶ ̶c̶o̶m̶p̶u̶t̶e̶r̶ ̶i̶t̶ ̶w̶o̶r̶k̶e̶d̶ ̶d̶e̶s̶p̶i̶t̶e̶ ̶b̶o̶t̶h̶ ̶h̶a̶v̶i̶n̶g̶ ̶4̶g̶b̶ ̶o̶f̶ ̶r̶a̶m̶,̶ ̶(̶i̶ ̶d̶o̶n̶'̶t̶ ̶k̶n̶o̶w̶ ̶w̶h̶y̶ ̶y̶o̶u̶r̶s̶ ̶n̶e̶e̶d̶s̶ ̶s̶o̶ ̶m̶u̶c̶h̶ ̶r̶a̶m̶)̶.̶ ̶b̶u̶t̶ ̶s̶t̶i̶l̶l̶ ̶i̶t̶ ̶w̶i̶l̶l̶ ̶n̶o̶t̶ ̶r̶u̶n̶ ̶o̶n̶ ̶m̶y̶ ̶o̶w̶n̶ ̶p̶c̶ |
I'm trying to train a chinese text. The text file is ~6M
The size is 2399086 x 5466 |
when i want to generate text i get a memory error
Traceback (most recent call last): File "rnn_tf.py", line 300, in <module> main() File "rnn_tf.py", line 221, in main data, vocab = load_data(args.input_file) File "rnn_tf.py", line 174, in load_data data = embed_to_vocab(data_, vocab) File "rnn_tf.py", line 152, in embed_to_vocab data = np.zeros((len(data_), len(vocab))) MemoryError
the text file is ~6000 KB in size but that shouldn't be a problem because i can train with this text.
i am running python in 64-bit
please help!
The text was updated successfully, but these errors were encountered: