-
Notifications
You must be signed in to change notification settings - Fork 10
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
runGenePeakcorr Memory Issues #44
Comments
Hi there! Thanks for your interest in using our package. A few things I noticed that I'm wondering if you could look into prior to re-testing:
You want to make sure these are raw peak x cell counts, and nothing else. Can you show me the original call to the main function?
This is a strange error given the data size, it shouldn't really have to do with memory (thank you for testing appropropriately via downsampling etc. - I'd say also worth testing on a single core just in case something strange is happening with the parellization that might be harder to debug here). FigR does not save any abnormally large output either at this step. Let's see if we can debug this based on the above output? |
Hi, Thanks for the reply @vkartha ! I adjusted the RNA and ATAC to be raw counts (I believe they were scaled by z-score previously). I am still getting the same memory/core related error and have tried running with a single core to rule out parallelization. Any information or suggestions you have is appreciated!
|
Hello,
I am getting the following error when running the function.
Whether I run 250 or 3 background peaks, I get this error. I have tried running this in different environments with different number of features (filtering the ATAC peaks from 93,000 to 10,000 and RNA genes from 16,000 to 10,000) for the 1,800 cells. I don't believe the output should be sufficiently large to be causing this error. I have approximately 200GBs of memory and 19 cores available.
Is this an error that others are running into? Or is the output file truly larger than 200GBs?!
Thanks!
The text was updated successfully, but these errors were encountered: