You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I ran run_numbat using the following code: allele <- read.table(paste0(outdir, the.sample, "_allele_counts.tsv.gz"), sep = "\t", header = TRUE); seu <- readRDS(file.path(data.dir, the.sample, "clustered_seurat_.rds")); counts <- GetAssayData(seu, slot = "counts", assay = "RNA"); out <- run_numbat( count_mat = counts, lambdas_ref = ref_hca, df_allele = allele, genome = "hg38", t = 1e-5, ncores = 16, plot = TRUE, out_dir = outdir );
And it ran for 3 hours and generated bulk_clones_1.tsv, bulk_subtrees_1.tsv, gex_roll_wide.tsv, hc.rds, sc_refs.rds and segs_consensus_1.tsv files, but it stopped and. gave the following error:
Running HMMs on 3 cell groups.. Testing for multi-allelic CNVs .. 1 multi-allelic CNVs found: 11c Evaluating CNV per cell .. Mem used: 10.2Gb All cells succeeded Expanding allelic states.. Error: 'separate_longer_delim' is not an exported object from 'namespace:tidyr' In addition: Warning message: In asMethod(object) : sparse->dense coercion: allocating vector of size 1.1 GiB Execution halted srun: error: node01: task 0: Exited with exit code 1
Any idea what the reason is and how can I address the issue?
The text was updated successfully, but these errors were encountered:
I managed to debug it and get to this point that it works perfectly by setting multi_allelic = FALSE
And the reason id, when multi_allelic = TRUE, it logs as '0 multi-allelic CNVs found so it would run'
and then it will run test_multi_allelic, in which it will run this line n_states = ifelse(cnv_state == 'neu', 0, 1), which means all n_states in segs_consensus would be 0 & 1. This is how my segs_consensus look like:
>table(segs_consensus$n_states)
0 1
25 37
so, by the time it gets to run exp_post = expand_states(exp_post, segs_consensus)
there is no n_states > 1; therefore, running expand_states function breaks
because dimension of segs_consensus %>% filter(n_states > 1) would be 0
in expand_states function and return Error: 'separate_longer_delim' is not an exported object from 'namespace:tidyr'
because all n_states has been set to 0 & 1 and technically, there is no way to get past
this step unless we set multi_allelic = FALSE. It seems to me that the code is written
in a way that if multi_allelic == TRUE, there is no way to run it successfully unless
you actually have some multi-allelic CNVs.
Hi,
I ran run_numbat using the following code:
allele <- read.table(paste0(outdir, the.sample, "_allele_counts.tsv.gz"), sep = "\t", header = TRUE);
seu <- readRDS(file.path(data.dir, the.sample, "clustered_seurat_.rds"));
counts <- GetAssayData(seu, slot = "counts", assay = "RNA");
out <- run_numbat( count_mat = counts, lambdas_ref = ref_hca, df_allele = allele, genome = "hg38", t = 1e-5, ncores = 16, plot = TRUE, out_dir = outdir );
And it ran for 3 hours and generated bulk_clones_1.tsv, bulk_subtrees_1.tsv, gex_roll_wide.tsv, hc.rds, sc_refs.rds and segs_consensus_1.tsv files, but it stopped and. gave the following error:
Running HMMs on 3 cell groups..
Testing for multi-allelic CNVs ..
1 multi-allelic CNVs found: 11c
Evaluating CNV per cell ..
Mem used: 10.2Gb
All cells succeeded
Expanding allelic states..
Error: 'separate_longer_delim' is not an exported object from 'namespace:tidyr'
In addition: Warning message:
In asMethod(object) :
sparse->dense coercion: allocating vector of size 1.1 GiB
Execution halted
srun: error: node01: task 0: Exited with exit code 1
Any idea what the reason is and how can I address the issue?
The text was updated successfully, but these errors were encountered: