broadinstitute / Drop-seq

Java tools for analyzing Drop-seq data

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

DigitalExpression: java.lang.OutOfMemoryError: GC overhead limit exceeded Dumping heap to java_

vagabond12 opened this issue · comments

INFO 2019-12-19 21:58:14 UMIIterator Processed 113,000,000 records. Elapsed time: 00:24:33s. Time for last 1,000,000: 14s. Last read position: X:153,696,275
INFO 2019-12-19 21:58:41 SortingCollection Creating merging iterator from 227 files
INFO 2019-12-19 21:58:41 UMIIterator Sorting finished.
java.lang.OutOfMemoryError: GC overhead limit exceeded
Dumping heap to java_pid20873.hprof ...
Heap dump file created [4561212591 bytes in 21.727 secs]
[Thu Dec 19 22:03:28 CST 2019] org.broadinstitute.dropseqrna.barnyard.DigitalExpression done. Elapsed time: 35.43 minutes.
Runtime.totalMemory()=3817865216
Exception in thread "main" java.lang.OutOfMemoryError: GC overhead limit exceeded
at htsjdk.samtools.util.StringUtil.bytesToString(StringUtil.java:301)
at htsjdk.samtools.util.StringUtil.bytesToString(StringUtil.java:288)
at htsjdk.samtools.BinaryTagCodec.readNullTerminatedString(BinaryTagCodec.java:423)
at htsjdk.samtools.BinaryTagCodec.readSingleValue(BinaryTagCodec.java:318)
at htsjdk.samtools.BinaryTagCodec.readTags(BinaryTagCodec.java:282)
at htsjdk.samtools.BAMRecord.decodeAttributes(BAMRecord.java:415)
at htsjdk.samtools.BAMRecord.getAttribute(BAMRecord.java:394)
at org.broadinstitute.dropseqrna.utils.StringTagComparator.compare(StringTagComparator.java:44)
at org.broadinstitute.dropseqrna.utils.StringTagComparator.compare(StringTagComparator.java:35)
at org.broadinstitute.dropseqrna.utils.MultiComparator.compare(MultiComparator.java:49)
at htsjdk.samtools.util.SortingCollection$PeekFileRecordIteratorComparator.compare(SortingCollection.java:653)
at htsjdk.samtools.util.SortingCollection$PeekFileRecordIteratorComparator.compare(SortingCollection.java:648)
at java.util.TreeMap.put(TreeMap.java:552)
at java.util.TreeSet.add(TreeSet.java:255)
at htsjdk.samtools.util.SortingCollection$MergingIterator.next(SortingCollection.java:562)
at htsjdk.samtools.util.PeekableIterator.advance(PeekableIterator.java:71)
at htsjdk.samtools.util.PeekableIterator.next(PeekableIterator.java:57)
at org.broadinstitute.dropseqrna.utils.GroupingIterator.next(GroupingIterator.java:63)
at org.broadinstitute.dropseqrna.utils.readiterators.UMIIterator.next(UMIIterator.java:151)
at org.broadinstitute.dropseqrna.barnyard.DigitalExpression.digitalExpression(DigitalExpression.java:202)
at org.broadinstitute.dropseqrna.barnyard.DigitalExpression.doWork(DigitalExpression.java:154)
at picard.cmdline.CommandLineProgram.instanceMain(CommandLineProgram.java:295)
at picard.cmdline.PicardCommandLine.instanceMain(PicardCommandLine.java:103)
at org.broadinstitute.dropseqrna.cmdline.DropSeqMain.main(DropSeqMain.java:42)
(base) [root@whmx tmp]#

How to solve the problem with DigitalExpression?

commented

You need to increase the java heap size. Send the command line that you invoked DigitalExpression with and I'll tell you how to fix it.

/home/soft/Drop-seq_tools-2.0.0/DigitalExpression
I=function_tagged.bam
O=unalignedbarcode.txt.gz
SUMMARY=unaligned_out_gene_exon_tagged.dge96.summary.txt
NUM_CORE_BARCODES=96
CELL_BC_FILE=/sdb/REF/barcode.csv
LOCUS_FUNCTION_LIST=INTRONIC

This is it, Thanks for your help!

commented

Hi @vagabond12 ,

It turns out there is a bug in the DigitalExpression script that causes it not to show the way to set heap size when you pass -h option. That will be fixed in next release. Anyway, the default heap size is 4g. Try increasing it to 8g. If you still get the same error, increase it some more. You do this as follows:

/home/soft/Drop-seq_tools-2.0.0/DigitalExpression -m 8g
I=function_tagged.bam
O=unalignedbarcode.txt.gz
SUMMARY=unaligned_out_gene_exon_tagged.dge96.summary.txt
NUM_CORE_BARCODES=96
CELL_BC_FILE=/sdb/REF/barcode.csv
LOCUS_FUNCTION_LIST=INTRONIC

Regards, Alec

Hi @ alecw

I have done what you told me, and it worked. You have helped me solved the problem confusing me for a long time. Thank you very much.