Question: Large datasets and memory usage
gravatar for nd1091
4 weeks ago by
nd10910 wrote:

Hi, My data set (multiple, 17 pair ended) has total size of 87 GB. I want to run trinity denovo assembly. But as I have to prepocess the raw files number of files are increasing and I have almost used up all my space. My proprcoessing steps are,

FASTQC>Trimmomatic>FASTQC>fastq Interlacer> fastq de interlacer>concatenate>trinity.

So, as a solution can I delete my raw FASTQ files and other intermediate files in history to accommodate more space to run the tool? will it interfere my run?

Thanks, Nihar

ADD COMMENTlink modified 4 weeks ago by Jennifer Hillman Jackson24k • written 4 weeks ago by nd10910
gravatar for Jennifer Hillman Jackson
4 weeks ago by
United States
Jennifer Hillman Jackson24k wrote:


You can permanently delete any datasets that will not be used as inputs again. Download and save a copy of any that you might want to access or use again later.


Thanks! Jen, Galaxy team

ADD COMMENTlink written 4 weeks ago by Jennifer Hillman Jackson24k
Please log in to add an answer.


Use of this site constitutes acceptance of our User Agreement and Privacy Policy.
Powered by Biostar version 16.09
Traffic: 63 users visited in the last hour