Question: Large datasets and memory usage
gravatar for nd1091
6 months ago by
nd10910 wrote:

Hi, My data set (multiple, 17 pair ended) has total size of 87 GB. I want to run trinity denovo assembly. But as I have to prepocess the raw files number of files are increasing and I have almost used up all my space. My proprcoessing steps are,

FASTQC>Trimmomatic>FASTQC>fastq Interlacer> fastq de interlacer>concatenate>trinity.

So, as a solution can I delete my raw FASTQ files and other intermediate files in history to accommodate more space to run the tool? will it interfere my run?

Thanks, Nihar

ADD COMMENTlink modified 6 months ago by Jennifer Hillman Jackson25k • written 6 months ago by nd10910
gravatar for Jennifer Hillman Jackson
6 months ago by
United States
Jennifer Hillman Jackson25k wrote:


You can permanently delete any datasets that will not be used as inputs again. Download and save a copy of any that you might want to access or use again later.


Thanks! Jen, Galaxy team

ADD COMMENTlink written 6 months ago by Jennifer Hillman Jackson25k
Please log in to add an answer.


Use of this site constitutes acceptance of our User Agreement and Privacy Policy.
Powered by Biostar version 16.09
Traffic: 94 users visited in the last hour