Question: Large datasets and memory usage
0
gravatar for nd1091
10 months ago by
nd10910
nd10910 wrote:

Hi, My data set (multiple, 17 pair ended) has total size of 87 GB. I want to run trinity denovo assembly. But as I have to prepocess the raw files number of files are increasing and I have almost used up all my space. My proprcoessing steps are,

FASTQC>Trimmomatic>FASTQC>fastq Interlacer> fastq de interlacer>concatenate>trinity.

So, as a solution can I delete my raw FASTQ files and other intermediate files in history to accommodate more space to run the tool? will it interfere my run?

Thanks, Nihar

ADD COMMENTlink modified 10 months ago by Jennifer Hillman Jackson25k • written 10 months ago by nd10910
0
gravatar for Jennifer Hillman Jackson
10 months ago by
United States
Jennifer Hillman Jackson25k wrote:

Hello,

You can permanently delete any datasets that will not be used as inputs again. Download and save a copy of any that you might want to access or use again later.

FAQs: https://galaxyproject.org/support/

Thanks! Jen, Galaxy team

ADD COMMENTlink written 10 months ago by Jennifer Hillman Jackson25k
Please log in to add an answer.

Help
Access

Use of this site constitutes acceptance of our User Agreement and Privacy Policy.
Powered by Biostar version 16.09
Traffic: 175 users visited in the last hour