Question: Is There Size Limit Of Dataset For Running Tophat?
0
gravatar for Du, Jianguang
5.7 years ago by
Du, Jianguang110
Du, Jianguang110 wrote:
Hi All, Is there a size limit of dataset for running Tophat at Galaxy? If there is, how many reads is the limit? Thanks. Jianguang
galaxy • 916 views
ADD COMMENTlink modified 5.7 years ago by Jennifer Hillman Jackson25k • written 5.7 years ago by Du, Jianguang110
0
gravatar for Jennifer Hillman Jackson
5.7 years ago by
United States
Jennifer Hillman Jackson25k wrote:
Hi Jianguang, The limit for Tophat will most likely not be the number of reads, but the total processing time when using the public Galaxy instance. Currently, a job has 72 hours to complete, assuming that there is not a memory problem before that time limit is reached. But, there are some size limitations. Initial upload file size must be 50G or less. Output files must be 200G or less, and there must be room on your history for the output, or further work will not be possible until the account is brought back under quota (250G). These wikis contain the same information and more: http://wiki.galaxyproject.org/Main -> links to -> http://wiki.galaxyproject.org/Learn/Managing%20Datasets#Data_size_and_ disk_Quotas For large or batch processing, the cloud option is the best recommendation, since this allows you to customize resource as needed. http://usegalaxy.org/cloud Thanks! Jen Galaxy team -- Jennifer Hillman-Jackson Galaxy Support and Training http://galaxyproject.org
ADD COMMENTlink written 5.7 years ago by Jennifer Hillman Jackson25k
Please log in to add an answer.

Help
Access

Use of this site constitutes acceptance of our User Agreement and Privacy Policy.
Powered by Biostar version 16.09
Traffic: 171 users visited in the last hour