Question: Export To File
0
gravatar for Dave Corney
6.1 years ago by
Dave Corney50
Dave Corney50 wrote:
Hi list, Is there a currently a known problem with the "export to file" function? I'm trying to migrate some data from the public galaxy to a private one; the export function worked well with a small (~100mb) dataset, but it has not been working with larger datasets (>2GB) and I get the error: Server Error. An error occurred. See the error logs for more information. (Turn debug on to display exception reports here). Is there a limit on the file size of the export? If so, what is it? Thanks in advance, Dave
galaxy • 1.3k views
ADD COMMENTlink modified 6.1 years ago by Jennifer Hillman Jackson25k • written 6.1 years ago by Dave Corney50
0
gravatar for Jennifer Hillman Jackson
6.1 years ago by
United States
Jennifer Hillman Jackson25k wrote:
Hi Dave, To export larger files, you can use a different method. Open up a terminal window on your computer and type in at the prompt ($): $ curl -0 '<file_link>' > name_the_output Where <file_link> can be obtained by right-clicking on the disc icon for the dataset and selecting "Copy link location". If you are going to import into a local Galaxy, exporting entire histories, or a history comprised of datasets that you have copied/grouped together, may be a quick alternative. From the history panel, use "Options (gear icon) -> Export to File" to generate a link, then use curl again to perform the download. The "Import from File" function (in the same menu) can be used in your local Galaxy to incorporate the history and the datasets it contains. Hopefully this helps, but please let us know if you have more questions, Jen Galaxy team -- Jennifer Jackson http://galaxyproject.org
ADD COMMENTlink written 6.1 years ago by Jennifer Hillman Jackson25k
Hi I have a local install of Galaxy on my desktop. I performed the following steps: Created a new Library Selected "Upload directory of files" Chose the "server Directory" Selected "Copy files into Galaxy" However, when I execute I get the following error. Any ideas? Error Traceback: View as: Interactive | Text | XML (full) ⇝ AttributeError: 'Bunch' object has no attribute 'multifiles' URL: http://140.253.78.44/galaxy/library_common/upload_library_dataset Module weberror.evalexception.middleware:364 in respond view Module paste.debug.prints:98 in __call__ view Module paste.wsgilib:539 in intercept_output view Module paste.recursive:80 in __call__ view Module paste.httpexceptions:632 in __call__ view Module galaxy.web.framework.base:160 in __call__ view Module galaxy.web.controllers.library_common:855 in upload_library_dataset view Module galaxy.web.controllers.library_common:1055 in upload_dataset view Module galaxy.tools.actions.upload_common:342 in create_paramfile view AttributeError: 'Bunch' object has no attribute 'multifiles' Do I need to set something else? Thanks Neil
ADD REPLYlink written 6.1 years ago by Neil.Burdett@csiro.au310
0
gravatar for Jennifer Hillman Jackson
6.1 years ago by
United States
Jennifer Hillman Jackson25k wrote:
Hi Dave, Yes, if your Galaxy instance is on the internet, for entire history transfer, you can skip the curl download and just enter the URL from the public Main Galaxy server into your Galaxy directly. To load large data over 2G that is local (datasets, not history archives), you can use the data library option. The idea is to load into a library, then move datasets from libraries into histories as needed. Help is in our wiki here: http://wiki.g2.bx.psu.edu/Admin/Data%20Libraries/Libraries http://wiki.g2.bx.psu.edu/Admin/Data%20Libraries/Uploading%20Library%2 0Files Take care, Jen Galaxy team -- Jennifer Jackson http://galaxyproject.org
ADD COMMENTlink written 6.1 years ago by Jennifer Hillman Jackson25k
Dave, There's likely something problematic about your history that causing problems. Can you share with me the history that's generating the error? To do so, from the history options menu --> Share/Publish --> Share with a User --> my email address Thanks, J.
ADD REPLYlink written 6.1 years ago by Jeremy Goecks2.2k
Hi Jeremy, Thanks for your offer of help. By the time I got your email I had already added many new jobs to the history that are either running now or waiting to run. Since I read somewhere that if the history is running then there are problems exporting I shared a clone of the history with you. The clone should be identical to the history that I was having problems with yesterday. I can share with you the original history once the jobs have finished running (but it might take a while). Thanks, Dave
ADD REPLYlink written 6.1 years ago by Dave Corney50
I've reworked the code to handle large history export files in -central changeset afc8e9345268., and this should solve your issue. This change should make it out to our public server this coming week. Best, J.
ADD REPLYlink written 6.1 years ago by Jeremy Goecks2.2k
Hi Jeremy, That's really wonderful - thanks so much for taking the time and effort to do this! When you say large history, is there a size limit that I should be aware of, or will it handle anything that my quota can accept? Thanks, Dave
ADD REPLYlink written 6.1 years ago by Dave Corney50
It will handle anything your quota can accept. Best, J.
ADD REPLYlink written 6.1 years ago by Jeremy Goecks2.2k
Please log in to add an answer.

Help
Access

Use of this site constitutes acceptance of our User Agreement and Privacy Policy.
Powered by Biostar version 16.09
Traffic: 169 users visited in the last hour