Question: How To Transfer Files Between Two Galaxy Instances
gravatar for shamsher jagat
5.8 years ago by
United States
shamsher jagat590 wrote:
I have uploaded files in citsrome/ Gunner Ratch lab Galaxy instances which allow users to use their tool. I want to either share work flow from these instances or atleast transfer FAstq files to penn state open source galaxy severer. Is it possible or not? I have another question in this regard when I tried to upload the FASTq files via web link or FTP the job is never completed. I have tried it couple of times. This problem is there from last couple of months. Are there so me changes which have been implemented recently which is not allowing me to upload the files. Indeed I have seen from last month or so too many messages suggesting either Tophat job stuck or job is not completed or unable to upload the file. I am not if all tehse problems are related (storage) or not. Can someone from Galaxy team advice.
galaxy • 1.5k views
ADD COMMENTlink modified 5.8 years ago by Jennifer Hillman Jackson24k • written 5.8 years ago by shamsher jagat590
gravatar for Jennifer Hillman Jackson
5.8 years ago by
United States
Jennifer Hillman Jackson24k wrote:
Hello, It would not be possible to share a workflow between two Galaxy instance using the built in "Share or Publish" methods, however, you should be able to download a workflow from one Galaxy instance and then "Upload or import workflow" into the public main Galaxy instance (this button is on the "Workflow" home page). Please note that any tools included in your workflow that were available in the other instance, but that are not available on the public main Galaxy instance, would not be functional, so the workflow will likely need to be edited before use. Moving FASTQ datasets from other instances should also be straightforward - download, then upload these to the public main Galaxy instance using FTP. If you are having trouble downloading workflows or datasets, then the host Galaxy instance should be contacted. At this time, there are no known issues with upload functions. If you are having problems with URL data transfer from another Galaxy instance (or really, any 3rd party source), you can try to download the data locally to your desktop in a terminal prompt (as a test) with the command: % curl -O 'copied_link_location_from_dataset_history_or_workflow_or_other_URL' If this is successful, then a load into Galaxy should be successful. If this is unsuccessful, first contact the source to check if the data is publicly accessible, as they may have advice about requirements concerning user/passwords in the URL. Once that is worked out and a URL load still fails, please submit a bug report from the problematic load (leaving the error dataset undeleted in your history) and we will examine. The main public instance occasionally has brief server/cluster issues that impacted jobs, but these are very small portion of the overall volume of jobs processed, and a re-run is the solution. We do our best to respond quickly as soon as a problem is detected. Reported delays due to high usage volume is a different matter. The public main Galaxy instance has substantial dedicated resources, but there are times when we still get very busy. If your analysis needs are urgent or your jobs are numerous, then a local or cloud instance is the recommended alternative: To be clear, you had FTP upload processes that were started but never completed? If so, the connection may be been interrupted. When this occurs, an FTP client will allow you to restart an upload so that it will begin again where it left off. To do this, re-establish the connection first, then resume the transfer. Hopefully this addresses your concerns, Jen Galaxy team -- Jennifer Jackson
ADD COMMENTlink written 5.8 years ago by Jennifer Hillman Jackson24k
Hi again, I have an important update - I was incorrect about workflow transfer between Galaxy instances and where to edit. In summary, it is important to do the workflow editing up-front before the export, instead of after import. I'll explain why. While it is now possible to import workflows from other Galaxy instances (any), once actually in that new instance, if that workflow contains tools that are not in the new target instance, there is nothing that can be done with it (editing or running will result in a server error). This is a known open issue with the workflow developers. So, when you plan to import into the public main Galaxy instance, you will need to modify any workflow in the source instance before exporting it to make certain that it only contains tools that are present in the public main Galaxy instance. If a server error results from an imported workflow, an unavailable tool present in the workflow is the most likely root cause of the problem. If you plan to import into a local or cloud instance, then you have the choice of either modifying the workflow and/or adding all of the workflow tools to your local/cloud instance, and then importing the workflow. You do not import the workflow first as this will result in an error - if it occurs, delete the workflow and import again after the required tools are added. Good question and I apologize for getting this detail incorrect in the original reply. Best, Jen Galaxy team Incorrect start -->> << -- Incorrect end, see above -- Jennifer Jackson
ADD REPLYlink written 5.8 years ago by Jennifer Hillman Jackson24k
Thanks Jen for the update. I tried following: Go to Ratsch Galaxy instance > workflow> make work flow accessible via link Go to galaxy Penn server Workflow> import workflow URL> galaxy UR error is The data content does not appear to be a Galaxy workflow. Exception: No JSON object could be decoded: line 1 column 0 (char 0) I also downloaded the file from Ratsch serever saved on computer and use option of Choose file under import galaxy flow it importe dthe file afetr a while and when I opened workflow there was no data only steps of the workflow were there. Do you have any suggestion wheer I am doing something wrong. Thanks
ADD REPLYlink written 5.8 years ago by shamsher jagat590
Hi, Just wanted to add a few clarifications here. It definitely *is* currently possible to transfer a workflow from one instance to another instance that does not have (some or all) of the tools for a particular workflow. The error you're running into "No JSON object" means that you likely have the wrong link to your workflow. The one you want is accessible via the workflow context menu -> Download or Export -> URL for importing into another galaxy. Or, you could just download the raw file if you want and upload that as you figured out. The format of the correct URL should look like this, note the "for_direct_import" in the string: 7ee As a correction to what was previously said, I would not recommend stripping out tools from an existing workflow prior to export. When you upload the workflow to a new instance, if tools aren't available you will see something like the following when you edit the workflow, which specifies that the tool is not found: And at this point the unrecognized tools can be installed if it's your galaxy server, or if you wish, removed from the workflow via the editor. This must be done before the workflow will be usable. Lastly, workflows don't contain any data, just the organization and parameters of steps for a process. What it sounds like you're looking for (to get your data there as well) is a history export, which is available through the menu at the top of your history as "Export to File". -Dannon
ADD REPLYlink written 5.7 years ago by Dannon Baker3.7k
Ok. Perhaps I am not understanding the process- I am not making any head way in transfering the data from one Galaxy instance to other. I have uploaded some files in Rutsch lab galaxy instance and have url b324ccf Nothing is happening. Is it possible that some one from Galaxy team can enlist steps of - how to transfer files (data) from one galaxy instance to other please, considering me as a beginer. Thanks. Sorry for pushing this question. Vasu
ADD REPLYlink written 5.7 years ago by shamsher jagat590
The data is not a part of the workflow -- what you want is a history export. Once the data is ready (the link works), on the destination Galaxy instance, select "Import from File" in the history menu. Put in your link. This should migrate the entire history.
ADD REPLYlink written 5.7 years ago by Dannon Baker3.7k
Please log in to add an answer.


Use of this site constitutes acceptance of our User Agreement and Privacy Policy.
Powered by Biostar version 16.09
Traffic: 94 users visited in the last hour