Question: Download of large files stops at 1GB
0
gravatar for b1t0
9 months ago by
b1t010
b1t010 wrote:

On my local Galaxy installation I cannot download files bigger than 1GB from the history. The download stops at ~1 GB and after some time it fails. I tried to use curl and wget which did not work.

ADD COMMENTlink modified 9 months ago by Jennifer Hillman Jackson25k • written 9 months ago by b1t010
2
gravatar for b1t0
9 months ago by
b1t010
b1t010 wrote:

Follow up: I resolved the issue by adding the line mentioned here to the galaxy.ini:

nginx_x_accel_redirect_base = /_x_accel_redirect

My nginx was already configured properly. Maybe this helps others having similar issues.

ADD COMMENTlink written 9 months ago by b1t010

Great that you figured out the issue and thanks for posting it back! Jen

ADD REPLYlink written 9 months ago by Jennifer Hillman Jackson25k

Hmm, I'm having this exact issue as well, but this didn't fix my issue. Are you using a docker container like I am?

ADD REPLYlink written 8 months ago by alex40

Hello, The developers that work on the Docker Galaxy releases should be able to help. I started the conversation here, please follow for comments/help, or they may post back here. You can also join in the conversation to clarify your use-case: https://gitter.im/galaxy-iuc/iuc?at=5ab945a735dd17022ea1bd57

ADD REPLYlink written 8 months ago by Jennifer Hillman Jackson25k

Interestingly, this is only failing for me for files in data libraries. I can't download more than 1GB from a data library, but can download >1GB from histories...

Should i carry the conversation forwards in tools-iuc?

ADD REPLYlink written 8 months ago by alex40
0
gravatar for Jennifer Hillman Jackson
9 months ago by
United States
Jennifer Hillman Jackson25k wrote:

Hello,

Double check that you have curl and/or wget installed correctly, then follow the instructions here: https://galaxyproject.org/support/download-data/

If that does not work, please post back your command line and the resulting error as a comment.

Example of a successful download from a local Galaxy:

$ curl -o outfile --insecure 'http://127.0.0.1:8080/datasets/0c5ffef6d88a1e97/display?to_ext=fastq'
  % Total    % Received % Xferd  Average Speed   Time    Time     Time  Current
                                  Dload  Upload   Total   Spent    Left  Speed
100   692  100   692    0     0   9054      0 --:--:-- --:--:-- --:--:--  9105

It would be Ok to leave the dataset link intact (so we can check basic format) if your local is not hosted publically. If hosted publically, you can redact the numerical portion like this:

$ curl -o outfile --insecure 'http://127.0.0.1:8080/datasets/XXX/display?to_ext=fastq'

Thanks! Jen, Galaxy team

ADD COMMENTlink written 9 months ago by Jennifer Hillman Jackson25k

I think I have to clarify my situation. I installed Galaxy on a local server. I am using it from a different PC.

When I try to download files using the server on which Galaxy is installed, I get this output:

$ curl -o outfile --insecure 'http://127.0.0.1/datasets/4cb8b1160aa5351a/display?to_ext=fastqsanger.gz'                                                             
  % Total    % Received % Xferd  Average Speed   Time    Time     Time  Current
                               Dload  Upload   Total   Spent    Left  Speed
  8 36.8G    8 3293M    0     0   333M      0  0:01:53  0:00:09  0:01:44  119Mcurl: (23) Failed writing body (4517 != 16384)

When I try to download files using another PC, I get this output:

$ curl -o outfile --insecure 'http://<Galaxy Server IP>/datasets/d5e06bff7285d86c/display?to_ext=fasta'
  % Total    % Received % Xferd  Average Speed   Time    Time     Time  Current
                                 Dload  Upload   Total   Spent    Left  Speed
 79 1299M   79 1033M    0     0  2804k      0  0:07:54  0:06:17  0:01:37     0

curl: (18) transfer closed with 278827119 bytes remaining to read

And when trying to resume the download there is another error:

$ curl -C - -o outfile --insecure 'http://<Galaxy Server IP>/datasets/d5e06bff7285d86c/display?to_ext=fasta'
** Resuming transfer from byte position 1083960963
  % Total    % Received % Xferd  Average Speed   Time    Time     Time  Current
                                 Dload  Upload   Total   Spent    Left  Speed
  0 1299M    0     0    0     0      0      0 --:--:-- --:--:-- --:--:--     0
curl: (33) HTTP server doesn't seem to support byte ranges. Cannot resume.

I think these errors might occur due to the network or some settings in nginx as noted here, but I have no clue what to change since the changes in this comment did not help me.

ADD REPLYlink modified 9 months ago • written 9 months ago by b1t010
Please log in to add an answer.

Help
Access

Use of this site constitutes acceptance of our User Agreement and Privacy Policy.
Powered by Biostar version 16.09
Traffic: 171 users visited in the last hour