Hi,
I am having a lot of difficulty uploading some large gzipped fastqs (~
10GB) to the public server. I have tried both ftp and "pulling" by
http URL. The upload succeeds, however I get an error as it tries to
gunzip it. I have tried more than 10 times now and succeeded once.
These files are correct and complete, and gunzip properly locally.
The
error shown is usually this
empty
format: txt, database: ?
Problem decompressing gzipped data
However on 2 occasions (both ftp uploads) I got the traceback below.
Am I missing some obvious trick? I searched the archives and see
references to problems with large gzipped files but no solutions.
Thanks
Jim
Traceback (most recent call last):
File "/galaxy/home/g2main/galaxy_main/tools/data_source/upload.py",
line 384, in <module>
__main__()
File "/galaxy/home/g2main/galaxy_main/tools/data_source/upload.py",
line 373, in __main__
add_file( dataset, registry, json_file, output_path )
File "/galaxy/home/g2main/galaxy_main/tools/data_source/upload.py",
line 270, in add_file
line_count, converted_path = sniff.convert_newlines(
dataset.path,
in_place=in_place )
File
"/galaxy/home/g2main/galaxy_main/lib/galaxy/datatypes/sniff.py",
line 106, in convert_newlines
shutil.move( temp_name, fname )
File "/usr/lib/python2.7/shutil.py", line 299, in move
copy2(src, real_dst)
File "/usr/lib/python2.7/shutil.py", line 128, in copy2
copyfile(src, dst)
File "/usr/lib/python2.7/shutil.py", line 84, in copyfile
copyfileobj(fsrc, fdst)
File "/usr/lib/python2.7/shutil.py", line 49, in copyfileobj
buf = fsrc.read(length)
IOError: [Errno 5] Input/output error