Question: Import history from file not working - error
0
gravatar for mariawoer
17 months ago by
mariawoer0
mariawoer0 wrote:

Hi all,

we are currently migrating data from an older instance of galaxy to a new (and up-to-date) one on a different server. Unfortunately the "import from file" function for importing user histories is throwing this error:

galaxy.jobs ERROR 2017-07-06 20:05:11,263 Unable to cleanup job 51
Traceback (most recent call last):
  File "/opt/galaxy/lib/galaxy/jobs/__init__.py", line 1442, in cleanup
    galaxy.tools.imp_exp.JobImportHistoryArchiveWrapper( self.app, self.job_id ).cleanup_after_job()
  File "/opt/galaxy/lib/galaxy/tools/imp_exp/__init__.py", line 160, in cleanup_after_job
    raise MalformedContents( "Invalid dataset path: %s" % temp_dataset_file_name )
MalformedContents: Invalid dataset path: /var/galaxydata/database/tmp/tmp_80OPt/datasets/test.fasta.fasta

I've tried it with different histories and different datasets/files. It always fails while importing the first dataset/file. Also I have tried to understand what galaxy.tools.imp_exp.JobImportHistoryArchiveWrapper( self.app, self.job_id ).cleanup_after_job() is doing and find out where it goes wrong but with no success.

Does anyone know what could be causing this or has an idea on how to fix it? This is a major hold up at the moment and any advice or idea is very much appreciated.

Maria

error galaxy • 990 views
ADD COMMENTlink modified 16 months ago by Jeremy Goecks50 • written 17 months ago by mariawoer0
1
gravatar for Jeremy Goecks
16 months ago by
United States
Jeremy Goecks50 wrote:

The job of galaxy.tools.imp_exp.JobImportHistoryArchiveWrapper( self.app, self.job_id ).cleanup_after_job() is to create the history and import the history's datasets. It appears that the problem is that the function cannot find the datasets. Can you share a link to the history archive that is causing problems?

Thanks, J.

ADD COMMENTlink written 16 months ago by Jeremy Goecks50
0
gravatar for mariawoer
16 months ago by
mariawoer0
mariawoer0 wrote:

Hi Jeremy, thanks for your reply.

One guess of ours was that it may be some compatibility issue between the two galaxy versions as the older one is from 1-2 years ago and has not been updated since then.

The problem is that the link is only reachable from inside our network. This is the archive that was making problems: https://www.dropbox.com/s/e83ffizbymqwqbc/Galaxy-History-testcopy.tar.gz?dl=0

Does that help in anyway?

Thanks!

Maria

ADD COMMENTlink written 16 months ago by mariawoer0
0
gravatar for Jeremy Goecks
16 months ago by
United States
Jeremy Goecks50 wrote:

Hi Maria,

I was able to import this history using the latest Galaxy release, so it appears that this is not a compatibility issue b/t your versions of Galaxy. One thing to check: histories much be accessible by link or published so that they are visible to the Galaxy server that you're importing them to. If a history is not accessible/published, you'll get a failure that looks like this in the logs:

galaxy.jobs.output_checker DEBUG 2017-07-10 08:30:55,187 Tool produced standard error failing job - [Error unpacking tar/gz archive: not a gzip file]

Are you able to share the full log when the job is run?

Thanks, J.

ADD COMMENTlink written 16 months ago by Jeremy Goecks50
0
gravatar for mariawoer
16 months ago by
mariawoer0
mariawoer0 wrote:

Ah that is good to know! We actually had the error you mentioned but (seemed) to have fixed it. This is the log after trying to import the history:

galaxy.jobs DEBUG 2017-07-10 18:32:06,677 (59) Working directory for job is: /opt/galaxy/database/jobs_directory/000/59 
galaxy.jobs.handler DEBUG 2017-07-10 18:32:06,683 (59) Dispatching to local runner 
galaxy.jobs DEBUG 2017-07-10 18:32:06,709 (59) Persisting job destination (destination id: local:///) 
galaxy.jobs.runners DEBUG 2017-07-10 18:32:06,714 Job [59] queued (31.229 ms) 
galaxy.jobs.handler INFO 2017-07-10 18:32:06,719 (59) Job dispatched 
galaxy.jobs.command_factory INFO 2017-07-10 18:32:06,842 Built script [/opt/galaxy/database/jobs_directory/000/59/tool_script.sh] for tool command [python /opt/galaxy/lib/galaxy/tools/imp_exp/unpack_tar_gz_archive.py "IGh0dHBzOi8vZ2FsYXh5LmhwYy5wYWxtdWMub3JnL2hpc3RvcnkvZXhwb3J0X2FyY2hpdmU/aWQ9 ZTQxZGQ0ZmRkOWE1ODZlZg== " "L29wdC9nYWxheHkvZGF0YWJhc2UvdG1wL3RtcEU0UUMyTg== " --url --encoded] 
galaxy.jobs.runners DEBUG 2017-07-10 18:32:06,875 (59) command is: rm
-rf working; mkdir -p working; cd working; /opt/galaxy/database/jobs_directory/000/59/tool_script.sh; return_code=$?; cd '/opt/galaxy/database/jobs_directory/000/59';  if [ "$GALAXY_LIB" != "None" ]; then
    if [ -n "$PYTHONPATH" ]; then
        PYTHONPATH="$GALAXY_LIB:$PYTHONPATH"
    else
        PYTHONPATH="$GALAXY_LIB"
    fi
    export PYTHONPATH fi if [ "$GALAXY_VIRTUAL_ENV" != "None" -a -z "$VIRTUAL_ENV"      -a -f "$GALAXY_VIRTUAL_ENV/bin/activate" ]; then
    . "$GALAXY_VIRTUAL_ENV/bin/activate" fi GALAXY_PYTHON=`command -v python` python "/opt/galaxy/database/jobs_directory/000/59/set_metadata_bxTooG.py" "/opt/galaxy/database/tmp/tmpraDd3s" "/opt/galaxy/database/jobs_directory/000/59/working/galaxy.json"  5242880; sh -c "exit $return_code" 
galaxy.jobs.runners.local DEBUG 2017-07-10 18:32:06,907 (59) executing job script: /opt/galaxy/database/jobs_directory/000/59/galaxy_59.sh 
galaxy.jobs DEBUG 2017-07-10 18:32:06,913 (59) Persisting job destination (destination id: local:///) 
galaxy.jobs.runners.local DEBUG 2017-07-10 18:32:09,624 execution finished: /opt/galaxy/database/jobs_directory/000/59/galaxy_59.sh 
galaxy.jobs INFO 2017-07-10 18:32:09,732 Collecting metrics for Job 59 
galaxy.jobs DEBUG 2017-07-10 18:32:09,745 job 59 ended (finish() executed in (120.586 ms)) 
galaxy.model.metadata DEBUG 2017-07-10 18:32:09,753 Cleaning up external metadata files 
galaxy.jobs ERROR 2017-07-10 18:32:09,883 Unable to cleanup job 59 
Traceback (most recent call last):   
  File "/opt/galaxy/lib/galaxy/jobs/__init__.py", line 1442, in cleanup
     galaxy.tools.imp_exp.JobImportHistoryArchiveWrapper( self.app, self.job_id ).cleanup_after_job()   
  File "/opt/galaxy/lib/galaxy/tools/imp_exp/__init__.py", line 160, in cleanup_after_job
     raise MalformedContents( "Invalid dataset path: %s" % temp_dataset_file_name ) 
MalformedContents: Invalid dataset path: /var/galaxydata/database/tmp/tmpE4QC2N/datasets/test.fasta.fasta

The test.fasta actually looks like it has been imported (see image) but it seems that galaxy has not linked it to the actual file. The size of the history is reported as (empty) and when trying to view the data I get:

Not Found

The resource could not be found. 
File Not Found (/opt/galaxy/database/files/000/dataset_48.dat).
ADD COMMENTlink modified 16 months ago • written 16 months ago by mariawoer0
0
gravatar for Jeremy Goecks
16 months ago by
United States
Jeremy Goecks50 wrote:

This is tough because I can't reproduce the behavior that you're seeing. My next suggestion is to look at the temp directory where the unpacked history should be, which in this case is /var/galaxydata/database/tmp/tmpE4QC2N/

Do you see what's expected in this directory, including history_attrs.txt, a datasets directory, and more? And in the datasets directory do you see test.fasta.fasta?

ADD COMMENTlink written 16 months ago by Jeremy Goecks50
0
gravatar for mariawoer
16 months ago by
mariawoer0
mariawoer0 wrote:

I was able to resolve the issue with the help of our system administrator, as it had to be a problem with our setup. I think there was a conflict with the NFS setup... Thanks!

ADD COMMENTlink written 16 months ago by mariawoer0
0
gravatar for Jeremy Goecks
16 months ago by
United States
Jeremy Goecks50 wrote:

Great to hear that you solved your problem. If you're willing, it would be nice to post the ultimate problem and your solution so that others can learn from your experiences. Thanks!

ADD COMMENTlink written 16 months ago by Jeremy Goecks50
Please log in to add an answer.

Help
Access

Use of this site constitutes acceptance of our User Agreement and Privacy Policy.
Powered by Biostar version 16.09
Traffic: 182 users visited in the last hour