Ah that is good to know! We actually had the error you mentioned but (seemed) to have fixed it. This is the log after trying to import the history:
galaxy.jobs DEBUG 2017-07-10 18:32:06,677 (59) Working directory for job is: /opt/galaxy/database/jobs_directory/000/59
galaxy.jobs.handler DEBUG 2017-07-10 18:32:06,683 (59) Dispatching to local runner
galaxy.jobs DEBUG 2017-07-10 18:32:06,709 (59) Persisting job destination (destination id: local:///)
galaxy.jobs.runners DEBUG 2017-07-10 18:32:06,714 Job [59] queued (31.229 ms)
galaxy.jobs.handler INFO 2017-07-10 18:32:06,719 (59) Job dispatched
galaxy.jobs.command_factory INFO 2017-07-10 18:32:06,842 Built script [/opt/galaxy/database/jobs_directory/000/59/tool_script.sh] for tool command [python /opt/galaxy/lib/galaxy/tools/imp_exp/unpack_tar_gz_archive.py "IGh0dHBzOi8vZ2FsYXh5LmhwYy5wYWxtdWMub3JnL2hpc3RvcnkvZXhwb3J0X2FyY2hpdmU/aWQ9 ZTQxZGQ0ZmRkOWE1ODZlZg== " "L29wdC9nYWxheHkvZGF0YWJhc2UvdG1wL3RtcEU0UUMyTg== " --url --encoded]
galaxy.jobs.runners DEBUG 2017-07-10 18:32:06,875 (59) command is: rm
-rf working; mkdir -p working; cd working; /opt/galaxy/database/jobs_directory/000/59/tool_script.sh; return_code=$?; cd '/opt/galaxy/database/jobs_directory/000/59'; if [ "$GALAXY_LIB" != "None" ]; then
if [ -n "$PYTHONPATH" ]; then
PYTHONPATH="$GALAXY_LIB:$PYTHONPATH"
else
PYTHONPATH="$GALAXY_LIB"
fi
export PYTHONPATH fi if [ "$GALAXY_VIRTUAL_ENV" != "None" -a -z "$VIRTUAL_ENV" -a -f "$GALAXY_VIRTUAL_ENV/bin/activate" ]; then
. "$GALAXY_VIRTUAL_ENV/bin/activate" fi GALAXY_PYTHON=`command -v python` python "/opt/galaxy/database/jobs_directory/000/59/set_metadata_bxTooG.py" "/opt/galaxy/database/tmp/tmpraDd3s" "/opt/galaxy/database/jobs_directory/000/59/working/galaxy.json" 5242880; sh -c "exit $return_code"
galaxy.jobs.runners.local DEBUG 2017-07-10 18:32:06,907 (59) executing job script: /opt/galaxy/database/jobs_directory/000/59/galaxy_59.sh
galaxy.jobs DEBUG 2017-07-10 18:32:06,913 (59) Persisting job destination (destination id: local:///)
galaxy.jobs.runners.local DEBUG 2017-07-10 18:32:09,624 execution finished: /opt/galaxy/database/jobs_directory/000/59/galaxy_59.sh
galaxy.jobs INFO 2017-07-10 18:32:09,732 Collecting metrics for Job 59
galaxy.jobs DEBUG 2017-07-10 18:32:09,745 job 59 ended (finish() executed in (120.586 ms))
galaxy.model.metadata DEBUG 2017-07-10 18:32:09,753 Cleaning up external metadata files
galaxy.jobs ERROR 2017-07-10 18:32:09,883 Unable to cleanup job 59
Traceback (most recent call last):
File "/opt/galaxy/lib/galaxy/jobs/__init__.py", line 1442, in cleanup
galaxy.tools.imp_exp.JobImportHistoryArchiveWrapper( self.app, self.job_id ).cleanup_after_job()
File "/opt/galaxy/lib/galaxy/tools/imp_exp/__init__.py", line 160, in cleanup_after_job
raise MalformedContents( "Invalid dataset path: %s" % temp_dataset_file_name )
MalformedContents: Invalid dataset path: /var/galaxydata/database/tmp/tmpE4QC2N/datasets/test.fasta.fasta
The test.fasta actually looks like it has been imported (see image) but it seems that galaxy has not linked it to the actual file. The size of the history is reported as (empty) and when trying to view the data I get:
Not Found
The resource could not be found.
File Not Found (/opt/galaxy/database/files/000/dataset_48.dat).
•
link
modified 16 months ago
•
written
16 months ago by
mariawoer • 0