Question: Job wrapper fails because of missing file
1
gravatar for manuel.pasieka
20 months ago by
manuel.pasieka30 wrote:

Hi,

I have difficulties running a picard tool "MarkDuplicates on data" as the Job wrapper script complains about a metadata_temp file to be missing, although the file exits in the job directory and contains data.

We using Galaxy v16.10 on a seperate host, submitting to a cluster running PBS Pro, data is shared through sshfs mounts, and dependencies resolved using the module system.

Any ideas? Thx

galaxy.objectstore CRITICAL 2017-03-28 10:04:44,926 Error copying /lustre/scratch/projects/csf_biocomp_common/galaxy_shared/jobs_directory/000/221/metadata_temp_file_xccoop to /home/GMI/biocomp.pacbio/galaxy/galaxy_shared/file_database/_metadata_files/000/metadata_26.dat: [Errno 2] No such file or directory: u'/lustre/scratch/projects/csf_biocomp_common/galaxy_shared/jobs_directory/000/221/metadata_temp_file_xccoop'
galaxy.jobs.runners ERROR 2017-03-28 10:04:44,926 (221/1146406.pbs0.ice.gmi.oeaw.ac.at) Job wrapper finish method failed
Traceback (most recent call last):
  File "/home/GMI/biocomp.pacbio/galaxy/galaxy/lib/galaxy/jobs/runners/__init__.py", line 611, in finish_job
    job_state.job_wrapper.finish( stdout, stderr, exit_code )
  File "/home/GMI/biocomp.pacbio/galaxy/galaxy/lib/galaxy/jobs/__init__.py", line 1303, in finish
    dataset.metadata.from_JSON_dict( output_filename, path_rewriter=path_rewriter )
  File "/home/GMI/biocomp.pacbio/galaxy/galaxy/lib/galaxy/model/metadata.py", line 168, in from_JSON_dict
    dataset._metadata[ name ] = param.from_external_value( external_value, dataset, **from_ext_kwds )
  File "/home/GMI/biocomp.pacbio/galaxy/galaxy/lib/galaxy/model/metadata.py", line 605, in from_external_value
    alt_name=os.path.basename(mf.file_name) )
  File "/home/GMI/biocomp.pacbio/galaxy/galaxy/lib/galaxy/objectstore/__init__.py", line 430, in update_from_file
    raise ex
IOError: [Errno 2] No such file or directory: u'/lustre/scratch/projects/csf_biocomp_common/galaxy_shared/jobs_directory/000/221/metadata_temp_file_xccoop'
job wrapper • 522 views
ADD COMMENTlink modified 20 months ago • written 20 months ago by manuel.pasieka30
1
gravatar for manuel.pasieka
20 months ago by
manuel.pasieka30 wrote:

Hi,

it took me some time (should have read the error message with more attention)

The error happens, as the path on the cluster is not identical as on the machine where galayx is running. The folder structures on the VM and cluster are different, but I set some soft links in the home directory of both systems so that they appear to have the same path.

This seams to work for most tools, but for some reason with this one, galaxy seems to pass the "hard link"??? path, used on the cluster, back to the galaxy VM.

So my solution is to simply replicate the identical path on the machine where galaxy is running (Still using some soft link in the home directory).

ADD COMMENTlink written 20 months ago by manuel.pasieka30
Please log in to add an answer.

Help
Access

Use of this site constitutes acceptance of our User Agreement and Privacy Policy.
Powered by Biostar version 16.09
Traffic: 171 users visited in the last hour