Question: (Closed) Personalised tool not running in local galaxy (always yellow but not sending command)
gravatar for migueljuliamolina
13 days ago by
migueljuliamolina0 wrote:


I am creating a wrapper to run a personalised tool in my local galaxy installation. I tested that the script works in my machine, that galaxy works with tools from the shed and that the dummy example of how to add a personalised tool also works (

Now, this one does not work from galaxy. Galaxy recognises the tool, allow me to select the input files and send the job. The job goes to the queue, creates the working folders correctly, creates the execution script and starts running, but in the middle of the script there is a command where it gets stuck and don't execute it. It gives no error neither stops the job, it just keeps working forever without executing that line or the following ones. This is the galaxy log corresponding to what I just explained: DEBUG 2018-02-07 12:59:19,961 Validated and populated state for tool request (87.177 ms) INFO 2018-02-07 12:59:20,109 Handled output named zip_folder for tool IRMA_FLU_PairedEnd (38.973 ms) INFO 2018-02-07 12:59:20,123 Added output datasets to history (13.645 ms) INFO 2018-02-07 12:59:20,216 Verified access to datasets for Job[unflushed,tool_id=IRMA_FLU_PairedEnd] (57.842 ms) INFO 2018-02-07 12:59:20,218 Setup for job Job[unflushed,tool_id=IRMA_FLU_PairedEnd] complete, ready to flush (95.017 ms) INFO 2018-02-07 12:59:20,270 Flushed transaction for job Job[id=34,tool_id=IRMA_FLU_PairedEnd] (52.051 ms) DEBUG 2018-02-07 12:59:20,271 Tool [IRMA_FLU_PairedEnd] created job [34] (239.457 ms) DEBUG 2018-02-07 12:59:20,291 Executed 1 job(s) for tool IRMA_FLU_PairedEnd request: (327.768 ms) - - [07/Feb/2018:12:59:19 +0200] "POST /api/tools HTTP/1.0" 200 - "" "Mozilla/5.0 (X11; Linux x86_64; rv:52.0) Gecko/20100101 Firefox/52.0" DEBUG 2018-02-07 12:59:20,450 (34) Working directory for job is: /home/galaxy_user/galaxy/database/jobs_directory/000/34 - - [07/Feb/2018:12:59:20 +0200] "GET /api/webhooks/tool HTTP/1.0" 200 - "" "Mozilla/5.0 (X11; Linux x86_64; rv:52.0) Gecko/20100101 Firefox/52.0" DEBUG 2018-02-07 12:59:20,525 (34) Dispatching to local runner - - [07/Feb/2018:12:59:20 +0200] "GET /api/histories/f597429621d6eb2b/contents?details=0a248a1f62a0cc04%2C03501d7626bd192f%2C3f5830403180d620%2Ce85a3be143d5905b%2Cc9468fdb6dc5c5f1%2C2a56795cad3c7db3%2Cf09437b8822035f7%2Cfb85969571388350%2Cb472e2eb553fa0d1%2Ca7db2fac67043c7e%2C4ff6f47412c3e65e%2Ce89067bb68bee7a0%2C0c5ffef6d88a1e97%2C911dde3ddb677bcd%2Cff5476bcf6c921fa%2C79966582feb6c081%2C5564089c81cf7fe8&order=hid&v=dev&q=update_time-ge&q=deleted&q=purged&qv=2018-02-07T11%3A59%3A02.000Z&qv=False&qv=False HTTP/1.0"                      200 - "" "Mozilla/5.0 (X11; Linux x86_64; rv:52.0) Gecko/20100101 Firefox/52.0" DEBUG 2018-02-07 12:59:20,693 (34) Persisting job destination (destination id: local:///) DEBUG 2018-02-07 12:59:20,709 Job [34] queued (183.805 ms) INFO 2018-02-07 12:59:20,733 (34) Job dispatched INFO 2018-02-07 12:59:21,293 Built script [/home/galaxy_user/galaxy/database/jobs_directory/000/34/] for tool command [ln -s '/home/galaxy_user/galaxy/database/files/000/dataset_21.dat' 'VRP001_S1_L001_R1_001_shorted.fastq' && ln -s '/home/galaxy_user/galaxy/database/files/000/dataset_22.dat' 'VRP001_S1_L001_R2_001_shorted.fastq' &&  IRMA FLU 'VRP001_S1_L001_R1_001_shorted.fastq' 'VRP001_S1_L001_R2_001_shorted.fastq' 'test' &&  zip -r '' 'test' &&  rm -rf 'test' && rm -f 'VRP001_S1_L001_R1_001_shorted.fastq' && rm -f 'VRP001_S1_L001_R2_001_shorted.fastq'] DEBUG 2018-02-07 12:59:21,624 (34) command is: rm -rf working; mkdir -p working; cd working; /home/galaxy_user/galaxy/database/jobs_directory/000/34/; return_code=$?; if [ -f /home/galaxy_user/galaxy/database/jobs_directory/000/34/working/${output_folder}.zip ] ; then cp /home/galaxy_user/galaxy/database/jobs_directory/000/34/working/${output_folder}.zip /home/galaxy_user/galaxy/database/files/000/dataset_36.dat ; fi; cd '/home/galaxy_user/galaxy/database/jobs_directory/000/34'; [ "$GALAXY_VIRTUAL_ENV" = "None" ] && GALAXY_VIRTUAL_ENV="$_GALAXY_VIRTUAL_ENV"; _galaxy_setup_environment True
python "/home/galaxy_user/galaxy/database/jobs_directory/000/34/" "/home/galaxy_user/galaxy/database/jobs_directory/000/34/registry.xml" "/home/galaxy_user/galaxy/database/jobs_directory/000/34/working/galaxy.json" "/home/galaxy_user/galaxy/database/jobs_directory/000/34/metadata_in_HistoryDatasetAssociation_36_yDTNLL,/home/galaxy_user/galaxy/database/jobs_directory/000/34/metadata_kwds_HistoryDatasetAssociation_36_wd1nHs,/home/galaxy_user/galaxy/database/jobs_directory/000/34/metadata_out_HistoryDatasetAssociation_36_eWUSnc,/home/galaxy_user/galaxy/database/jobs_directory/000/34/metadata_results_HistoryDatasetAssociation_36_dR7Ggc,/home/galaxy_user/galaxy/database/files/000/dataset_36.dat,/home/galaxy_user/galaxy/database/jobs_directory/000/34/metadata_override_HistoryDatasetAssociation_36_dUFKBt" 5242880; sh -c "exit $return_code" DEBUG 2018-02-07 12:59:21,671 (34) executing job script: /home/galaxy_user/galaxy/database/jobs_directory/000/34/ DEBUG 2018-02-07 12:59:21,715 (34) Persisting job destination (destination id: local:///) - - [07/Feb/2018:12:59:24 +0200] "GET /api/histories/f597429621d6eb2b/contents?details=0a248a1f62a0cc04%2C03501d7626bd192f%2C3f5830403180d620%2Ce85a3be143d5905b%2Cc9468fdb6dc5c5f1%2C2a56795cad3c7db3%2Cf09437b8822035f7%2Cfb85969571388350%2Cb472e2eb553fa0d1%2Ca7db2fac67043c7e%2C4ff6f47412c3e65e%2Ce89067bb68bee7a0%2C0c5ffef6d88a1e97%2C911dde3ddb677bcd%2Cff5476bcf6c921fa%2C79966582feb6c081%2C5564089c81cf7fe8&order=hid&v=dev&q=update_time-ge&q=deleted&q=purged&qv=2018-02-07T11%3A59%3A20.000Z&qv=False&qv=False HTTP/1.0" 200 - "" "Mozilla/5.0 (X11; Linux x86_64; rv:52.0) Gecko/20100101 Firefox/52.0" - - [07/Feb/2018:12:59:24 +0200] "POST /api/tools/IRMA_FLU_PairedEnd/build HTTP/1.0" 200 - "" "Mozilla/5.0 (X11; Linux x86_64; rv:52.0) Gecko/20100101 Firefox/52.0" - - [07/Feb/2018:12:59:28 +0200] "GET /api/histories/f597429621d6eb2b/contents?details=0a248a1f62a0cc04%2C03501d7626bd192f%2C3f5830403180d620%2Ce85a3be143d5905b%2Cc9468fdb6dc5c5f1%2C2a56795cad3c7db3%2Cf09437b8822035f7%2Cfb85969571388350%2Cb472e2eb553fa0d1%2Ca7db2fac67043c7e%2C4ff6f47412c3e65e%2Ce89067bb68bee7a0%2C0c5ffef6d88a1e97%2C911dde3ddb677bcd%2Cff5476bcf6c921fa%2C79966582feb6c081%2C5564089c81cf7fe8&order=hid&v=dev&q=update_time-ge&q=deleted&q=purged&qv=2018-02-07T11%3A59%3A24.000Z&qv=False&qv=False HTTP/1.0" 200 - "" "Mozilla/5.0 (X11; Linux x86_64; rv:52.0) Gecko/20100101 Firefox/52.0"

So, summarising, the command which is supposed to be executed is:

ln -s '/home/galaxy_user/galaxy/database/files/000/dataset_21.dat' 'VRP001_S1_L001_R1_001_shorted.fastq' && ln -s '/home/galaxy_user/galaxy/database/files/000/dataset_22.dat' 'VRP001_S1_L001_R2_001_shorted.fastq' &&  IRMA FLU 'VRP001_S1_L001_R1_001_shorted.fastq' 'VRP001_S1_L001_R2_001_shorted.fastq' 'test' &&  zip -r '' 'test' &&  rm -rf 'test' && rm -f 'VRP001_S1_L001_R1_001_shorted.fastq' && rm -f 'VRP001_S1_L001_R2_001_shorted.fastq'

But when it reaches to "IRMA FLU 'VRP001_S1_L001_R1_001_shorted.fastq' 'VRP001_S1_L001_R2_001_shorted.fastq' 'test'" it stops working without exiting the job or writing anything to stdout or stderr, just keep waiting. If I go to the working directory and execute the command myself, ti works. If I add or remove commands before that one, it works. So galaxy stops executing my pipeline at that point without any message.

Any idea why this is happening?



galaxy • 127 views
ADD COMMENTlink modified 13 days ago by Jennifer Hillman Jackson24k • written 13 days ago by migueljuliamolina0

I am going to add my xml file as extra info, just in case the problem is there:

<tool id="IRMA_FLU_PairedEnd" name="IRMA" version="0.6.7">
<description>FLU 0.6.7</description>

<command detect_errors="exit_code"><![CDATA[
#import re
#set input_R1 = re.sub('[^\w\-\s\.]', '_', str($sample_R1.element_identifier))
#set input_R2 = re.sub('[^\w\-\s\.]', '_', str($sample_R2.element_identifier))
#set output_folder = re.sub('[^\w\-\s\.]', '_', str($sample_name))

#set sample_R1_sl = $input_R1
#set sample_R2_sl = $input_R2   

ln -s '${sample_R1}' '${sample_R1_sl}' &&
ln -s '${sample_R2}' '${sample_R2_sl}' &&

IRMA FLU '$sample_R1_sl' '$sample_R2_sl' '$output_folder' &&

zip -r '${output_folder}.zip' '$output_folder' &&

rm -rf '$output_folder' &&
rm -f '${sample_R1_sl}' &&
rm -f '${sample_R2_sl}'


<param format="fastq,fastq.gz" name="sample_R1" type="data" label="R1.fastq.gz/R1.fastq"/>
<param format="fastq,fastq.gz" optional="true" name="sample_R2" type="data" label="R2.fastq.gz/R2.fastq"/>
<param name="sample_name" type="text" label="sample_name"/>

<data format="zip" name="zip_folder" from_work_dir="${output_folder}.zip" label="${} on ${on_string}: Zipped Results" />
ADD REPLYlink modified 12 days ago • written 12 days ago by migueljuliamolina0

I'm not sure whether from_work_dir supports the variable lookup you are trying (I'm guessing it doesn't and that's why it's waiting for output to appear in the wrong place), but it is also unnecessary. You can just use a fixed file name in your zip, e.g.: zip -r '$output_folder', then use from_work_dir="" because you really shouldn't be interested in the name of the temporary file in the working directory (the same holds true for the names of the symlinks that I wouldn't determine dynamically either). As an additional remark, you do not have to do the file cleanup yourself, but let Galaxy handle it for you (so you can remove all the rms).

ADD REPLYlink written 11 days ago by Wolfgang Maier440

Thanks for the help, but the problem is still there.

The job is submitted but it never starts running the "IRMA" command. I have tried changing it for a simple "mkdir $output_folder" to check and it works, so there have to be a reason why that command does not start running. If executed "" which is created for the job in jobs_directory from terminal with the same user galaxy is running under, it works perfectly.

The only difference with previous .xml is that now it stays in gray instead of yellow.

ADD REPLYlink modified 8 days ago • written 8 days ago by migueljuliamolina0

Problem solved!

It was due to launching the job thru galaxy job manager. For some reason it gives problems if a bash script reads from /dev/urandom and gets hung there forever. I replaced that line in my script for another way of generating random strings and now it works.

ADD REPLYlink written 8 days ago by migueljuliamolina0

Hello migueljuliamolina!

We believe that this post does not fit the main topic of this site.

Problem solved

For this reason we have closed your question. This allows us to keep the site focused on the topics that the community can help with.

If you disagree please tell us why in a reply below, we'll be happy to talk about it.


ADD REPLYlink written 8 days ago by migueljuliamolina0
Please log in to add an answer.
The thread is closed. No new answers may be added.


Use of this site constitutes acceptance of our User Agreement and Privacy Policy.
Powered by Biostar version 16.09
Traffic: 108 users visited in the last hour