Question: Suggestion For Multithreading
0
Louise-AmélieSchmitt • 160 wrote:
Hello everyone,
I'm using TORQUE with Galaxy, and we noticed that if a tool is
multithreaded, the number of needed cores is not communicated to pbs,
leading to job crashes if the required resources are not available
when
the job is submitted.
Therefore I modified a little the code as follows in
lib/galaxy/jobs/runners/pbs.py
256 # define PBS job options
257 attrs.append( dict( name = pbs.ATTR_N, value = str(
"%s_%s_%
s" % ( job_wrapper.job_id, job_wrapper.tool.id, job_wrapper.user ) ) )
)
258 mt_file = open('tool-data/multithreading.csv', 'r')
259 for l in mt_file:
260 l = string.split(l)
261 if ( l[0] == job_wrapper.tool.id ):
262 attrs.append( dict( name = pbs.ATTR_l,
resource = 'nodes', value = '1:ppn='+str(l[1]) ) )
263 attrs.append( dict( name = pbs.ATTR_l,
resource = 'mem', value = str(l[2]) ) )
264 break
265 mt_file.close()
266 job_attrs = pbs.new_attropl( len( attrs ) +
len( pbs_options ) )
(sorry it didn't come out very well due to line breaking)
The csv file contains a list of the multithreaded tools, each line
containing:
<tool id="">\t<number of="" threads="">\t<memory needed="">\n
And it works fine, the jobs wait for their turn properly, but
information is duplicated. Perhaps there would be a way to include
something similar in galaxy's original code (if it is not already the
case, I may not be up-to-date) without duplicating data.
I hope that helps :)
Best regards,
L-A