Hi, im running a local Galaxy server, and using a tried and tested workflow to do some analysis on my data. My problem is that it keeps crashing on the same step in the workflow, even when I use data and workflow provided by the same paper. It starts fine and does step 1 to 4, but than it just stops with the following error:
Traceback (most recent call last):
File "/home/nermze/shed_tools/toolshed.g2.bx.psu.edu/repos/yufei-luo/s_mart/d96f6c9a39e0/s_mart/SMART/galaxy/../Java/Python/clusterize.py", line 203, in <module> c.run() File "/home/nermze/shed_tools/toolshed.g2.bx.psu.edu/repos/yufei-luo/s_mart/d96f6c9a39e0/s_mart/SMART/galaxy/../Java/Python/clusterize.py", line 172, in run self._sortFiles() File "/home/nermze/shed_tools/toolshed.g2.bx.psu.edu/repos/yufei-luo/s_mart/d96f6c9a39e0/s_mart/SMART/galaxy/../Java/Python/clusterize.py", line 102, in _sortFiles fs.sort() File "/home/nermze/shed_tools/toolshed.g2.bx.psu.edu/repos/yufei-luo/s_mart/d96f6c9a39e0/s_mart/SMART/Java/Python/ncList/FileSorter.py", line 78, in sort self._batchSort() File "/home/nermze/shed_tools/toolshed.g2.bx.psu.edu/repos/yufei-luo/s_mart/d96f6c9a39e0/s_mart/SMART/Java/Python/ncList/FileSorter.py", line 204, in _batchSort self._merge(self._chunks[chromosome], chromosome, outputHandle) File "/home/nermze/shed_tools/toolshed.g2.bx.psu.edu/repos/yufei-luo/s_mart/d96f6c9a39e0/s_mart/SMART/Java/Python/ncList/FileSorter.py", line 133, in _merge self._mergeParts(chunks, outputHandle) File "/home/nermze/shed_tools/toolshed.g2.bx.psu.edu/repos/yufei-luo/s_mart/d96f6c9a39e0/s_mart/SMART/Java/Python/ncList/FileSorter.py", line 156, in _mergeParts transcript = pickle.load(chunk) ValueError: I/O operation on closed file
Does anyone have any idea whats going on, is there a problem with the data, or some python or ..?`