-
Notifications
You must be signed in to change notification settings - Fork 967
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Fix rerunning a tool via API #7333
Conversation
Do we know when this was introduced ? Does this happen to all tools ? Should we add a test case if we don't have one ? |
@mvdbeek I think this API call was not used except for Trackster, and it may be still working when used from Trackster. I was surprised that we don't have a tool rerun method in BioBlend and started to look at the Galaxy API. This is all we have at the moment, which I think it's missing some of the features available through the UI, but worked for the simplest case with this small modifications to BioBlend: diff --git a/bioblend/galaxy/tools/__init__.py b/bioblend/galaxy/tools/__init__.py
index 4915593..2ccf22f 100644
--- a/bioblend/galaxy/tools/__init__.py
+++ b/bioblend/galaxy/tools/__init__.py
@@ -96,7 +96,7 @@ class ToolClient(Client):
params['link_details'] = link_details
return self._get(id=tool_id, params=params)
- def run_tool(self, history_id, tool_id, tool_inputs):
+ def run_tool(self, history_id, tool_id, tool_inputs, **kwargs):
"""
Runs tool specified by ``tool_id`` in history indicated
by ``history_id`` with inputs from ``dict`` ``tool_inputs``.
@@ -119,6 +119,14 @@ class ToolClient(Client):
payload = {}
payload["history_id"] = history_id
payload["tool_id"] = tool_id
+ action = kwargs.get('action')
+ if action:
+ payload['action'] = action
+ if action == 'rerun':
+ target_dataset_id = kwargs.get('target_dataset_id')
+ if not target_dataset_id:
+ raise ValueError("target_dataset_id must be specified when action='rerun'")
+ payload['target_dataset_id'] = target_dataset_id
try:
payload["inputs"] = tool_inputs.to_dict()
except AttributeError: So, to answer your Qs:
I think this never worked outside of Trackster.
Yes, I can try to add one. |
When sending an API request `POST /api/tools` with payload: ``` { 'action': 'rerun', 'history_id': history_id, 'target_dataset_id': target_dataset_id 'tool_id': tool_id, } ``` the following exception was raised: ``` Traceback (most recent call last): File "lib/galaxy/web/framework/decorators.py", line 283, in decorator rval = func(self, trans, *args, **kwargs) File "lib/galaxy/webapps/galaxy/api/tools.py", line 456, in create return self._create(trans, payload, **kwd) File "lib/galaxy/webapps/galaxy/api/tools.py", line 462, in _create return self._rerun_tool(trans, payload, **kwd) File "lib/galaxy/webapps/galaxy/api/tools.py", line 710, in _rerun_tool for action in tool.trackster_conf.actions: AttributeError: 'NoneType' object has no attribute 'actions' ``` and after fixing this, another exception was raised: ``` Traceback (most recent call last): File "lib/galaxy/web/framework/decorators.py", line 283, in decorator rval = func(self, trans, *args, **kwargs) File "lib/galaxy/webapps/galaxy/api/tools.py", line 456, in create return self._create(trans, payload, **kwd) File "lib/galaxy/webapps/galaxy/api/tools.py", line 462, in _create return self._rerun_tool(trans, payload, **kwd) File "lib/galaxy/webapps/galaxy/api/tools.py", line 822, in _rerun_tool dataset_dict['track_config'] = self.get_new_track_config(trans, output_dataset) File "lib/galaxy/web/base/controller.py", line 1017, in get_new_track_config "filters": {'filters' : track_data_provider.get_filters()}, AttributeError: 'NoneType' object has no attribute 'get_filters' ``` Also add basic `get_filters()` method to `BaseDataProvider` class.
yeah, the UI first fetches the parameters by
🥇 |
Remove check that was blocking re-running of datasets in non-`ok` state.
dfb4bd5
to
174bfbc
Compare
Not sure what's the way to go here, maybe @jmchilton or @guerler have a better informed opinion.
Added. |
@jgoecks gave me permission to remove |
@jmchilton That's fine by me, the present API is anyway missing some features I'd like to see, namely having the option of replacing the failed datasets and collections in the history, and restarting jobs which were waiting on those. Also, it returns a single dataset instead of the re-run job info. |
@nsoranzo @jmchilton should we get something similar in before we have the new API? |
I think there was a consensus to just have a new, better API endpoint, both @jmchilton and @mvdbeek offered to work on that 😉 |
Upps, this bit me again. |
This should be possible with #9802 |
When sending an API request
POST /api/tools
with payload:the following exception was raised:
and after fixing this, another exception was raised:
Also add basic
get_filters()
method toBaseDataProvider
class.