Skip to content

Commit

Permalink
BaseRestartWorkChain: Factor out attachment of outputs (#5983)
Browse files Browse the repository at this point in the history
When a work chain step returns an exit code, the work chain execution is
aborted. A common use for the handlers of the `BaseRestartWorkChain` is
exactly this, to stop work chain execution when a particular problem or
situation is detected.

The downside is that no other steps can be called by the work chain
implementation, for example, the `results` step to still attach any
(partial) results. Of course an implementation could copy the content of
the `results` method in the handler to do so, but it would have to copy
the contents in each handler that still wanted to attach the outputs,
duplicating the work.

Here, the actual attaching of the outputs is factored out of the
`results` method to the `attach_outputs` method. This method can now
easily be called inside a process handler that wants to attach outputs
before returning an exit code to stop the work chain.
  • Loading branch information
sphuber authored Sep 7, 2023
1 parent 4e0e7d8 commit d6093d1
Show file tree
Hide file tree
Showing 2 changed files with 27 additions and 3 deletions.
19 changes: 16 additions & 3 deletions aiida/engine/processes/workchains/restart.py
Original file line number Diff line number Diff line change
Expand Up @@ -15,6 +15,7 @@

from aiida import orm
from aiida.common import AttributeDict
from aiida.common.links import LinkType
from aiida.common.warnings import warn_deprecation

from .context import ToContext, append_
Expand Down Expand Up @@ -320,8 +321,17 @@ def results(self) -> Optional['ExitCode']:
return self.exit_codes.ERROR_MAXIMUM_ITERATIONS_EXCEEDED # pylint: disable=no-member

self.report(f'work chain completed after {self.ctx.iteration} iterations')
self._attach_outputs(node)
return None

def _attach_outputs(self, node) -> Mapping[str, orm.Node]:
"""Attach the outputs of the given calculation job to the work chain.
:param node: The ``CalcJobNode`` whose outputs to attach.
:returns: The mapping of output nodes that were attached.
"""
outputs = self.get_outputs(node)
existing_outputs = self.node.base.links.get_outgoing(link_type=LinkType.RETURN).all_link_labels()

for name, port in self.spec().outputs.items():

Expand All @@ -330,13 +340,16 @@ def results(self) -> Optional['ExitCode']:
except KeyError:
if port.required:
self.report(
f'required output \'{name}\' was not an output of {self.ctx.process_name}<{node.pk}> '
f'required output `{name}` was not an output of {self.ctx.process_name}<{node.pk}> '
f'(or an incorrect class/output is being exposed).'
)
else:
self.out(name, output)
if name in existing_outputs:
self.logger.info(f'output `{name}` was already attached, skipping.')
else:
self.out(name, output)

return None
return outputs

def __init__(self, *args, **kwargs) -> None:
"""Construct the instance."""
Expand Down
11 changes: 11 additions & 0 deletions docs/source/howto/workchains_restart.rst
Original file line number Diff line number Diff line change
Expand Up @@ -272,6 +272,17 @@ It is also possible to update the contents of one of the outputs returned by the
In this case, it is important to go through a ``calcfunction``, as always, as to not lose any provenance.


Attaching outputs
=================

In a normal run, the ``results`` method is the last step in the outline of the ``BaseRestartWorkChain``.
In this step, the outputs of the last completed calculation job are "attached" to the work chain itself.
The attaching of the outputs is implemented by the :meth:`~aiida.engine.processes.workchains.restart.BaseRestartWorkChain._attach_outputs` method.
If the outputs need to be attached at a point in the workflow other then the ``results`` step, this method can be called manually.
An example would be to call it in a process handler that will abort the work chain.
In this case the work chain will be stopped immediately and the ``results`` step would no longer be called.


Error handling
==============

Expand Down

0 comments on commit d6093d1

Please sign in to comment.