New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Improved workflow sets #625
Improved workflow sets #625
Conversation
These workflows have improved logic and interfaces compared to the existing ones. Also added ways to setup calculations using existing inputs sets and allow calculations/workflows to be dryrun locally for error checking and parameters tweaking (for parallelisation). Temporary naming these workflows with prefix `v2` in the entrypoint names. The existing workflows can be renamed as `v1` and deprecated later if needed.
I got lots of
running |
That is strange. What about running |
Great additions and extensions @zhubonan. Thanks for adding them. I will add a few specific comments that we can continue to discuss. |
I think we should try to make these not dependent on |
Yes, this is certainly a good option. I think, we can also do remote dryruns now as we can monitor a remote running calc with the recent additions in aiida core. I have not tried that, but maybe we should look into that, so that also a remote code can be used here. |
And a general comment, what is the take on the release with respect to these additions? Here we are most likely breaking a few workchain interfaces, so would maybe just make sense to add them as the necessary docs updates would not be done in at least a week or so. Your opinions on this would be great. I suspect most of us will not work much until next week so we should also consider that. |
It also shows up, the errors are pasted below.
|
I will looking into this, although I think our priority is to get it working for VASP. I should be easy to generalise to other codes by refectoring the methods (update cut off, k spacing and getting energies/forces/stresses). |
I see. I have not tried that either. The dryrun here is just to run locally before submitting the workchain (get the number of kpoints for example). It is also possible to add this into the workchain, e.g. have some kind of auto-parallel, but I hvae not written a working logic for setting KPAR/NCOREs, as it also depends on the hardware of the remote cluster.
What I intended is to have these additions as "non-breaking" changes, at least for now. Feel free to have the new release ("3.x"?) before/after the PR is merged. What I have in mind is to have:
|
This PR needs to be updated to reflect changes in the aiida-user-addons' code (should be minor) as well as those in the I think it is good to have it merged soon. The missing things are:
|
Codecov ReportAttention: Patch coverage is
Additional details and impacted files@@ Coverage Diff @@
## develop #625 +/- ##
============================================
- Coverage 65.44% 53.62% -11.82%
============================================
Files 58 74 +16
Lines 7325 9903 +2578
============================================
+ Hits 4793 5309 +516
- Misses 2532 4594 +2062
Flags with carried forward coverage won't be shown. Click here to find out more. ☔ View full report in Codecov by Sentry. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM.
Please check the applicable boxes, thank you:
I, the author consider this PR
Interactions with issues / other PRs
type "#" followed by search words to find issues / PRs
supersedes:
#516
blocks:
is blocked by:
None of the above but is still related to the following:
Description
Formal PR for merging the workchain stack that I use to the main code base. I call the set
v2
for now in order to distinguish them with the existing ones.A brief summary of the improvements:
VaspWorkChain
ldau_mapping
to make setting up LDA+U calculations eaiser.kpoints
spacing to make setting up KPOINTS mesh eaiser, although the units is in the CASTEP convention, e.g. 2pi * A^-1 instead of A^-1 that VASP uses.VaspConvergeWorkChain
This workchain is simplified from the origin one, and the calculations are all carried out in parallel.
VaspBandsWorkChain
This workchain includes an optional relaxation phase. Multiple path scehemes are supported via sumo.
An alternative workchain
VaspHybridBandsWorkChain
is designed for performing band structure using hybrid functionals (can be used for standard GGA as well). This workchain uses the "zero-weighted" kpoints approaches and split the band paths into multiple segments.VaspRelaxWorkChain
This workchain is similar to the original relaxation workchain, but the settings are supplied in a single
Dict
node under therelax_settings
port (and validated before workchain submission).Additional checks are also performed for the final singlepoint calculation to see if it is indeed relaxed (VASP can sometime ended up in a different local minimum in this final singlepoint calculation, resulting inconsistent energies).
It also allows the parameters of the final static calculation to be overriden, say if one wants to use a different incar tag such as
ISMEAR
.OptionContainer
Is the extendable class powering options to be specified as
Dict
. It allows keys to be pre-declared with defaults and also allows automatic conversion and validation integrated with thespec.input
in thedefine
part of theWorkChain
.VaspInputSet
A class to load and apply existing input sets to a
ProcessBuilder
and simply tedious works for mapping magnetic moments and passing LDA+U keys andLMAXMIX
etc.Dryrun
Allow dryrun calculation using VASP executable installed in the local machine. This runs VASP for up to several seconds or until a certain line is printed in the outputs. The
OUTCAR
is then parsed to check the lengths of different dimensions of the wavefunction, e.g. number of kpoints, bands and wave vectors, which can be used to determine the optium parallelisation settings.TODO
v1
entrypoints to the existing workchains, and issue deprecation warnings when applicable.Comments are wellcome @atztogo @espenfl @JPchico