-
Notifications
You must be signed in to change notification settings - Fork 14
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Std/Conf/Std+Conf jobs cannot be run twice as a lib #2165
Labels
bug
Something isn't working
Conformance
Conformance Job affected
priority: low
Nice to have
Standardization
Standardization Job affected
Comments
dk1844
added
bug
Something isn't working
priority: undecided
Undecided priority to be assigned after discussion
labels
Jan 18, 2023
dk1844
added a commit
that referenced
this issue
Jan 18, 2023
…the end of each Std/Conf/Std+Conf spark job - so that users are able to chain spark jobs. `prepareStandardization()` and `finishJob()` are now paired in this sense - unit test added
I am not sure it's a bug, rather an improvement (after all, it was never intended to run this way), but we can keep the designation... 😉 |
benedeki
added
good first issue
Good for newcomers
Conformance
Conformance Job affected
Standardization
Standardization Job affected
priority: low
Nice to have
and removed
priority: undecided
Undecided priority to be assigned after discussion
labels
Jan 18, 2023
dk1844
added a commit
that referenced
this issue
Jan 20, 2023
dk1844
added a commit
that referenced
this issue
Jan 23, 2023
* #2165 Atum's spark.disableControlMeasuresTracking() is now called at the end of each Std/Conf/Std+Conf spark job - so that users are able to chain spark jobs. `prepareStandardization()` and `finishJob()` are now paired in this sense - unit test added - review update: commons-TempDirectory used instead of nio-Files Co-authored-by: David Benedeki <14905969+benedeki@users.noreply.github.com>
benedeki
added a commit
that referenced
this issue
Jan 26, 2023
Co-authored-by: Daniel Kavan <dk1844@gmail.com>
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Labels
bug
Something isn't working
Conformance
Conformance Job affected
priority: low
Nice to have
Standardization
Standardization Job affected
Describe the
bugproblemCurrent implementation of Enceladus Spark jobs internally uses Atum's initialization and does not explicitly disable Atum's control measurement tracking, because it relies on the implied disable routine with Spark session being ended.
However, if one of these sparkjobs is ran from other code (library-like), Atum's disabling of the CM tracking is not called and
Control framework tracking is already initialized.
exception is raised.To Reproduce
Steps to reproduce the behavior OR commands run:
StandardizationJob.main()
from other code more than once.Control framework tracking is already initialized.
Expected behavior
Running sparkjobs multiple times from other code should work.
Additional context
If this way of using is to be supported, explicit
spark.disableControlMeasuresTracking()
for Atum must be called at the end of all Enceladus SparkJobsTemporary Workaround
Until fixed and released, when using Enceladus spark jobs in this as-a-library fashion, one can explicitly call
between individual jobs.
The text was updated successfully, but these errors were encountered: