-
Notifications
You must be signed in to change notification settings - Fork 11
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
check workflow config for 'dmrpp' key if collection doesn't have it #25
check workflow config for 'dmrpp' key if collection doesn't have it #25
Conversation
This fixes the test added in the previous commit Before this change, multiple calls to `CumulusLogger(name)` would construct a `CumulusLogger` instance with the same underlying `Logger` instance, but each construction would also create and attach a new handler, resulting in duplicate messages, e.g.: ``` > python >>> from cumulus_logger import CumulusLogger >>> logger = CumulusLogger('test') >>> logger.info('hello there') {"message": "hello there", "timestamp": "2021-12-21T16:18:09.315309", "level": "info"} >>> >>> logger2 = CumulusLogger('test') >>> logger2.info('hello there') {"message": "hello there", "timestamp": "2021-12-21T16:18:16.845070", "level": "info"} {"message": "hello there", "timestamp": "2021-12-21T16:18:16.845070", "level": "info"} >>> >>> logger.info('hello there') {"message": "hello there", "timestamp": "2021-12-21T16:18:18.165564", "level": "info"} {"message": "hello there", "timestamp": "2021-12-21T16:18:18.165564", "level": "info"} ``` See also the screenshots on ghrcdaac/dmrpp-file-generator-docker#25 With this change, a new handler is only constructed if the underlying `Logger` does not already have any handlers, preventing messages from being handled more than once: ``` > python >>> from cumulus_logger import CumulusLogger >>> logger = CumulusLogger('test') >>> logger.info('hello there') {"message": "hello there", "timestamp": "2021-12-21T16:33:56.735453", "level": "info"} >>> >>> logger2 = CumulusLogger('test') >>> logger2.info('hello there') {"message": "hello there", "timestamp": "2021-12-21T16:34:03.887972", "level": "info"} >>> >>> logger.info('hello there') {"message": "hello there", "timestamp": "2021-12-21T16:34:07.808415", "level": "info"} ```
Can you please add an example of adding |
@amarouane-ABDELHAK I've added example configs to the two READMEs in dbd7b9c and ghrcdaac/dmrpp-generator#16. Thanks! |
@amarouane-ABDELHAK another question, if you don't think we'll be able to get this PR merged before PI Planning, would you be able to create a Jira ticket for this? NSIDC needs this functionality for NDCUM-683 and our scrum master wants to link our ticket to a GHRC ticket so the dependency is visible for planning. Thanks! |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
A welcome PR
It is merged. I will create a release including it soon.
…On Tue, Jan 4, 2022 at 4:58 PM Michael Brandt ***@***.***> wrote:
@amarouane-ABDELHAK <https://github.com/amarouane-ABDELHAK> another
question, if you don't think we'll be able to get this PR merged before PI
Planning, would you be able to create a Jira ticket for this? NSIDC needs
this functionality for NDCUM-683
<https://bugs.earthdata.nasa.gov/browse/NDCUM-683> and our scrum master
wants to link our ticket to a GHRC ticket so the dependency is visible for
planning. Thanks!
—
Reply to this email directly, view it on GitHub
<#25 (comment)>,
or unsubscribe
<https://github.com/notifications/unsubscribe-auth/AH4Z5GED2H4S4FV5XK2VZITUUN3QNANCNFSM5KRHRB2Q>
.
Triage notifications on the go with GitHub Mobile for iOS
<https://apps.apple.com/app/apple-store/id1477376905?ct=notification-email&mt=8&pt=524675>
or Android
<https://play.google.com/store/apps/details?id=com.github.android&referrer=utm_campaign%3Dnotification-email%26utm_medium%3Demail%26utm_source%3Dgithub>.
You are receiving this because you were mentioned.Message ID:
***@***.***>
--
Abdelhak Marouane
|
This fixes the test added in the previous commit Before this change, multiple calls to `CumulusLogger(name)` would construct a `CumulusLogger` instance with the same underlying `Logger` instance, but each construction would also create and attach a new handler, resulting in duplicate messages, e.g.: ``` > python >>> from cumulus_logger import CumulusLogger >>> logger = CumulusLogger('test') >>> logger.info('hello there') {"message": "hello there", "timestamp": "2021-12-21T16:18:09.315309", "level": "info"} >>> >>> logger2 = CumulusLogger('test') >>> logger2.info('hello there') {"message": "hello there", "timestamp": "2021-12-21T16:18:16.845070", "level": "info"} {"message": "hello there", "timestamp": "2021-12-21T16:18:16.845070", "level": "info"} >>> >>> logger.info('hello there') {"message": "hello there", "timestamp": "2021-12-21T16:18:18.165564", "level": "info"} {"message": "hello there", "timestamp": "2021-12-21T16:18:18.165564", "level": "info"} ``` See also the screenshots on ghrcdaac/dmrpp-file-generator-docker#25 With this change, a new handler is only constructed if the underlying `Logger` does not already have any handlers, preventing messages from being handled more than once: ``` > python >>> from cumulus_logger import CumulusLogger >>> logger = CumulusLogger('test') >>> logger.info('hello there') {"message": "hello there", "timestamp": "2021-12-21T16:33:56.735453", "level": "info"} >>> >>> logger2 = CumulusLogger('test') >>> logger2.info('hello there') {"message": "hello there", "timestamp": "2021-12-21T16:34:03.887972", "level": "info"} >>> >>> logger.info('hello there') {"message": "hello there", "timestamp": "2021-12-21T16:34:07.808415", "level": "info"} ```
* test_logger setup: allow kwargs to be forwarded to CumulusLogger constructor * add failing test, each logger should only have one handler * CumulusLogger: only create new handler if one is not present This fixes the test added in the previous commit Before this change, multiple calls to `CumulusLogger(name)` would construct a `CumulusLogger` instance with the same underlying `Logger` instance, but each construction would also create and attach a new handler, resulting in duplicate messages, e.g.: ``` > python >>> from cumulus_logger import CumulusLogger >>> logger = CumulusLogger('test') >>> logger.info('hello there') {"message": "hello there", "timestamp": "2021-12-21T16:18:09.315309", "level": "info"} >>> >>> logger2 = CumulusLogger('test') >>> logger2.info('hello there') {"message": "hello there", "timestamp": "2021-12-21T16:18:16.845070", "level": "info"} {"message": "hello there", "timestamp": "2021-12-21T16:18:16.845070", "level": "info"} >>> >>> logger.info('hello there') {"message": "hello there", "timestamp": "2021-12-21T16:18:18.165564", "level": "info"} {"message": "hello there", "timestamp": "2021-12-21T16:18:18.165564", "level": "info"} ``` See also the screenshots on ghrcdaac/dmrpp-file-generator-docker#25 With this change, a new handler is only constructed if the underlying `Logger` does not already have any handlers, preventing messages from being handled more than once: ``` > python >>> from cumulus_logger import CumulusLogger >>> logger = CumulusLogger('test') >>> logger.info('hello there') {"message": "hello there", "timestamp": "2021-12-21T16:33:56.735453", "level": "info"} >>> >>> logger2 = CumulusLogger('test') >>> logger2.info('hello there') {"message": "hello there", "timestamp": "2021-12-21T16:34:03.887972", "level": "info"} >>> >>> logger.info('hello there') {"message": "hello there", "timestamp": "2021-12-21T16:34:07.808415", "level": "info"} ``` Co-authored-by: Michael Brandt <michaelbrandt5@gmail.com>
This allows putting the dmrpp config in the workflow to facilitate using the same config for many collections, and any collections that will be different can still have their own configuration.
As a quick test, I added a bit more logging in nsidc@8fe1af6 to dump out the different possible dmrpp configs, and show which one that the code chooses. The CloudWatch screenshots show that the correct config is picked* (which is the third one logged in each of these screenshots).
*note: they also show some log messages being duplicated when I ran the next trial quickly; it seems that the underlying
Logger
instance from theCumulusLogger
persists between runs, and when theDMRPPGenerator
instance is created, theCumulusLogger
creation grabs the same underlyingLogger
, and adds another handler to it resulting in duplicate messages; I'm surprised theLogger
manages to hang around, but I suspect this duplication could be removed by adding a check inCumulusLogger.__init__
to only add a handler iff one isn't already present (edit to add: I have put in a PR to CMA to address this: nasa/cumulus-message-adapter-python#43)Screenshots
with no
dmrpp
key anywherecollection has
dmrpp
keyworkflow has
dmrpp
keyboth collection and workflow have a
dmrpp
key, collection config is usedworkflow has
dmrpp
key and value, collection hasdmrpp
key and value of{}
"dmrpp": {}
in the collection overrides whateverdmrpp
was in the workflow; note the difference between this case and the case where the collection simply doesn't have admrpp
key and so the workflow config is used