Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[override] DF: events processing progress data. #359

Closed
wants to merge 14 commits into from

Conversation

mgolosova
Copy link
Collaborator

@mgolosova mgolosova commented Jun 2, 2020

Overridden by multiple PRs related to the last item in the ToDo list:
#365, #366, #368, #369, #371 (#380), #374 (#381)
Some of these PR will require: #176 (done)

Get events processing progress data to the DKB storage.

ToDo:


Trello: https://trello.com/c/5BgrJPal

Add filters:
 - only production task jobs (`prodsourcelabel: managed`);
 - only jobs with `nevents > 0`.
When we get date field from the UC ES, it comes in thid format by
default. We can, of course, specify the format we want and/or convert
what we get to the required format -- but instead we may support loading
values in this format.
To avoid extraction of task IDs for every campaign step and then
querying progress statistics by these IDs, the parameters that allows to
detect task's affilation with a campaign and its step are added to the
progress info documents.

In theory, a campaign can be said by a hashtag and a step -- by a
combination of AMI tags chain and output data format.

For now, however, we use MC steps -- or the last (current) AMI tag with
format. So all three kinds of "steps" are added to the mapping.
This stage takes data from the 010 and extend documents with information
from the DKB ES.

Currently it works with one message at a time, but the internal
functionality supports multiple messages processing.
If an input message is somehow "incorrect" (e.g. has no "taskid" field)
or no information was found in the DKB ES -- the input message will be
passed through as-is -- for it is not up to this stage to filter
something out, right? However it can mark a message as "incomplete",
just in case.
If the return value was a list of documents, merging it with a list of
original messages in `process()` (in case of multiple messages
processing) would be painful.
The trick was copy-and-pasted from some 'data4es' dataflow stage; but
for this dataflow the location of the library is different. So this is
the first stage that suggests that the pyDKB is available on a system
level (generally of in virtual environment).
In the mapping for tasks metadata we already have "hashtag_list", and
supposedly these fields are to be used in similar queries -- so naming
them differently would only complicate request logic.

Althought "hashtag_list" does not look very good, but -- let's keep
things consistent. Should we change the naming -- we'll change it in
both indices simultaneously.
'YYYY' and 'yyyy' are pretty much the same, but 'DD' and 'dd' are very
much different: the first is day of year, and the second -- day of
month.
@mgolosova mgolosova self-assigned this Jun 2, 2020
@mgolosova mgolosova marked this pull request as draft June 2, 2020 11:57
@mgolosova mgolosova changed the title [WIP] DF: events processing progress data. [override] DF: events processing progress data. Jun 18, 2020
@mgolosova mgolosova closed this Aug 5, 2020
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

1 participant