-
Notifications
You must be signed in to change notification settings - Fork 2
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[override] DF: events processing progress data. #359
Closed
Closed
Conversation
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Add filters: - only production task jobs (`prodsourcelabel: managed`); - only jobs with `nevents > 0`.
When we get date field from the UC ES, it comes in thid format by default. We can, of course, specify the format we want and/or convert what we get to the required format -- but instead we may support loading values in this format.
To avoid extraction of task IDs for every campaign step and then querying progress statistics by these IDs, the parameters that allows to detect task's affilation with a campaign and its step are added to the progress info documents. In theory, a campaign can be said by a hashtag and a step -- by a combination of AMI tags chain and output data format. For now, however, we use MC steps -- or the last (current) AMI tag with format. So all three kinds of "steps" are added to the mapping.
This stage takes data from the 010 and extend documents with information from the DKB ES. Currently it works with one message at a time, but the internal functionality supports multiple messages processing.
If an input message is somehow "incorrect" (e.g. has no "taskid" field) or no information was found in the DKB ES -- the input message will be passed through as-is -- for it is not up to this stage to filter something out, right? However it can mark a message as "incomplete", just in case.
If the return value was a list of documents, merging it with a list of original messages in `process()` (in case of multiple messages processing) would be painful.
The trick was copy-and-pasted from some 'data4es' dataflow stage; but for this dataflow the location of the library is different. So this is the first stage that suggests that the pyDKB is available on a system level (generally of in virtual environment).
In the mapping for tasks metadata we already have "hashtag_list", and supposedly these fields are to be used in similar queries -- so naming them differently would only complicate request logic. Althought "hashtag_list" does not look very good, but -- let's keep things consistent. Should we change the naming -- we'll change it in both indices simultaneously.
'YYYY' and 'yyyy' are pretty much the same, but 'DD' and 'dd' are very much different: the first is day of year, and the second -- day of month.
mgolosova
changed the title
[WIP] DF: events processing progress data.
[override] DF: events processing progress data.
Jun 18, 2020
2 tasks
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
Overridden by multiple PRs related to the last item in the ToDo list:
#365,#366,#368,#369,#371 (#380),#374 (#381)Some of these PR will require: #176(done)Get events processing progress data to the DKB storage.
ToDo:
data4es
);progress4es
);Trello: https://trello.com/c/5BgrJPal