-
Notifications
You must be signed in to change notification settings - Fork 0
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Spike: metadata completeness #39
Comments
Investigate how the analytics measures are generated in DataHub - any reliability issues |
some of the graphql queries for analytics are here these are for the charts though, it is missing the info in the top tile |
eg
|
We want to know how the analytics from the top tiles on the DataHub Analytics page are generated. These tiles are generated from the query getHighlights {
getHighlights {
value
title
body
__typename
}
}
/** TODO: Config Driven Charts Instead of Hardcoded. */ |
There was interest in tracking frontend activity with Google Analytics. This is something that's possible to do with DataHub as well. It's currently only available if we maintain a fork of DataHub. |
The 'Datasets' count widget includes the following text:
The description number isn't matching with the descriptions we're seeing on the platform, as much less than 100% of the entities have populated descriptions. The numerator for this percentage comes from here in the analytics derivation code. According to a response in the DataHub slack:
If we want to view the count for the in-catalogue description, we need to inspect a different property: the
It may be that more complex logic is required for calculation of the documentation numerator, as there doesn't appear to be an equivalent |
Investigation complete: team to have a discussion about outcome and which tickets need to be created off the back of this. |
Following on from #62
Create a report/dashboard which pulls from postgres (DataHub entity store) to display metadata completeness
Could be:
The text was updated successfully, but these errors were encountered: