-
Notifications
You must be signed in to change notification settings - Fork 0
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Dataflow cost metrics #3
Comments
table is updated regularly for running jobs (you cannot calculate the cost once, you have to subscribe to the changes / update regularly) |
json is obsolete, you have to use this api: https://cloud.google.com/billing/v1/how-tos/catalog-api |
usage:
Relevant entries:
each region might have different prices, the API returns the regions in the "serviceRegions" field. Dataflow region is in the table below the "Resource metrics" table. For "Current" fields the value is {price} * {current value} / {time period} For "Total" fields the value is {price} * {total value} For development, you can use API key 'AIzaSyBZVfVwDKpduSuNOJlvWildIeQ5AsNtnWM' (not sure if we should use this for prod, or get it from the user via config) |
✅ Currency code can be specified. https://cloud.google.com/billing/reference/rest/v1/services.skus/list |
The cost metrics should be displayed in a user preferred currency format. |
I think the project specific configuration is over-engineering. I believe a global GCPimp level configuration is more than enough |
Description of the SKU api response: https://cloud.google.com/billing/reference/rest/v1/services.skus/list#PricingExpression |
Dataflow job detail page (https://console.cloud.google.com/dataflow?project=projectId -> go to any job), on the right hand panel, under "Resource metrics", add two new rows:
Note that the page looks different for batch/streaming jobs (and possibli for SDK1/2 jobs as well)
The text was updated successfully, but these errors were encountered: