Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Metrics are not working for v3.9 #18941

Closed
mareklibra opened this issue Mar 12, 2018 · 5 comments
Closed

Metrics are not working for v3.9 #18941

mareklibra opened this issue Mar 12, 2018 · 5 comments

Comments

@mareklibra
Copy link

[provide a description of the issue]
The hawkular-metrics-openshift-infra is not running after oc cluster up

Version
# oc version
oc v3.9.0-alpha.4+6d21b7d-539
kubernetes v1.9.1+a0ce1bc657
features: Basic-Auth GSSAPI Kerberos SPNEGO

Server https://127.0.0.1:8443
openshift v3.9.0-alpha.4+6d21b7d-539
kubernetes v1.9.1+a0ce1bc657
Steps To Reproduce
  1. oc cluster up --metrics=true --service-catalog=true --version=v3.9
...
The metrics service is available at:
    https://hawkular-metrics-openshift-infra.127.0.0.1.nip.io/hawkular/metrics
...
  1. In browser:
Current Result

Following the metrics service URL above, the browser says:

Application is not available
...
Expected Result

The metrics service is up.

Additional Information
# oc get pods -n openshift-infra
NAME                                  READY     STATUS    RESTARTS   AGE
openshift-ansible-metrics-job-2c45d   0/1       Error     0          38m
openshift-ansible-metrics-job-45cps   0/1       Error     0          37m
openshift-ansible-metrics-job-4ls64   0/1       Error     0          38m
openshift-ansible-metrics-job-h65d8   0/1       Error     0          38m
openshift-ansible-metrics-job-w84q2   0/1       Error     0          40m
openshift-ansible-metrics-job-w8j8l   0/1       Error     0          36m
# oc logs po/openshift-ansible-metrics-job-w8j8l -n openshift-infra
Using /usr/share/ansible/openshift-ansible/ansible.cfg as config file
PLAY [Initialization Checkpoint Start] *****************************************
TASK [Set install initialization 'In Progress'] ********************************
Monday 12 March 2018  14:23:24 +0000 (0:00:00.102)       0:00:00.102 ********** 
ok: [127.0.0.1] => {"ansible_stats": {"aggregate": true, "data": {"installer_phase_initialize": {"start": "20180312142324Z", "status": "In Progress"}}, "per_host": false}, "changed": false}
PLAY [Populate config host groups] *********************************************
TASK [Load group name mapping variables] ***************************************
Monday 12 March 2018  14:23:24 +0000 (0:00:00.070)       0:00:00.173 ********** 
ok: [127.0.0.1] => {"ansible_facts": {"g_all_hosts": "{{ g_master_hosts | union(g_node_hosts) | union(g_etcd_hosts) | union(g_new_etcd_hosts) | union(g_lb_hosts) | union(g_nfs_hosts) | union(g_new_node_hosts)| union(g_new_master_hosts) | union(g_glusterfs_hosts) | union(g_glusterfs_registry_hosts) | default([]) }}", "g_etcd_hosts": "{{ groups.etcd | default([]) }}", "g_glusterfs_hosts": "{{ groups.glusterfs | default([]) }}", "g_glusterfs_registry_hosts": "{{ groups.glusterfs_registry | default(g_glusterfs_hosts) }}", "g_lb_hosts": "{{ groups.lb | default([]) }}", "g_master_hosts": "{{ groups.masters | default([]) }}", "g_new_etcd_hosts": "{{ groups.new_etcd | default([]) }}", "g_new_master_hosts": "{{ groups.new_masters | default([]) }}", "g_new_node_hosts": "{{ groups.new_nodes | default([]) }}", "g_nfs_hosts": "{{ groups.nfs | default([]) }}", "g_node_hosts": "{{ groups.nodes | default([]) }}"}, "ansible_included_var_files": ["/usr/share/ansible/openshift-ansible/playbooks/init/vars/cluster_hosts.yml"], "changed": false}
TASK [Evaluate groups - g_etcd_hosts or g_new_etcd_hosts required] *************
Monday 12 March 2018  14:23:24 +0000 (0:00:00.037)       0:00:00.210 ********** 
skipping: [127.0.0.1] => {"changed": false, "skip_reason": "Conditional result was False"}
TASK [Evaluate groups - g_master_hosts or g_new_master_hosts required] *********
Monday 12 March 2018  14:23:24 +0000 (0:00:00.038)       0:00:00.249 ********** 
skipping: [127.0.0.1] => {"changed": false, "skip_reason": "Conditional result was False"}
TASK [Evaluate groups - g_node_hosts or g_new_node_hosts required] *************
Monday 12 March 2018  14:23:24 +0000 (0:00:00.034)       0:00:00.283 ********** 
skipping: [127.0.0.1] => {"changed": false, "skip_reason": "Conditional result was False"}
TASK [Evaluate groups - g_lb_hosts required] ***********************************
Monday 12 March 2018  14:23:24 +0000 (0:00:00.033)       0:00:00.317 ********** 
skipping: [127.0.0.1] => {"changed": false, "skip_reason": "Conditional result was False"}
TASK [Evaluate groups - g_nfs_hosts required] **********************************
Monday 12 March 2018  14:23:24 +0000 (0:00:00.032)       0:00:00.349 ********** 
skipping: [127.0.0.1] => {"changed": false, "skip_reason": "Conditional result was False"}
TASK [Evaluate groups - g_nfs_hosts is single host] ****************************
Monday 12 March 2018  14:23:25 +0000 (0:00:00.033)       0:00:00.383 ********** 
skipping: [127.0.0.1] => {"changed": false, "skip_reason": "Conditional result was False"}
TASK [Evaluate groups - g_glusterfs_hosts required] ****************************
Monday 12 March 2018  14:23:25 +0000 (0:00:00.038)       0:00:00.422 ********** 
skipping: [127.0.0.1] => {"changed": false, "skip_reason": "Conditional result was False"}
TASK [Evaluate groups - Fail if no etcd hosts group is defined] ****************
Monday 12 March 2018  14:23:25 +0000 (0:00:00.032)       0:00:00.454 ********** 
skipping: [127.0.0.1] => {"changed": false, "skip_reason": "Conditional result was False"}
TASK [Evaluate oo_all_hosts] ***************************************************
Monday 12 March 2018  14:23:25 +0000 (0:00:00.032)       0:00:00.487 ********** 
ok: [127.0.0.1] => (item=127.0.0.1) => {"add_host": {"groups": ["oo_all_hosts"], "host_name": "127.0.0.1", "host_vars": {}}, "changed": false, "item": "127.0.0.1"}
TASK [Evaluate oo_masters] *****************************************************
Monday 12 March 2018  14:23:25 +0000 (0:00:00.085)       0:00:00.573 ********** 
ok: [127.0.0.1] => (item=127.0.0.1) => {"add_host": {"groups": ["oo_masters"], "host_name": "127.0.0.1", "host_vars": {}}, "changed": false, "item": "127.0.0.1"}
TASK [Evaluate oo_first_master] ************************************************
Monday 12 March 2018  14:23:25 +0000 (0:00:00.058)       0:00:00.631 ********** 
ok: [127.0.0.1] => {"add_host": {"groups": ["oo_first_master"], "host_name": "127.0.0.1", "host_vars": {}}, "changed": false}
TASK [Evaluate oo_new_etcd_to_config] ******************************************
Monday 12 March 2018  14:23:25 +0000 (0:00:00.046)       0:00:00.678 ********** 
TASK [Evaluate oo_masters_to_config] *******************************************
Monday 12 March 2018  14:23:25 +0000 (0:00:00.033)       0:00:00.711 ********** 
ok: [127.0.0.1] => (item=127.0.0.1) => {"add_host": {"groups": ["oo_masters_to_config"], "host_name": "127.0.0.1", "host_vars": {}}, "changed": false, "item": "127.0.0.1"}
TASK [Evaluate oo_etcd_to_config] **********************************************
Monday 12 March 2018  14:23:25 +0000 (0:00:00.053)       0:00:00.765 ********** 
ok: [127.0.0.1] => (item=127.0.0.1) => {"add_host": {"groups": ["oo_etcd_to_config"], "host_name": "127.0.0.1", "host_vars": {}}, "changed": false, "item": "127.0.0.1"}
TASK [Evaluate oo_first_etcd] **************************************************
Monday 12 March 2018  14:23:25 +0000 (0:00:00.054)       0:00:00.819 ********** 
ok: [127.0.0.1] => {"add_host": {"groups": ["oo_first_etcd"], "host_name": "127.0.0.1", "host_vars": {}}, "changed": false}
TASK [Evaluate oo_etcd_hosts_to_upgrade] ***************************************
Monday 12 March 2018  14:23:25 +0000 (0:00:00.049)       0:00:00.869 ********** 
ok: [127.0.0.1] => (item=127.0.0.1) => {"add_host": {"groups": ["oo_etcd_hosts_to_upgrade"], "host_name": "127.0.0.1", "host_vars": {}}, "changed": false, "item": "127.0.0.1"}
TASK [Evaluate oo_etcd_hosts_to_backup] ****************************************
Monday 12 March 2018  14:23:25 +0000 (0:00:00.049)       0:00:00.919 ********** 
ok: [127.0.0.1] => (item=127.0.0.1) => {"add_host": {"groups": ["oo_etcd_hosts_to_backup"], "host_name": "127.0.0.1", "host_vars": {}}, "changed": false, "item": "127.0.0.1"}
TASK [Evaluate oo_nodes_to_config] *********************************************
Monday 12 March 2018  14:23:25 +0000 (0:00:00.049)       0:00:00.968 ********** 
ok: [127.0.0.1] => (item=127.0.0.1) => {"add_host": {"groups": ["oo_nodes_to_config"], "host_name": "127.0.0.1", "host_vars": {}}, "changed": false, "item": "127.0.0.1"}
TASK [Add master to oo_nodes_to_config] ****************************************
Monday 12 March 2018  14:23:25 +0000 (0:00:00.070)       0:00:01.039 ********** 
skipping: [127.0.0.1] => (item=127.0.0.1)  => {"changed": false, "item": "127.0.0.1", "skip_reason": "Conditional result was False"}
TASK [Evaluate oo_lb_to_config] ************************************************
Monday 12 March 2018  14:23:25 +0000 (0:00:00.045)       0:00:01.084 ********** 
TASK [Evaluate oo_nfs_to_config] ***********************************************
Monday 12 March 2018  14:23:25 +0000 (0:00:00.069)       0:00:01.154 ********** 
TASK [Evaluate oo_glusterfs_to_config] *****************************************
Monday 12 March 2018  14:23:25 +0000 (0:00:00.031)       0:00:01.185 ********** 
TASK [Evaluate oo_etcd_to_migrate] *********************************************
Monday 12 March 2018  14:23:25 +0000 (0:00:00.034)       0:00:01.220 ********** 
ok: [127.0.0.1] => (item=127.0.0.1) => {"add_host": {"groups": ["oo_etcd_to_migrate"], "host_name": "127.0.0.1", "host_vars": {}}, "changed": false, "item": "127.0.0.1"}
 [WARNING]: Could not match supplied host pattern, ignoring: oo_lb_to_config
 [WARNING]: Could not match supplied host pattern, ignoring: oo_nfs_to_config
PLAY [Ensure that all non-node hosts are accessible] ***************************
TASK [Gathering Facts] *********************************************************
Monday 12 March 2018  14:23:25 +0000 (0:00:00.068)       0:00:01.289 ********** 
ok: [127.0.0.1]
PLAY [Initialize basic host facts] *********************************************
TASK [openshift_sanitize_inventory : include_tasks] ****************************
Monday 12 March 2018  14:23:27 +0000 (0:00:01.909)       0:00:03.199 ********** 
included: /usr/share/ansible/openshift-ansible/roles/openshift_sanitize_inventory/tasks/deprecations.yml for 127.0.0.1
TASK [openshift_sanitize_inventory : Check for usage of deprecated variables] ***
Monday 12 March 2018  14:23:27 +0000 (0:00:00.081)       0:00:03.280 ********** 
ok: [127.0.0.1] => {"censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result"}
TASK [openshift_sanitize_inventory : debug] ************************************
Monday 12 March 2018  14:23:28 +0000 (0:00:00.175)       0:00:03.455 ********** 
skipping: [127.0.0.1] => {"censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result"}
TASK [openshift_sanitize_inventory : set_stats] ********************************
Monday 12 March 2018  14:23:28 +0000 (0:00:00.035)       0:00:03.490 ********** 
skipping: [127.0.0.1] => {"censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result"}
TASK [openshift_sanitize_inventory : Assign deprecated variables to correct counterparts] ***
Monday 12 March 2018  14:23:28 +0000 (0:00:00.056)       0:00:03.547 ********** 
included: /usr/share/ansible/openshift-ansible/roles/openshift_sanitize_inventory/tasks/__deprecations_logging.yml for 127.0.0.1
included: /usr/share/ansible/openshift-ansible/roles/openshift_sanitize_inventory/tasks/__deprecations_metrics.yml for 127.0.0.1
TASK [openshift_sanitize_inventory : conditional_set_fact] *********************
Monday 12 March 2018  14:23:28 +0000 (0:00:00.120)       0:00:03.667 ********** 
ok: [127.0.0.1] => {"censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result"}
TASK [openshift_sanitize_inventory : set_fact] *********************************
Monday 12 March 2018  14:23:28 +0000 (0:00:00.153)       0:00:03.821 ********** 
ok: [127.0.0.1] => {"censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result"}
TASK [openshift_sanitize_inventory : conditional_set_fact] *********************
Monday 12 March 2018  14:23:28 +0000 (0:00:00.063)       0:00:03.885 ********** 
ok: [127.0.0.1] => {"censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result"}
TASK [openshift_sanitize_inventory : Standardize on latest variable names] *****
Monday 12 March 2018  14:23:28 +0000 (0:00:00.160)       0:00:04.045 ********** 
ok: [127.0.0.1] => {"censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result"}
TASK [openshift_sanitize_inventory : Normalize openshift_release] **************
Monday 12 March 2018  14:23:28 +0000 (0:00:00.047)       0:00:04.092 ********** 
skipping: [127.0.0.1] => {"censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result"}
TASK [openshift_sanitize_inventory : Abort when openshift_release is invalid] ***
Monday 12 March 2018  14:23:28 +0000 (0:00:00.059)       0:00:04.152 ********** 
skipping: [127.0.0.1] => {"censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result"}
TASK [openshift_sanitize_inventory : include_tasks] ****************************
Monday 12 March 2018  14:23:28 +0000 (0:00:00.048)       0:00:04.201 ********** 
included: /usr/share/ansible/openshift-ansible/roles/openshift_sanitize_inventory/tasks/unsupported.yml for 127.0.0.1
TASK [openshift_sanitize_inventory : Ensure that openshift_use_dnsmasq is true] ***
Monday 12 March 2018  14:23:28 +0000 (0:00:00.087)       0:00:04.289 ********** 
skipping: [127.0.0.1] => {"censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result"}
TASK [openshift_sanitize_inventory : Ensure that openshift_node_dnsmasq_install_network_manager_hook is true] ***
Monday 12 March 2018  14:23:28 +0000 (0:00:00.044)       0:00:04.334 ********** 
skipping: [127.0.0.1] => {"censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result"}
TASK [openshift_sanitize_inventory : set_fact] *********************************
Monday 12 March 2018  14:23:28 +0000 (0:00:00.038)       0:00:04.373 ********** 
TASK [openshift_sanitize_inventory : Ensure that dynamic provisioning is set if using dynamic storage] ***
Monday 12 March 2018  14:23:29 +0000 (0:00:00.115)       0:00:04.488 ********** 
skipping: [127.0.0.1] => {"censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result"}
TASK [openshift_sanitize_inventory : Ensure the hosted registry's GlusterFS storage is configured correctly] ***
Monday 12 March 2018  14:23:29 +0000 (0:00:00.053)       0:00:04.542 ********** 
skipping: [127.0.0.1] => {"censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result"}
TASK [openshift_sanitize_inventory : Ensure the hosted registry's GlusterFS storage is configured correctly] ***
Monday 12 March 2018  14:23:29 +0000 (0:00:00.051)       0:00:04.593 ********** 
skipping: [127.0.0.1] => {"censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result"}
TASK [openshift_sanitize_inventory : Ensure clusterid is set along with the cloudprovider] ***
Monday 12 March 2018  14:23:29 +0000 (0:00:00.047)       0:00:04.640 ********** 
skipping: [127.0.0.1] => {"censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result"}
TASK [openshift_sanitize_inventory : Ensure ansible_service_broker_remove and ansible_service_broker_install are mutually exclusive] ***
Monday 12 March 2018  14:23:29 +0000 (0:00:00.046)       0:00:04.686 ********** 
skipping: [127.0.0.1] => {"censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result"}
TASK [openshift_sanitize_inventory : Ensure template_service_broker_remove and template_service_broker_install are mutually exclusive] ***
Monday 12 March 2018  14:23:29 +0000 (0:00:00.039)       0:00:04.726 ********** 
skipping: [127.0.0.1] => {"censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result"}
TASK [openshift_sanitize_inventory : Ensure that all requires vsphere configuration variables are set] ***
Monday 12 March 2018  14:23:29 +0000 (0:00:00.039)       0:00:04.765 ********** 
skipping: [127.0.0.1] => {"censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result"}
TASK [Detecting Operating System from ostree_booted] ***************************
Monday 12 March 2018  14:23:29 +0000 (0:00:00.037)       0:00:04.803 ********** 
ok: [127.0.0.1] => {"changed": false, "stat": {"exists": false}}
TASK [set openshift_deployment_type if unset] **********************************
Monday 12 March 2018  14:23:29 +0000 (0:00:00.411)       0:00:05.214 ********** 
skipping: [127.0.0.1] => {"changed": false, "skip_reason": "Conditional result was False"}
TASK [initialize_facts set fact openshift_is_atomic and openshift_is_containerized] ***
Monday 12 March 2018  14:23:29 +0000 (0:00:00.034)       0:00:05.249 ********** 
ok: [127.0.0.1] => {"ansible_facts": {"openshift_is_atomic": false, "openshift_is_containerized": false}, "changed": false}
TASK [Determine Atomic Host Docker Version] ************************************
Monday 12 March 2018  14:23:29 +0000 (0:00:00.063)       0:00:05.312 ********** 
skipping: [127.0.0.1] => {"changed": false, "skip_reason": "Conditional result was False"}
TASK [assert atomic host docker version is 1.12 or later] **********************
Monday 12 March 2018  14:23:29 +0000 (0:00:00.040)       0:00:05.353 ********** 
skipping: [127.0.0.1] => {"changed": false, "skip_reason": "Conditional result was False"}
PLAY [Initialize special first-master variables] *******************************
TASK [set_fact] ****************************************************************
Monday 12 March 2018  14:23:30 +0000 (0:00:00.055)       0:00:05.408 ********** 
ok: [127.0.0.1] => {"ansible_facts": {"first_master_client_binary": "oc", "openshift_client_binary": "oc"}, "changed": false}
PLAY [Disable web console if required] *****************************************
TASK [set_fact] ****************************************************************
Monday 12 March 2018  14:23:30 +0000 (0:00:00.061)       0:00:05.470 ********** 
skipping: [127.0.0.1] => {"changed": false, "skip_reason": "Conditional result was False"}
PLAY [Install packages necessary for installer] ********************************
TASK [Ensure openshift-ansible installer package deps are installed] ***********
Monday 12 March 2018  14:23:30 +0000 (0:00:00.064)       0:00:05.534 ********** 
skipping: [127.0.0.1] => (item=iproute)  => {"changed": false, "item": "iproute", "skip_reason": "Conditional result was False"}
skipping: [127.0.0.1] => (item=dbus-python)  => {"changed": false, "item": "dbus-python", "skip_reason": "Conditional result was False"}
skipping: [127.0.0.1] => (item=PyYAML)  => {"changed": false, "item": "PyYAML", "skip_reason": "Conditional result was False"}
skipping: [127.0.0.1] => (item=python-ipaddress)  => {"changed": false, "item": "python-ipaddress", "skip_reason": "Conditional result was False"}
skipping: [127.0.0.1] => (item=yum-utils)  => {"changed": false, "item": "yum-utils", "skip_reason": "Conditional result was False"}
TASK [Ensure various deps for running system containers are installed] *********
Monday 12 March 2018  14:23:30 +0000 (0:00:00.110)       0:00:05.645 ********** 
skipping: [127.0.0.1] => (item=atomic)  => {"changed": false, "item": "atomic", "skip_reason": "Conditional result was False"}
skipping: [127.0.0.1] => (item=ostree)  => {"changed": false, "item": "ostree", "skip_reason": "Conditional result was False"}
skipping: [127.0.0.1] => (item=runc)  => {"changed": false, "item": "runc", "skip_reason": "Conditional result was False"}
PLAY [Initialize cluster facts] ************************************************
TASK [Gather Cluster facts] ****************************************************
Monday 12 March 2018  14:23:30 +0000 (0:00:00.075)       0:00:05.720 ********** 
An exception occurred during task execution. To see the full traceback, use -vvv. The error was: KeyError: 'ansible_default_ipv4'
fatal: [127.0.0.1]: FAILED! => {"changed": false, "module_stderr": "Traceback (most recent call last):\n  File \"/tmp/ansible_ptRXuV/ansible_module_openshift_facts.py\", line 1692, in <module>\n    main()\n  File \"/tmp/ansible_ptRXuV/ansible_module_openshift_facts.py\", line 1679, in main\n    additive_facts_to_overwrite)\n  File \"/tmp/ansible_ptRXuV/ansible_module_openshift_facts.py\", line 1344, in __init__\n    additive_facts_to_overwrite)\n  File \"/tmp/ansible_ptRXuV/ansible_module_openshift_facts.py\", line 1373, in generate_facts\n    defaults = self.get_defaults(roles, deployment_type, deployment_subtype)\n  File \"/tmp/ansible_ptRXuV/ansible_module_openshift_facts.py\", line 1405, in get_defaults\n    ip_addr = self.system_facts['ansible_default_ipv4']['address']\nKeyError: 'ansible_default_ipv4'\n", "module_stdout": "", "msg": "MODULE FAILURE", "rc": 1}
PLAY RECAP *********************************************************************
127.0.0.1                  : ok=25   changed=0    unreachable=0    failed=1   
INSTALLER STATUS ***************************************************************
Initialization             : In Progress (0:00:08)
Monday 12 March 2018  14:23:32 +0000 (0:00:02.046)       0:00:07.767 ********** 
=============================================================================== 
Gather Cluster facts ---------------------------------------------------- 2.05s
Gathering Facts --------------------------------------------------------- 1.91s
Detecting Operating System from ostree_booted --------------------------- 0.41s
openshift_sanitize_inventory : Check for usage of deprecated variables --- 0.18s
openshift_sanitize_inventory : conditional_set_fact --------------------- 0.16s
openshift_sanitize_inventory : conditional_set_fact --------------------- 0.15s
openshift_sanitize_inventory : Assign deprecated variables to correct counterparts --- 0.12s
openshift_sanitize_inventory : set_fact --------------------------------- 0.12s
Ensure openshift-ansible installer package deps are installed ----------- 0.11s
openshift_sanitize_inventory : include_tasks ---------------------------- 0.09s
Evaluate oo_all_hosts --------------------------------------------------- 0.09s
openshift_sanitize_inventory : include_tasks ---------------------------- 0.08s
Ensure various deps for running system containers are installed --------- 0.08s
Evaluate oo_nodes_to_config --------------------------------------------- 0.07s
Set install initialization 'In Progress' -------------------------------- 0.07s
Evaluate oo_lb_to_config ------------------------------------------------ 0.07s
Evaluate oo_etcd_to_migrate --------------------------------------------- 0.07s
set_fact ---------------------------------------------------------------- 0.06s
openshift_sanitize_inventory : set_fact --------------------------------- 0.06s
initialize_facts set fact openshift_is_atomic and openshift_is_containerized --- 0.06s
@mareklibra
Copy link
Author

Any update, please?

May I ask for confirmation that the steps above to enable metrics are correct? They seem to be aligned with documentation.

@jwforres
Copy link
Member

@spadgett i think you opened a BZ related to this right?

@spadgett
Copy link
Member

I don't know if there is a BZ. @jcantrill had opened this p0 issue in openshift-ansible:

openshift/openshift-ansible#7006

@spadgett
Copy link
Member

@jwforres
Copy link
Member

we'll track in the BZ

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants