Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Agent driver requires agent_url in driver_internal_info #132

Closed
yakiyoshi opened this issue Jan 20, 2020 · 4 comments
Closed

Agent driver requires agent_url in driver_internal_info #132

yakiyoshi opened this issue Jan 20, 2020 · 4 comments

Comments

@yakiyoshi
Copy link

I got error messages on ironic container below and couldn't deploy baremetalhost.

  • Error Messages ( $ kubectl logs ironic-79cdf49594-cd4qh -n metal3 -c ironic )
    2020-01-10 07:05:03.736 34 ERROR ironic.conductor.utils [req-1c1e975c-f9ee-4149-bfbf-212e898c53fc - - - - -] Node a2a01673-cefa-4c16-8f7e-7f4cfcd26d36 failed deploy step {u'priority': 100, u'interface': u'deploy', u'step': u'deploy', u'argsinfo': None}. Error: Agent driver requires agent_url in driver_internal_info: IronicException: Agent driver requires agent_url in driver_internal_info
    2020-01-10 07:05:03.922 34 ERROR ironic.conductor.task_manager [req-1c1e975c-f9ee-4149-bfbf-212e898c53fc - - - - -] Node a2a01673-cefa-4c16-8f7e-7f4cfcd26d36 moved to provision state "deploy failed" from state "deploying"; target provision state is "active": IronicException: Agent driver requires agent_url in driver_internal_info

  • Baremetalhost CR
    apiVersion: metal3.io/v1alpha1
    kind: BareMetalHost
    metadata:
    name: baremetalhost01
    namespace: metal3
    spec:
    bmc:
    address: ipmi://10.1.1.17:623
    credentialsName: baremetalhost01-secret
    image:
    checksum: http://10.1.19.14:8080/images/my-image02.qcow2.md5sum
    url: http://10.1.19.14:8080/images/my-image02.qcow2
    online: true

Does anyone have ideas for solving this issue?

@juliakreger
Copy link
Member

Greetings @yakiyoshi! You can see some discussion on issue #130 which should hopefully shed light for you on what the issue is and how to remedy it. We have a patch in upstream ironic that is in review at this time. In essence a security fix broke the fast track feature and the the testing didn't catch it with the way it has been setup to use from RPMs until metal3 was already impacted.

Your fastest fix until we get upstream patched, is to turn off the fast track configuration in ironic.conf in your ironic-image container.

@maelk
Copy link
Member

maelk commented Jan 20, 2020

You can see an example in the baremetal operator deployment. The workaround configmap is here : https://github.com/metal3-io/baremetal-operator/blob/master/deploy/ironic_bmo_configmap.env#L9

@yakiyoshi
Copy link
Author

@juliakreger @maelk
Thank you for a valuable suggestion.
Setting IRONIC_FAST_TRACK=false environment value in the ironic container is working fine!

@rkamudhan
Copy link

@yakiyoshi @maelk @juliakreger , I am getting the same error

2020-02-13 21:35:24.172 27 ERROR ironic.conductor.utils [req-ff7c67e9-c321-4dec-b174-4a3e5bc3d340 - - - - -] Node b20054f5-493d-494f-88c9-3af5038ad300 failed deploy step {'step': 'deploy', 'priority': 100, 'argsinfo': None, 'interface': 'deploy'}. Error: Agent driver requires agent_url in driver_internal_info: ironic_lib.exception.IronicException: Agent driver requires agent_url in driver_internal_info

in the latest ironic image s follow:

2020-02-13 21:35:24.101 27 DEBUG ironic.common.states [req-ff7c67e9-c321-4dec-b174-4a3e5bc3d340 - - - - -] Exiting old state 'deploying' in response to event 'wait' on_exit /usr/lib/python3.6/site-packages/ironic/common/states.py:294
2020-02-13 21:35:24.102 27 DEBUG ironic.common.states [req-ff7c67e9-c321-4dec-b174-4a3e5bc3d340 - - - - -] Entering new state 'wait call-back' in response to event 'wait' on_enter /usr/lib/python3.6/site-packages/ironic/common/states.py:300
2020-02-13 21:35:24.134 27 INFO ironic.conductor.task_manager [req-ff7c67e9-c321-4dec-b174-4a3e5bc3d340 - - - - -] Node b20054f5-493d-494f-88c9-3af5038ad300 moved to provision state "wait call-back" from state "deploying"; target provision state is "active"
2020-02-13 21:35:24.136 27 DEBUG ironic.common.states [req-ff7c67e9-c321-4dec-b174-4a3e5bc3d340 - - - - -] Exiting old state 'wait call-back' in response to event 'resume' on_exit /usr/lib/python3.6/site-packages/ironic/common/states.py:294
2020-02-13 21:35:24.136 27 DEBUG ironic.common.states [req-ff7c67e9-c321-4dec-b174-4a3e5bc3d340 - - - - -] Entering new state 'deploying' in response to event 'resume' on_enter /usr/lib/python3.6/site-packages/ironic/common/states.py:300
2020-02-13 21:35:24.168 27 INFO ironic.conductor.task_manager [req-ff7c67e9-c321-4dec-b174-4a3e5bc3d340 - - - - -] Node b20054f5-493d-494f-88c9-3af5038ad300 moved to provision state "deploying" from state "wait call-back"; target provision state is "active"
2020-02-13 21:35:24.169 27 DEBUG ironic.drivers.modules.agent [req-ff7c67e9-c321-4dec-b174-4a3e5bc3d340 - - - - -] Continuing deploy for node b20054f5-493d-494f-88c9-3af5038ad300 with image http://172.22.0.1/images/bionic-server-cloudimg-amd64.img continue_deploy /usr/lib/python3.6/site-packages/ironic/drivers/modules/agent.py:206
2020-02-13 21:35:24.170 27 DEBUG ironic.drivers.modules.agent_client [req-ff7c67e9-c321-4dec-b174-4a3e5bc3d340 - - - - -] Preparing image bionic-server-cloudimg-amd64.img on node b20054f5-493d-494f-88c9-3af5038ad300. prepare_image /usr/lib/python3.6/site-packages/ironic/drivers/modules/agent_client.py:193
2020-02-13 21:35:24.172 27 ERROR ironic.conductor.utils [req-ff7c67e9-c321-4dec-b174-4a3e5bc3d340 - - - - -] Node b20054f5-493d-494f-88c9-3af5038ad300 failed deploy step {'step': 'deploy', 'priority': 100, 'argsinfo': None, 'interface': 'deploy'}. Error: Agent driver requires agent_url in driver_internal_info: ironic_lib.exception.IronicException: Agent driver requires agent_url in driver_internal_info
2020-02-13 21:35:24.209 27 DEBUG ironic.common.pxe_utils [req-ff7c67e9-c321-4dec-b174-4a3e5bc3d340 - - - - -] Cleaning up PXE config for node b20054f5-493d-494f-88c9-3af5038ad300 clean_up_pxe_config /usr/lib/python3.6/site-packages/ironic/common/pxe_utils.py:341
2020-02-13 21:35:24.210 27 DEBUG oslo_concurrency.lockutils [req-ff7c67e9-c321-4dec-b174-4a3e5bc3d340 - - - - -] Lock "master_image" acquired by "ironic.drivers.modules.image_cache.ImageCache.clean_up" :: waited 0.000s inner /usr/lib/python3.6/site-packages/oslo_concurrency/lockutils.py:358
2020-02-13 21:35:24.211 27 DEBUG ironic.drivers.modules.image_cache [req-ff7c67e9-c321-4dec-b174-4a3e5bc3d340 - - - - -] Starting clean up for master image cache /shared/tftpboot clean_up /usr/lib/python3.6/site-packages/ironic/drivers/modules/image_cache.py:198
2020-02-13 21:35:24.211 27 DEBUG oslo_concurrency.lockutils [req-ff7c67e9-c321-4dec-b174-4a3e5bc3d340 - - - - -] Lock "master_image" released by "ironic.drivers.modules.image_cache.ImageCache.clean_up" :: held 0.002s inner /usr/lib/python3.6/site-packages/oslo_concurrency/lockutils.py:370
2020-02-13 21:35:24.212 27 DEBUG ironic.common.pxe_utils [req-ff7c67e9-c321-4dec-b174-4a3e5bc3d340 - - - - -] Cleaning up PXE config for node b20054f5-493d-494f-88c9-3af5038ad300 clean_up_pxe_config /usr/lib/python3.6/site-packages/ironic/common/pxe_utils.py:341
2020-02-13 21:35:24.213 27 DEBUG oslo_concurrency.lockutils [req-ff7c67e9-c321-4dec-b174-4a3e5bc3d340 - - - - -] Lock "master_image" acquired by "ironic.drivers.modules.image_cache.ImageCache.clean_up" :: waited 0.000s inner /usr/lib/python3.6/site-packages/oslo_concurrency/lockutils.py:358
2020-02-13 21:35:24.214 27 DEBUG ironic.drivers.modules.image_cache [req-ff7c67e9-c321-4dec-b174-4a3e5bc3d340 - - - - -] Starting clean up for master image cache /shared/tftpboot clean_up /usr/lib/python3.6/site-packages/ironic/drivers/modules/image_cache.py:198```

elfosardo pushed a commit to elfosardo/ironic-image that referenced this issue Jan 19, 2021
Bug 1916145: Explicitly set minimum versions of python libraries
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

4 participants