Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

pylxd broken by dependency requests 2.32.0 #579

Closed
NucciTheBoss opened this issue May 22, 2024 · 3 comments · Fixed by #580
Closed

pylxd broken by dependency requests 2.32.0 #579

NucciTheBoss opened this issue May 22, 2024 · 3 comments · Fixed by #580
Assignees
Labels

Comments

@NucciTheBoss
Copy link

It looks like the latest minor version release of the requests package breaks pylxd if you are communicating with the LXD API via the unix socket on the host. Here's the link to the issue: psf/requests#6707

requests now checks the scheme of the URL being used to communication with an endpoint; it does not recognize the http+unix scheme used to communicate with the LXD unix socket, so it throws an error when we attempt to contact LXD. Here's the error the HPC team is currently seeing in our CI when we attempt to pre-configure LXD before deploying Juju applications:

Run tox -e integration
integration: install_deps> python -I -m pip install -r /home/runner/work/cephfs-server-proxy-operator/cephfs-server-proxy-operator/dev-requirements.txt -r /home/runner/work/cephfs-server-proxy-operator/cephfs-server-proxy-operator/requirements.txt
integration: freeze> python -m pip freeze --all
integration: asttokens==2.4.1,bcrypt==4.1.3,cachetools==5.3.3,certifi==2024.2.2,cffi==1.1[6](https://github.com/canonical/cephfs-server-proxy-operator/actions/runs/9182281557/job/25282069534#step:4:7).0,charset-normalizer==3.3.2,cryptography==42.0.7,decorator==5.1.1,exceptiongroup==1.2.1,executing==2.0.1,google-auth==2.29.0,hvac==2.2.0,idna==3.7,iniconfig==2.0.0,ipdb==0.13.13,ipython==8.24.0,jedi==0.19.1,Jinja2==3.1.4,juju==3.4.0.0,kubernetes==29.0.0,macaroonbakery==1.3.4,MarkupSafe==2.1.5,matplotlib-inline==0.1.[7](https://github.com/canonical/cephfs-server-proxy-operator/actions/runs/9182281557/job/25282069534#step:4:8),mypy-extensions==1.0.0,oauthlib==3.2.2,ops==2.13.0,packaging==24.0,paramiko==3.4.0,parso==0.[8](https://github.com/canonical/cephfs-server-proxy-operator/actions/runs/9182281557/job/25282069534#step:4:9).4,pexpect==4.9.0,pip==24.0,pluggy==1.5.0,prompt-toolkit==3.0.43,protobuf==5.26.1,ptyprocess==0.7.0,pure-eval==0.2.2,pyasn1==0.6.0,pyasn1_modules==0.4.0,pycparser==2.22,Pygments==2.18.0,pylxd==2.3.3,pymacaroons==0.13.0,PyNaCl==1.5.0,pyRFC3339==1.1,pytest==7.4.4,pytest-asyncio==0.21.2,pytest-operator==0.35.0,pytest-order==1.2.1,python-dateutil==2.9.0.post0,pytz==2024.1,PyYAML==6.0.1,requests==2.32.2,requests-oauthlib==2.0.0,requests-toolbelt==1.0.0,requests-unixsocket==0.3.0,rsa==4.9,setuptools==69.5.1,six==1.16.0,st
integration: commands[0]> pytest -v -s --tb native --log-cli-level=INFO /home/runner/work/cephfs-server-proxy-operator/cephfs-server-proxy-operator/tests/integration
============================= test session starts ==============================
platform linux -- Python 3.10.12, pytest-7.4.4, pluggy-1.5.0 -- /home/runner/work/cephfs-server-proxy-operator/cephfs-server-proxy-operator/.tox/integration/bin/python
cachedir: .tox/integration/.pytest_cache
rootdir: /home/runner/work/cephfs-server-proxy-operator/cephfs-server-proxy-operator
configfile: pyproject.toml
plugins: operator-0.35.0, asyncio-0.21.2, order-1.2.1
asyncio: mode=strict
collecting ... collected 4 items
tests/integration/test_charm.py::test_build_and_deploy[ubuntu@22.04] 
-------------------------------- live log setup --------------------------------
INFO     pytest_operator.plugin:plugin.py:702 Adding model github-pr-a55e2-lxd:test-charm-4ejs on cloud localhost
FAILED
tests/integration/test_charm.py::test_integrate XFAIL (aborted)
tests/integration/test_charm.py::test_share_active XFAIL (aborted)
tests/integration/test_charm.py::test_reintegrate XFAIL (aborted)
------------------------------ live log teardown -------------------------------
INFO     pytest_operator.plugin:plugin.py:862 Model status:
Model            Controller           Cloud/Region         Version  SLA          Timestamp
test-charm-4ejs  github-pr-a55e2-lxd  localhost/localhost  3.1.8    unsupported  14:02:47Z
INFO     pytest_operator.plugin:plugin.py:868 Juju error logs:
INFO     pytest_operator.plugin:plugin.py:882 juju-crashdump finished [0]
INFO     pytest_operator.plugin:plugin.py:[9](https://github.com/canonical/cephfs-server-proxy-operator/actions/runs/9182281557/job/25282069534#step:4:10)71 Resetting model test-charm-4ejs...
INFO     pytest_operator.plugin:plugin.py:976 Not waiting on reset to complete.
INFO     pytest_operator.plugin:plugin.py:947 Forgetting model main...
=================================== FAILURES ===================================
_____________________ test_build_and_deploy[ubuntu@22.04] ______________________
Traceback (most recent call last):
  File "/home/runner/work/cephfs-server-proxy-operator/cephfs-server-proxy-operator/.tox/integration/lib/python3.[10](https://github.com/canonical/cephfs-server-proxy-operator/actions/runs/9182281557/job/25282069534#step:4:11)/site-packages/requests/adapters.py", line 555, in send
    conn = self.get_connection_with_tls_context(
  File "/home/runner/work/cephfs-server-proxy-operator/cephfs-server-proxy-operator/.tox/integration/lib/python3.10/site-packages/requests/adapters.py", line 4[11](https://github.com/canonical/cephfs-server-proxy-operator/actions/runs/9182281557/job/25282069534#step:4:12), in get_connection_with_tls_context
    conn = self.poolmanager.connection_from_host(
  File "/home/runner/work/cephfs-server-proxy-operator/cephfs-server-proxy-operator/.tox/integration/lib/python3.10/site-packages/urllib3/poolmanager.py", line 246, in connection_from_host
    return self.connection_from_context(request_context)
  File "/home/runner/work/cephfs-server-proxy-operator/cephfs-server-proxy-operator/.tox/integration/lib/python3.10/site-packages/urllib3/poolmanager.py", line 258, in connection_from_context
    raise URLSchemeUnknown(scheme)
urllib3.exceptions.URLSchemeUnknown: Not supported URL scheme http+unix
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
  File "/home/runner/work/cephfs-server-proxy-operator/cephfs-server-proxy-operator/.tox/integration/lib/python3.10/site-packages/pylxd/client.py", line 410, in __init__
    response = self.api.get()
  File "/home/runner/work/cephfs-server-proxy-operator/cephfs-server-proxy-operator/.tox/integration/lib/python3.10/site-packages/pylxd/client.py", line 206, in get
    response = self.session.get(self._api_endpoint, *args, **kwargs)
  File "/home/runner/work/cephfs-server-proxy-operator/cephfs-server-proxy-operator/.tox/integration/lib/python3.10/site-packages/requests/sessions.py", line 602, in get
    return self.request("GET", url, **kwargs)
  File "/home/runner/work/cephfs-server-proxy-operator/cephfs-server-proxy-operator/.tox/integration/lib/python3.10/site-packages/requests/sessions.py", line 589, in request
    resp = self.send(prep, **send_kwargs)
  File "/home/runner/work/cephfs-server-proxy-operator/cephfs-server-proxy-operator/.tox/integration/lib/python3.10/site-packages/requests/sessions.py", line 703, in send
    r = adapter.send(request, **kwargs)
  File "/home/runner/work/cephfs-server-proxy-operator/cephfs-server-proxy-operator/.tox/integration/lib/python3.10/site-packages/requests/adapters.py", line 559, in send
    raise InvalidURL(e, request=request)
requests.exceptions.InvalidURL: Not supported URL scheme http+unix
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
  File "/home/runner/work/cephfs-server-proxy-operator/cephfs-server-proxy-operator/.tox/integration/lib/python3.10/site-packages/_pytest/runner.py", line 341, in from_call
    result: Optional[TResult] = func()
  File "/home/runner/work/cephfs-server-proxy-operator/cephfs-server-proxy-operator/.tox/integration/lib/python3.10/site-packages/_pytest/runner.py", line 262, in <lambda>
    lambda: ihook(item=item, **kwds), when=when, reraise=reraise
  File "/home/runner/work/cephfs-server-proxy-operator/cephfs-server-proxy-operator/.tox/integration/lib/python3.10/site-packages/pluggy/_hooks.py", line 513, in __call__
    return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult)
  File "/home/runner/work/cephfs-server-proxy-operator/cephfs-server-proxy-operator/.tox/integration/lib/python3.10/site-packages/pluggy/_manager.py", line [12](https://github.com/canonical/cephfs-server-proxy-operator/actions/runs/9182281557/job/25282069534#step:4:13)0, in _hookexec
    return self._inner_hookexec(hook_name, methods, kwargs, firstresult)
  File "/home/runner/work/cephfs-server-proxy-operator/cephfs-server-proxy-operator/.tox/integration/lib/python3.10/site-packages/pluggy/_callers.py", line 182, in _multicall
    return outcome.get_result()
  File "/home/runner/work/cephfs-server-proxy-operator/cephfs-server-proxy-operator/.tox/integration/lib/python3.10/site-packages/pluggy/_result.py", line 100, in get_result
    raise exc.with_traceback(exc.__traceback__)
  File "/home/runner/work/cephfs-server-proxy-operator/cephfs-server-proxy-operator/.tox/integration/lib/python3.10/site-packages/pluggy/_callers.py", line 103, in _multicall
    res = hook_impl.function(*args)
  File "/home/runner/work/cephfs-server-proxy-operator/cephfs-server-proxy-operator/.tox/integration/lib/python3.10/site-packages/_pytest/runner.py", line 177, in pytest_runtest_call
    raise e
  File "/home/runner/work/cephfs-server-proxy-operator/cephfs-server-proxy-operator/.tox/integration/lib/python3.10/site-packages/_pytest/runner.py", line 169, in pytest_runtest_call
    item.runtest()
  File "/home/runner/work/cephfs-server-proxy-operator/cephfs-server-proxy-operator/.tox/integration/lib/python3.10/site-packages/_pytest/python.py", line 1792, in runtest
    self.ihook.pytest_pyfunc_call(pyfuncitem=self)
  File "/home/runner/work/cephfs-server-proxy-operator/cephfs-server-proxy-operator/.tox/integration/lib/python3.10/site-packages/pluggy/_hooks.py", line 5[13](https://github.com/canonical/cephfs-server-proxy-operator/actions/runs/9182281557/job/25282069534#step:4:14), in __call__
    return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult)
  File "/home/runner/work/cephfs-server-proxy-operator/cephfs-server-proxy-operator/.tox/integration/lib/python3.10/site-packages/pluggy/_manager.py", line 120, in _hookexec
    return self._inner_hookexec(hook_name, methods, kwargs, firstresult)
  File "/home/runner/work/cephfs-server-proxy-operator/cephfs-server-proxy-operator/.tox/integration/lib/python3.10/site-packages/pluggy/_callers.py", line 182, in _multicall
    return outcome.get_result()
  File "/home/runner/work/cephfs-server-proxy-operator/cephfs-server-proxy-operator/.tox/integration/lib/python3.10/site-packages/pluggy/_result.py", line 100, in get_result
    raise exc.with_traceback(exc.__traceback__)
  File "/home/runner/work/cephfs-server-proxy-operator/cephfs-server-proxy-operator/.tox/integration/lib/python3.10/site-packages/pluggy/_callers.py", line 103, in _multicall
    res = hook_impl.function(*args)
  File "/home/runner/work/cephfs-server-proxy-operator/cephfs-server-proxy-operator/.tox/integration/lib/python3.10/site-packages/_pytest/python.py", line 194, in pytest_pyfunc_call
    result = testfunction(**testargs)
  File "/home/runner/work/cephfs-server-proxy-operator/cephfs-server-proxy-operator/.tox/integration/lib/python3.10/site-packages/pytest_asyncio/plugin.py", line 529, in inner
    _loop.run_until_complete(task)
  File "/usr/lib/python3.10/asyncio/base_events.py", line 649, in run_until_complete
    return future.result()
  File "/home/runner/work/cephfs-server-proxy-operator/cephfs-server-proxy-operator/tests/integration/test_charm.py", line 30, in test_build_and_deploy
    share_info, auth_info = bootstrap_microceph()
  File "/home/runner/work/cephfs-server-proxy-operator/cephfs-server-proxy-operator/tests/integration/helpers.py", line 26, in bootstrap_microceph
    client = Client()
  File "/home/runner/work/cephfs-server-proxy-operator/cephfs-server-proxy-operator/.tox/integration/lib/python3.10/site-packages/pylxd/client.py", line 419, in __init__
    raise exceptions.ClientConnectionFailed(str(e))
pylxd.exceptions.ClientConnectionFailed: Not supported URL scheme http+unix
------------------------------ Captured log setup ------------------------------
INFO     pytest_operator.plugin:plugin.py:702 Adding model github-pr-a55e2-lxd:test-charm-4ejs on cloud localhost
=========================== short test summary info ============================
FAILED tests/integration/test_charm.py::test_build_and_deploy[ubuntu@22.04] - pylxd.exceptions.ClientConnectionFailed: Not supported URL scheme http+unix
========================= 1 failed, 3 xfailed in 2.57s =========================
integration: exit 1 (3.36 seconds) /home/runner/work/cephfs-server-proxy-operator/cephfs-server-proxy-operator> pytest -v -s --tb native --log-cli-level=INFO /home/runner/work/cephfs-server-proxy-operator/cephfs-server-proxy-operator/tests/integration pid=8[15](https://github.com/canonical/cephfs-server-proxy-operator/actions/runs/9182281557/job/25282069534#step:4:16)1
  integration: FAIL code 1 (17.30=setup[13.94]+cmd[3.36] seconds)
  evaluation failed :( ([17](https://github.com/canonical/cephfs-server-proxy-operator/actions/runs/9182281557/job/25282069534#step:4:18).45 seconds)

Workarounds

Looks like we'll either need to pin the version of requests to < 2.32 or find a workaround that enables pylxd to still communicate in HTTP over the LXD API unix socket. requests_unixsocket isn't pinned to a specific version of requests, so it will bring in the latest version of requests regardless.

@simondeziel
Copy link
Member

I proposed a fix in msabramo/requests-unixsocket#72

@cbartz
Copy link

cbartz commented May 23, 2024

+1 It also affects the self-hosted runners https://github.com/canonical/github-runner-operator using local LXD (a freshly packed charm cannot be installed anymore).

@tomponline
Copy link
Member

@simondeziel can we pin our deps in this repo for time being?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

Successfully merging a pull request may close this issue.

4 participants