Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Error setting up entry Worx - xxx@xxx.de for landroid_cloud #372

Closed
tofrie opened this issue May 20, 2023 · 4 comments · Fixed by #374
Closed

Error setting up entry Worx - xxx@xxx.de for landroid_cloud #372

tofrie opened this issue May 20, 2023 · 4 comments · Fixed by #374
Labels
bug Something isn't working in progress Development in progress pyworxcloud Indicates that this issue is from pyworxcloud

Comments

@tofrie
Copy link

tofrie commented May 20, 2023

Describe the issue

Error, the integration won‘t work.

What version of Home Assistant Core has the issue?

Home Assistant 2023.5.3

What was the last working version of Home Assistant Core?

No response

What version of the Landroid Cloud integration do you have installed

3.0.0

What type of installation are you running?

Home Assistant OS

Which make and model is the mower used for this integration?

Landroid Vision M600

Diagnostics information (NOT log entries!)

Xxx

Relevant log entries

Dieser Fehler wurde von einer benutzerdefinierten Integration verursacht

Logger: homeassistant.config_entries
Source: custom_components/landroid_cloud/__init__.py:206 
Integration: Landroid Cloud (documentation, issues) 
First occurred: 21:28:23 (1 occurrences) 
Last logged: 21:28:23

Error setting up entry Worx - tobiasfriedrich@t-online.de for landroid_cloud
Traceback (most recent call last):
  File "/usr/src/homeassistant/homeassistant/config_entries.py", line 387, in async_setup
    result = await component.async_setup_entry(hass, self)
  File "/config/custom_components/landroid_cloud/__init__.py", line 63, in async_setup_entry
    result = await _async_setup(hass, entry)
  File "/config/custom_components/landroid_cloud/__init__.py", line 206, in _async_setup
    await hass.async_add_executor_job(cloud.connect)
  File "/usr/local/lib/python3.10/concurrent/futures/thread.py", line 58, in run
    result = self.fn(*self.args, **self.kwargs)
  File "/usr/local/lib/python3.10/site-packages/pyworxcloud/__init__.py", line 235, in connect
    self._fetch()
  File "/usr/local/lib/python3.10/site-packages/pyworxcloud/__init__.py", line 516, in _fetch
    device = DeviceHandler(self._api, mower)
  File "/usr/local/lib/python3.10/site-packages/pyworxcloud/utils/devices.py", line 46, in __init__
    self.__mapinfo(api, mower)
  File "/usr/local/lib/python3.10/site-packages/pyworxcloud/utils/devices.py", line 100, in __mapinfo
    self.zone = Zone(data)
  File "/usr/local/lib/python3.10/site-packages/pyworxcloud/utils/zone.py", line 22, in __init__
    self["index"] = data["last_status"]["payload"]["dat"]["lz"]
KeyError: 'lz'

Additional information

No response

@tofrie tofrie added the bug Something isn't working label May 20, 2023
@MTrab
Copy link
Owner

MTrab commented May 20, 2023

Please enter the actual version numbers!

@tofrie
Copy link
Author

tofrie commented May 20, 2023

Done

@MTrab MTrab added the pyworxcloud Indicates that this issue is from pyworxcloud label May 20, 2023
@MTrab
Copy link
Owner

MTrab commented May 20, 2023

As this is a pyworxcloud issue, I'll lock this thread and continue any discussion in the issue created for this in the pyworxcloud repo (MTrab/pyworxcloud#148)

Repository owner locked and limited conversation to collaborators May 20, 2023
@MTrab MTrab added the in progress Development in progress label May 22, 2023
@MTrab
Copy link
Owner

MTrab commented May 22, 2023

Fixed in next release

Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
bug Something isn't working in progress Development in progress pyworxcloud Indicates that this issue is from pyworxcloud
Projects
None yet
Development

Successfully merging a pull request may close this issue.

2 participants