-
Notifications
You must be signed in to change notification settings - Fork 699
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[Device Support Request] Zemismart Curtain Motor with Rail (Mains powered) (_TZE200_rmymn92d) #1294
Comments
I got it to add the Cover entity with this:
Just.. The buttons all throw:
|
@operinko check this #744 (comment) |
Any update on support this device? |
@operinko When adding a custom quirk, you also want to update the information here: zha-device-handlers/zhaquirks/tuya/__init__.py Lines 94 to 111 in e4e6250
(I don't know which codes need to be added here -- or if this is even enough) (For some other device, If I remember correctly the structure for the custom quirk directory should look like this then: Note: Edit: Discussion continued in #1245 (comment) |
So you will see the device in HA, you can even check the status (open/close). But the buttons will not work
/config/custom_zha_quirks/init.py lines 109-110
|
@Kiread-work. The new HA version borked my quirk. Actually the custom quirk borked my ZHA network so I've removed it which means it cannot even track state of _TZE200_rmymn92d. We have confirmation that the code is working in MQTT and I submitted a PR for the cover inversion in my journey to figure out the difference in the code but I think we're going to need a new custom quirk to use, is that correct? Something has changed where HA really didn't like that custom quirk being there. How are things looking for you? still the same? This is now the last device in my home that HA cannot control :( |
@SergeantPup shame on you =) Just updated HA to core-2022.4.1 on 5.10.108 HAOS and my Xiaomi Humidifier stoped working =)) |
@SergeantPup well it works (status, not buttons). Nothing changed. |
I put it back in and it borked my zha network again so I took it out again. Tracking the curtain isn't THAT important without control. This must be something trivial. Half tempted to just buy another head unit but I have a 50% shot it might be the same chip :( |
|
Hello Julian! Here is the error I get when the quirk is present but makes the rest of my ZHA unresponsive:
Here is the last known working quirk before the April update. |
|
Try this one: tuya.zip |
ZHA network and device tracking of _TZE200_rmymn92d are concurrently working again with 2022.4.3. Thank you! I think I'm getting different errors for the _TZE200_rmymn92d command sends now (which is what the other user was reporting). Here is a successful command on working device: _TZE200_xaabybja with Nwk: 0xd775 (this device works fine)
Here is the new error from _TZE200_rmymn92d with Nwk: 0xc7ac (this is the device that's accurately tracking state and not responding to button presses).
|
Just to confirm, do your debug logs (from |
I just ran an open/stop/close command on _TZE200_xaabybja 0xb41c. 61 entries for "0xb41c" and 38 entries for "tuya" (none say sending:
|
The device that's tracking state but not controlling has "sending tuya" commands:
|
Can you see if there's any difference with this: tuya.zip |
Can you completely delete and re-pair the device with the quirk update I've sent a minute ago? |
removed device, swapped quirk, rebooted, added device:
|
I don't know why your working blinds don't log any "Sending Tuya". That would mean that this code is never called. Please check if your logger config (or service call where you set Also, can you send me the device signature from both the working cover that does not output the commands and from the non-working cover that outputs the command? |
That's correct:
|
Zigbee device signature of controllable unit:
Device signature of tracked but non controlled unit:
|
Yes |
@Kiread-work Wow, didn't know that there is percentage control. Found it, works, thanks! But buttons are still unusable. Yes, I've added my device into TUYA_COVER_COMMAND. |
Try changing Tuya_cover_command and I think your button will work. |
I've tried almost all combinations of 0x0001, 0x0002 and 0x0000, no success :/ |
@kolmakova Mine only works after I added it to the TUYA_COVER_INVERTED_BY_DEFAULT section
|
What’s the latest on this? I just purchased this Zemismart curtain motor and I would like to use in ZHA. Are we able to create a official quirk for it? |
Can someone create a PR for this, if this is working now? Thanks! 🥇 |
I'm wondering the same. |
I can confirm the quirk, in addition to this change works! |
I have a _TZE200_rmymn92d and am struggling to get the quirk working. I have tried: (1) downloaded https://github.com/zigpy/zha-device-handlers/blob/dev/zhaquirks/tuya/ts0601_cover.py and saved to config/custom_zha_quirks (2) Replaced line 266 under MODELS_INFO from ("_TZE200_3i3exuay", "TS0601"), to ("_TZE200_rmymn92d", "TS0601"), then saved (3) Added /config/custom_zha_quirks/init.py (4) Changed "set_data", {"param": Command}, False, is_manufacturer_specific=True to "set_data", {"param": Command}, False, is_manufacturer_specific=False#True (5) Deleted the cached file in __config/custom_zha_quirks/pycache (6) Re-paired the curtain motor to ZHA (7) Restarted HA I've also tried a combination of (4) and (3) in different order. I have quirks for other devices, so I assume I'm doing the right procedure (mostly), but I can't see that the quirk is loading for this device. Log
|
Hi all, has anyone got any suggestions on why this is not working for me? Thanks. |
You're in luck. I just migrated my Home Assistant this week and my _TZE200_rmymn92d was the last device that I successfully moved yesterday and I can confirm that my quirk is working on a brand new install. The easiest way I found to do this (in the last dozen times I've done this) is to use something like Samba share to swap out the files. After swapping the folders and a reboot, put this in your config yaml to start reading the quirk: I can give you a copy of my file if you choose how you want me to get it to you. There's no other custom quirk in it, the only custom quirk I have is for _TZE200_rmymn92d and that's all that's in the fileset. I DO recall from the previous times I did this that the file structure matters and it won't work if you don't have everything aligned just so (and I think there's a counterintuitive requirement here). Yesterday I basically did a lift and shift to a HA yellow and it the curtain immediately showed entities after I installed this. I did not build this quirk, somebody at HA more experienced help me build this a year ago. I confirmed with my new install that this quirk is still not part of core but it definitely works for this device. |
Isn't there a way to get this quirk included in main ZHA so it works for everyone? |
Yes but. If I recall correctly, the person who was helping me with this quirk had some unanswered questions about this device and the logs and I think that's what was preventing a permanent solution. My HA suggests where I should consider a pr for this custom quirk. I just never considered mine a "complete" or "correct" solution; however, I will admit it works enough to where I've had this curtain rail automated for a year with this quirk. The outstanding item on this device was: from all that we can tell, the curtain functionality works (open/close/pause) but it was still throwing some errors in the zha logs (nothing critical). Something about it looked like it was trying to call home with a date/time (which didn't make sense). This functionality isn't present in the other motor thats just like this with a different model number (TZE200_xaabybja). I thought it was also missing LQI or RSSI but I just looked and it appears to be functioning. Perhaps my memory is failing me on what the missing functionality was. I think we always figured somebody would come along and solve the mystery and "finish" the quirk. If you think its helpful to put in a pr for this quirk just with the knowledge that open close and pause works, then I'll be happy to submit my quirk for that. I'm just not sure my quirk is "best". |
Thanks a lot SergeantPup. Can you upload it to here: https://www.dropbox.com/request/X6Oj3LXZpuoYaz9MGERM |
done. Just reboot before you add the line in your config file pointing at the quirk |
Thanks. Before using your files, I removed my existing quirks and the pycache, removed my custom quirks reference in configuration.yaml and rebooted HA. I then copied across your files straight from what you uploaded. I rebooted, added in the zha quirks reference to configuration.yaml, and rebooted once again. I re-paired the device but it wasn't loading the quirk. I looked at the logs and it gave this error:
Looking at your quirk, Line 13 was a little different to what I had before. Your's had:
So I changed that to:
After another reboot, the error has gone and I can see the quirk is being applied. I now also see the controls for the device. Unfortunately, they don't do anything. |
I changed line 13 from **ts0601_cover.py**
**__init__.py**
|
That's much better than having nothing at all. I would definitely appreciate this PR. 👍🏼 |
Thank you for providing the files. I needed to change line 13 to Still it would be better if this was provided as a PR. |
=== updated on Jul 30, 2023 === Changes are rebased on Those files are: /config/zha-device-handlers/zhaquirks/tuya/__init__.py"""Tuya devices."""
import dataclasses
import datetime
import logging
from typing import Any, Callable, Dict, List, Optional, Tuple, Union
from zigpy.quirks import CustomCluster, CustomDevice
import zigpy.types as t
from zigpy.zcl import foundation
from zigpy.zcl.clusters.closures import WindowCovering
from zigpy.zcl.clusters.general import LevelControl, OnOff, PowerConfiguration
from zigpy.zcl.clusters.homeautomation import ElectricalMeasurement
from zigpy.zcl.clusters.hvac import Thermostat, UserInterface
from zigpy.zcl.clusters.smartenergy import Metering
from zhaquirks import Bus, EventableCluster, LocalDataCluster
from zhaquirks.const import (
DOUBLE_PRESS,
LEFT,
LONG_PRESS,
RIGHT,
SHORT_PRESS,
ZHA_SEND_EVENT,
)
# ---------------------------------------------------------
# Tuya Custom Cluster ID
# ---------------------------------------------------------
TUYA_CLUSTER_ID = 0xEF00
TUYA_CLUSTER_E000_ID = 0xE000
TUYA_CLUSTER_E001_ID = 0xE001
# ---------------------------------------------------------
# Tuya Cluster Commands
# ---------------------------------------------------------
TUYA_SET_DATA = 0x00
TUYA_GET_DATA = 0x01
TUYA_SET_DATA_RESPONSE = 0x02
TUYA_SEND_DATA = 0x04
TUYA_ACTIVE_STATUS_RPT = 0x06
TUYA_SET_TIME = 0x24
# TODO: To be checked
TUYA_MCU_VERSION_REQ = 0x10
TUYA_MCU_VERSION_RSP = 0x11
#
TUYA_LEVEL_COMMAND = 514
COVER_EVENT = "cover_event"
LEVEL_EVENT = "level_event"
TUYA_MCU_COMMAND = "tuya_mcu_command"
# Rotating for remotes
STOP = "stop" # To constans
# ---------------------------------------------------------
# Value for dp_type
# ---------------------------------------------------------
# ID Name Description
# ---------------------------------------------------------
# 0x00 DP_TYPE_RAW ?
# 0x01 DP_TYPE_BOOL ?
# 0x02 DP_TYPE_VALUE 4 byte unsigned integer
# 0x03 DP_TYPE_STRING variable length string
# 0x04 DP_TYPE_ENUM 1 byte enum
# 0x05 DP_TYPE_FAULT 1 byte bitmap (didn't test yet)
TUYA_DP_TYPE_RAW = 0x0000
TUYA_DP_TYPE_BOOL = 0x0100
TUYA_DP_TYPE_VALUE = 0x0200
TUYA_DP_TYPE_STRING = 0x0300
TUYA_DP_TYPE_ENUM = 0x0400
TUYA_DP_TYPE_FAULT = 0x0500
# ---------------------------------------------------------
# Value for dp_identifier (These are device specific)
# ---------------------------------------------------------
# ID Name Type Description
# ---------------------------------------------------------
# 0x01 control enum open, stop, close, continue
# 0x02 percent_control value 0-100% control
# 0x03 percent_state value Report from motor about current percentage
# 0x04 control_back enum Configures motor direction (untested)
# 0x05 work_state enum Motor Direction Setting
# 0x06 situation_set enum Configures if 100% equals to fully closed or fully open (untested)
# 0x07 fault bitmap Anything but 0 means something went wrong (untested)
TUYA_DP_ID_CONTROL = 0x01
TUYA_DP_ID_PERCENT_CONTROL = 0x02
TUYA_DP_ID_PERCENT_STATE = 0x03
TUYA_DP_ID_DIRECTION_CHANGE = 0x05
TUYA_DP_ID_COVER_INVERTED = 0x06
# ---------------------------------------------------------
# Window Cover Server Commands
# ---------------------------------------------------------
WINDOW_COVER_COMMAND_UPOPEN = 0x0000
WINDOW_COVER_COMMAND_DOWNCLOSE = 0x0001
WINDOW_COVER_COMMAND_STOP = 0x0002
WINDOW_COVER_COMMAND_LIFTPERCENT = 0x0005
WINDOW_COVER_COMMAND_CUSTOM = 0x0006
# ---------------------------------------------------------
# TUYA Cover Custom Values
# ---------------------------------------------------------
COVER_EVENT = "cover_event"
ATTR_COVER_POSITION = 0x0008
ATTR_COVER_DIRECTION = 0x8001
ATTR_COVER_INVERTED = 0x8002
# For most tuya devices 0 = Up/Open, 1 = Stop, 2 = Down/Close
TUYA_COVER_COMMAND = {
"_TZE200_zah67ekd": {0x0000: 0x0000, 0x0001: 0x0002, 0x0002: 0x0001},
"_TZE200_fzo2pocs": {0x0000: 0x0000, 0x0001: 0x0002, 0x0002: 0x0001},
"_TZE200_xuzcvlku": {0x0000: 0x0000, 0x0001: 0x0002, 0x0002: 0x0001},
"_TZE200_rddyvrci": {0x0000: 0x0002, 0x0001: 0x0001, 0x0002: 0x0000},
"_TZE200_3i3exuay": {0x0000: 0x0000, 0x0001: 0x0002, 0x0002: 0x0001},
"_TZE200_nueqqe6k": {0x0000: 0x0000, 0x0001: 0x0002, 0x0002: 0x0001},
"_TZE200_gubdgai2": {0x0000: 0x0000, 0x0001: 0x0002, 0x0002: 0x0001},
"_TZE200_zpzndjez": {0x0000: 0x0000, 0x0001: 0x0002, 0x0002: 0x0001},
"_TZE200_cowvfni3": {0x0000: 0x0002, 0x0001: 0x0000, 0x0002: 0x0001},
"_TYST11_wmcdj3aq": {0x0000: 0x0000, 0x0001: 0x0002, 0x0002: 0x0001},
"_TZE200_yenbr4om": {0x0000: 0x0000, 0x0001: 0x0002, 0x0002: 0x0001},
"_TZE200_5sbebbzs": {0x0000: 0x0000, 0x0001: 0x0002, 0x0002: 0x0001},
"_TZE200_xaabybja": {0x0000: 0x0000, 0x0001: 0x0002, 0x0002: 0x0001},
"_TZE200_hsgrhjpf": {0x0000: 0x0000, 0x0001: 0x0002, 0x0002: 0x0001},
"_TZE200_iossyxra": {0x0000: 0x0000, 0x0001: 0x0002, 0x0002: 0x0001},
"_TZE200_68nvbio9": {0x0000: 0x0000, 0x0001: 0x0002, 0x0002: 0x0001},
"_TZE200_zuz7f94z": {0x0000: 0x0000, 0x0001: 0x0002, 0x0002: 0x0001},
"_TZE200_ergbiejo": {0x0000: 0x0000, 0x0001: 0x0002, 0x0002: 0x0001},
"_TZE200_rmymn92d": {0x0000: 0x0000, 0x0001: 0x0002, 0x0002: 0x0001},
}
# Taken from zigbee-herdsman-converters
# Contains all covers which need their position inverted by default
# Default is 100 = open, 0 = closed; Devices listed here will use 0 = open, 100 = closed instead
# Use manufacturerName to identify device!
# Don't invert _TZE200_cowvfni3: https://github.com/Koenkk/zigbee2mqtt/issues/6043
TUYA_COVER_INVERTED_BY_DEFAULT = [
"_TZE200_wmcdj3aq",
"_TZE200_nogaemzt",
"_TZE200_xuzcvlku",
"_TZE200_xaabybja",
"_TZE200_yenbr4om",
"_TZE200_zpzndjez",
"_TZE200_zuz7f94z",
"_TZE200_rmymn92d",
]
# ---------------------------------------------------------
# TUYA Switch Custom Values
# ---------------------------------------------------------
SWITCH_EVENT = "switch_event"
ATTR_ON_OFF = 0x0000
ATTR_COVER_POSITION = 0x0008
TUYA_CMD_BASE = 0x0100
# ---------------------------------------------------------
# DP Value meanings in Status Report
# ---------------------------------------------------------
# Type ID IntDP Description
# ---------------------------------------------------------
# 0x04 0x01 1025 Confirm opening/closing/stopping (triggered from Zigbee)
# 0x02 0x02 514 Started moving to position (triggered from Zigbee)
# 0x04 0x07 1031 Started moving (triggered by transmitter order pulling on curtain)
# 0x02 0x03 515 Arrived at position
# 0x01 0x05 261 Returned by configuration set; ignore
# 0x02 0x69 617 Not sure what this is
# 0x04 0x05 1029 Changed the Motor Direction
# 0x04 0x65 1125 Change of tilt/lift mode 1 = lift 0=tilt
# ---------------------------------------------------------
_LOGGER = logging.getLogger(__name__)
class BigEndianInt16(int):
"""Helper class to represent big endian 16 bit value."""
def serialize(self) -> bytes:
"""Value serialisation."""
try:
return self.to_bytes(2, "big", signed=False)
except OverflowError as e:
# OverflowError is not a subclass of ValueError, making it annoying to catch
raise ValueError(str(e)) from e
@classmethod
def deserialize(cls, data: bytes) -> Tuple["BigEndianInt16", bytes]:
"""Value deserialisation."""
if len(data) < 2:
raise ValueError(f"Data is too short to contain {cls._size} bytes")
r = cls.from_bytes(data[:2], "big", signed=False)
data = data[2:]
return r, data
class TuyaTimePayload(t.LVList, item_type=t.uint8_t, length_type=BigEndianInt16):
"""Tuya set time payload definition."""
class TuyaDPType(t.enum8):
"""DataPoint Type."""
RAW = 0x00
BOOL = 0x01
VALUE = 0x02
STRING = 0x03
ENUM = 0x04
BITMAP = 0x05
class TuyaData(t.Struct):
"""Tuya Data type."""
dp_type: TuyaDPType
function: t.uint8_t
raw: t.LVBytes
@classmethod
def deserialize(cls, data: bytes) -> Tuple["TuyaData", bytes]:
"""Deserialize data."""
res = cls()
res.dp_type, data = TuyaDPType.deserialize(data)
res.function, data = t.uint8_t.deserialize(data)
res.raw, data = t.LVBytes.deserialize(data)
if res.dp_type not in (TuyaDPType.BITMAP, TuyaDPType.STRING, TuyaDPType.ENUM):
res.raw = res.raw[::-1]
return res, data
@property
def payload(self) -> Union[t.Bool, t.CharacterString, t.uint32_t, t.data32]:
"""Payload accordingly to data point type."""
if self.dp_type == TuyaDPType.VALUE:
return t.uint32_t.deserialize(self.raw)[0]
elif self.dp_type == TuyaDPType.BOOL:
return t.Bool.deserialize(self.raw)[0]
elif self.dp_type == TuyaDPType.STRING:
return self.raw.decode("utf8")
elif self.dp_type == TuyaDPType.ENUM:
return t.enum8.deserialize(self.raw)[0]
elif self.dp_type == TuyaDPType.BITMAP:
bitmaps = {1: t.bitmap8, 2: t.bitmap16, 4: t.bitmap32}
try:
return bitmaps[len(self.raw)].deserialize(self.raw)[0]
except KeyError as exc:
raise ValueError(f"Wrong bitmap length: {len(self.raw)}") from exc
raise ValueError(f"Unknown {self.dp_type} datapoint type")
class Data(t.List, item_type=t.uint8_t):
"""list of uint8_t."""
@classmethod
def from_value(cls, value):
"""Convert from a zigpy typed value to a tuya data payload."""
# serialized in little-endian by zigpy
data = cls(value.serialize())
# we want big-endian, with length prepended
data.append(len(data))
data.reverse()
return data
def to_value(self, ztype):
"""Convert from a tuya data payload to a zigpy typed value."""
# first uint8_t is the length of the remaining data
# tuya data is in big endian whereas ztypes use little endian
value, _ = ztype.deserialize(bytes(reversed(self[1:])))
return value
class TuyaCommand(t.Struct):
"""Tuya manufacturer cluster command."""
status: t.uint8_t
tsn: t.uint8_t
dp: t.uint8_t
data: TuyaData
class TuyaManufCluster(CustomCluster):
"""Tuya manufacturer specific cluster."""
name = "Tuya Manufacturer Specicific"
cluster_id = TUYA_CLUSTER_ID
ep_attribute = "tuya_manufacturer"
set_time_offset = 0
set_time_local_offset = None
class Command(t.Struct):
"""Tuya manufacturer cluster command."""
status: t.uint8_t
tsn: t.uint8_t
command_id: t.uint16_t
function: t.uint8_t
data: Data
class MCUVersionRsp(t.Struct):
"""Tuya MCU version response Zcl payload."""
tsn: t.uint16_t
version: t.uint8_t
""" Time sync command (It's transparent between MCU and server)
Time request device -> server
payloadSize = 0
Set time, server -> device
payloadSize, should be always 8
payload[0-3] - UTC timestamp (big endian)
payload[4-7] - Local timestamp (big endian)
Zigbee payload is very similar to the UART payload which is described here: https://developer.tuya.com/en/docs/iot/device-development/access-mode-mcu/zigbee-general-solution/tuya-zigbee-module-uart-communication-protocol/tuya-zigbee-module-uart-communication-protocol?id=K9ear5khsqoty#title-10-Time%20synchronization
Some devices need the timestamp in seconds from 1/1/1970 and others in seconds from 1/1/2000.
Also, there is devices which uses both timestamps variants (probably bug). Use set_time_local_offset var in this cases.
NOTE: You need to wait for time request before setting it. You can't set time without request."""
server_commands = {
0x0000: foundation.ZCLCommandDef(
"set_data", {"param": Command}, False, is_manufacturer_specific=False
),
0x0010: foundation.ZCLCommandDef(
"mcu_version_req",
{"param": t.uint16_t},
False,
is_manufacturer_specific=True,
),
0x0024: foundation.ZCLCommandDef(
"set_time", {"param": TuyaTimePayload}, False, is_manufacturer_specific=True
),
}
client_commands = {
0x0001: foundation.ZCLCommandDef(
"get_data", {"param": Command}, True, is_manufacturer_specific=True
),
0x0002: foundation.ZCLCommandDef(
"set_data_response", {"param": Command}, True, is_manufacturer_specific=True
),
0x0006: foundation.ZCLCommandDef(
"active_status_report",
{"param": Command},
True,
is_manufacturer_specific=True,
),
0x0011: foundation.ZCLCommandDef(
"mcu_version_rsp",
{"param": MCUVersionRsp},
True,
is_manufacturer_specific=True,
),
0x0024: foundation.ZCLCommandDef(
"set_time_request", {"param": t.data16}, True, is_manufacturer_specific=True
),
}
def __init__(self, *args, **kwargs):
"""Init."""
super().__init__(*args, **kwargs)
self.endpoint.device.command_bus = Bus()
self.endpoint.device.command_bus.add_listener(self) # listen MCU commands
def tuya_mcu_command(self, command: Command):
"""Tuya MCU command listener. Only endpoint:1 must listen to MCU commands."""
self.create_catching_task(
self.command(TUYA_SET_DATA, command, expect_reply=True)
)
def handle_cluster_request(
self,
hdr: foundation.ZCLHeader,
args: Tuple,
*,
dst_addressing: Optional[
Union[t.Addressing.Group, t.Addressing.IEEE, t.Addressing.NWK]
] = None,
) -> None:
"""Handle time request."""
if hdr.command_id != 0x0024 or self.set_time_offset == 0:
return super().handle_cluster_request(
hdr, args, dst_addressing=dst_addressing
)
# Send default response because the MCU expects it
if not hdr.frame_control.disable_default_response:
self.send_default_rsp(hdr, status=foundation.Status.SUCCESS)
_LOGGER.debug(
"[0x%04x:%s:0x%04x] Got set time request (command 0x%04x)",
self.endpoint.device.nwk,
self.endpoint.endpoint_id,
self.cluster_id,
hdr.command_id,
)
payload = TuyaTimePayload()
utc_timestamp = int(
(
datetime.datetime.utcnow()
- datetime.datetime(self.set_time_offset, 1, 1)
).total_seconds()
)
local_timestamp = int(
(
datetime.datetime.now()
- datetime.datetime(
self.set_time_local_offset or self.set_time_offset, 1, 1
)
).total_seconds()
)
payload.extend(utc_timestamp.to_bytes(4, "big", signed=False))
payload.extend(local_timestamp.to_bytes(4, "big", signed=False))
self.create_catching_task(
super().command(TUYA_SET_TIME, payload, expect_reply=False)
)
class TuyaManufClusterAttributes(TuyaManufCluster):
"""Manufacturer specific cluster for Tuya converting attributes <-> commands."""
def handle_cluster_request(
self,
hdr: foundation.ZCLHeader,
args: Tuple,
*,
dst_addressing: Optional[
Union[t.Addressing.Group, t.Addressing.IEEE, t.Addressing.NWK]
] = None,
) -> None:
"""Handle cluster request."""
if hdr.command_id not in (0x0001, 0x0002):
return super().handle_cluster_request(
hdr, args, dst_addressing=dst_addressing
)
# Send default response because the MCU expects it
if not hdr.frame_control.disable_default_response:
self.send_default_rsp(hdr, status=foundation.Status.SUCCESS)
tuya_cmd = args[0].command_id
tuya_data = args[0].data
_LOGGER.debug(
"[0x%04x:%s:0x%04x] Received value %s "
"for attribute 0x%04x (command 0x%04x)",
self.endpoint.device.nwk,
self.endpoint.endpoint_id,
self.cluster_id,
repr(tuya_data[1:]),
tuya_cmd,
hdr.command_id,
)
if tuya_cmd not in self.attributes:
return
ztype = self.attributes[tuya_cmd].type
zvalue = tuya_data.to_value(ztype)
self._update_attribute(tuya_cmd, zvalue)
def read_attributes(
self, attributes, allow_cache=False, only_cache=False, manufacturer=None
):
"""Ignore remote reads as the "get_data" command doesn't seem to do anything."""
return super().read_attributes(
attributes, allow_cache=True, only_cache=True, manufacturer=manufacturer
)
async def write_attributes(self, attributes, manufacturer=None):
"""Defer attributes writing to the set_data tuya command."""
records = self._write_attr_records(attributes)
for record in records:
cmd_payload = TuyaManufCluster.Command()
cmd_payload.status = 0
cmd_payload.tsn = self.endpoint.device.application.get_sequence()
cmd_payload.command_id = record.attrid
cmd_payload.function = 0
cmd_payload.data = Data.from_value(record.value.value)
await super().command(
TUYA_SET_DATA,
cmd_payload,
manufacturer=manufacturer,
expect_reply=False,
tsn=cmd_payload.tsn,
)
return [[foundation.WriteAttributesStatusRecord(foundation.Status.SUCCESS)]]
class TuyaOnOff(CustomCluster, OnOff):
"""Tuya On/Off cluster for On/Off device."""
def __init__(self, *args, **kwargs):
"""Init."""
super().__init__(*args, **kwargs)
self.endpoint.device.switch_bus.add_listener(self)
def switch_event(self, channel, state):
"""Switch event."""
_LOGGER.debug(
"%s - Received switch event message, channel: %d, state: %d",
self.endpoint.device.ieee,
channel,
state,
)
# update status only if event == endpoint
if self.endpoint.endpoint_id == channel:
self._update_attribute(ATTR_ON_OFF, state)
async def command(
self,
command_id: Union[foundation.GeneralCommand, int, t.uint8_t],
*args,
manufacturer: Optional[Union[int, t.uint16_t]] = None,
expect_reply: bool = True,
tsn: Optional[Union[int, t.uint8_t]] = None,
):
"""Override the default Cluster command."""
if command_id in (0x0000, 0x0001):
cmd_payload = TuyaManufCluster.Command()
cmd_payload.status = 0
# cmd_payload.tsn = tsn if tsn else self.endpoint.device.application.get_sequence()
cmd_payload.tsn = 0
cmd_payload.command_id = TUYA_CMD_BASE + self.endpoint.endpoint_id
cmd_payload.function = 0
cmd_payload.data = [1, command_id]
self.endpoint.device.command_bus.listener_event(
TUYA_MCU_COMMAND,
cmd_payload,
)
return foundation.Status.SUCCESS
return foundation.Status.UNSUP_CLUSTER_COMMAND
class TuyaManufacturerClusterOnOff(TuyaManufCluster):
"""Manufacturer Specific Cluster of On/Off device."""
def handle_cluster_request(
self,
hdr: foundation.ZCLHeader,
args: Tuple[TuyaManufCluster.Command],
*,
dst_addressing: Optional[
Union[t.Addressing.Group, t.Addressing.IEEE, t.Addressing.NWK]
] = None,
) -> None:
"""Handle cluster request."""
if hdr.command_id in (0x0002, 0x0001):
# Send default response because the MCU expects it
if not hdr.frame_control.disable_default_response:
self.send_default_rsp(hdr, status=foundation.Status.SUCCESS)
tuya_payload = args[0]
self.endpoint.device.switch_bus.listener_event(
SWITCH_EVENT,
tuya_payload.command_id - TUYA_CMD_BASE,
tuya_payload.data[1],
)
elif hdr.command_id == TUYA_SET_TIME:
"""Time event call super"""
_LOGGER.debug("TUYA_SET_TIME --> hdr: %s, args: %s", hdr, args)
super().handle_cluster_request(hdr, args, dst_addressing=dst_addressing)
else:
_LOGGER.warning("Unsupported command: %s", hdr)
class TuyaSwitch(CustomDevice):
"""Tuya switch device."""
def __init__(self, *args, **kwargs):
"""Init device."""
self.switch_bus = Bus()
super().__init__(*args, **kwargs)
class TuyaDimmerSwitch(TuyaSwitch):
"""Tuya dimmer switch device."""
def __init__(self, *args, **kwargs):
"""Init device."""
self.dimmer_bus = Bus()
super().__init__(*args, **kwargs)
class TuyaThermostatCluster(LocalDataCluster, Thermostat):
"""Thermostat cluster for Tuya thermostats."""
_CONSTANT_ATTRIBUTES = {0x001B: Thermostat.ControlSequenceOfOperation.Heating_Only}
def __init__(self, *args, **kwargs):
"""Init."""
super().__init__(*args, **kwargs)
self.endpoint.device.thermostat_bus.add_listener(self)
def temperature_change(self, attr, value):
"""Local or target temperature change from device."""
self._update_attribute(self.attributes_by_name[attr].id, value)
def state_change(self, value):
"""State update from device."""
if value == 0:
mode = self.RunningMode.Off
state = self.RunningState.Idle
else:
mode = self.RunningMode.Heat
state = self.RunningState.Heat_State_On
self._update_attribute(self.attributes_by_name["running_mode"].id, mode)
self._update_attribute(self.attributes_by_name["running_state"].id, state)
# pylint: disable=R0201
def map_attribute(self, attribute, value):
"""Map standardized attribute value to dict of manufacturer values."""
return {}
async def write_attributes(self, attributes, manufacturer=None):
"""Implement writeable attributes."""
records = self._write_attr_records(attributes)
if not records:
return [[foundation.WriteAttributesStatusRecord(foundation.Status.SUCCESS)]]
manufacturer_attrs = {}
for record in records:
attr_name = self.attributes[record.attrid].name
new_attrs = self.map_attribute(attr_name, record.value.value)
_LOGGER.debug(
"[0x%04x:%s:0x%04x] Mapping standard %s (0x%04x) "
"with value %s to custom %s",
self.endpoint.device.nwk,
self.endpoint.endpoint_id,
self.cluster_id,
attr_name,
record.attrid,
repr(record.value.value),
repr(new_attrs),
)
manufacturer_attrs.update(new_attrs)
if not manufacturer_attrs:
return [
[
foundation.WriteAttributesStatusRecord(
foundation.Status.FAILURE, r.attrid
)
for r in records
]
]
await self.endpoint.tuya_manufacturer.write_attributes(
manufacturer_attrs, manufacturer=manufacturer
)
return [[foundation.WriteAttributesStatusRecord(foundation.Status.SUCCESS)]]
# pylint: disable=W0236
async def command(
self,
command_id: Union[foundation.GeneralCommand, int, t.uint8_t],
*args,
manufacturer: Optional[Union[int, t.uint16_t]] = None,
expect_reply: bool = True,
tsn: Optional[Union[int, t.uint8_t]] = None,
):
"""Implement thermostat commands."""
if command_id != 0x0000:
return foundation.GENERAL_COMMANDS[
foundation.GeneralCommand.Default_Response
].schema(
command_id=command_id, status=foundation.Status.UNSUP_CLUSTER_COMMAND
)
mode, offset = args
if mode not in (self.SetpointMode.Heat, self.SetpointMode.Both):
return foundation.GENERAL_COMMANDS[
foundation.GeneralCommand.Default_Response
].schema(command_id=command_id, status=foundation.Status.INVALID_VALUE)
attrid = self.attributes_by_name["occupied_heating_setpoint"].id
success, _ = await self.read_attributes((attrid,), manufacturer=manufacturer)
try:
current = success[attrid]
except KeyError:
return foundation.Status.FAILURE
# offset is given in decidegrees, see Zigbee cluster specification
(res,) = await self.write_attributes(
{"occupied_heating_setpoint": current + offset * 10},
manufacturer=manufacturer,
)
return foundation.GENERAL_COMMANDS[
foundation.GeneralCommand.Default_Response
].schema(command_id=command_id, status=res[0].status)
class TuyaUserInterfaceCluster(LocalDataCluster, UserInterface):
"""HVAC User interface cluster for tuya thermostats."""
def __init__(self, *args, **kwargs):
"""Init."""
super().__init__(*args, **kwargs)
self.endpoint.device.ui_bus.add_listener(self)
def child_lock_change(self, mode):
"""Change of child lock setting."""
if mode == 0:
lockout = self.KeypadLockout.No_lockout
else:
lockout = self.KeypadLockout.Level_1_lockout
self._update_attribute(self.attributes_by_name["keypad_lockout"].id, lockout)
def map_attribute(self, attribute, value):
"""Map standardized attribute value to dict of manufacturer values."""
return {}
async def write_attributes(self, attributes, manufacturer=None):
"""Defer the keypad_lockout attribute to child_lock."""
records = self._write_attr_records(attributes)
manufacturer_attrs = {}
for record in records:
if record.attrid == self.attributes_by_name["keypad_lockout"].id:
lock = 0 if record.value.value == self.KeypadLockout.No_lockout else 1
new_attrs = {self._CHILD_LOCK_ATTR: lock}
else:
attr_name = self.attributes[record.attrid].name
new_attrs = self.map_attribute(attr_name, record.value.value)
_LOGGER.debug(
"[0x%04x:%s:0x%04x] Mapping standard %s (0x%04x) "
"with value %s to custom %s",
self.endpoint.device.nwk,
self.endpoint.endpoint_id,
self.cluster_id,
attr_name,
record.attrid,
repr(record.value.value),
repr(new_attrs),
)
manufacturer_attrs.update(new_attrs)
if not manufacturer_attrs:
return [
[
foundation.WriteAttributesStatusRecord(
foundation.Status.FAILURE, r.attrid
)
for r in records
]
]
await self.endpoint.tuya_manufacturer.write_attributes(
manufacturer_attrs, manufacturer=manufacturer
)
return [[foundation.WriteAttributesStatusRecord(foundation.Status.SUCCESS)]]
class TuyaPowerConfigurationCluster(LocalDataCluster, PowerConfiguration):
"""PowerConfiguration cluster for battery-operated thermostats."""
def __init__(self, *args, **kwargs):
"""Init."""
super().__init__(*args, **kwargs)
self.endpoint.device.battery_bus.add_listener(self)
def battery_change(self, value):
"""Change of reported battery percentage remaining."""
self._update_attribute(
self.attributes_by_name["battery_percentage_remaining"].id, value * 2
)
class TuyaPowerConfigurationCluster2AA(TuyaPowerConfigurationCluster):
"""PowerConfiguration cluster for battery-operated TRVs with 2 AA."""
BATTERY_SIZES = 0x0031
BATTERY_RATED_VOLTAGE = 0x0034
BATTERY_QUANTITY = 0x0033
_CONSTANT_ATTRIBUTES = {
BATTERY_SIZES: 3,
BATTERY_RATED_VOLTAGE: 15,
BATTERY_QUANTITY: 2,
}
class TuyaPowerConfigurationCluster3AA(TuyaPowerConfigurationCluster):
"""PowerConfiguration cluster for battery-operated TRVs with 3 AA."""
BATTERY_SIZES = 0x0031
BATTERY_RATED_VOLTAGE = 0x0034
BATTERY_QUANTITY = 0x0033
_CONSTANT_ATTRIBUTES = {
BATTERY_SIZES: 3,
BATTERY_RATED_VOLTAGE: 15,
BATTERY_QUANTITY: 3,
}
class TuyaThermostat(CustomDevice):
"""Generic Tuya thermostat device."""
def __init__(self, *args, **kwargs):
"""Init device."""
self.thermostat_bus = Bus()
self.ui_bus = Bus()
self.battery_bus = Bus()
super().__init__(*args, **kwargs)
# Tuya Zigbee OnOff Cluster Attribute Implementation
class SwitchBackLight(t.enum8):
"""Tuya switch back light mode enum."""
Mode_0 = 0x00
Mode_1 = 0x01
Mode_2 = 0x02
class SwitchMode(t.enum8):
"""Tuya switch mode enum."""
Command = 0x00
Event = 0x01
class PowerOnState(t.enum8):
"""Tuya power on state enum."""
Off = 0x00
On = 0x01
LastState = 0x02
class TuyaZBOnOffAttributeCluster(CustomCluster, OnOff):
"""Tuya Zigbee On Off cluster with extra attributes."""
attributes = OnOff.attributes.copy()
attributes.update({0x8000: ("child_lock", t.Bool)})
attributes.update({0x8001: ("backlight_mode", SwitchBackLight)})
attributes.update({0x8002: ("power_on_state", PowerOnState)})
attributes.update({0x8004: ("switch_mode", SwitchMode)})
class TuyaSmartRemoteOnOffCluster(OnOff, EventableCluster):
"""TuyaSmartRemoteOnOffCluster: fire events corresponding to press type."""
rotate_type = {
0x00: RIGHT,
0x01: LEFT,
0x02: STOP,
}
press_type = {
0x00: SHORT_PRESS,
0x01: DOUBLE_PRESS,
0x02: LONG_PRESS,
}
name = "TS004X_cluster"
ep_attribute = "TS004X_cluster"
attributes = OnOff.attributes.copy()
attributes.update({0x8001: ("backlight_mode", SwitchBackLight)})
attributes.update({0x8002: ("power_on_state", PowerOnState)})
attributes.update({0x8004: ("switch_mode", SwitchMode)})
def __init__(self, *args, **kwargs):
"""Init."""
self.last_tsn = -1
super().__init__(*args, **kwargs)
server_commands = OnOff.server_commands.copy()
server_commands.update(
{
0xFC: foundation.ZCLCommandDef(
"rotate_type",
{"rotate_type": t.uint8_t},
False,
is_manufacturer_specific=True,
),
0xFD: foundation.ZCLCommandDef(
"press_type",
{"press_type": t.uint8_t},
False,
is_manufacturer_specific=True,
),
}
)
def handle_cluster_request(
self,
hdr: foundation.ZCLHeader,
args: List[Any],
*,
dst_addressing: Optional[
Union[t.Addressing.Group, t.Addressing.IEEE, t.Addressing.NWK]
] = None,
):
"""Handle press_types command."""
# normally if default response sent, TS004x wouldn't send such repeated zclframe (with same sequence number),
# but for stability reasons (e. g. the case the response doesn't arrive the device), we can simply ignore it
if hdr.tsn == self.last_tsn:
_LOGGER.debug("TS004X: ignoring duplicate frame")
return
# save last sequence number
self.last_tsn = hdr.tsn
# send default response (as soon as possible), so avoid repeated zclframe from device
if not hdr.frame_control.disable_default_response:
self.debug("TS004X: send default response")
self.send_default_rsp(hdr, status=foundation.Status.SUCCESS)
# handle command
if hdr.command_id == 0xFC:
rotate_type = args[0]
self.listener_event(
ZHA_SEND_EVENT, self.rotate_type.get(rotate_type, "unknown"), []
)
elif hdr.command_id == 0xFD:
press_type = args[0]
self.listener_event(
ZHA_SEND_EVENT, self.press_type.get(press_type, "unknown"), []
)
# Tuya Zigbee Metering Cluster Correction Implementation
class TuyaZBMeteringCluster(CustomCluster, Metering):
"""Divides the kWh for tuya."""
MULTIPLIER = 0x0301
DIVISOR = 0x0302
_CONSTANT_ATTRIBUTES = {MULTIPLIER: 1, DIVISOR: 100}
class TuyaZBElectricalMeasurement(CustomCluster, ElectricalMeasurement):
"""Divides the Current for tuya."""
AC_CURRENT_MULTIPLIER = 0x0602
AC_CURRENT_DIVISOR = 0x0603
_CONSTANT_ATTRIBUTES = {AC_CURRENT_MULTIPLIER: 1, AC_CURRENT_DIVISOR: 1000}
# Tuya Zigbee Cluster 0xE000 Implementation
class TuyaZBE000Cluster(CustomCluster):
"""Tuya manufacturer specific cluster 57344."""
name = "Tuya Manufacturer Specific"
cluster_id = TUYA_CLUSTER_E000_ID
ep_attribute = "tuya_is_pita_0"
# Tuya Zigbee Cluster 0xE001 Implementation
class ExternalSwitchType(t.enum8):
"""Tuya external switch type enum."""
Toggle = 0x00
State = 0x01
Momentary = 0x02
class TuyaZBExternalSwitchTypeCluster(CustomCluster):
"""Tuya External Switch Type Cluster."""
name = "Tuya External Switch Type Cluster"
cluster_id = TUYA_CLUSTER_E001_ID
ep_attribute = "tuya_external_switch_type"
attributes = {0xD030: ("external_switch_type", ExternalSwitchType)}
# Tuya Window Cover Implementation
class TuyaManufacturerWindowCover(TuyaManufCluster):
"""Manufacturer Specific Cluster for cover device."""
def handle_cluster_request(
self,
hdr: foundation.ZCLHeader,
args: Tuple[TuyaManufCluster.Command],
*,
dst_addressing: Optional[
Union[t.Addressing.Group, t.Addressing.IEEE, t.Addressing.NWK]
] = None,
) -> None:
"""Handle cluster request."""
"""Tuya Specific Cluster Commands"""
if hdr.command_id in (TUYA_GET_DATA, TUYA_SET_DATA_RESPONSE):
tuya_payload = args[0]
_LOGGER.debug(
"%s Received Attribute Report. Command is 0x%04x, Tuya Paylod values"
"[Status : %s, TSN: %s, Command: 0x%04x, Function: 0x%02x, Data: %s]",
self.endpoint.device.ieee,
hdr.command_id,
tuya_payload.status,
tuya_payload.tsn,
tuya_payload.command_id,
tuya_payload.function,
tuya_payload.data,
)
if tuya_payload.command_id == TUYA_DP_TYPE_VALUE + TUYA_DP_ID_PERCENT_STATE:
self.endpoint.device.cover_bus.listener_event(
COVER_EVENT,
ATTR_COVER_POSITION,
tuya_payload.data[4],
)
elif (
tuya_payload.command_id
== TUYA_DP_TYPE_VALUE + TUYA_DP_ID_PERCENT_CONTROL
):
self.endpoint.device.cover_bus.listener_event(
COVER_EVENT,
ATTR_COVER_POSITION,
tuya_payload.data[4],
)
elif (
tuya_payload.command_id
== TUYA_DP_TYPE_ENUM + TUYA_DP_ID_DIRECTION_CHANGE
):
self.endpoint.device.cover_bus.listener_event(
COVER_EVENT,
ATTR_COVER_DIRECTION,
tuya_payload.data[1],
)
elif (
tuya_payload.command_id == TUYA_DP_TYPE_ENUM + TUYA_DP_ID_COVER_INVERTED
):
self.endpoint.device.cover_bus.listener_event(
COVER_EVENT,
ATTR_COVER_INVERTED,
tuya_payload.data[1], # Check this
)
elif hdr.command_id == TUYA_SET_TIME:
"""Time event call super"""
super().handle_cluster_request(hdr, args, dst_addressing=dst_addressing)
else:
_LOGGER.debug(
"%s Received Attribute Report - Unknown Command. Self [%s], Header [%s], Tuya Paylod [%s]",
self.endpoint.device.ieee,
self,
hdr,
args,
)
class TuyaWindowCoverControl(LocalDataCluster, WindowCovering):
"""Manufacturer Specific Cluster of Device cover."""
"""Add additional attributes for direction"""
attributes = WindowCovering.attributes.copy()
attributes.update({ATTR_COVER_DIRECTION: ("motor_direction", t.Bool)})
attributes.update({ATTR_COVER_INVERTED: ("cover_inverted", t.Bool)})
def __init__(self, *args, **kwargs):
"""Initialize instance."""
super().__init__(*args, **kwargs)
self.endpoint.device.cover_bus.add_listener(self)
def cover_event(self, attribute, value):
"""Event listener for cover events."""
if attribute == ATTR_COVER_POSITION:
invert_attr = self._attr_cache.get(ATTR_COVER_INVERTED) == 1
invert = (
not invert_attr
if self.endpoint.device.manufacturer in TUYA_COVER_INVERTED_BY_DEFAULT
else invert_attr
)
value = value if invert else 100 - value
self._update_attribute(attribute, value)
_LOGGER.debug(
"%s Tuya Attribute Cache : [%s]",
self.endpoint.device.ieee,
self._attr_cache,
)
def command(
self,
command_id: Union[foundation.GeneralCommand, int, t.uint8_t],
*args,
manufacturer: Optional[Union[int, t.uint16_t]] = None,
expect_reply: bool = True,
tsn: Optional[Union[int, t.uint8_t]] = None,
):
"""Override the default Cluster command."""
if manufacturer is None:
manufacturer = self.endpoint.device.manufacturer
_LOGGER.debug(
"%s Sending Tuya Cluster Command.. Manufacturer is %s Cluster Command is 0x%04x, Arguments are %s",
self.endpoint.device.ieee,
manufacturer,
command_id,
args,
)
# Open Close or Stop commands
tuya_payload = TuyaManufCluster.Command()
if command_id in (
WINDOW_COVER_COMMAND_UPOPEN,
WINDOW_COVER_COMMAND_DOWNCLOSE,
WINDOW_COVER_COMMAND_STOP,
):
tuya_payload.status = 0
tuya_payload.tsn = tsn if tsn else 0
tuya_payload.command_id = TUYA_DP_TYPE_ENUM + TUYA_DP_ID_CONTROL
tuya_payload.function = 0
tuya_payload.data = [
1,
# need to implement direction change
TUYA_COVER_COMMAND[manufacturer][command_id],
] # remap the command to the Tuya command
# Set Position Command
elif command_id == WINDOW_COVER_COMMAND_LIFTPERCENT:
tuya_payload.status = 0
tuya_payload.tsn = tsn if tsn else 0
tuya_payload.command_id = TUYA_DP_TYPE_VALUE + TUYA_DP_ID_PERCENT_CONTROL
tuya_payload.function = 0
"""Check direction and correct value"""
invert_attr = self._attr_cache.get(ATTR_COVER_INVERTED) == 1
invert = (
not invert_attr
if self.endpoint.device.manufacturer in TUYA_COVER_INVERTED_BY_DEFAULT
else invert_attr
)
position = args[0] if invert else 100 - args[0]
tuya_payload.data = [
4,
0,
0,
0,
position,
]
# Custom Command
elif command_id == WINDOW_COVER_COMMAND_CUSTOM:
tuya_payload.status = args[0]
tuya_payload.tsn = args[1]
tuya_payload.command_id = args[2]
tuya_payload.function = args[3]
tuya_payload.data = args[4]
else:
tuya_payload = None
# Send the command
if tuya_payload.command_id:
_LOGGER.debug(
"%s Sending Tuya Command. Paylod values [endpoint_id : %s, "
"Status : %s, TSN: %s, Command: 0x%04x, Function: %s, Data: %s]",
self.endpoint.device.ieee,
self.endpoint.endpoint_id,
tuya_payload.status,
tuya_payload.tsn,
tuya_payload.command_id,
tuya_payload.function,
tuya_payload.data,
)
return self.endpoint.tuya_manufacturer.command(
TUYA_SET_DATA, tuya_payload, expect_reply=True
)
else:
_LOGGER.debug("Unrecognised command: %x", command_id)
return foundation.Status.UNSUP_CLUSTER_COMMAND
class TuyaWindowCover(CustomDevice):
"""Tuya switch device."""
def __init__(self, *args, **kwargs):
"""Init device."""
self.cover_bus = Bus()
super().__init__(*args, **kwargs)
class TuyaManufacturerLevelControl(TuyaManufCluster):
"""Manufacturer Specific Cluster for cover device."""
def handle_cluster_request(
self,
hdr: foundation.ZCLHeader,
args: Tuple[TuyaManufCluster.Command],
*,
dst_addressing: Optional[
Union[t.Addressing.Group, t.Addressing.IEEE, t.Addressing.NWK]
] = None,
) -> None:
"""Handle cluster request."""
tuya_payload = args[0]
_LOGGER.debug(
"%s Received Attribute Report. Command is %x, Tuya Paylod values"
"[Status : %s, TSN: %s, Command: %s, Function: %s, Data: %s]",
self.endpoint.device.ieee,
hdr.command_id,
tuya_payload.status,
tuya_payload.tsn,
tuya_payload.command_id,
tuya_payload.function,
tuya_payload.data,
)
if hdr.command_id in (0x0002, 0x0001):
if tuya_payload.command_id == TUYA_LEVEL_COMMAND:
self.endpoint.device.dimmer_bus.listener_event(
LEVEL_EVENT,
tuya_payload.command_id,
tuya_payload.data,
)
else:
self.endpoint.device.switch_bus.listener_event(
SWITCH_EVENT,
tuya_payload.command_id - TUYA_CMD_BASE,
tuya_payload.data[1],
)
class TuyaLevelControl(CustomCluster, LevelControl):
"""Tuya Level cluster for dimmable device."""
def __init__(self, *args, **kwargs):
"""Init."""
super().__init__(*args, **kwargs)
self.endpoint.device.dimmer_bus.add_listener(self)
def level_event(self, channel, state):
"""Level event."""
level = (((state[3] << 8) + state[4]) * 255) // 1000
_LOGGER.debug(
"%s - Received level event message, channel: %d, level: %d, data: %d",
self.endpoint.device.ieee,
channel,
level,
state,
)
self._update_attribute(self.attributes_by_name["current_level"].id, level)
def command(
self,
command_id: Union[foundation.GeneralCommand, int, t.uint8_t],
*args,
manufacturer: Optional[Union[int, t.uint16_t]] = None,
expect_reply: bool = True,
tsn: Optional[Union[int, t.uint8_t]] = None,
):
"""Override the default Cluster command."""
_LOGGER.debug(
"%s Sending Tuya Cluster Command.. Cluster Command is %x, Arguments are %s",
self.endpoint.device.ieee,
command_id,
args,
)
# Move to level
# move_to_level_with_on_off
if command_id in (0x0000, 0x0001, 0x0004):
cmd_payload = TuyaManufCluster.Command()
cmd_payload.status = 0
cmd_payload.tsn = 0
cmd_payload.command_id = TUYA_LEVEL_COMMAND
cmd_payload.function = 0
brightness = (args[0] * 1000) // 255
val1 = brightness >> 8
val2 = brightness & 0xFF
cmd_payload.data = [4, 0, 0, val1, val2] # Custom Command
return self.endpoint.tuya_manufacturer.command(
TUYA_SET_DATA, cmd_payload, expect_reply=True
)
return foundation.Status.UNSUP_CLUSTER_COMMAND
class TuyaLocalCluster(LocalDataCluster):
"""Tuya virtual clusters.
Prevents attribute reads and writes. Attribute writes could be converted
to DataPoint updates.
"""
def update_attribute(self, attr_name: str, value: Any) -> None:
"""Update attribute by attribute name."""
try:
attr = self.attributes_by_name[attr_name]
except KeyError:
self.debug("no such attribute: %s", attr_name)
return
return self._update_attribute(attr.id, value)
@dataclasses.dataclass
class DPToAttributeMapping:
"""Container for datapoint to cluster attribute update mapping."""
ep_attribute: str
attribute_name: str
converter: Optional[
Callable[
[
Any,
],
Any,
]
] = None
endpoint_id: Optional[int] = None
class TuyaNewManufCluster(CustomCluster):
"""Tuya manufacturer specific cluster.
This is an attempt to consolidate the multiple above clusters into a
single framework. Instead of overriding the handle_cluster_request()
method, implement handlers for commands, like get_data, set_data_response,
set_time_request, etc.
"""
name: str = "Tuya Manufacturer Specific"
cluster_id: t.uint16_t = TUYA_CLUSTER_ID
ep_attribute: str = "tuya_manufacturer"
server_commands = {
TUYA_SET_DATA: foundation.ZCLCommandDef(
"set_data", {"data": TuyaCommand}, False, is_manufacturer_specific=True
),
TUYA_SEND_DATA: foundation.ZCLCommandDef(
"send_data", {"data": TuyaCommand}, False, is_manufacturer_specific=True
),
TUYA_SET_TIME: foundation.ZCLCommandDef(
"set_time", {"time": TuyaTimePayload}, False, is_manufacturer_specific=True
),
}
client_commands = {
TUYA_GET_DATA: foundation.ZCLCommandDef(
"get_data", {"data": TuyaCommand}, True, is_manufacturer_specific=True
),
TUYA_SET_DATA_RESPONSE: foundation.ZCLCommandDef(
"set_data_response",
{"data": TuyaCommand},
True,
is_manufacturer_specific=True,
),
TUYA_ACTIVE_STATUS_RPT: foundation.ZCLCommandDef(
"active_status_report",
{"data": TuyaCommand},
True,
is_manufacturer_specific=True,
),
TUYA_SET_TIME: foundation.ZCLCommandDef(
"set_time_request", {"data": t.data16}, True, is_manufacturer_specific=True
),
}
data_point_handlers: Dict[int, str] = {}
def handle_cluster_request(
self,
hdr: foundation.ZCLHeader,
args: Tuple,
*,
dst_addressing: Optional[
Union[t.Addressing.Group, t.Addressing.IEEE, t.Addressing.NWK]
] = None,
) -> None:
"""Handle cluster specific request."""
try:
if hdr.is_reply:
# server_cluster -> client_cluster cluster specific command
handler_name = f"handle_{self.client_commands[hdr.command_id].name}"
else:
handler_name = f"handle_{self.server_commands[hdr.command_id].name}"
except KeyError:
self.debug(
"Received unknown manufacturer command %s: %s", hdr.command_id, args
)
if not hdr.frame_control.disable_default_response:
self.send_default_rsp(
hdr, status=foundation.Status.UNSUP_CLUSTER_COMMAND
)
return
try:
status = getattr(self, handler_name)(*args)
except AttributeError:
self.warning(
"No '%s' tuya handler found for %s",
handler_name,
args,
)
status = foundation.Status.UNSUP_CLUSTER_COMMAND
if not hdr.frame_control.disable_default_response:
self.send_default_rsp(hdr, status=status)
def handle_get_data(self, command: TuyaCommand) -> foundation.Status:
"""Handle get_data response (report)."""
try:
dp_handler = self.data_point_handlers[command.dp]
getattr(self, dp_handler)(command)
except (AttributeError, KeyError):
self.debug("No datapoint handler for %s", command)
return foundation.status.UNSUPPORTED_ATTRIBUTE
return foundation.Status.SUCCESS
handle_set_data_response = handle_get_data
handle_active_status_report = handle_get_data
def handle_set_time_request(self, payload: t.uint16_t) -> foundation.Status:
"""Handle Time set request."""
return foundation.Status.SUCCESS
def _dp_2_attr_update(self, command: TuyaCommand) -> None:
"""Handle data point to attribute report conversion."""
try:
dp_map = self.dp_to_attribute[command.dp]
except KeyError:
self.debug("No attribute mapping for %s data point", command.dp)
return
endpoint = self.endpoint
if dp_map.endpoint_id:
endpoint = self.endpoint.device.endpoints[dp_map.endpoint_id]
cluster = getattr(endpoint, dp_map.ep_attribute)
value = command.data.payload
if dp_map.converter:
value = dp_map.converter(value)
cluster.update_attribute(dp_map.attribute_name, value) /config/zha-device-handlers/zhaquirks/tuya/ts0601_cover.py
"""Tuya based cover and blinds."""
from zigpy.profiles import zha
from zigpy.zcl.clusters.general import Basic, GreenPowerProxy, Groups, Identify, OnOff, Ota, Scenes, Time
from zhaquirks.const import (
DEVICE_TYPE,
ENDPOINTS,
INPUT_CLUSTERS,
MODELS_INFO,
OUTPUT_CLUSTERS,
PROFILE_ID,
)
from . import (
TuyaManufacturerWindowCover,
TuyaManufCluster,
TuyaWindowCover,
TuyaWindowCoverControl,
)
class TuyaZemismartSmartCover0601(TuyaWindowCover):
"""Tuya Zemismart blind cover motor."""
signature = {
# "node_descriptor": "<NodeDescriptor byte1=1 byte2=64 mac_capability_flags=142 manufacturer_code=4098
# maximum_buffer_size=82 maximum_incoming_transfer_size=82 server_mask=11264
# maximum_outgoing_transfer_size=82 descriptor_capability_field=0>",
# input_clusters=[0x0000, 0x0004, 0x0005, 0x000a, 0xef00]
# output_clusters=[0x0019]
# <SimpleDescriptor endpoint=1 profile=260 device_type=51 input_clusters=[0, 4, 5, 61184] output_clusters=[25]>
MODELS_INFO: [
("_TZE200_fzo2pocs", "TS0601"),
("_TZE200_zpzndjez", "TS0601"),
("_TZE200_cowvfni3", "TS0601"),
],
ENDPOINTS: {
1: {
PROFILE_ID: zha.PROFILE_ID,
DEVICE_TYPE: zha.DeviceType.SMART_PLUG,
INPUT_CLUSTERS: [
Basic.cluster_id,
Groups.cluster_id,
Scenes.cluster_id,
Time.cluster_id,
TuyaManufCluster.cluster_id,
],
OUTPUT_CLUSTERS: [Ota.cluster_id],
},
},
}
replacement = {
ENDPOINTS: {
1: {
PROFILE_ID: zha.PROFILE_ID,
DEVICE_TYPE: zha.DeviceType.WINDOW_COVERING_DEVICE,
INPUT_CLUSTERS: [
Basic.cluster_id,
Groups.cluster_id,
Scenes.cluster_id,
Time.cluster_id,
TuyaManufacturerWindowCover,
TuyaWindowCoverControl,
],
OUTPUT_CLUSTERS: [Ota.cluster_id],
},
},
}
# From: https://github.com/zigpy/zha-device-handlers/issues/1294#issuecomment-1014843749
class TuyaZemismartSmartCover0601_4(TuyaWindowCover):
"""Tuya blind controller device."""
signature = {
# "node_descriptor": "NodeDescriptor(byte1=1, byte2=64, mac_capability_flags=142, manufacturer_code=4417,
# maximum_buffer_size=66, maximum_incoming_transfer_size=66, server_mask=10752,
# maximum_outgoing_transfer_size=66, descriptor_capability_field=0>,
# "endpoints": { "1": { "profile_id": 260, "device_type": "0x0051", "in_clusters": [ "0x0000", "0x0004",
# "0x0005","0xef00"], "out_clusters": ["0x000a","0x0019"] }, "242": { "profile_id": 41440, "device_type":
# "0x0061", in_clusters": [], "out_clusters": [ "0x0021" ] } }, "manufacturer": "_TZE200_rmymn92d",
# "model": "TS0601", "class": "zigpy.device.Device" }
MODELS_INFO: [
("_TZE200_rmymn92d", "TS0601"),
],
ENDPOINTS: {
1: {
PROFILE_ID: zha.PROFILE_ID,
DEVICE_TYPE: zha.DeviceType.SMART_PLUG,
INPUT_CLUSTERS: [
Basic.cluster_id,
Groups.cluster_id,
Scenes.cluster_id,
TuyaManufCluster.cluster_id,
],
OUTPUT_CLUSTERS: [Time.cluster_id, Ota.cluster_id],
},
242: {
PROFILE_ID: 41440,
DEVICE_TYPE: 0x0061,
INPUT_CLUSTERS: [],
OUTPUT_CLUSTERS: [GreenPowerProxy.cluster_id],
},
},
}
replacement = {
ENDPOINTS: {
1: {
PROFILE_ID: zha.PROFILE_ID,
DEVICE_TYPE: zha.DeviceType.WINDOW_COVERING_DEVICE,
INPUT_CLUSTERS: [
Basic.cluster_id,
Groups.cluster_id,
Scenes.cluster_id,
TuyaManufacturerWindowCover,
TuyaWindowCoverControl,
],
OUTPUT_CLUSTERS: [Time.cluster_id, Ota.cluster_id],
},
242: {
PROFILE_ID: 41440,
DEVICE_TYPE: 0x0061,
INPUT_CLUSTERS: [],
OUTPUT_CLUSTERS: [GreenPowerProxy.cluster_id],
},
}
}
class TuyaZemismartSmartCover0601_3(TuyaWindowCover):
"""Tuya Zemismart blind cover motor."""
signature = {
# "node_descriptor": "<NodeDescriptor byte1=1 byte2=64 mac_capability_flags=142 manufacturer_code=4098
# maximum_buffer_size=82 maximum_incoming_transfer_size=82 server_mask=11264
# maximum_outgoing_transfer_size=82 descriptor_capability_field=0>",
# input_clusters=[0x0000, 0x0004, 0x0005, 0x000a, 0xef00]
# output_clusters=[0x0019]
# <SimpleDescriptor endpoint=1 profile=260 device_type=51 input_clusters=[0, 4, 5, 61184] output_clusters=[25]>
MODELS_INFO: [
("_TZE200_fzo2pocs", "TS0601"),
("_TZE200_zpzndjez", "TS0601"),
("_TZE200_iossyxra", "TS0601"),
],
ENDPOINTS: {
1: {
PROFILE_ID: zha.PROFILE_ID,
DEVICE_TYPE: zha.DeviceType.SMART_PLUG,
INPUT_CLUSTERS: [
Basic.cluster_id,
Groups.cluster_id,
Scenes.cluster_id,
TuyaManufCluster.cluster_id,
],
OUTPUT_CLUSTERS: [Time.cluster_id, Ota.cluster_id],
},
},
}
replacement = {
ENDPOINTS: {
1: {
PROFILE_ID: zha.PROFILE_ID,
DEVICE_TYPE: zha.DeviceType.WINDOW_COVERING_DEVICE,
INPUT_CLUSTERS: [
Basic.cluster_id,
Groups.cluster_id,
Scenes.cluster_id,
TuyaManufacturerWindowCover,
TuyaWindowCoverControl,
],
OUTPUT_CLUSTERS: [Ota.cluster_id],
},
},
}
class TuyaZemismartSmartCover0601_2(TuyaWindowCover):
"""Tuya Zemismart curtain cover motor."""
signature = {
# "node_descriptor": "<NodeDescriptor byte1=1 byte2=64 mac_capability_flags=142 manufacturer_code=4098
# maximum_buffer_size=82 maximum_incoming_transfer_size=82 server_mask=11264
# maximum_outgoing_transfer_size=82 descriptor_capability_field=0>",
# input_clusters=[0x0000, 0x000a, 0x0004, 0x0005, 0xef00]
# output_clusters=[0x0019]
# <SimpleDescriptor endpoint=1 profile=260 device_type=81 input_clusters=[0, 10, 4, 5, 61184] output_clusters=[25]>
MODELS_INFO: [
("_TZE200_3i3exuay", "TS0601"),
],
ENDPOINTS: {
1: {
PROFILE_ID: zha.PROFILE_ID,
DEVICE_TYPE: zha.DeviceType.SMART_PLUG,
INPUT_CLUSTERS: [
Basic.cluster_id,
Time.cluster_id,
Groups.cluster_id,
Scenes.cluster_id,
TuyaManufCluster.cluster_id,
],
OUTPUT_CLUSTERS: [Ota.cluster_id],
},
},
}
replacement = {
ENDPOINTS: {
1: {
PROFILE_ID: zha.PROFILE_ID,
DEVICE_TYPE: zha.DeviceType.WINDOW_COVERING_DEVICE,
INPUT_CLUSTERS: [
Basic.cluster_id,
Groups.cluster_id,
Scenes.cluster_id,
Time.cluster_id,
TuyaManufacturerWindowCover,
TuyaWindowCoverControl,
],
OUTPUT_CLUSTERS: [Ota.cluster_id],
},
},
}
class TuyaMoesCover0601(TuyaWindowCover):
"""Tuya blind controller device."""
signature = {
# "node_descriptor": "NodeDescriptor(byte1=2, byte2=64, mac_capability_flags=128, manufacturer_code=4098,
# maximum_buffer_size=82, maximum_incoming_transfer_size=82, server_mask=11264,
# maximum_outgoing_transfer_size=82, descriptor_capability_field=0)",
# "endpoints": {
# "1": { "profile_id": 260, "device_type": "0x0051", "in_clusters": [ "0x0000", "0x0004","0x0005","0xef00"], "out_clusters": ["0x000a","0x0019"] }
# },
# "manufacturer": "_TZE200_zah67ekd",
# "model": "TS0601",
# "class": "zigpy.device.Device"
# }
MODELS_INFO: [
("_TZE200_zah67ekd", "TS0601"),
("_TZE200_xuzcvlku", "TS0601"),
("_TZE200_rddyvrci", "TS0601"),
("_TZE200_nueqqe6k", "TS0601"),
("_TZE200_gubdgai2", "TS0601"),
("_TZE200_yenbr4om", "TS0601"),
("_TZE200_5sbebbzs", "TS0601"),
("_TZE200_xaabybja", "TS0601"),
("_TZE200_hsgrhjpf", "TS0601"),
("_TZE200_68nvbio9", "TS0601"),
("_TZE200_zuz7f94z", "TS0601"),
("_TZE200_ergbiejo", "TS0601"),
],
ENDPOINTS: {
1: {
PROFILE_ID: zha.PROFILE_ID,
DEVICE_TYPE: zha.DeviceType.SMART_PLUG,
INPUT_CLUSTERS: [
Basic.cluster_id,
Groups.cluster_id,
Scenes.cluster_id,
TuyaManufCluster.cluster_id,
],
OUTPUT_CLUSTERS: [Time.cluster_id, Ota.cluster_id],
}
},
}
replacement = {
ENDPOINTS: {
1: {
DEVICE_TYPE: zha.DeviceType.WINDOW_COVERING_DEVICE,
INPUT_CLUSTERS: [
Basic.cluster_id,
Groups.cluster_id,
Scenes.cluster_id,
TuyaManufacturerWindowCover,
TuyaWindowCoverControl,
],
OUTPUT_CLUSTERS: [Time.cluster_id, Ota.cluster_id],
}
}
}
class TuyaCloneCover0601(TuyaWindowCover):
"""Tuya blind controller device."""
signature = {
# <SimpleDescriptor endpoint=1 profile=260 device_type=256 device_version=0
# input_clusters=[0, 3, 4, 5, 6]
# output_clusters=[25]>
# },
# "manufacturer": "_TYST11_wmcdj3aq",
# "model": "mcdj3aq",
# "class": "zigpy.device.Device"
# }
MODELS_INFO: [("_TYST11_wmcdj3aq", "mcdj3aq")], # Not tested
ENDPOINTS: {
1: {
PROFILE_ID: zha.PROFILE_ID,
DEVICE_TYPE: zha.DeviceType.ON_OFF_LIGHT,
INPUT_CLUSTERS: [
Basic.cluster_id,
Identify.cluster_id,
Groups.cluster_id,
Scenes.cluster_id,
OnOff.cluster_id,
],
OUTPUT_CLUSTERS: [Ota.cluster_id],
}
},
}
replacement = {
ENDPOINTS: {
1: {
DEVICE_TYPE: zha.DeviceType.WINDOW_COVERING_DEVICE,
INPUT_CLUSTERS: [
Basic.cluster_id,
Groups.cluster_id,
Scenes.cluster_id,
TuyaManufacturerWindowCover,
TuyaWindowCoverControl,
],
OUTPUT_CLUSTERS: [Ota.cluster_id],
}
}
} |
. The UI shows the status and buttons. But neither the slider nor the buttons work. I don't see tuya-related import errors and I can see "sending tuya command"s. The curtain track just doesn't move at all. zhaquirks.tuya logging 2023-07-27 02:10:46.649 DEBUG (MainThread) [zhaquirks.tuya] a4:c1:38:d0:15:3b:4d:38 Sending Tuya Cluster Command.. Manufacturer is _TZE200_rmymn92d Cluster Command is 0x0005, Arguments are (69,)
2023-07-27 02:10:46.650 DEBUG (MainThread) [zhaquirks.tuya] a4:c1:38:d0:15:3b:4d:38 Sending Tuya Command. Paylod values [endpoint_id : 1, Status : 0, TSN: 0, Command: 0x0202, Function: 0, Data: [4, 0, 0, 0, 31]]
2023-07-27 02:10:47.599 INFO (MainThread) [pyhap.hap_protocol] ('192.168.86.111', 59989): Connection made to 192 168 86 116
2023-07-27 02:10:47.607 INFO (MainThread) [pyhap.hap_protocol] ('192.168.86.111', 59990): Connection made to HASS Bridge GZ
2023-07-27 02:10:47.665 INFO (MainThread) [pyhap.hap_protocol] ('192.168.86.111', 59991): Connection made to HASS Bridge 1W
2023-07-27 02:10:48.010 DEBUG (MainThread) [zhaquirks.tuya] a4:c1:38:d0:15:3b:4d:38 Sending Tuya Cluster Command.. Manufacturer is _TZE200_rmymn92d Cluster Command is 0x0005, Arguments are (88,)
2023-07-27 02:10:48.011 DEBUG (MainThread) [zhaquirks.tuya] a4:c1:38:d0:15:3b:4d:38 Sending Tuya Command. Paylod values [endpoint_id : 1, Status : 0, TSN: 0, Command: 0x0202, Function: 0, Data: [4, 0, 0, 0, 12]]
2023-07-27 02:10:49.418 DEBUG (MainThread) [zhaquirks.tuya] a4:c1:38:d0:15:3b:4d:38 Sending Tuya Cluster Command.. Manufacturer is _TZE200_rmymn92d Cluster Command is 0x0005, Arguments are (0,)
2023-07-27 02:10:49.418 DEBUG (MainThread) [zhaquirks.tuya] a4:c1:38:d0:15:3b:4d:38 Sending Tuya Command. Paylod values [endpoint_id : 1, Status : 0, TSN: 0, Command: 0x0202, Function: 0, Data: [4, 0, 0, 0, 100]]
2023-07-27 02:10:50.934 DEBUG (MainThread) [zhaquirks.tuya] a4:c1:38:d0:15:3b:4d:38 Sending Tuya Cluster Command.. Manufacturer is _TZE200_rmymn92d Cluster Command is 0x0005, Arguments are (99,)
2023-07-27 02:10:50.935 DEBUG (MainThread) [zhaquirks.tuya] a4:c1:38:d0:15:3b:4d:38 Sending Tuya Command. Paylod values [endpoint_id : 1, Status : 0, TSN: 0, Command: 0x0202, Function: 0, Data: [4, 0, 0, 0, 1]]
2023-07-27 02:10:51.505 DEBUG (MainThread) [zhaquirks.tuya] a4:c1:38:d0:15:3b:4d:38 Sending Tuya Cluster Command.. Manufacturer is _TZE200_rmymn92d Cluster Command is 0x0005, Arguments are (85,)
2023-07-27 02:10:51.506 DEBUG (MainThread) [zhaquirks.tuya] a4:c1:38:d0:15:3b:4d:38 Sending Tuya Command. Paylod values [endpoint_id : 1, Status : 0, TSN: 0, Command: 0x0202, Function: 0, Data: [4, 0, 0, 0, 15]]
2023-07-27 02:10:52.128 DEBUG (MainThread) [zhaquirks.tuya] a4:c1:38:d0:15:3b:4d:38 Sending Tuya Cluster Command.. Manufacturer is _TZE200_rmymn92d Cluster Command is 0x0005, Arguments are (54,)
2023-07-27 02:10:52.129 DEBUG (MainThread) [zhaquirks.tuya] a4:c1:38:d0:15:3b:4d:38 Sending Tuya Command. Paylod values [endpoint_id : 1, Status : 0, TSN: 0, Command: 0x0202, Function: 0, Data: [4, 0, 0, 0, 46]]
2023-07-27 02:10:52.663 DEBUG (MainThread) [zhaquirks.tuya] a4:c1:38:d0:15:3b:4d:38 Sending Tuya Cluster Command.. Manufacturer is _TZE200_rmymn92d Cluster Command is 0x0005, Arguments are (22,)
2023-07-27 02:10:52.664 DEBUG (MainThread) [zhaquirks.tuya] a4:c1:38:d0:15:3b:4d:38 Sending Tuya Command. Paylod values [endpoint_id : 1, Status : 0, TSN: 0, Command: 0x0202, Function: 0, Data: [4, 0, 0, 0, 78]]
2023-07-27 02:10:54.164 DEBUG (MainThread) [zhaquirks.tuya] a4:c1:38:d0:15:3b:4d:38 Sending Tuya Cluster Command.. Manufacturer is _TZE200_rmymn92d Cluster Command is 0x0000, Arguments are ()
2023-07-27 02:10:54.165 DEBUG (MainThread) [zhaquirks.tuya] a4:c1:38:d0:15:3b:4d:38 Sending Tuya Command. Paylod values [endpoint_id : 1, Status : 0, TSN: 0, Command: 0x0401, Function: 0, Data: [1, 0]]
2023-07-27 02:10:54.432 DEBUG (MainThread) [zhaquirks.tuya] a4:c1:38:d0:15:3b:4d:38 Sending Tuya Cluster Command.. Manufacturer is _TZE200_rmymn92d Cluster Command is 0x0000, Arguments are ()
2023-07-27 02:10:54.433 DEBUG (MainThread) [zhaquirks.tuya] a4:c1:38:d0:15:3b:4d:38 Sending Tuya Command. Paylod values [endpoint_id : 1, Status : 0, TSN: 0, Command: 0x0401, Function: 0, Data: [1, 0]]
2023-07-27 02:10:54.628 DEBUG (MainThread) [zhaquirks.tuya] a4:c1:38:d0:15:3b:4d:38 Sending Tuya Cluster Command.. Manufacturer is _TZE200_rmymn92d Cluster Command is 0x0000, Arguments are ()
2023-07-27 02:10:54.628 DEBUG (MainThread) [zhaquirks.tuya] a4:c1:38:d0:15:3b:4d:38 Sending Tuya Command. Paylod values [endpoint_id : 1, Status : 0, TSN: 0, Command: 0x0401, Function: 0, Data: [1, 0]]
2023-07-27 02:10:55.277 DEBUG (MainThread) [zhaquirks.tuya] a4:c1:38:d0:15:3b:4d:38 Sending Tuya Cluster Command.. Manufacturer is _TZE200_rmymn92d Cluster Command is 0x0001, Arguments are ()
2023-07-27 02:10:55.277 DEBUG (MainThread) [zhaquirks.tuya] a4:c1:38:d0:15:3b:4d:38 Sending Tuya Command. Paylod values [endpoint_id : 1, Status : 0, TSN: 0, Command: 0x0401, Function: 0, Data: [1, 2]]
2023-07-27 02:10:55.432 DEBUG (MainThread) [zhaquirks.tuya] a4:c1:38:d0:15:3b:4d:38 Sending Tuya Cluster Command.. Manufacturer is _TZE200_rmymn92d Cluster Command is 0x0001, Arguments are ()
2023-07-27 02:10:55.432 DEBUG (MainThread) [zhaquirks.tuya] a4:c1:38:d0:15:3b:4d:38 Sending Tuya Command. Paylod values [endpoint_id : 1, Status : 0, TSN: 0, Command: 0x0401, Function: 0, Data: [1, 2]]
2023-07-27 02:10:55.577 DEBUG (MainThread) [zhaquirks.tuya] a4:c1:38:d0:15:3b:4d:38 Sending Tuya Cluster Command.. Manufacturer is _TZE200_rmymn92d Cluster Command is 0x0001, Arguments are ()
2023-07-27 02:10:55.578 DEBUG (MainThread) [zhaquirks.tuya] a4:c1:38:d0:15:3b:4d:38 Sending Tuya Command. Paylod values [endpoint_id : 1, Status : 0, TSN: 0, Command: 0x0401, Function: 0, Data: [1, 2]]
2023-07-27 02:10:56.178 DEBUG (MainThread) [zhaquirks.tuya] a4:c1:38:d0:15:3b:4d:38 Sending Tuya Cluster Command.. Manufacturer is _TZE200_rmymn92d Cluster Command is 0x0002, Arguments are ()
2023-07-27 02:10:56.179 DEBUG (MainThread) [zhaquirks.tuya] a4:c1:38:d0:15:3b:4d:38 Sending Tuya Command. Paylod values [endpoint_id : 1, Status : 0, TSN: 0, Command: 0x0401, Function: 0, Data: [1, 1]]
2023-07-27 02:11:07.887 INFO (MainThread) [homeassistant.components.mqtt.discovery] Component has already been discovered: sensor 95d685af-b8d0-43ef-af19-3fd3346bb293 a9054a47-2b53-47a5-92c3-101318046926_info, sending update full logging2023-07-27 03:37:23.399 DEBUG (MainThread) [zhaquirks.tuya] a4:c1:38:d0:15:3b:4d:38 Sending Tuya Cluster Command.. Manufacturer is _TZE200_rmymn92d Cluster Command is 0x0002, Arguments are ()
2023-07-27 03:37:23.399 DEBUG (MainThread) [zhaquirks.tuya] a4:c1:38:d0:15:3b:4d:38 Sending Tuya Command. Paylod values [endpoint_id : 1, Status : 0, TSN: 0, Command: 0x0401, Function: 0, Data: [1, 1]]
2023-07-27 03:37:23.400 DEBUG (MainThread) [zigpy.zcl] [0x8B32:1:0xef00] Sending request header: ZCLHeader(frame_control=FrameControl(frame_type=<FrameType.CLUSTER_COMMAND: 1>, is_manufacturer_specific=True, direction=<Direction.Server_to_Client: 0>, disable_default_response=0, reserved=0, *is_cluster=True, *is_general=False), manufacturer=4417, tsn=197, command_id=0, *direction=<Direction.Server_to_Client: 0>)
2023-07-27 03:37:23.401 DEBUG (MainThread) [zigpy.zcl] [0x8B32:1:0xef00] Sending request: set_data(param=Command(status=0, tsn=0, command_id=1025, function=0, data=[1, 1]))
2023-07-27 03:37:23.403 DEBUG (MainThread) [bellows.zigbee.application] Sending packet ZigbeePacket(src=AddrModeAddress(addr_mode=<AddrMode.NWK: 2>, address=0x0000), src_ep=1, dst=AddrModeAddress(addr_mode=<AddrMode.NWK: 2>, address=0x8B32), dst_ep=1, source_route=None, extended_timeout=False, tsn=197, profile_id=260, cluster_id=61184, data=Serialized[b'\x05A\x11\xc5\x00\x00\x00\x01\x04\x00\x01\x01'], tx_options=<TransmitOptions.NONE: 0>, radius=0, non_member_radius=0, lqi=None, rssi=None)
2023-07-27 03:37:23.403 DEBUG (MainThread) [bellows.ezsp.protocol] Send command sendUnicast: (<EmberOutgoingMessageType.OUTGOING_DIRECT: 0>, 0x8b32, EmberApsFrame(profileId=260, clusterId=61184, sourceEndpoint=1, destinationEndpoint=1, options=<EmberApsOption.APS_OPTION_ENABLE_ROUTE_DISCOVERY: 256>, groupId=0, sequence=197), 198, b'\x05A\x11\xc5\x00\x00\x00\x01\x04\x00\x01\x01')
2023-07-27 03:37:23.405 DEBUG (bellows.thread_0) [bellows.uart] Data frame: b'71b5b1a9112a15b65894a524ab5593499c3def65df459874f7cf0b8bfc8b3aa6ebccdeb6b87e'
2023-07-27 03:37:23.406 DEBUG (bellows.thread_0) [bellows.uart] Sending: b'8070787e'
2023-07-27 03:37:23.406 DEBUG (bellows.thread_0) [bellows.uart] Sending: b'10ba21a9602a1580d2904b25455493499d4e276e2bc262caec036389fc7f3ba7eacc73427e'
2023-07-27 03:37:23.408 DEBUG (MainThread) [bellows.ezsp.protocol] Application frame received incomingMessageHandler: [<EmberIncomingMessageType.INCOMING_UNICAST: 0>, EmberApsFrame(profileId=260, clusterId=61184, sourceEndpoint=1, destinationEndpoint=1, options=<EmberApsOption.APS_OPTION_ENABLE_ROUTE_DISCOVERY: 256>, groupId=0, sequence=115), 200, -50, 0x8b32, 255, 255, b'\th\x02\x00\xf5\x05\x01\x00\x01\x00']
2023-07-27 03:37:23.409 DEBUG (MainThread) [bellows.zigbee.application] Received incomingMessageHandler frame with [<EmberIncomingMessageType.INCOMING_UNICAST: 0>, EmberApsFrame(profileId=260, clusterId=61184, sourceEndpoint=1, destinationEndpoint=1, options=<EmberApsOption.APS_OPTION_ENABLE_ROUTE_DISCOVERY: 256>, groupId=0, sequence=115), 200, -50, 0x8b32, 255, 255, b'\th\x02\x00\xf5\x05\x01\x00\x01\x00']
2023-07-27 03:37:23.409 DEBUG (MainThread) [zigpy.application] Received a packet: ZigbeePacket(src=AddrModeAddress(addr_mode=<AddrMode.NWK: 2>, address=0x8B32), src_ep=1, dst=AddrModeAddress(addr_mode=<AddrMode.NWK: 2>, address=0x0000), dst_ep=1, source_route=None, extended_timeout=False, tsn=115, profile_id=260, cluster_id=61184, data=Serialized[b'\th\x02\x00\xf5\x05\x01\x00\x01\x00'], tx_options=<TransmitOptions.NONE: 0>, radius=0, non_member_radius=0, lqi=200, rssi=-50)
2023-07-27 03:37:23.410 DEBUG (MainThread) [zigpy.zcl] [0x8B32:1:0xef00] Received ZCL frame: b'\th\x02\x00\xf5\x05\x01\x00\x01\x00'
2023-07-27 03:37:23.411 DEBUG (MainThread) [zigpy.zcl] [0x8B32:1:0xef00] Decoded ZCL frame header: ZCLHeader(frame_control=FrameControl(frame_type=<FrameType.CLUSTER_COMMAND: 1>, is_manufacturer_specific=0, direction=<Direction.Client_to_Server: 1>, disable_default_response=0, reserved=0, *is_cluster=True, *is_general=False), tsn=104, command_id=2, *direction=<Direction.Client_to_Server: 1>)
2023-07-27 03:37:23.416 DEBUG (bellows.thread_0) [bellows.uart] Data frame: b'02baa1a9602a15a0c88b7e'
2023-07-27 03:37:23.416 DEBUG (bellows.thread_0) [bellows.uart] Sending: b'8160597e'
2023-07-27 03:37:23.413 DEBUG (MainThread) [zigpy.zcl] [0x8B32:1:0xef00] Decoded ZCL frame: TuyaManufacturerWindowCover:set_data_response(param=Command(status=0, tsn=245, command_id=261, function=0, data=[1, 0]))
2023-07-27 03:37:23.427 DEBUG (MainThread) [zigpy.zcl] [0x8B32:1:0xef00] Received command 0x02 (TSN 104): set_data_response(param=Command(status=0, tsn=245, command_id=261, function=0, data=[1, 0]))
2023-07-27 03:37:23.428 DEBUG (MainThread) [zhaquirks.tuya] a4:c1:38:d0:15:3b:4d:38 Received Attribute Report. Command is 0x0002, Tuya Paylod values[Status : 0, TSN: 245, Command: 0x0105, Function: 0x00, Data: [1, 0]]
2023-07-27 03:37:23.429 DEBUG (MainThread) [bellows.ezsp.protocol] Application frame received sendUnicast: [<EmberStatus.SUCCESS: 0>, 18]
2023-07-27 03:37:23.430 DEBUG (bellows.thread_0) [bellows.uart] Data frame: b'12bab1a96b2a1580d2904b25455493499d4e27b92bce6726f57e'
2023-07-27 03:37:23.430 DEBUG (bellows.thread_0) [bellows.uart] Sending: b'82503a7e'
2023-07-27 03:37:23.437 DEBUG (MainThread) [bellows.ezsp.protocol] Application frame received messageSentHandler: [<EmberOutgoingMessageType.OUTGOING_DIRECT: 0>, 35634, EmberApsFrame(profileId=260, clusterId=61184, sourceEndpoint=1, destinationEndpoint=1, options=<EmberApsOption.APS_OPTION_ENABLE_ROUTE_DISCOVERY: 256>, groupId=0, sequence=18), 198, <EmberStatus.SUCCESS: 0>, b'']
2023-07-27 03:37:23.437 DEBUG (MainThread) [bellows.zigbee.application] Received messageSentHandler frame with [<EmberOutgoingMessageType.OUTGOING_DIRECT: 0>, 35634, EmberApsFrame(profileId=260, clusterId=61184, sourceEndpoint=1, destinationEndpoint=1, options=<EmberApsOption.APS_OPTION_ENABLE_ROUTE_DISCOVERY: 256>, groupId=0, sequence=18), 198, <EmberStatus.SUCCESS: 0>, b'']
2023-07-27 03:37:23.467 DEBUG (bellows.thread_0) [bellows.uart] Data frame: b'22bab1a9112a15b65894a524ab5593499c3aef65df459874f8dea682fcfd011c7e'
2023-07-27 03:37:23.467 DEBUG (bellows.thread_0) [bellows.uart] Sending: b'83401b7e'
2023-07-27 03:37:23.468 DEBUG (MainThread) [bellows.ezsp.protocol] Application frame received incomingMessageHandler: [<EmberIncomingMessageType.INCOMING_UNICAST: 0>, EmberApsFrame(profileId=260, clusterId=61184, sourceEndpoint=1, destinationEndpoint=1, options=<EmberApsOption.APS_OPTION_ENABLE_ROUTE_DISCOVERY: 256>, groupId=0, sequence=116), 200, -50, 0x8b32, 255, 255, b'\x18\xc5\x0b\x00\x83']
2023-07-27 03:37:23.468 DEBUG (MainThread) [bellows.zigbee.application] Received incomingMessageHandler frame with [<EmberIncomingMessageType.INCOMING_UNICAST: 0>, EmberApsFrame(profileId=260, clusterId=61184, sourceEndpoint=1, destinationEndpoint=1, options=<EmberApsOption.APS_OPTION_ENABLE_ROUTE_DISCOVERY: 256>, groupId=0, sequence=116), 200, -50, 0x8b32, 255, 255, b'\x18\xc5\x0b\x00\x83']
2023-07-27 03:37:23.468 DEBUG (MainThread) [zigpy.application] Received a packet: ZigbeePacket(src=AddrModeAddress(addr_mode=<AddrMode.NWK: 2>, address=0x8B32), src_ep=1, dst=AddrModeAddress(addr_mode=<AddrMode.NWK: 2>, address=0x0000), dst_ep=1, source_route=None, extended_timeout=False, tsn=116, profile_id=260, cluster_id=61184, data=Serialized[b'\x18\xc5\x0b\x00\x83'], tx_options=<TransmitOptions.NONE: 0>, radius=0, non_member_radius=0, lqi=200, rssi=-50)
2023-07-27 03:37:23.469 DEBUG (MainThread) [zigpy.zcl] [0x8B32:1:0xef00] Received ZCL frame: b'\x18\xc5\x0b\x00\x83'
2023-07-27 03:37:23.470 DEBUG (MainThread) [zigpy.zcl] [0x8B32:1:0xef00] Decoded ZCL frame header: ZCLHeader(frame_control=FrameControl(frame_type=<FrameType.GLOBAL_COMMAND: 0>, is_manufacturer_specific=0, direction=<Direction.Client_to_Server: 1>, disable_default_response=1, reserved=0, *is_cluster=False, *is_general=True), tsn=197, command_id=11, *direction=<Direction.Client_to_Server: 1>)
2023-07-27 03:37:23.471 DEBUG (MainThread) [zigpy.zcl] [0x8B32:1:0xef00] Decoded ZCL frame: TuyaManufacturerWindowCover:Default_Response(command_id=0, status=<Status.UNSUP_MANUF_CLUSTER_COMMAND: 131>)
2023-07-27 03:37:23.540 DEBUG (MainThread) [bellows.ezsp.protocol] Send command readCounters: ()
2023-07-27 03:37:23.541 DEBUG (bellows.thread_0) [bellows.uart] Sending: b'23bb21a9a52a326d7e'
2023-07-27 03:37:23.559 DEBUG (bellows.thread_0) [bellows.uart] Data frame: b'33bba1a9a52aa4b309945c24d655f249834e23abe9ced78ba5c67289f27e31a7e7cddf6f8fffc7dbd5d2698c4623a9ec763ba5ea758241984c2607b1e070381c0e07bbe5ca658e459a4d9e4f9ff7c3d9d46a35a2519048244f987e'
2023-07-27 03:37:23.560 DEBUG (bellows.thread_0) [bellows.uart] Sending: b'8430fc7e'
2023-07-27 03:37:23.561 DEBUG (MainThread) [bellows.ezsp.protocol] Application frame received readCounters: [[433, 80, 278, 124, 96, 31, 4, 4, 176, 88, 17, 14, 14, 12, 1, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 20, 0, 0, 0, 0, 0, 4, 0, 0, 0, 0, 0, 0, 0, 0]]
2023-07-27 03:37:23.561 DEBUG (MainThread) [bellows.ezsp.protocol] Send command getValue: (<EzspValueId.VALUE_FREE_BUFFERS: 3>,)
2023-07-27 03:37:23.562 DEBUG (bellows.thread_0) [bellows.uart] Sending: b'34b821a9fe2a1647067e'
2023-07-27 03:37:23.568 DEBUG (bellows.thread_0) [bellows.uart] Data frame: b'44b8a1a9fe2a15b3aeb2957e'
2023-07-27 03:37:23.568 DEBUG (bellows.thread_0) [bellows.uart] Sending: b'8520dd7e'
2023-07-27 03:37:23.569 DEBUG (MainThread) [bellows.ezsp.protocol] Application frame received getValue: [<EzspStatus.SUCCESS: 0>, b'\xf7']
2023-07-27 03:37:23.570 DEBUG (MainThread) [bellows.zigbee.application] Free buffers status EzspStatus.SUCCESS, value: 247
2023-07-27 03:37:23.570 DEBUG (MainThread) [bellows.zigbee.application] ezsp_counters: [MAC_RX_BROADCAST = 433, MAC_TX_BROADCAST = 80, MAC_RX_UNICAST = 278, MAC_TX_UNICAST_SUCCESS = 124, MAC_TX_UNICAST_RETRY = 96, MAC_TX_UNICAST_FAILED = 31, APS_DATA_RX_BROADCAST = 4, APS_DATA_TX_BROADCAST = 4, APS_DATA_RX_UNICAST = 176, APS_DATA_TX_UNICAST_SUCCESS = 88, APS_DATA_TX_UNICAST_RETRY = 17, APS_DATA_TX_UNICAST_FAILED = 14, ROUTE_DISCOVERY_INITIATED = 14, NEIGHBOR_ADDED = 12, NEIGHBOR_REMOVED = 1, NEIGHBOR_STALE = 0, JOIN_INDICATION = 0, CHILD_REMOVED = 0, ASH_OVERFLOW_ERROR = 0, ASH_FRAMING_ERROR = 0, ASH_OVERRUN_ERROR = 0, NWK_FRAME_COUNTER_FAILURE = 0, APS_FRAME_COUNTER_FAILURE = 0, UTILITY = 0, APS_LINK_KEY_NOT_AUTHORIZED = 0, NWK_DECRYPTION_FAILURE = 0, APS_DECRYPTION_FAILURE = 20, ALLOCATE_PACKET_BUFFER_FAILURE = 0, RELAYED_UNICAST = 0, PHY_TO_MAC_QUEUE_LIMIT_REACHED = 0, PACKET_VALIDATE_LIBRARY_DROPPED_COUNT = 0, TYPE_NWK_RETRY_OVERFLOW = 0, PHY_CCA_FAIL_COUNT = 4, BROADCAST_TABLE_FULL = 0, PTA_LO_PRI_REQUESTED = 0, PTA_HI_PRI_REQUESTED = 0, PTA_LO_PRI_DENIED = 0, PTA_HI_PRI_DENIED = 0, PTA_LO_PRI_TX_ABORTED = 0, PTA_HI_PRI_TX_ABORTED = 0, ADDRESS_CONFLICT_SENT = 0, EZSP_FREE_BUFFERS = 247]
2023-07-27 03:37:23.601 DEBUG (bellows.thread_0) [bellows.uart] Data frame: b'54b8b1a9112a15b65894a524ab5593499c3be366df459874f7cf0b8bfc8b3aa6ebccde3b3a7e'
2023-07-27 03:37:23.601 DEBUG (bellows.thread_0) [bellows.uart] Sending: b'8610be7e'
2023-07-27 03:37:23.602 DEBUG (MainThread) [bellows.ezsp.protocol] Application frame received incomingMessageHandler: [<EmberIncomingMessageType.INCOMING_UNICAST: 0>, EmberApsFrame(profileId=260, clusterId=61184, sourceEndpoint=1, destinationEndpoint=1, options=<EmberApsOption.APS_OPTION_ENABLE_ROUTE_DISCOVERY: 256>, groupId=0, sequence=117), 196, -51, 0x8b32, 255, 255, b'\th\x02\x00\xf5\x05\x01\x00\x01\x00']
2023-07-27 03:37:23.603 DEBUG (MainThread) [bellows.zigbee.application] Received incomingMessageHandler frame with [<EmberIncomingMessageType.INCOMING_UNICAST: 0>, EmberApsFrame(profileId=260, clusterId=61184, sourceEndpoint=1, destinationEndpoint=1, options=<EmberApsOption.APS_OPTION_ENABLE_ROUTE_DISCOVERY: 256>, groupId=0, sequence=117), 196, -51, 0x8b32, 255, 255, b'\th\x02\x00\xf5\x05\x01\x00\x01\x00']
2023-07-27 03:37:23.603 DEBUG (MainThread) [zigpy.application] Received a packet: ZigbeePacket(src=AddrModeAddress(addr_mode=<AddrMode.NWK: 2>, address=0x8B32), src_ep=1, dst=AddrModeAddress(addr_mode=<AddrMode.NWK: 2>, address=0x0000), dst_ep=1, source_route=None, extended_timeout=False, tsn=117, profile_id=260, cluster_id=61184, data=Serialized[b'\th\x02\x00\xf5\x05\x01\x00\x01\x00'], tx_options=<TransmitOptions.NONE: 0>, radius=0, non_member_radius=0, lqi=196, rssi=-51)
2023-07-27 03:37:23.604 DEBUG (MainThread) [zigpy.zcl] [0x8B32:1:0xef00] Received ZCL frame: b'\th\x02\x00\xf5\x05\x01\x00\x01\x00'
2023-07-27 03:37:23.606 DEBUG (MainThread) [zigpy.zcl] [0x8B32:1:0xef00] Decoded ZCL frame header: ZCLHeader(frame_control=FrameControl(frame_type=<FrameType.CLUSTER_COMMAND: 1>, is_manufacturer_specific=0, direction=<Direction.Client_to_Server: 1>, disable_default_response=0, reserved=0, *is_cluster=True, *is_general=False), tsn=104, command_id=2, *direction=<Direction.Client_to_Server: 1>)
2023-07-27 03:37:23.613 DEBUG (MainThread) [zigpy.zcl] [0x8B32:1:0xef00] Decoded ZCL frame: TuyaManufacturerWindowCover:set_data_response(param=Command(status=0, tsn=245, command_id=261, function=0, data=[1, 0]))
2023-07-27 03:37:23.614 DEBUG (MainThread) [zigpy.zcl] [0x8B32:1:0xef00] Received command 0x02 (TSN 104): set_data_response(param=Command(status=0, tsn=245, command_id=261, function=0, data=[1, 0]))
2023-07-27 03:37:23.615 DEBUG (MainThread) [zhaquirks.tuya] a4:c1:38:d0:15:3b:4d:38 Received Attribute Report. Command is 0x0002, Tuya Paylod values[Status : 0, TSN: 245, Command: 0x0105, Function: 0x00, Data: [1, 0]]
2023-07-27 03:37:23.855 DEBUG (MainThread) [homeassistant.components.zha.core.device] [0xE266](DG6HD): Device seen - marking the device available and resetting counter
2023-07-27 03:37:23.856 DEBUG (MainThread) [homeassistant.components.zha.core.device] [0xE266](DG6HD): Update device availability - device available: True - new availability: True - changed: False
2023-07-27 03:37:24.011 DEBUG (bellows.thread_0) [bellows.uart] Data frame: b'64b8b1a9112a15b65894a524ab5593499c38ef65df459874f7cf0a8bfc883ea3ebccdf23867e'
2023-07-27 03:37:24.011 DEBUG (bellows.thread_0) [bellows.uart] Sending: b'87009f7e'
2023-07-27 03:37:24.030 DEBUG (MainThread) [bellows.ezsp.protocol] Application frame received incomingMessageHandler: [<EmberIncomingMessageType.INCOMING_UNICAST: 0>, EmberApsFrame(profileId=260, clusterId=61184, sourceEndpoint=1, destinationEndpoint=1, options=<EmberApsOption.APS_OPTION_ENABLE_ROUTE_DISCOVERY: 256>, groupId=0, sequence=118), 200, -50, 0x8b32, 255, 255, b'\ti\x02\x00\xf6\x01\x04\x00\x01\x01']
2023-07-27 03:37:24.031 DEBUG (MainThread) [bellows.zigbee.application] Received incomingMessageHandler frame with [<EmberIncomingMessageType.INCOMING_UNICAST: 0>, EmberApsFrame(profileId=260, clusterId=61184, sourceEndpoint=1, destinationEndpoint=1, options=<EmberApsOption.APS_OPTION_ENABLE_ROUTE_DISCOVERY: 256>, groupId=0, sequence=118), 200, -50, 0x8b32, 255, 255, b'\ti\x02\x00\xf6\x01\x04\x00\x01\x01']
2023-07-27 03:37:24.031 DEBUG (MainThread) [zigpy.application] Received a packet: ZigbeePacket(src=AddrModeAddress(addr_mode=<AddrMode.NWK: 2>, address=0x8B32), src_ep=1, dst=AddrModeAddress(addr_mode=<AddrMode.NWK: 2>, address=0x0000), dst_ep=1, source_route=None, extended_timeout=False, tsn=118, profile_id=260, cluster_id=61184, data=Serialized[b'\ti\x02\x00\xf6\x01\x04\x00\x01\x01'], tx_options=<TransmitOptions.NONE: 0>, radius=0, non_member_radius=0, lqi=200, rssi=-50)
2023-07-27 03:37:24.032 DEBUG (MainThread) [zigpy.zcl] [0x8B32:1:0xef00] Received ZCL frame: b'\ti\x02\x00\xf6\x01\x04\x00\x01\x01'
2023-07-27 03:37:24.032 DEBUG (MainThread) [zigpy.zcl] [0x8B32:1:0xef00] Decoded ZCL frame header: ZCLHeader(frame_control=FrameControl(frame_type=<FrameType.CLUSTER_COMMAND: 1>, is_manufacturer_specific=0, direction=<Direction.Client_to_Server: 1>, disable_default_response=0, reserved=0, *is_cluster=True, *is_general=False), tsn=105, command_id=2, *direction=<Direction.Client_to_Server: 1>)
2023-07-27 03:37:24.033 DEBUG (MainThread) [zigpy.zcl] [0x8B32:1:0xef00] Decoded ZCL frame: TuyaManufacturerWindowCover:set_data_response(param=Command(status=0, tsn=246, command_id=1025, function=0, data=[1, 1]))
2023-07-27 03:37:24.034 DEBUG (MainThread) [zigpy.zcl] [0x8B32:1:0xef00] Received command 0x02 (TSN 105): set_data_response(param=Command(status=0, tsn=246, command_id=1025, function=0, data=[1, 1]))
2023-07-27 03:37:24.034 DEBUG (MainThread) [zhaquirks.tuya] a4:c1:38:d0:15:3b:4d:38 Received Attribute Report. Command is 0x0002, Tuya Paylod values[Status : 0, TSN: 246, Command: 0x0401, Function: 0x00, Data: [1, 1]]
2023-07-27 03:37:24.250 DEBUG (MainThread) [homeassistant.core] Bus:Handling <Event call_service[L]: domain=cover, service=close_cover, service_data=entity_id=cover.tze200_rmymn92d_ts0601_cover>
2023-07-27 03:37:24.253 DEBUG (MainThread) [zigpy.util] Tries remaining: 3
2023-07-27 03:37:24.253 DEBUG (MainThread) [zhaquirks.tuya] a4:c1:38:d0:15:3b:4d:38 Sending Tuya Cluster Command.. Manufacturer is _TZE200_rmymn92d Cluster Command is 0x0001, Arguments are ()
2023-07-27 03:37:24.253 DEBUG (MainThread) [zhaquirks.tuya] a4:c1:38:d0:15:3b:4d:38 Sending Tuya Command. Paylod values [endpoint_id : 1, Status : 0, TSN: 0, Command: 0x0401, Function: 0, Data: [1, 2]]
2023-07-27 03:37:24.254 DEBUG (MainThread) [zigpy.zcl] [0x8B32:1:0xef00] Sending request header: ZCLHeader(frame_control=FrameControl(frame_type=<FrameType.CLUSTER_COMMAND: 1>, is_manufacturer_specific=True, direction=<Direction.Server_to_Client: 0>, disable_default_response=0, reserved=0, *is_cluster=True, *is_general=False), manufacturer=4417, tsn=199, command_id=0, *direction=<Direction.Server_to_Client: 0>)
2023-07-27 03:37:24.255 DEBUG (MainThread) [zigpy.zcl] [0x8B32:1:0xef00] Sending request: set_data(param=Command(status=0, tsn=0, command_id=1025, function=0, data=[1, 2]))
2023-07-27 03:37:24.256 DEBUG (MainThread) [bellows.zigbee.application] Sending packet ZigbeePacket(src=AddrModeAddress(addr_mode=<AddrMode.NWK: 2>, address=0x0000), src_ep=1, dst=AddrModeAddress(addr_mode=<AddrMode.NWK: 2>, address=0x8B32), dst_ep=1, source_route=None, extended_timeout=False, tsn=199, profile_id=260, cluster_id=61184, data=Serialized[b'\x05A\x11\xc7\x00\x00\x00\x01\x04\x00\x01\x02'], tx_options=<TransmitOptions.NONE: 0>, radius=0, non_member_radius=0, lqi=None, rssi=None)
2023-07-27 03:37:24.256 DEBUG (MainThread) [bellows.ezsp.protocol] Send command sendUnicast: (<EmberOutgoingMessageType.OUTGOING_DIRECT: 0>, 0x8b32, EmberApsFrame(profileId=260, clusterId=61184, sourceEndpoint=1, destinationEndpoint=1, options=<EmberApsOption.APS_OPTION_ENABLE_ROUTE_DISCOVERY: 256>, groupId=0, sequence=199), 200, b'\x05A\x11\xc7\x00\x00\x00\x01\x04\x00\x01\x02')
2023-07-27 03:37:24.263 DEBUG (bellows.thread_0) [bellows.uart] Sending: b'47b921a9602a1580d2904b25455493499d4e276c25c262caec016389fc7f3ba7eacf44f27e'
2023-07-27 03:37:24.272 DEBUG (bellows.thread_0) [bellows.uart] Data frame: b'75b9a1a9602a15a1773c7e'
2023-07-27 03:37:24.272 DEBUG (bellows.thread_0) [bellows.uart] Sending: b'8070787e'
2023-07-27 03:37:24.274 DEBUG (MainThread) [bellows.ezsp.protocol] Application frame received sendUnicast: [<EmberStatus.SUCCESS: 0>, 19]
2023-07-27 03:37:24.286 DEBUG (bellows.thread_0) [bellows.uart] Data frame: b'05b9b1a96b2a1580d2904b25455493499d4e27b825ce67e64e7e'
2023-07-27 03:37:24.286 DEBUG (bellows.thread_0) [bellows.uart] Sending: b'8160597e'
2023-07-27 03:37:24.289 DEBUG (MainThread) [bellows.ezsp.protocol] Application frame received messageSentHandler: [<EmberOutgoingMessageType.OUTGOING_DIRECT: 0>, 35634, EmberApsFrame(profileId=260, clusterId=61184, sourceEndpoint=1, destinationEndpoint=1, options=<EmberApsOption.APS_OPTION_ENABLE_ROUTE_DISCOVERY: 256>, groupId=0, sequence=19), 200, <EmberStatus.SUCCESS: 0>, b'']
2023-07-27 03:37:24.289 DEBUG (MainThread) [bellows.zigbee.application] Received messageSentHandler frame with [<EmberOutgoingMessageType.OUTGOING_DIRECT: 0>, 35634, EmberApsFrame(profileId=260, clusterId=61184, sourceEndpoint=1, destinationEndpoint=1, options=<EmberApsOption.APS_OPTION_ENABLE_ROUTE_DISCOVERY: 256>, groupId=0, sequence=19), 200, <EmberStatus.SUCCESS: 0>, b'']
2023-07-27 03:37:24.323 DEBUG (bellows.thread_0) [bellows.uart] Data frame: b'15b9b1a9112a15b65894a524ab5593499c39ef65df459874f8dea482fcfdbfb67e'
2023-07-27 03:37:24.324 DEBUG (bellows.thread_0) [bellows.uart] Sending: b'82503a7e'
2023-07-27 03:37:24.326 DEBUG (MainThread) [bellows.ezsp.protocol] Application frame received incomingMessageHandler: [<EmberIncomingMessageType.INCOMING_UNICAST: 0>, EmberApsFrame(profileId=260, clusterId=61184, sourceEndpoint=1, destinationEndpoint=1, options=<EmberApsOption.APS_OPTION_ENABLE_ROUTE_DISCOVERY: 256>, groupId=0, sequence=119), 200, -50, 0x8b32, 255, 255, b'\x18\xc7\x0b\x00\x83']
2023-07-27 03:37:24.327 DEBUG (MainThread) [bellows.zigbee.application] Received incomingMessageHandler frame with [<EmberIncomingMessageType.INCOMING_UNICAST: 0>, EmberApsFrame(profileId=260, clusterId=61184, sourceEndpoint=1, destinationEndpoint=1, options=<EmberApsOption.APS_OPTION_ENABLE_ROUTE_DISCOVERY: 256>, groupId=0, sequence=119), 200, -50, 0x8b32, 255, 255, b'\x18\xc7\x0b\x00\x83']
2023-07-27 03:37:24.327 DEBUG (MainThread) [zigpy.application] Received a packet: ZigbeePacket(src=AddrModeAddress(addr_mode=<AddrMode.NWK: 2>, address=0x8B32), src_ep=1, dst=AddrModeAddress(addr_mode=<AddrMode.NWK: 2>, address=0x0000), dst_ep=1, source_route=None, extended_timeout=False, tsn=119, profile_id=260, cluster_id=61184, data=Serialized[b'\x18\xc7\x0b\x00\x83'], tx_options=<TransmitOptions.NONE: 0>, radius=0, non_member_radius=0, lqi=200, rssi=-50)
2023-07-27 03:37:24.328 DEBUG (MainThread) [zigpy.zcl] [0x8B32:1:0xef00] Received ZCL frame: b'\x18\xc7\x0b\x00\x83'
2023-07-27 03:37:24.329 DEBUG (MainThread) [zigpy.zcl] [0x8B32:1:0xef00] Decoded ZCL frame header: ZCLHeader(frame_control=FrameControl(frame_type=<FrameType.GLOBAL_COMMAND: 0>, is_manufacturer_specific=0, direction=<Direction.Client_to_Server: 1>, disable_default_response=1, reserved=0, *is_cluster=False, *is_general=True), tsn=199, command_id=11, *direction=<Direction.Client_to_Server: 1>)
2023-07-27 03:37:24.330 DEBUG (MainThread) [zigpy.zcl] [0x8B32:1:0xef00] Decoded ZCL frame: TuyaManufacturerWindowCover:Default_Response(command_id=0, status=<Status.UNSUP_MANUF_CLUSTER_COMMAND: 131>)
2023-07-27 03:37:24.419 DEBUG (bellows.thread_0) [bellows.uart] Data frame: b'25b9b1a9112a15b65894a524ab5593499c36e366df459874f0cf098bfc893ca5ebc9de6f8f9b9e437e'
2023-07-27 03:37:24.420 DEBUG (bellows.thread_0) [bellows.uart] Sending: b'83401b7e'
2023-07-27 03:37:24.421 DEBUG (MainThread) [bellows.ezsp.protocol] Application frame received incomingMessageHandler: [<EmberIncomingMessageType.INCOMING_UNICAST: 0>, EmberApsFrame(profileId=260, clusterId=61184, sourceEndpoint=1, destinationEndpoint=1, options=<EmberApsOption.APS_OPTION_ENABLE_ROUTE_DISCOVERY: 256>, groupId=0, sequence=120), 196, -51, 0x8b32, 255, 255, b'\tj\x02\x00\xf7\x03\x02\x00\x04\x00\x00\x00d']
2023-07-27 03:37:24.421 DEBUG (MainThread) [bellows.zigbee.application] Received incomingMessageHandler frame with [<EmberIncomingMessageType.INCOMING_UNICAST: 0>, EmberApsFrame(profileId=260, clusterId=61184, sourceEndpoint=1, destinationEndpoint=1, options=<EmberApsOption.APS_OPTION_ENABLE_ROUTE_DISCOVERY: 256>, groupId=0, sequence=120), 196, -51, 0x8b32, 255, 255, b'\tj\x02\x00\xf7\x03\x02\x00\x04\x00\x00\x00d']
2023-07-27 03:37:24.422 DEBUG (MainThread) [zigpy.application] Received a packet: ZigbeePacket(src=AddrModeAddress(addr_mode=<AddrMode.NWK: 2>, address=0x8B32), src_ep=1, dst=AddrModeAddress(addr_mode=<AddrMode.NWK: 2>, address=0x0000), dst_ep=1, source_route=None, extended_timeout=False, tsn=120, profile_id=260, cluster_id=61184, data=Serialized[b'\tj\x02\x00\xf7\x03\x02\x00\x04\x00\x00\x00d'], tx_options=<TransmitOptions.NONE: 0>, radius=0, non_member_radius=0, lqi=196, rssi=-51)
2023-07-27 03:37:24.423 DEBUG (MainThread) [zigpy.zcl] [0x8B32:1:0xef00] Received ZCL frame: b'\tj\x02\x00\xf7\x03\x02\x00\x04\x00\x00\x00d'
2023-07-27 03:37:24.423 DEBUG (MainThread) [zigpy.zcl] [0x8B32:1:0xef00] Decoded ZCL frame header: ZCLHeader(frame_control=FrameControl(frame_type=<FrameType.CLUSTER_COMMAND: 1>, is_manufacturer_specific=0, direction=<Direction.Client_to_Server: 1>, disable_default_response=0, reserved=0, *is_cluster=True, *is_general=False), tsn=106, command_id=2, *direction=<Direction.Client_to_Server: 1>)
2023-07-27 03:37:24.425 DEBUG (MainThread) [zigpy.zcl] [0x8B32:1:0xef00] Decoded ZCL frame: TuyaManufacturerWindowCover:set_data_response(param=Command(status=0, tsn=247, command_id=515, function=0, data=[4, 0, 0, 0, 100]))
2023-07-27 03:37:24.432 DEBUG (MainThread) [zigpy.zcl] [0x8B32:1:0xef00] Received command 0x02 (TSN 106): set_data_response(param=Command(status=0, tsn=247, command_id=515, function=0, data=[4, 0, 0, 0, 100]))
2023-07-27 03:37:24.433 DEBUG (MainThread) [zhaquirks.tuya] a4:c1:38:d0:15:3b:4d:38 Received Attribute Report. Command is 0x0002, Tuya Paylod values[Status : 0, TSN: 247, Command: 0x0203, Function: 0x00, Data: [4, 0, 0, 0, 100]]
2023-07-27 03:37:24.434 DEBUG (MainThread) [homeassistant.components.zha.core.cluster_handlers] [0x8B32:1:0x0102]: Attribute report 'Window Covering'[current_position_lift_percentage] = 0
2023-07-27 03:37:24.434 DEBUG (MainThread) [homeassistant.components.zha.cover] setting position: 0
2023-07-27 03:37:24.435 DEBUG (MainThread) [zhaquirks.tuya] a4:c1:38:d0:15:3b:4d:38 Tuya Attribute Cache : [{8: 0}]
2023-07-27 03:37:24.828 DEBUG (bellows.thread_0) [bellows.uart] Data frame: b'35b9b1a9112a15b6589e4a24ab5593499c37ef65df459874f8c60889fb7e23d37e' Does anyone have any idea what's going wrong here? Any help is appreciated. |
There hasn't been any activity on this issue recently. Due to the high number of incoming GitHub notifications, we have to clean some of the old issues, as many of them have already been resolved with the latest updates. Please make sure to update to the latest version and check if that solves the issue. Let us know if that works for you by adding a comment 👍 This issue has now been marked as stale and will be closed if no further activity occurs. Thank you for your contributions. |
Is your feature request related to a problem? Please describe.
ts0601_cover.py needs a new quirk added to support this device properly.
Currently, it pairs with no entities attached.
Describe the solution you'd like
Be able to pair the device and control my curtains through HA.
Device signature - this can be acquired by removing the device from ZHA and pairing it again from the add devices screen. Be sure to add the entire content of the log panel after pairing the device to a code block below this line.
The text was updated successfully, but these errors were encountered: