Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Device Support Request] Zemismart Curtain Motor with Rail (Mains powered) (_TZE200_rmymn92d) #1294

Closed
operinko opened this issue Jan 17, 2022 · 96 comments
Labels
custom quirk available A custom quirk is available to solve the issue, but it's not merged in the repo yet stale Issue is inactivate and might get closed soon Tuya Request/PR regarding a Tuya device

Comments

@operinko
Copy link

Is your feature request related to a problem? Please describe.
ts0601_cover.py needs a new quirk added to support this device properly.
Currently, it pairs with no entities attached.

Describe the solution you'd like
Be able to pair the device and control my curtains through HA.

Device signature - this can be acquired by removing the device from ZHA and pairing it again from the add devices screen. Be sure to add the entire content of the log panel after pairing the device to a code block below this line.

{
  "node_descriptor": "NodeDescriptor(logical_type=<LogicalType.Router: 1>, complex_descriptor_available=0, user_descriptor_available=0, reserved=0, aps_flags=0, frequency_band=<FrequencyBand.Freq2400MHz: 8>, mac_capability_flags=<MACCapabilityFlags.AllocateAddress|RxOnWhenIdle|MainsPowered|FullFunctionDevice: 142>, manufacturer_code=4417, maximum_buffer_size=66, maximum_incoming_transfer_size=66, server_mask=10752, maximum_outgoing_transfer_size=66, descriptor_capability_field=<DescriptorCapability.NONE: 0>, *allocate_address=True, *is_alternate_pan_coordinator=False, *is_coordinator=False, *is_end_device=False, *is_full_function_device=True, *is_mains_powered=True, *is_receiver_on_when_idle=True, *is_router=True, *is_security_capable=False)",
  "endpoints": {
    "1": {
      "profile_id": 260,
      "device_type": "0x0051",
      "in_clusters": [
        "0x0000",
        "0x0004",
        "0x0005",
        "0xef00"
      ],
      "out_clusters": [
        "0x000a",
        "0x0019"
      ]
    },
    "242": {
      "profile_id": 41440,
      "device_type": "0x0061",
      "in_clusters": [],
      "out_clusters": [
        "0x0021"
      ]
    }
  },
  "manufacturer": "_TZE200_rmymn92d",
  "model": "TS0601",
  "class": "zigpy.device.Device"
}
@operinko
Copy link
Author

I got it to add the Cover entity with this:

class TuyaZemismartCover0601_custom(TuyaWindowCover):
    """Tuya blind controller device."""

    signature = {
        # "node_descriptor": "NodeDescriptor(byte1=1, byte2=64, mac_capability_flags=142, manufacturer_code=4417,
        #                     maximum_buffer_size=66, maximum_incoming_transfer_size=66, server_mask=10752,
        #                     maximum_outgoing_transfer_size=66, descriptor_capability_field=0>,
        # "endpoints": {
        # "1": { "profile_id": 260, "device_type": "0x0051", "in_clusters": [ "0x0000", "0x0004","0x0005","0xef00"], "out_clusters": ["0x000a","0x0019"] },
        # "242": { "profile_id": 41440, "device_type": "0x0061", in_clusters": [], "out_clusters": [ "0x0021" ] }
        # },
        # "manufacturer": "_TZE200_rmymn92d",
        # "model": "TS0601",
        # "class": "zigpy.device.Device"
        # }
        MODELS_INFO: [
            ("_TZE200_rmymn92d", "TS0601"),
        ],
        ENDPOINTS: {
            1: {
                PROFILE_ID: 260,
                DEVICE_TYPE: 0x0051,
                INPUT_CLUSTERS: [
                    Basic.cluster_id,
                    Groups.cluster_id,
                    Scenes.cluster_id,
                    TuyaManufCluster.cluster_id,
                ],
                OUTPUT_CLUSTERS: [Time.cluster_id, Ota.cluster_id],
            },
            242: {
                PROFILE_ID: 41440,
                DEVICE_TYPE: 0x0061,
                INPUT_CLUSTERS: [],
                OUTPUT_CLUSTERS: [GreenPowerProxy.cluster_id],
            },
        },
    }

    replacement = {
        ENDPOINTS: {
            1: {
                DEVICE_TYPE: zha.DeviceType.WINDOW_COVERING_DEVICE,
                INPUT_CLUSTERS: [
                    Basic.cluster_id,
                    Groups.cluster_id,
                    Scenes.cluster_id,
                    TuyaManufacturerWindowCover,
                    TuyaWindowCoverControl,
                ],
                OUTPUT_CLUSTERS: [Time.cluster_id, Ota.cluster_id],
            },
            242: {
                PROFILE_ID: 41440,
                DEVICE_TYPE: 0x0061,
                INPUT_CLUSTERS: [],
                OUTPUT_CLUSTERS: [GreenPowerProxy.cluster_id],
            },
        }
    }

Just.. The buttons all throw:

Traceback (most recent call last):
  File "/usr/src/homeassistant/homeassistant/components/websocket_api/commands.py", line 185, in handle_call_service
    await hass.services.async_call(
  File "/usr/src/homeassistant/homeassistant/core.py", line 1495, in async_call
    task.result()
  File "/usr/src/homeassistant/homeassistant/core.py", line 1530, in _execute_service
    await handler.job.target(service_call)
  File "/usr/src/homeassistant/homeassistant/helpers/entity_component.py", line 209, in handle_service
    await self.hass.helpers.service.entity_service_call(
  File "/usr/src/homeassistant/homeassistant/helpers/service.py", line 663, in entity_service_call
    future.result()  # pop exception if have
  File "/usr/src/homeassistant/homeassistant/helpers/entity.py", line 896, in async_request_call
    await coro
  File "/usr/src/homeassistant/homeassistant/helpers/service.py", line 700, in _handle_entity_call
    await result
  File "/usr/src/homeassistant/homeassistant/components/zha/cover.py", line 132, in async_close_cover
    res = await self._cover_channel.down_close()
  File "/usr/src/homeassistant/homeassistant/components/zha/core/channels/base.py", line 59, in wrapper
    result = await command(*args, **kwds)
  File "/usr/local/lib/python3.9/site-packages/zhaquirks/tuya/__init__.py", line 1039, in command
    TUYA_COVER_COMMAND[manufacturer][command_id],
KeyError: '_TZE200_rmymn92d'

@Kiread-work
Copy link

@operinko check this #744 (comment)
Buttons works now without errors, but nothing happens

@raphaeujp
Copy link

Any update on support this device?

@TheJulianJES
Copy link
Collaborator

TheJulianJES commented Feb 27, 2022

@operinko When adding a custom quirk, you also want to update the information here:

TUYA_COVER_COMMAND = {
"_TZE200_zah67ekd": {0x0000: 0x0000, 0x0001: 0x0002, 0x0002: 0x0001},
"_TZE200_fzo2pocs": {0x0000: 0x0000, 0x0001: 0x0002, 0x0002: 0x0001},
"_TZE200_xuzcvlku": {0x0000: 0x0000, 0x0001: 0x0002, 0x0002: 0x0001},
"_TZE200_rddyvrci": {0x0000: 0x0002, 0x0001: 0x0001, 0x0002: 0x0000},
"_TZE200_3i3exuay": {0x0000: 0x0000, 0x0001: 0x0002, 0x0002: 0x0001},
"_TZE200_nueqqe6k": {0x0000: 0x0000, 0x0001: 0x0002, 0x0002: 0x0001},
"_TZE200_gubdgai2": {0x0000: 0x0000, 0x0001: 0x0002, 0x0002: 0x0001},
"_TZE200_zpzndjez": {0x0000: 0x0000, 0x0001: 0x0002, 0x0002: 0x0001},
"_TZE200_cowvfni3": {0x0000: 0x0002, 0x0001: 0x0000, 0x0002: 0x0001},
"_TYST11_wmcdj3aq": {0x0000: 0x0000, 0x0001: 0x0002, 0x0002: 0x0001},
"_TZE200_yenbr4om": {0x0000: 0x0000, 0x0001: 0x0002, 0x0002: 0x0001},
"_TZE200_5sbebbzs": {0x0000: 0x0000, 0x0001: 0x0002, 0x0002: 0x0001},
"_TZE200_xaabybja": {0x0000: 0x0000, 0x0001: 0x0002, 0x0002: 0x0001},
"_TZE200_hsgrhjpf": {0x0000: 0x0000, 0x0001: 0x0002, 0x0002: 0x0001},
"_TZE200_iossyxra": {0x0000: 0x0000, 0x0001: 0x0002, 0x0002: 0x0001},
"_TZE200_68nvbio9": {0x0000: 0x0000, 0x0001: 0x0002, 0x0002: 0x0001},
}

(I don't know which codes need to be added here -- or if this is even enough)

(For some other device, TUYA_COVER_INVERTED_BY_DEFAULT also needs to be updated. But not for _TZE200_rmymn92d`` apparently.)

If I remember correctly the structure for the custom quirk directory should look like this then:
/config/custom_quirks/tuya:
-- __init.py__ (modified)
-- ts0601_cover.py (modified)

Note:
Also edit the ts0601_cover.py file and change the following line (13):
from zhaquirks.tuya import (
to
from . import (

Edit: Discussion continued in #1245 (comment)

@Kiread-work
Copy link

So you will see the device in HA, you can even check the status (open/close). But the buttons will not work
/config/custom_zha_quirks/TS0601_Covertest.py

"""Tuya based cover and blinds."""
from zigpy.profiles import zha
from zigpy.zcl.clusters.general import Basic, Groups, Identify, OnOff, Ota, Scenes, Time, GreenPowerProxy
# from zigpy.zcl.clusters.closures import WindowCovering
# from zigpy.quirks import CustomCluster, CustomDevice
# import zigpy.types as t

from zhaquirks.const import (
    DEVICE_TYPE,
    ENDPOINTS,
    INPUT_CLUSTERS,
    MODELS_INFO,
    OUTPUT_CLUSTERS,
    PROFILE_ID,
)
from zhaquirks.tuya import (
    TuyaManufacturerWindowCover,
    TuyaManufCluster,
    TuyaWindowCover,
    TuyaWindowCoverControl,
)

class TuyaCover0601_TO_GPP(TuyaWindowCover):
    """Tuya blind controller device with time on out and GPP."""

    signature = {
        # "node_descriptor": "NodeDescriptor(byte1=2, byte2=64, mac_capability_flags=128, manufacturer_code=4098,
        #                    maximum_buffer_size=82, maximum_incoming_transfer_size=82, server_mask=11264,
        #                    maximum_outgoing_transfer_size=82, descriptor_capability_field=0)",
        # "endpoints": {
        # "1": { "profile_id": 260, "device_type": "0x0051", "in_clusters": [ "0x0000", "0x0004","0x0005","0xef00"],
        # "out_clusters": ["0x000a","0x0019"] } },
        MODELS_INFO: [
            ("_TZE200_r0jdjrvi", "TS0601"),
            ("_TZE200_rmymn92d", "TS0601"),
        ],
        ENDPOINTS: {
            1: {
                PROFILE_ID: zha.PROFILE_ID,
                DEVICE_TYPE: 0x0051, # zha.DeviceType.SMART_PLUG,
                INPUT_CLUSTERS: [
                    Basic.cluster_id,
                    Groups.cluster_id,
                    Scenes.cluster_id,
                    TuyaManufCluster.cluster_id,
                    #WindowCovering.cluster_id
                ],
                OUTPUT_CLUSTERS: [Time.cluster_id, Ota.cluster_id],
            },
            242: {
                # <SimpleDescriptor endpoint=242 profile=41440 device_type=97
                # input_clusters=[]
                # output_clusters=[33]
                PROFILE_ID: 41440,
                DEVICE_TYPE: 0x0061, #97,
                INPUT_CLUSTERS: [],
                OUTPUT_CLUSTERS: [GreenPowerProxy.cluster_id],
            },
        },
    }

    replacement = {
        ENDPOINTS: {
            1: {
                DEVICE_TYPE: zha.DeviceType.WINDOW_COVERING_DEVICE,
                INPUT_CLUSTERS: [
                    Basic.cluster_id,
                    Groups.cluster_id,
                    Scenes.cluster_id,
                    TuyaManufacturerWindowCover,
                    TuyaWindowCoverControl,
                    #TuyaCoveringCluster,
                ],
                OUTPUT_CLUSTERS: [Time.cluster_id, Ota.cluster_id],
            },
            242: {
                PROFILE_ID: 41440,
                DEVICE_TYPE: 97,
                INPUT_CLUSTERS: [],
                OUTPUT_CLUSTERS: [GreenPowerProxy.cluster_id],
            },
        },
    }

/config/custom_zha_quirks/init.py lines 109-110

    "_TZE200_r0jdjrvi": {0x0000: 0x0000, 0x0001: 0x0002, 0x0002: 0x0001},
    "_TZE200_rmymn92d": {0x0000: 0x0000, 0x0001: 0x0002, 0x0002: 0x0001},

@SergeantPup
Copy link

@Kiread-work. The new HA version borked my quirk. Actually the custom quirk borked my ZHA network so I've removed it which means it cannot even track state of _TZE200_rmymn92d. We have confirmation that the code is working in MQTT and I submitted a PR for the cover inversion in my journey to figure out the difference in the code but I think we're going to need a new custom quirk to use, is that correct? Something has changed where HA really didn't like that custom quirk being there.

How are things looking for you? still the same? This is now the last device in my home that HA cannot control :(

@Kiread-work
Copy link

Kiread-work commented Apr 9, 2022

@SergeantPup shame on you =) Just updated HA to core-2022.4.1 on 5.10.108 HAOS and my Xiaomi Humidifier stoped working =))
Maybe I could find some time to restore integration with humidifier and windows cover status _TZE200_r0jdjrvi

@Kiread-work
Copy link

@SergeantPup well it works (status, not buttons). Nothing changed.
Button press drops another error now.

@SergeantPup
Copy link

SergeantPup commented Apr 13, 2022

I put it back in and it borked my zha network again so I took it out again. Tracking the curtain isn't THAT important without control. This must be something trivial. Half tempted to just buy another head unit but I have a 50% shot it might be the same chip :(

@TheJulianJES
Copy link
Collaborator

  • Can you send the latest version of the quirk that used to "work"?
  • Is there anything in the (debug) logs when the quirk screws up ZHA?

@SergeantPup
Copy link

Hello Julian!

Here is the error I get when the quirk is present but makes the rest of my ZHA unresponsive:

Logger: homeassistant.config_entries
Source: custom_zha_quirks/tuya/__init__.py:263
First occurred: 8:04:45 PM (1 occurrences)
Last logged: 8:04:45 PM

Error setting up entry Zigbee Network for zha
Traceback (most recent call last):
  File "/usr/src/homeassistant/homeassistant/config_entries.py", line 335, in async_setup
    result = await component.async_setup_entry(hass, self)
  File "/usr/src/homeassistant/homeassistant/components/zha/__init__.py", line 99, in async_setup_entry
    setup_quirks(config)
  File "/usr/local/lib/python3.9/site-packages/zhaquirks/__init__.py", line 409, in setup
    importer.find_module(modname).load_module(modname)
  File "<frozen importlib._bootstrap_external>", line 529, in _check_name_wrapper
  File "<frozen importlib._bootstrap_external>", line 1029, in load_module
  File "<frozen importlib._bootstrap_external>", line 854, in load_module
  File "<frozen importlib._bootstrap>", line 274, in _load_module_shim
  File "<frozen importlib._bootstrap>", line 711, in _load
  File "<frozen importlib._bootstrap>", line 680, in _load_unlocked
  File "<frozen importlib._bootstrap_external>", line 850, in exec_module
  File "<frozen importlib._bootstrap>", line 228, in _call_with_frames_removed
  File "/config/custom_zha_quirks/tuya/__init__.py", line 263, in <module>
    class TuyaManufCluster(CustomCluster):
  File "/usr/local/lib/python3.9/site-packages/zigpy/zcl/__init__.py", line 91, in __init_subclass__
    raise TypeError(
TypeError: `manufacturer_client_commands` is deprecated. Copy the parent class's `client_commands` dictionary and update it with your manufacturer-specific `client_commands`. Make sure to specify that it is manufacturer-specific through the  appropriate constructor or tuple!

Here is the last known working quirk before the April update.
tuya.zip

@TheJulianJES
Copy link
Collaborator

TheJulianJES commented Apr 14, 2022

You can try this tuya folder:
tuya.zip
tuya.zip

@SergeantPup
Copy link

Logger: homeassistant.config_entries
Source: custom_zha_quirks/tuya/ts0601_cover.py:101
First occurred: 8:20:25 PM (1 occurrences)
Last logged: 8:20:25 PM

Error setting up entry Zigbee Network for zha
Traceback (most recent call last):
  File "/usr/src/homeassistant/homeassistant/config_entries.py", line 335, in async_setup
    result = await component.async_setup_entry(hass, self)
  File "/usr/src/homeassistant/homeassistant/components/zha/__init__.py", line 99, in async_setup_entry
    setup_quirks(config)
  File "/usr/local/lib/python3.9/site-packages/zhaquirks/__init__.py", line 409, in setup
    importer.find_module(modname).load_module(modname)
  File "<frozen importlib._bootstrap_external>", line 529, in _check_name_wrapper
  File "<frozen importlib._bootstrap_external>", line 1029, in load_module
  File "<frozen importlib._bootstrap_external>", line 854, in load_module
  File "<frozen importlib._bootstrap>", line 274, in _load_module_shim
  File "<frozen importlib._bootstrap>", line 711, in _load
  File "<frozen importlib._bootstrap>", line 680, in _load_unlocked
  File "<frozen importlib._bootstrap_external>", line 850, in exec_module
  File "<frozen importlib._bootstrap>", line 228, in _call_with_frames_removed
  File "/config/custom_zha_quirks/tuya/ts0601_cover.py", line 71, in <module>
    class TuyaZemismartSmartCover0601_4(TuyaWindowCover):
  File "/config/custom_zha_quirks/tuya/ts0601_cover.py", line 101, in TuyaZemismartSmartCover0601_4
    OUTPUT_CLUSTERS: [GreenPowerProxy.cluster_id],
NameError: name 'GreenPowerProxy' is not defined

@TheJulianJES
Copy link
Collaborator

TheJulianJES commented Apr 14, 2022

Try this one: tuya.zip

@SergeantPup
Copy link

ZHA network and device tracking of _TZE200_rmymn92d are concurrently working again with 2022.4.3. Thank you!

I think I'm getting different errors for the _TZE200_rmymn92d command sends now (which is what the other user was reporting).

Here is a successful command on working device: _TZE200_xaabybja with Nwk: 0xd775 (this device works fine)

2022-04-13 20:45:41 DEBUG (MainThread) [bellows.ezsp.protocol] Application frame 89 (incomingRouteRecordHandler) received: b'75d7abb7d1feff81f68cffb401acc7'
2022-04-13 20:45:41 DEBUG (MainThread) [bellows.zigbee.application] Received incomingRouteRecordHandler frame with [0xd775, 8c:f6:81:ff:fe:d1:b7:ab, 255, -76, [0xc7ac]]
2022-04-13 20:45:41 DEBUG (MainThread) [bellows.zigbee.application] Processing route record request: (0xd775, 8c:f6:81:ff:fe:d1:b7:ab, 255, -76, [0xc7ac])
2022-04-13 20:45:41 DEBUG (MainThread) [bellows.ezsp.protocol] Application frame 69 (incomingMessageHandler) received: b'00040100ef010100010000cfffb775d7ffff0a09530100c7070400010004'
2022-04-13 20:45:41 DEBUG (MainThread) [bellows.zigbee.application] Received incomingMessageHandler frame with [<EmberIncomingMessageType.INCOMING_UNICAST: 0>, EmberApsFrame(profileId=260, clusterId=61184, sourceEndpoint=1, destinationEndpoint=1, options=<EmberApsOption.APS_OPTION_ENABLE_ROUTE_DISCOVERY: 256>, groupId=0, sequence=207), 255, -73, 0xd775, 255, 255, b'\tS\x01\x00\xc7\x07\x04\x00\x01\x00']
2022-04-13 20:45:41 DEBUG (MainThread) [zigpy.zcl] [0xD775:1:0xef00] Received ZCL frame: b'\tS\x01\x00\xc7\x07\x04\x00\x01\x00'
2022-04-13 20:45:41 DEBUG (MainThread) [zigpy.zcl] [0xD775:1:0xef00] Decoded ZCL frame header: ZCLHeader(frame_control=FrameControl(frame_type=<FrameType.CLUSTER_COMMAND: 1>, is_manufacturer_specific=0, is_reply=1, disable_default_response=0, reserved=0, *is_cluster=True, *is_general=False), tsn=83, command_id=1, *is_reply=True)
2022-04-13 20:45:41 DEBUG (MainThread) [zigpy.zcl] [0xD775:1:0xef00] Decoded ZCL frame: TuyaManufacturerWindowCover:get_data(param=Command(status=0, tsn=199, command_id=1031, function=0, data=[1, 0]))
2022-04-13 20:45:41 DEBUG (MainThread) [zigpy.zcl] [0xD775:1:0xef00] Received command 0x01 (TSN 83): get_data(param=Command(status=0, tsn=199, command_id=1031, function=0, data=[1, 0]))
2022-04-13 20:45:41 DEBUG (MainThread) [tuya] 8c:f6:81:ff:fe:d1:b7:ab Received Attribute Report. Command is 0x0001, Tuya Paylod values[Status : 0, TSN: 199, Command: 0x0407, Function: 0x00, Data: [1, 0]]
2022-04-13 20:45:41 DEBUG (MainThread) [bellows.ezsp.protocol] Application frame 89 (incomingRouteRecordHandler) received: b'75d7abb7d1feff81f68cffb801acc7'
2022-04-13 20:45:41 DEBUG (MainThread) [bellows.zigbee.application] Received incomingRouteRecordHandler frame with [0xd775, 8c:f6:81:ff:fe:d1:b7:ab, 255, -72, [0xc7ac]]
2022-04-13 20:45:41 DEBUG (MainThread) [bellows.zigbee.application] Processing route record request: (0xd775, 8c:f6:81:ff:fe:d1:b7:ab, 255, -72, [0xc7ac])
2022-04-13 20:45:41 DEBUG (MainThread) [bellows.ezsp.protocol] Application frame 69 (incomingMessageHandler) received: b'00040100ef010100010000d0ffb575d7ffff0d09540100c8030200040000006404'
2022-04-13 20:45:41 DEBUG (MainThread) [bellows.zigbee.application] Received incomingMessageHandler frame with [<EmberIncomingMessageType.INCOMING_UNICAST: 0>, EmberApsFrame(profileId=260, clusterId=61184, sourceEndpoint=1, destinationEndpoint=1, options=<EmberApsOption.APS_OPTION_ENABLE_ROUTE_DISCOVERY: 256>, groupId=0, sequence=208), 255, -75, 0xd775, 255, 255, b'\tT\x01\x00\xc8\x03\x02\x00\x04\x00\x00\x00d']
2022-04-13 20:45:41 DEBUG (MainThread) [zigpy.zcl] [0xD775:1:0xef00] Received ZCL frame: b'\tT\x01\x00\xc8\x03\x02\x00\x04\x00\x00\x00d'
2022-04-13 20:45:41 DEBUG (MainThread) [zigpy.zcl] [0xD775:1:0xef00] Decoded ZCL frame header: ZCLHeader(frame_control=FrameControl(frame_type=<FrameType.CLUSTER_COMMAND: 1>, is_manufacturer_specific=0, is_reply=1, disable_default_response=0, reserved=0, *is_cluster=True, *is_general=False), tsn=84, command_id=1, *is_reply=True)
2022-04-13 20:45:41 DEBUG (MainThread) [zigpy.zcl] [0xD775:1:0xef00] Decoded ZCL frame: TuyaManufacturerWindowCover:get_data(param=Command(status=0, tsn=200, command_id=515, function=0, data=[4, 0, 0, 0, 100]))
2022-04-13 20:45:41 DEBUG (MainThread) [zigpy.zcl] [0xD775:1:0xef00] Received command 0x01 (TSN 84): get_data(param=Command(status=0, tsn=200, command_id=515, function=0, data=[4, 0, 0, 0, 100]))
2022-04-13 20:45:41 DEBUG (MainThread) [tuya] 8c:f6:81:ff:fe:d1:b7:ab Received Attribute Report. Command is 0x0001, Tuya Paylod values[Status : 0, TSN: 200, Command: 0x0203, Function: 0x00, Data: [4, 0, 0, 0, 100]]
2022-04-13 20:45:41 DEBUG (MainThread) [homeassistant.components.zha.core.channels.base] [0xD775:1:0x0102]: Attribute report 'Window Covering'[current_position_lift_percentage] = 100
2022-04-13 20:45:41 DEBUG (MainThread) [tuya] 8c:f6:81:ff:fe:d1:b7:ab Tuya Attribute Cache : [{8: 100}]
2022-04-13 20:45:41 DEBUG (MainThread) [homeassistant.components.zha.cover] setting position: 100

Here is the new error from _TZE200_rmymn92d with Nwk: 0xc7ac (this is the device that's accurately tracking state and not responding to button presses).

2022-04-13 20:45:41 DEBUG (MainThread) [zigpy.zcl] [0xC7AC:1:0xef00] Sending request header: ZCLHeader(frame_control=FrameControl(frame_type=<FrameType.CLUSTER_COMMAND: 1>, is_manufacturer_specific=True, is_reply=0, disable_default_response=0, reserved=0, *is_cluster=True, *is_general=False), manufacturer=4417, tsn=95, command_id=0, *is_reply=False)
2022-04-13 20:45:41 DEBUG (MainThread) [zigpy.zcl] [0xC7AC:1:0xef00] Sending request: set_data(param=Command(status=0, tsn=0, command_id=1025, function=0, data=[1, 1]))
2022-04-13 20:45:41 DEBUG (MainThread) [bellows.ezsp.protocol] Send command sendUnicast: (<EmberOutgoingMessageType.OUTGOING_DIRECT: 0>, 0xC7AC, EmberApsFrame(profileId=260, clusterId=61184, sourceEndpoint=1, destinationEndpoint=1, options=<EmberApsOption.APS_OPTION_ENABLE_ROUTE_DISCOVERY|APS_OPTION_RETRY: 320>, groupId=0, sequence=95), 96, b'\x05A\x11_\x00\x00\x00\x01\x04\x00\x01\x01')
2022-04-13 20:45:41 DEBUG (MainThread) [bellows.ezsp.protocol] Application frame 52 (sendUnicast) received: b'00a8'
2022-04-13 20:45:41 DEBUG (MainThread) [bellows.ezsp.protocol] Application frame 63 (messageSentHandler) received: b'00acc7040100ef010140010000a8600000'
2022-04-13 20:45:41 DEBUG (MainThread) [bellows.zigbee.application] Received messageSentHandler frame with [<EmberOutgoingMessageType.OUTGOING_DIRECT: 0>, 51116, EmberApsFrame(profileId=260, clusterId=61184, sourceEndpoint=1, destinationEndpoint=1, options=<EmberApsOption.APS_OPTION_ENABLE_ROUTE_DISCOVERY|APS_OPTION_RETRY: 320>, groupId=0, sequence=168), 96, <EmberStatus.SUCCESS: 0>, b'']
2022-04-13 20:45:41 DEBUG (MainThread) [bellows.ezsp.protocol] Application frame 69 (incomingMessageHandler) received: b'00040100ef0101000100003effb5acc7ffff05185f0b008302'
2022-04-13 20:45:41 DEBUG (MainThread) [bellows.zigbee.application] Received incomingMessageHandler frame with [<EmberIncomingMessageType.INCOMING_UNICAST: 0>, EmberApsFrame(profileId=260, clusterId=61184, sourceEndpoint=1, destinationEndpoint=1, options=<EmberApsOption.APS_OPTION_ENABLE_ROUTE_DISCOVERY: 256>, groupId=0, sequence=62), 255, -75, 0xc7ac, 255, 255, b'\x18_\x0b\x00\x83']
2022-04-13 20:45:41 DEBUG (MainThread) [zigpy.zcl] [0xC7AC:1:0xef00] Received ZCL frame: b'\x18_\x0b\x00\x83'
2022-04-13 20:45:41 DEBUG (MainThread) [zigpy.zcl] [0xC7AC:1:0xef00] Decoded ZCL frame header: ZCLHeader(frame_control=FrameControl(frame_type=<FrameType.GLOBAL_COMMAND: 0>, is_manufacturer_specific=0, is_reply=1, disable_default_response=1, reserved=0, *is_cluster=False, *is_general=True), tsn=95, command_id=11, *is_reply=True)
2022-04-13 20:45:41 DEBUG (MainThread) [zigpy.zcl] [0xC7AC:1:0xef00] Decoded ZCL frame: TuyaManufacturerWindowCover:Default_Response(command_id=0, status=<Status.UNSUP_MANUF_CLUSTER_COMMAND: 131>)
2022-04-13 20:45:41 DEBUG (MainThread) [homeassistant.components.zha.core.channels.base] [0xC7AC:1:0x0102]: executed 'stop' command with args: '()' kwargs: '{}' result: Default_Response(command_id=0, status=<Status.UNSUP_MANUF_CLUSTER_COMMAND: 131>)
2022-04-13 20:45:41 DEBUG (MainThread) [bellows.ezsp.protocol] Application frame 89 (incomingRouteRecordHandler) received: b'75d7abb7d1feff81f68cffb401acc7'
2022-04-13 20:45:41 DEBUG (MainThread) [bellows.zigbee.application] Received incomingRouteRecordHandler frame with [0xd775, 8c:f6:81:ff:fe:d1:b7:ab, 255, -76, [0xc7ac]]
2022-04-13 20:45:41 DEBUG (MainThread) [bellows.zigbee.application] Processing route record request: (0xd775, 8c:f6:81:ff:fe:d1:b7:ab, 255, -76, [0xc7ac])
2022-04-13 20:45:41 DEBUG (MainThread) [bellows.ezsp.protocol] Application frame 69 (incomingMessageHandler) received: b'00040100ef010100010000cfffb775d7ffff0a09530100c7070400010004'
2022-04-13 20:45:41 DEBUG (MainThread) [bellows.zigbee.application] Received incomingMessageHandler frame with [<EmberIncomingMessageType.INCOMING_UNICAST: 0>, EmberApsFrame(profileId=260, clusterId=61184, sourceEndpoint=1, destinationEndpoint=1, options=<EmberApsOption.APS_OPTION_ENABLE_ROUTE_DISCOVERY: 256>, groupId=0, sequence=207), 255, -73, 0xd775, 255, 255, b'\tS\x01\x00\xc7\x07\x04\x00\x01\x00']
2022-04-13 20:45:41 DEBUG (MainThread) [zigpy.zcl] [0xD775:1:0xef00] Received ZCL frame: b'\tS\x01\x00\xc7\x07\x04\x00\x01\x00'
2022-04-13 20:45:41 DEBUG (MainThread) [zigpy.zcl] [0xD775:1:0xef00] Decoded ZCL frame header: ZCLHeader(frame_control=FrameControl(frame_type=<FrameType.CLUSTER_COMMAND: 1>, is_manufacturer_specific=0, is_reply=1, disable_default_response=0, reserved=0, *is_cluster=True, *is_general=False), tsn=83, command_id=1, *is_reply=True)
2022-04-13 20:45:41 DEBUG (MainThread) [zigpy.zcl] [0xD775:1:0xef00] Decoded ZCL frame: TuyaManufacturerWindowCover:get_data(param=Command(status=0, tsn=199, command_id=1031, function=0, data=[1, 0]))
2022-04-13 20:45:41 DEBUG (MainThread) [zigpy.zcl] [0xD775:1:0xef00] Received command 0x01 (TSN 83): get_data(param=Command(status=0, tsn=199, command_id=1031, function=0, data=[1, 0]))
2022-04-13 20:45:41 DEBUG (MainThread) [tuya] 8c:f6:81:ff:fe:d1:b7:ab Received Attribute Report. Command is 0x0001, Tuya Paylod values[Status : 0, TSN: 199, Command: 0x0407, Function: 0x00, Data: [1, 0]]
2022-04-13 20:45:41 DEBUG (MainThread) [bellows.ezsp.protocol] Application frame 89 (incomingRouteRecordHandler) received: b'75d7abb7d1feff81f68cffb801acc7'
2022-04-13 20:45:41 DEBUG (MainThread) [bellows.zigbee.application] Received incomingRouteRecordHandler frame with [0xd775, 8c:f6:81:ff:fe:d1:b7:ab, 255, -72, [0xc7ac]]
2022-04-13 20:45:41 DEBUG (MainThread) [bellows.zigbee.application] Processing route record request: (0xd775, 8c:f6:81:ff:fe:d1:b7:ab, 255, -72, [0xc7ac])
2022-04-13 20:45:41 DEBUG (MainThread) [bellows.ezsp.protocol] Application frame 69 (incomingMessageHandler) received: b'00040100ef010100010000d0ffb575d7ffff0d09540100c8030200040000006404'

@TheJulianJES
Copy link
Collaborator

TheJulianJES commented Apr 14, 2022

Just to confirm, do your debug logs (from [tuya]) ever log something including the text Sending Tuya?
(when sending an open/close command to a working cover and/or the non-working one)

@SergeantPup
Copy link

I just ran an open/stop/close command on _TZE200_xaabybja 0xb41c. 61 entries for "0xb41c" and 38 entries for "tuya" (none say sending:

2022-04-13 21:31:34 DEBUG (MainThread) [zigpy.zcl] [0xB41C:1:0xef00] Received command 0x01 (TSN 89): get_data(param=Command(status=0, tsn=0, command_id=1025, function=0, data=[1, 2]))
2022-04-13 21:31:34 DEBUG (MainThread) [tuya] b4:e3:f9:ff:fe:0c:51:6e Received Attribute Report. Command is 0x0001, Tuya Paylod values[Status : 0, TSN: 0, Command: 0x0401, Function: 0x00, Data: [1, 2]]
2022-04-13 21:31:35 DEBUG (MainThread) [bellows.ezsp.protocol] Application frame 69 (incomingMessageHandler) received: b'00040100ef010100010000b5ffbf1cb4ffff0a09590100000104000102'
2022-04-13 21:31:35 DEBUG (MainThread) [bellows.zigbee.application] Received incomingMessageHandler frame with [<EmberIncomingMessageType.INCOMING_UNICAST: 0>, EmberApsFrame(profileId=260, clusterId=61184, sourceEndpoint=1, destinationEndpoint=1, options=<EmberApsOption.APS_OPTION_ENABLE_ROUTE_DISCOVERY: 256>, groupId=0, sequence=181), 255, -65, 0xb41c, 255, 255, b'\tY\x01\x00\x00\x01\x04\x00\x01\x02']
2022-04-13 21:31:35 DEBUG (MainThread) [zigpy.zcl] [0xB41C:1:0xef00] Received ZCL frame: b'\tY\x01\x00\x00\x01\x04\x00\x01\x02'
2022-04-13 21:31:35 DEBUG (MainThread) [zigpy.zcl] [0xB41C:1:0xef00] Decoded ZCL frame header: ZCLHeader(frame_control=FrameControl(frame_type=<FrameType.CLUSTER_COMMAND: 1>, is_manufacturer_specific=0, is_reply=1, disable_default_response=0, reserved=0, *is_cluster=True, *is_general=False), tsn=89, command_id=1, *is_reply=True)
2022-04-13 21:31:35 DEBUG (MainThread) [zigpy.zcl] [0xB41C:1:0xef00] Decoded ZCL frame: TuyaManufacturerWindowCover:get_data(param=Command(status=0, tsn=0, command_id=1025, function=0, data=[1, 2]))
2022-04-13 21:31:35 DEBUG (MainThread) [zigpy.zcl] [0xB41C:1:0xef00] Received command 0x01 (TSN 89): get_data(param=Command(status=0, tsn=0, command_id=1025, function=0, data=[1, 2]))
2022-04-13 21:31:35 DEBUG (MainThread) [tuya] b4:e3:f9:ff:fe:0c:51:6e Received Attribute Report. Command is 0x0001, Tuya Paylod values[Status : 0, TSN: 0, Command: 0x0401, Function: 0x00, Data: [1, 2]]
2022-04-13 21:31:35 DEBUG (MainThread) [bellows.ezsp.protocol] Application frame 69 (incomingMessageHandler) received: b'00040100ef010100010000b6ffbf1cb4ffff0a09590100000104000102'
2022-04-13 21:31:35 DEBUG (MainThread) [bellows.zigbee.application] Received incomingMessageHandler frame with [<EmberIncomingMessageType.INCOMING_UNICAST: 0>, EmberApsFrame(profileId=260, clusterId=61184, sourceEndpoint=1, destinationEndpoint=1, options=<EmberApsOption.APS_OPTION_ENABLE_ROUTE_DISCOVERY: 256>, groupId=0, sequence=182), 255, -65, 0xb41c, 255, 255, b'\tY\x01\x00\x00\x01\x04\x00\x01\x02']
2022-04-13 21:31:35 DEBUG (MainThread) [zigpy.zcl] [0xB41C:1:0xef00] Received ZCL frame: b'\tY\x01\x00\x00\x01\x04\x00\x01\x02'
2022-04-13 21:31:35 DEBUG (MainThread) [zigpy.zcl] [0xB41C:1:0xef00] Decoded ZCL frame header: ZCLHeader(frame_control=FrameControl(frame_type=<FrameType.CLUSTER_COMMAND: 1>, is_manufacturer_specific=0, is_reply=1, disable_default_response=0, reserved=0, *is_cluster=True, *is_general=False), tsn=89, command_id=1, *is_reply=True)
2022-04-13 21:31:35 DEBUG (MainThread) [zigpy.zcl] [0xB41C:1:0xef00] Decoded ZCL frame: TuyaManufacturerWindowCover:get_data(param=Command(status=0, tsn=0, command_id=1025, function=0, data=[1, 2]))
2022-04-13 21:31:35 DEBUG (MainThread) [zigpy.zcl] [0xB41C:1:0xef00] Received command 0x01 (TSN 89): get_data(param=Command(status=0, tsn=0, command_id=1025, function=0, data=[1, 2]))
2022-04-13 21:31:35 DEBUG (MainThread) [tuya] b4:e3:f9:ff:fe:0c:51:6e Received Attribute Report. Command is 0x0001, Tuya Paylod values[Status : 0, TSN: 0, Command: 0x0401, Function: 0x00, Data: [1, 2]]
2022-04-13 21:31:35 DEBUG (MainThread) [bellows.ezsp.protocol] Application frame 69 (incomingMessageHandler) received: b'00040100ef010100010000b7ffbf1cb4ffff0a09590100000104000102'
2022-04-13 21:31:35 DEBUG (MainThread) [bellows.zigbee.application] Received incomingMessageHandler frame with [<EmberIncomingMessageType.INCOMING_UNICAST: 0>, EmberApsFrame(profileId=260, clusterId=61184, sourceEndpoint=1, destinationEndpoint=1, options=<EmberApsOption.APS_OPTION_ENABLE_ROUTE_DISCOVERY: 256>, groupId=0, sequence=183), 255, -65, 0xb41c, 255, 255, b'\tY\x01\x00\x00\x01\x04\x00\x01\x02']
2022-04-13 21:31:35 DEBUG (MainThread) [zigpy.zcl] [0xB41C:1:0xef00] Received ZCL frame: b'\tY\x01\x00\x00\x01\x04\x00\x01\x02'
2022-04-13 21:31:35 DEBUG (MainThread) [zigpy.zcl] [0xB41C:1:0xef00] Decoded ZCL frame header: ZCLHeader(frame_control=FrameControl(frame_type=<FrameType.CLUSTER_COMMAND: 1>, is_manufacturer_specific=0, is_reply=1, disable_default_response=0, reserved=0, *is_cluster=True, *is_general=False), tsn=89, command_id=1, *is_reply=True)
2022-04-13 21:31:35 DEBUG (MainThread) [zigpy.zcl] [0xB41C:1:0xef00] Decoded ZCL frame: TuyaManufacturerWindowCover:get_data(param=Command(status=0, tsn=0, command_id=1025, function=0, data=[1, 2]))
2022-04-13 21:31:35 DEBUG (MainThread) [zigpy.zcl] [0xB41C:1:0xef00] Received command 0x01 (TSN 89): get_data(param=Command(status=0, tsn=0, command_id=1025, function=0, data=[1, 2]))
2022-04-13 21:31:35 DEBUG (MainThread) [tuya] b4:e3:f9:ff:fe:0c:51:6e Received Attribute Report. Command is 0x0001, Tuya Paylod values[Status : 0, TSN: 0, Command: 0x0401, Function: 0x00, Data: [1, 2]]
2022-04-13 21:31:35 DEBUG (MainThread) [bellows.ezsp.protocol] Application frame 69 (incomingMessageHandler) received: b'00040100ef010100010000b8ffbf1cb4ffff0a09590100000104000102'
2022-04-13 21:31:35 DEBUG (MainThread) [bellows.zigbee.application] Received incomingMessageHandler frame with [<EmberIncomingMessageType.INCOMING_UNICAST: 0>, EmberApsFrame(profileId=260, clusterId=61184, sourceEndpoint=1, destinationEndpoint=1, options=<EmberApsOption.APS_OPTION_ENABLE_ROUTE_DISCOVERY: 256>, groupId=0, sequence=184), 255, -65, 0xb41c, 255, 255, b'\tY\x01\x00\x00\x01\x04\x00\x01\x02']
2022-04-13 21:31:35 DEBUG (MainThread) [zigpy.zcl] [0xB41C:1:0xef00] Received ZCL frame: b'\tY\x01\x00\x00\x01\x04\x00\x01\x02'
2022-04-13 21:31:35 DEBUG (MainThread) [zigpy.zcl] [0xB41C:1:0xef00] Decoded ZCL frame header: ZCLHeader(frame_control=FrameControl(frame_type=<FrameType.CLUSTER_COMMAND: 1>, is_manufacturer_specific=0, is_reply=1, disable_default_response=0, reserved=0, *is_cluster=True, *is_general=False), tsn=89, command_id=1, *is_reply=True)
2022-04-13 21:31:35 DEBUG (MainThread) [zigpy.zcl] [0xB41C:1:0xef00] Decoded ZCL frame: TuyaManufacturerWindowCover:get_data(param=Command(status=0, tsn=0, command_id=1025, function=0, data=[1, 2]))
2022-04-13 21:31:35 DEBUG (MainThread) [zigpy.zcl] [0xB41C:1:0xef00] Received command 0x01 (TSN 89): get_data(param=Command(status=0, tsn=0, command_id=1025, function=0, data=[1, 2]))
2022-04-13 21:31:35 DEBUG (MainThread) [tuya] b4:e3:f9:ff:fe:0c:51:6e Received Attribute Report. Command is 0x0001, Tuya Paylod values[Status : 0, TSN: 0, Command: 0x0401, Function: 0x00, Data: [1, 2]]
2022-04-13 21:31:35 DEBUG (MainThread) [bellows.ezsp.protocol] Application frame 69 (incomingMessageHandler) received: b'00040100ef010100010000b9ffbf1cb4ffff0a09590100000104000102'
2022-04-13 21:31:35 DEBUG (MainThread) [bellows.zigbee.application] Received incomingMessageHandler frame with [<EmberIncomingMessageType.INCOMING_UNICAST: 0>, EmberApsFrame(profileId=260, clusterId=61184, sourceEndpoint=1, destinationEndpoint=1, options=<EmberApsOption.APS_OPTION_ENABLE_ROUTE_DISCOVERY: 256>, groupId=0, sequence=185), 255, -65, 0xb41c, 255, 255, b'\tY\x01\x00\x00\x01\x04\x00\x01\x02']
2022-04-13 21:31:35 DEBUG (MainThread) [zigpy.zcl] [0xB41C:1:0xef00] Received ZCL frame: b'\tY\x01\x00\x00\x01\x04\x00\x01\x02'
2022-04-13 21:31:35 DEBUG (MainThread) [zigpy.zcl] [0xB41C:1:0xef00] Decoded ZCL frame header: ZCLHeader(frame_control=FrameControl(frame_type=<FrameType.CLUSTER_COMMAND: 1>, is_manufacturer_specific=0, is_reply=1, disable_default_response=0, reserved=0, *is_cluster=True, *is_general=False), tsn=89, command_id=1, *is_reply=True)
2022-04-13 21:31:35 DEBUG (MainThread) [zigpy.zcl] [0xB41C:1:0xef00] Decoded ZCL frame: TuyaManufacturerWindowCover:get_data(param=Command(status=0, tsn=0, command_id=1025, function=0, data=[1, 2]))
2022-04-13 21:31:35 DEBUG (MainThread) [zigpy.zcl] [0xB41C:1:0xef00] Received command 0x01 (TSN 89): get_data(param=Command(status=0, tsn=0, command_id=1025, function=0, data=[1, 2]))
2022-04-13 21:31:35 DEBUG (MainThread) [tuya] b4:e3:f9:ff:fe:0c:51:6e Received Attribute Report. Command is 0x0001, Tuya Paylod values[Status : 0, TSN: 0, Command: 0x0401, Function: 0x00, Data: [1, 2]]
2022-04-13 21:31:36 DEBUG (MainThread) [bellows.ezsp.protocol] Application frame 69 (incomingMessageHandler) received: b'00040100ef010100010000baffbe1cb4ffff0a09590100000104000102'
2022-04-13 21:31:36 DEBUG (MainThread) [bellows.zigbee.application] Received incomingMessageHandler frame with [<EmberIncomingMessageType.INCOMING_UNICAST: 0>, EmberApsFrame(profileId=260, clusterId=61184, sourceEndpoint=1, destinationEndpoint=1, options=<EmberApsOption.APS_OPTION_ENABLE_ROUTE_DISCOVERY: 256>, groupId=0, sequence=186), 255, -66, 0xb41c, 255, 255, b'\tY\x01\x00\x00\x01\x04\x00\x01\x02']
2022-04-13 21:31:36 DEBUG (MainThread) [zigpy.zcl] [0xB41C:1:0xef00] Received ZCL frame: b'\tY\x01\x00\x00\x01\x04\x00\x01\x02'
2022-04-13 21:31:36 DEBUG (MainThread) [zigpy.zcl] [0xB41C:1:0xef00] Decoded ZCL frame header: ZCLHeader(frame_control=FrameControl(frame_type=<FrameType.CLUSTER_COMMAND: 1>, is_manufacturer_specific=0, is_reply=1, disable_default_response=0, reserved=0, *is_cluster=True, *is_general=False), tsn=89, command_id=1, *is_reply=True)
2022-04-13 21:31:36 DEBUG (MainThread) [zigpy.zcl] [0xB41C:1:0xef00] Decoded ZCL frame: TuyaManufacturerWindowCover:get_data(param=Command(status=0, tsn=0, command_id=1025, function=0, data=[1, 2]))
2022-04-13 21:31:36 DEBUG (MainThread) [zigpy.zcl] [0xB41C:1:0xef00] Received command 0x01 (TSN 89): get_data(param=Command(status=0, tsn=0, command_id=1025, function=0, data=[1, 2]))
2022-04-13 21:31:36 DEBUG (MainThread) [tuya] b4:e3:f9:ff:fe:0c:51:6e Received Attribute Report. Command is 0x0001, Tuya Paylod values[Status : 0, TSN: 0, Command: 0x0401, Function: 0x00, Data: [1, 2]]
2022-04-13 21:31:36 DEBUG (MainThread) [bellows.ezsp.protocol] Application frame 69 (incomingMessageHandler) received: b'0004010005010140010000fbffbfd7dcffff09092200210000000000'
2022-04-13 21:31:36 DEBUG (MainThread) [bellows.zigbee.application] Received incomingMessageHandler frame with [<EmberIncomingMessageType.INCOMING_UNICAST: 0>, EmberApsFrame(profileId=260, clusterId=1280, sourceEndpoint=1, destinationEndpoint=1, options=<EmberApsOption.APS_OPTION_ENABLE_ROUTE_DISCOVERY|APS_OPTION_RETRY: 320>, groupId=0, sequence=251), 255, -65, 0xdcd7, 255, 255, b'\t"\x00!\x00\x00\x00\x00\x00']
2022-04-13 21:31:36 DEBUG (MainThread) [zigpy.zcl] [0xDCD7:1:0x0500] Received ZCL frame: b'\t"\x00!\x00\x00\x00\x00\x00'
2022-04-13 21:31:36 DEBUG (MainThread) [zigpy.zcl] [0xDCD7:1:0x0500] Decoded ZCL frame header: ZCLHeader(frame_control=FrameControl(frame_type=<FrameType.CLUSTER_COMMAND: 1>, is_manufacturer_specific=0, is_reply=1, disable_default_response=0, reserved=0, *is_cluster=True, *is_general=False), tsn=34, command_id=0, *is_reply=True)
2022-04-13 21:31:36 DEBUG (MainThread) [zigpy.zcl] [0xDCD7:1:0x0500] Decoded ZCL frame: IasZone:status_change_notification(zone_status=<ZoneStatus.Restore_reports|Alarm_1: 33>, extended_status=<bitmap8.0: 0>, zone_id=0, delay=0)
2022-04-13 21:31:36 DEBUG (MainThread) [zigpy.zcl] [0xDCD7:1:0x0500] Received command 0x00 (TSN 34): status_change_notification(zone_status=<ZoneStatus.Restore_reports|Alarm_1: 33>, extended_status=<bitmap8.0: 0>, zone_id=0, delay=0)
2022-04-13 21:31:36 DEBUG (MainThread) [zigpy.zcl] [0xDCD7:1:0x0500] Sending reply header: ZCLHeader(frame_control=FrameControl(frame_type=<FrameType.GLOBAL_COMMAND: 0>, is_manufacturer_specific=False, is_reply=1, disable_default_response=1, reserved=0, *is_cluster=False, *is_general=True), tsn=34, command_id=<GeneralCommand.Default_Response: 11>, *is_reply=True)
2022-04-13 21:31:36 DEBUG (MainThread) [zigpy.zcl] [0xDCD7:1:0x0500] Sending reply: Default_Response(command_id=0, status=<Status.SUCCESS: 0>)
2022-04-13 21:31:36 DEBUG (MainThread) [homeassistant.components.zha.core.channels.base] [0xDCD7:1:0x0500]: Updated alarm state: ZoneStatus.Alarm_1
2022-04-13 21:31:36 DEBUG (MainThread) [bellows.ezsp.protocol] Send command sendUnicast: (<EmberOutgoingMessageType.OUTGOING_DIRECT: 0>, 0xDCD7, EmberApsFrame(profileId=260, clusterId=1280, sourceEndpoint=1, destinationEndpoint=1, options=<EmberApsOption.APS_OPTION_ENABLE_ROUTE_DISCOVERY|APS_OPTION_RETRY: 320>, groupId=0, sequence=34), 147, b'\x18"\x0b\x00\x00')
2022-04-13 21:31:36 DEBUG (MainThread) [homeassistant.core] Bus:Handling <Event state_changed[L]: entity_id=binary_sensor.media_room_motion_sensor, old_state=<state binary_sensor.media_room_motion_sensor=off; device_class=motion, friendly_name=Media Room Motion Sensor @ 2022-04-13T21:31:24.290528-04:00>, new_state=<state binary_sensor.media_room_motion_sensor=on; device_class=motion, friendly_name=Media Room Motion Sensor @ 2022-04-13T21:31:36.257601-04:00>>
2022-04-13 21:31:36 DEBUG (MainThread) [bellows.ezsp.protocol] Application frame 52 (sendUnicast) received: b'0089'
2022-04-13 21:31:36 DEBUG (MainThread) [bellows.ezsp.protocol] Application frame 69 (incomingMessageHandler) received: b'00040100ef010100010000bbffbe1cb4ffff0a09590100000104000102'
2022-04-13 21:31:36 DEBUG (MainThread) [bellows.zigbee.application] Received incomingMessageHandler frame with [<EmberIncomingMessageType.INCOMING_UNICAST: 0>, EmberApsFrame(profileId=260, clusterId=61184, sourceEndpoint=1, destinationEndpoint=1, options=<EmberApsOption.APS_OPTION_ENABLE_ROUTE_DISCOVERY: 256>, groupId=0, sequence=187), 255, -66, 0xb41c, 255, 255, b'\tY\x01\x00\x00\x01\x04\x00\x01\x02']
2022-04-13 21:31:36 DEBUG (MainThread) [zigpy.zcl] [0xB41C:1:0xef00] Received ZCL frame: b'\tY\x01\x00\x00\x01\x04\x00\x01\x02'
2022-04-13 21:31:36 DEBUG (MainThread) [zigpy.zcl] [0xB41C:1:0xef00] Decoded ZCL frame header: ZCLHeader(frame_control=FrameControl(frame_type=<FrameType.CLUSTER_COMMAND: 1>, is_manufacturer_specific=0, is_reply=1, disable_default_response=0, reserved=0, *is_cluster=True, *is_general=False), tsn=89, command_id=1, *is_reply=True)
2022-04-13 21:31:36 DEBUG (MainThread) [zigpy.zcl] [0xB41C:1:0xef00] Decoded ZCL frame: TuyaManufacturerWindowCover:get_data(param=Command(status=0, tsn=0, command_id=1025, function=0, data=[1, 2]))
2022-04-13 21:31:36 DEBUG (MainThread) [zigpy.zcl] [0xB41C:1:0xef00] Received command 0x01 (TSN 89): get_data(param=Command(status=0, tsn=0, command_id=1025, function=0, data=[1, 2]))
2022-04-13 21:31:36 DEBUG (MainThread) [tuya] b4:e3:f9:ff:fe:0c:51:6e Received Attribute Report. Command is 0x0001, Tuya Paylod values[Status : 0, TSN: 0, Command: 0x0401, Function: 0x00, Data: [1, 2]]
2022-04-13 21:31:36 DEBUG (MainThread) [bellows.ezsp.protocol] Application frame 63 (messageSentHandler) received: b'00d7dc0401000501014001000089930000'
2022-04-13 21:31:36 DEBUG (MainThread) [bellows.zigbee.application] Received messageSentHandler frame with [<EmberOutgoingMessageType.OUTGOING_DIRECT: 0>, 56535, EmberApsFrame(profileId=260, clusterId=1280, sourceEndpoint=1, destinationEndpoint=1, options=<EmberApsOption.APS_OPTION_ENABLE_ROUTE_DISCOVERY|APS_OPTION_RETRY: 320>, groupId=0, sequence=137), 147, <EmberStatus.SUCCESS: 0>, b'']
2022-04-13 21:31:36 DEBUG (MainThread) [bellows.ezsp.protocol] Application frame 69 (incomingMessageHandler) received: b'00040100ef010100010000bcf0bf1cb4ffff0a09590100000104000102'
2022-04-13 21:31:36 DEBUG (MainThread) [bellows.zigbee.application] Received incomingMessageHandler frame with [<EmberIncomingMessageType.INCOMING_UNICAST: 0>, EmberApsFrame(profileId=260, clusterId=61184, sourceEndpoint=1, destinationEndpoint=1, options=<EmberApsOption.APS_OPTION_ENABLE_ROUTE_DISCOVERY: 256>, groupId=0, sequence=188), 240, -65, 0xb41c, 255, 255, b'\tY\x01\x00\x00\x01\x04\x00\x01\x02']
2022-04-13 21:31:36 DEBUG (MainThread) [zigpy.zcl] [0xB41C:1:0xef00] Received ZCL frame: b'\tY\x01\x00\x00\x01\x04\x00\x01\x02'
2022-04-13 21:31:36 DEBUG (MainThread) [zigpy.zcl] [0xB41C:1:0xef00] Decoded ZCL frame header: ZCLHeader(frame_control=FrameControl(frame_type=<FrameType.CLUSTER_COMMAND: 1>, is_manufacturer_specific=0, is_reply=1, disable_default_response=0, reserved=0, *is_cluster=True, *is_general=False), tsn=89, command_id=1, *is_reply=True)
2022-04-13 21:31:36 DEBUG (MainThread) [zigpy.zcl] [0xB41C:1:0xef00] Decoded ZCL frame: TuyaManufacturerWindowCover:get_data(param=Command(status=0, tsn=0, command_id=1025, function=0, data=[1, 2]))
2022-04-13 21:31:36 DEBUG (MainThread) [zigpy.zcl] [0xB41C:1:0xef00] Received command 0x01 (TSN 89): get_data(param=Command(status=0, tsn=0, command_id=1025, function=0, data=[1, 2]))
2022-04-13 21:31:36 DEBUG (MainThread) [tuya] b4:e3:f9:ff:fe:0c:51:6e Received Attribute Report. Command is 0x0001, Tuya Paylod values[Status : 0, TSN: 0, Command: 0x0401, Function: 0x00, Data: [1, 2]]
2022-04-13 21:31:36 DEBUG (MainThread) [bellows.ezsp.protocol] Application frame 69 (incomingMessageHandler) received: b'00040100ef010100010000bdffbf1cb4ffff0a09590100000104000102'
2022-04-13 21:31:36 DEBUG (MainThread) [bellows.zigbee.application] Received incomingMessageHandler frame with [<EmberIncomingMessageType.INCOMING_UNICAST: 0>, EmberApsFrame(profileId=260, clusterId=61184, sourceEndpoint=1, destinationEndpoint=1, options=<EmberApsOption.APS_OPTION_ENABLE_ROUTE_DISCOVERY: 256>, groupId=0, sequence=189), 255, -65, 0xb41c, 255, 255, b'\tY\x01\x00\x00\x01\x04\x00\x01\x02']
2022-04-13 21:31:36 DEBUG (MainThread) [zigpy.zcl] [0xB41C:1:0xef00] Received ZCL frame: b'\tY\x01\x00\x00\x01\x04\x00\x01\x02'
2022-04-13 21:31:36 DEBUG (MainThread) [zigpy.zcl] [0xB41C:1:0xef00] Decoded ZCL frame header: ZCLHeader(frame_control=FrameControl(frame_type=<FrameType.CLUSTER_COMMAND: 1>, is_manufacturer_specific=0, is_reply=1, disable_default_response=0, reserved=0, *is_cluster=True, *is_general=False), tsn=89, command_id=1, *is_reply=True)
2022-04-13 21:31:36 DEBUG (MainThread) [zigpy.zcl] [0xB41C:1:0xef00] Decoded ZCL frame: TuyaManufacturerWindowCover:get_data(param=Command(status=0, tsn=0, command_id=1025, function=0, data=[1, 2]))
2022-04-13 21:31:36 DEBUG (MainThread) [zigpy.zcl] [0xB41C:1:0xef00] Received command 0x01 (TSN 89): get_data(param=Command(status=0, tsn=0, command_id=1025, function=0, data=[1, 2]))
2022-04-13 21:31:36 DEBUG (MainThread) [tuya] b4:e3:f9:ff:fe:0c:51:6e Received Attribute Report. Command is 0x0001, Tuya Paylod values[Status : 0, TSN: 0, Command: 0x0401, Function: 0x00, Data: [1, 2]]
2022-04-13 21:31:37 DEBUG (MainThread) [bellows.ezsp.protocol] Application frame 69 (incomingMessageHandler) received: b'00040100ef010100010000beffbf1cb4ffff0a09590100000104000102'
2022-04-13 21:31:37 DEBUG (MainThread) [bellows.zigbee.application] Received incomingMessageHandler frame with [<EmberIncomingMessageType.INCOMING_UNICAST: 0>, EmberApsFrame(profileId=260, clusterId=61184, sourceEndpoint=1, destinationEndpoint=1, options=<EmberApsOption.APS_OPTION_ENABLE_ROUTE_DISCOVERY: 256>, groupId=0, sequence=190), 255, -65, 0xb41c, 255, 255, b'\tY\x01\x00\x00\x01\x04\x00\x01\x02']
2022-04-13 21:31:37 DEBUG (MainThread) [zigpy.zcl] [0xB41C:1:0xef00] Received ZCL frame: b'\tY\x01\x00\x00\x01\x04\x00\x01\x02'
2022-04-13 21:31:37 DEBUG (MainThread) [zigpy.zcl] [0xB41C:1:0xef00] Decoded ZCL frame header: ZCLHeader(frame_control=FrameControl(frame_type=<FrameType.CLUSTER_COMMAND: 1>, is_manufacturer_specific=0, is_reply=1, disable_default_response=0, reserved=0, *is_cluster=True, *is_general=False), tsn=89, command_id=1, *is_reply=True)
2022-04-13 21:31:37 DEBUG (MainThread) [zigpy.zcl] [0xB41C:1:0xef00] Decoded ZCL frame: TuyaManufacturerWindowCover:get_data(param=Command(status=0, tsn=0, command_id=1025, function=0, data=[1, 2]))
2022-04-13 21:31:37 DEBUG (MainThread) [zigpy.zcl] [0xB41C:1:0xef00] Received command 0x01 (TSN 89): get_data(param=Command(status=0, tsn=0, command_id=1025, function=0, data=[1, 2]))
2022-04-13 21:31:37 DEBUG (MainThread) [tuya] b4:e3:f9:ff:fe:0c:51:6e Received Attribute Report. Command is 0x0001, Tuya Paylod values[Status : 0, TSN: 0, Command: 0x0401, Function: 0x00, Data: [1, 2]]
2022-04-13 21:31:37 DEBUG (MainThread) [bellows.ezsp.protocol] Application frame 69 (incomingMessageHandler) received: b'00040100ef010100010000bfffbf1cb4ffff0a09590100000104000102'
2022-04-13 21:31:37 DEBUG (MainThread) [bellows.zigbee.application] Received incomingMessageHandler frame with [<EmberIncomingMessageType.INCOMING_UNICAST: 0>, EmberApsFrame(profileId=260, clusterId=61184, sourceEndpoint=1, destinationEndpoint=1, options=<EmberApsOption.APS_OPTION_ENABLE_ROUTE_DISCOVERY: 256>, groupId=0, sequence=191), 255, -65, 0xb41c, 255, 255, b'\tY\x01\x00\x00\x01\x04\x00\x01\x02']
2022-04-13 21:31:37 DEBUG (MainThread) [zigpy.zcl] [0xB41C:1:0xef00] Received ZCL frame: b'\tY\x01\x00\x00\x01\x04\x00\x01\x02'
2022-04-13 21:31:37 DEBUG (MainThread) [zigpy.zcl] [0xB41C:1:0xef00] Decoded ZCL frame header: ZCLHeader(frame_control=FrameControl(frame_type=<FrameType.CLUSTER_COMMAND: 1>, is_manufacturer_specific=0, is_reply=1, disable_default_response=0, reserved=0, *is_cluster=True, *is_general=False), tsn=89, command_id=1, *is_reply=True)
2022-04-13 21:31:37 DEBUG (MainThread) [zigpy.zcl] [0xB41C:1:0xef00] Decoded ZCL frame: TuyaManufacturerWindowCover:get_data(param=Command(status=0, tsn=0, command_id=1025, function=0, data=[1, 2]))
2022-04-13 21:31:37 DEBUG (MainThread) [zigpy.zcl] [0xB41C:1:0xef00] Received command 0x01 (TSN 89): get_data(param=Command(status=0, tsn=0, command_id=1025, function=0, data=[1, 2]))
2022-04-13 21:31:37 DEBUG (MainThread) [tuya] b4:e3:f9:ff:fe:0c:51:6e Received Attribute Report. Command is 0x0001, Tuya Paylod values[Status : 0, TSN: 0, Command: 0x0401, Function: 0x00, Data: [1, 2]]
2022-04-13 21:31:37 DEBUG (MainThread) [bellows.ezsp.protocol] Application frame 69 (incomingMessageHandler) received: b'00040100ef010100010000c0ffbf1cb4ffff0a09590100000104000102'
2022-04-13 21:31:37 DEBUG (MainThread) [bellows.zigbee.application] Received incomingMessageHandler frame with [<EmberIncomingMessageType.INCOMING_UNICAST: 0>, EmberApsFrame(profileId=260, clusterId=61184, sourceEndpoint=1, destinationEndpoint=1, options=<EmberApsOption.APS_OPTION_ENABLE_ROUTE_DISCOVERY: 256>, groupId=0, sequence=192), 255, -65, 0xb41c, 255, 255, b'\tY\x01\x00\x00\x01\x04\x00\x01\x02']
2022-04-13 21:31:37 DEBUG (MainThread) [zigpy.zcl] [0xB41C:1:0xef00] Received ZCL frame: b'\tY\x01\x00\x00\x01\x04\x00\x01\x02'
2022-04-13 21:31:37 DEBUG (MainThread) [zigpy.zcl] [0xB41C:1:0xef00] Decoded ZCL frame header: ZCLHeader(frame_control=FrameControl(frame_type=<FrameType.CLUSTER_COMMAND: 1>, is_manufacturer_specific=0, is_reply=1, disable_default_response=0, reserved=0, *is_cluster=True, *is_general=False), tsn=89, command_id=1, *is_reply=True)
2022-04-13 21:31:37 DEBUG (MainThread) [zigpy.zcl] [0xB41C:1:0xef00] Decoded ZCL frame: TuyaManufacturerWindowCover:get_data(param=Command(status=0, tsn=0, command_id=1025, function=0, data=[1, 2]))
2022-04-13 21:31:37 DEBUG (MainThread) [zigpy.zcl] [0xB41C:1:0xef00] Received command 0x01 (TSN 89): get_data(param=Command(status=0, tsn=0, command_id=1025, function=0, data=[1, 2]))
2022-04-13 21:31:37 DEBUG (MainThread) [tuya] b4:e3:f9:ff:fe:0c:51:6e Received Attribute Report. Command is 0x0001, Tuya Paylod values[Status : 0, TSN: 0, Command: 0x0401, Function: 0x00, Data: [1, 2]]

@SergeantPup
Copy link

The device that's tracking state but not controlling has "sending tuya" commands:

2022-04-13 21:39:18 DEBUG (MainThread) [bellows.ezsp.protocol] Send command nop: ()
2022-04-13 21:39:18 DEBUG (MainThread) [bellows.ezsp.protocol] Application frame 5 (nop) received: b''
2022-04-13 21:39:18 DEBUG (MainThread) [homeassistant.core] Bus:Handling <Event call_service[L]: domain=cover, service=open_cover, service_data=entity_id=cover.media_room_curtains_2>
2022-04-13 21:39:18 DEBUG (MainThread) [tuya] a4:c1:38:bd:76:6f:43:4c Sending Tuya Cluster Command.. Manufacturer is _TZE200_rmymn92d Cluster Command is 0x0000, Arguments are ()
2022-04-13 21:39:18 DEBUG (MainThread) [tuya] a4:c1:38:bd:76:6f:43:4c Sending Tuya Command. Paylod values [endpoint_id : 1, Status : 0, TSN: 0, Command: 0x0401, Function: 0, Data: [1, 0]]
2022-04-13 21:39:18 DEBUG (MainThread) [zigpy.zcl] [0xC7AC:1:0xef00] Sending request header: ZCLHeader(frame_control=FrameControl(frame_type=<FrameType.CLUSTER_COMMAND: 1>, is_manufacturer_specific=True, is_reply=0, disable_default_response=0, reserved=0, *is_cluster=True, *is_general=False), manufacturer=4417, tsn=58, command_id=0, *is_reply=False)
2022-04-13 21:39:18 DEBUG (MainThread) [zigpy.zcl] [0xC7AC:1:0xef00] Sending request: set_data(param=Command(status=0, tsn=0, command_id=1025, function=0, data=[1, 0]))
2022-04-13 21:39:18 DEBUG (MainThread) [bellows.ezsp.protocol] Send command sendUnicast: (<EmberOutgoingMessageType.OUTGOING_DIRECT: 0>, 0xC7AC, EmberApsFrame(profileId=260, clusterId=61184, sourceEndpoint=1, destinationEndpoint=1, options=<EmberApsOption.APS_OPTION_ENABLE_ROUTE_DISCOVERY|APS_OPTION_RETRY: 320>, groupId=0, sequence=58), 59, b'\x05A\x11:\x00\x00\x00\x01\x04\x00\x01\x00')
2022-04-13 21:39:18 DEBUG (MainThread) [bellows.ezsp.protocol] Application frame 52 (sendUnicast) received: b'001a'
2022-04-13 21:39:18 DEBUG (MainThread) [bellows.ezsp.protocol] Application frame 63 (messageSentHandler) received: b'00acc7040100ef0101400100001a3b0000'
2022-04-13 21:39:18 DEBUG (MainThread) [bellows.zigbee.application] Received messageSentHandler frame with [<EmberOutgoingMessageType.OUTGOING_DIRECT: 0>, 51116, EmberApsFrame(profileId=260, clusterId=61184, sourceEndpoint=1, destinationEndpoint=1, options=<EmberApsOption.APS_OPTION_ENABLE_ROUTE_DISCOVERY|APS_OPTION_RETRY: 320>, groupId=0, sequence=26), 59, <EmberStatus.SUCCESS: 0>, b'']
2022-04-13 21:39:18 DEBUG (MainThread) [bellows.ezsp.protocol] Application frame 69 (incomingMessageHandler) received: b'00040100ef010100010000ccfdb6acc7ffff05183a0b008302'
2022-04-13 21:39:18 DEBUG (MainThread) [bellows.zigbee.application] Received incomingMessageHandler frame with [<EmberIncomingMessageType.INCOMING_UNICAST: 0>, EmberApsFrame(profileId=260, clusterId=61184, sourceEndpoint=1, destinationEndpoint=1, options=<EmberApsOption.APS_OPTION_ENABLE_ROUTE_DISCOVERY: 256>, groupId=0, sequence=204), 253, -74, 0xc7ac, 255, 255, b'\x18:\x0b\x00\x83']
2022-04-13 21:39:18 DEBUG (MainThread) [zigpy.zcl] [0xC7AC:1:0xef00] Received ZCL frame: b'\x18:\x0b\x00\x83'
2022-04-13 21:39:18 DEBUG (MainThread) [zigpy.zcl] [0xC7AC:1:0xef00] Decoded ZCL frame header: ZCLHeader(frame_control=FrameControl(frame_type=<FrameType.GLOBAL_COMMAND: 0>, is_manufacturer_specific=0, is_reply=1, disable_default_response=1, reserved=0, *is_cluster=False, *is_general=True), tsn=58, command_id=11, *is_reply=True)
2022-04-13 21:39:18 DEBUG (MainThread) [zigpy.zcl] [0xC7AC:1:0xef00] Decoded ZCL frame: TuyaManufacturerWindowCover:Default_Response(command_id=0, status=<Status.UNSUP_MANUF_CLUSTER_COMMAND: 131>)
2022-04-13 21:39:18 DEBUG (MainThread) [homeassistant.components.zha.core.channels.base] [0xC7AC:1:0x0102]: executed 'up_open' command with args: '()' kwargs: '{}' result: Default_Response(command_id=0, status=<Status.UNSUP_MANUF_CLUSTER_COMMAND: 131>)
2022-04-13 21:39:20 DEBUG (MainThread) [homeassistant.core] Bus:Handling <Event call_service[L]: domain=cover, service=stop_cover, service_data=entity_id=cover.media_room_curtains_2>
2022-04-13 21:39:20 DEBUG (MainThread) [tuya] a4:c1:38:bd:76:6f:43:4c Sending Tuya Cluster Command.. Manufacturer is _TZE200_rmymn92d Cluster Command is 0x0002, Arguments are ()
2022-04-13 21:39:20 DEBUG (MainThread) [tuya] a4:c1:38:bd:76:6f:43:4c Sending Tuya Command. Paylod values [endpoint_id : 1, Status : 0, TSN: 0, Command: 0x0401, Function: 0, Data: [1, 1]]
2022-04-13 21:39:20 DEBUG (MainThread) [zigpy.zcl] [0xC7AC:1:0xef00] Sending request header: ZCLHeader(frame_control=FrameControl(frame_type=<FrameType.CLUSTER_COMMAND: 1>, is_manufacturer_specific=True, is_reply=0, disable_default_response=0, reserved=0, *is_cluster=True, *is_general=False), manufacturer=4417, tsn=60, command_id=0, *is_reply=False)
2022-04-13 21:39:20 DEBUG (MainThread) [zigpy.zcl] [0xC7AC:1:0xef00] Sending request: set_data(param=Command(status=0, tsn=0, command_id=1025, function=0, data=[1, 1]))
2022-04-13 21:39:20 DEBUG (MainThread) [bellows.ezsp.protocol] Send command sendUnicast: (<EmberOutgoingMessageType.OUTGOING_DIRECT: 0>, 0xC7AC, EmberApsFrame(profileId=260, clusterId=61184, sourceEndpoint=1, destinationEndpoint=1, options=<EmberApsOption.APS_OPTION_ENABLE_ROUTE_DISCOVERY|APS_OPTION_RETRY: 320>, groupId=0, sequence=60), 61, b'\x05A\x11<\x00\x00\x00\x01\x04\x00\x01\x01')
2022-04-13 21:39:20 DEBUG (MainThread) [bellows.ezsp.protocol] Application frame 52 (sendUnicast) received: b'001b'
2022-04-13 21:39:20 DEBUG (MainThread) [bellows.ezsp.protocol] Application frame 63 (messageSentHandler) received: b'00acc7040100ef0101400100001b3d0000'
2022-04-13 21:39:20 DEBUG (MainThread) [bellows.zigbee.application] Received messageSentHandler frame with [<EmberOutgoingMessageType.OUTGOING_DIRECT: 0>, 51116, EmberApsFrame(profileId=260, clusterId=61184, sourceEndpoint=1, destinationEndpoint=1, options=<EmberApsOption.APS_OPTION_ENABLE_ROUTE_DISCOVERY|APS_OPTION_RETRY: 320>, groupId=0, sequence=27), 61, <EmberStatus.SUCCESS: 0>, b'']
2022-04-13 21:39:20 DEBUG (MainThread) [bellows.ezsp.protocol] Application frame 69 (incomingMessageHandler) received: b'00040100ef010100010000cdf0b9acc7ffff05183c0b008302'
2022-04-13 21:39:20 DEBUG (MainThread) [bellows.zigbee.application] Received incomingMessageHandler frame with [<EmberIncomingMessageType.INCOMING_UNICAST: 0>, EmberApsFrame(profileId=260, clusterId=61184, sourceEndpoint=1, destinationEndpoint=1, options=<EmberApsOption.APS_OPTION_ENABLE_ROUTE_DISCOVERY: 256>, groupId=0, sequence=205), 240, -71, 0xc7ac, 255, 255, b'\x18<\x0b\x00\x83']
2022-04-13 21:39:20 DEBUG (MainThread) [zigpy.zcl] [0xC7AC:1:0xef00] Received ZCL frame: b'\x18<\x0b\x00\x83'
2022-04-13 21:39:20 DEBUG (MainThread) [zigpy.zcl] [0xC7AC:1:0xef00] Decoded ZCL frame header: ZCLHeader(frame_control=FrameControl(frame_type=<FrameType.GLOBAL_COMMAND: 0>, is_manufacturer_specific=0, is_reply=1, disable_default_response=1, reserved=0, *is_cluster=False, *is_general=True), tsn=60, command_id=11, *is_reply=True)
2022-04-13 21:39:20 DEBUG (MainThread) [zigpy.zcl] [0xC7AC:1:0xef00] Decoded ZCL frame: TuyaManufacturerWindowCover:Default_Response(command_id=0, status=<Status.UNSUP_MANUF_CLUSTER_COMMAND: 131>)

@TheJulianJES
Copy link
Collaborator

Can you see if there's any difference with this: tuya.zip

@SergeantPup
Copy link

2022-04-13 20_36_17-Greenshot
Not sure if it helps but this is how the device adds with this quirk in place

@TheJulianJES
Copy link
Collaborator

TheJulianJES commented Apr 14, 2022

Can you completely delete and re-pair the device with the quirk update I've sent a minute ago?
Doubt it but maybe check if controls work.

@SergeantPup
Copy link

removed device, swapped quirk, rebooted, added device:

2022-04-13 22:16:20 DEBUG (MainThread) [homeassistant.core] Bus:Handling <Event call_service[L]: domain=cover, service=open_cover, service_data=entity_id=cover.tze200_rmymn92d_ts0601_4c436f76_window_covering>
2022-04-13 22:16:20 DEBUG (MainThread) [tuya] a4:c1:38:bd:76:6f:43:4c Sending Tuya Cluster Command.. Manufacturer is _TZE200_rmymn92d Cluster Command is 0x0000, Arguments are ()
2022-04-13 22:16:20 DEBUG (MainThread) [tuya] a4:c1:38:bd:76:6f:43:4c Sending Tuya Command. Paylod values [endpoint_id : 1, Status : 0, TSN: 0, Command: 0x0401, Function: 0, Data: [1, 0]]
2022-04-13 22:16:20 DEBUG (MainThread) [zigpy.zcl] [0xC7AC:1:0xef00] Sending request header: ZCLHeader(frame_control=FrameControl(frame_type=<FrameType.CLUSTER_COMMAND: 1>, is_manufacturer_specific=True, is_reply=0, disable_default_response=0, reserved=0, *is_cluster=True, *is_general=False), manufacturer=4417, tsn=1, command_id=0, *is_reply=False)
2022-04-13 22:16:20 DEBUG (MainThread) [zigpy.zcl] [0xC7AC:1:0xef00] Sending request: set_data(param=Command(status=0, tsn=0, command_id=1025, function=0, data=[1, 0]))
2022-04-13 22:16:20 DEBUG (MainThread) [bellows.ezsp.protocol] Send command sendUnicast: (<EmberOutgoingMessageType.OUTGOING_DIRECT: 0>, 0xC7AC, EmberApsFrame(profileId=260, clusterId=61184, sourceEndpoint=1, destinationEndpoint=1, options=<EmberApsOption.APS_OPTION_ENABLE_ROUTE_DISCOVERY|APS_OPTION_RETRY: 320>, groupId=0, sequence=1), 2, b'\x05A\x11\x01\x00\x00\x00\x01\x04\x00\x01\x00')
2022-04-13 22:16:20 DEBUG (MainThread) [bellows.ezsp.protocol] Application frame 52 (sendUnicast) received: b'0060'
2022-04-13 22:16:20 DEBUG (MainThread) [bellows.ezsp.protocol] Application frame 63 (messageSentHandler) received: b'00acc7040100ef01014001000060020000'
2022-04-13 22:16:20 DEBUG (MainThread) [bellows.zigbee.application] Received messageSentHandler frame with [<EmberOutgoingMessageType.OUTGOING_DIRECT: 0>, 51116, EmberApsFrame(profileId=260, clusterId=61184, sourceEndpoint=1, destinationEndpoint=1, options=<EmberApsOption.APS_OPTION_ENABLE_ROUTE_DISCOVERY|APS_OPTION_RETRY: 320>, groupId=0, sequence=96), 2, <EmberStatus.SUCCESS: 0>, b'']
2022-04-13 22:16:20 DEBUG (MainThread) [bellows.ezsp.protocol] Application frame 69 (incomingMessageHandler) received: b'00040100ef010100010000fbfcb9acc7ffff0518010b0083'
2022-04-13 22:16:20 DEBUG (MainThread) [bellows.zigbee.application] Received incomingMessageHandler frame with [<EmberIncomingMessageType.INCOMING_UNICAST: 0>, EmberApsFrame(profileId=260, clusterId=61184, sourceEndpoint=1, destinationEndpoint=1, options=<EmberApsOption.APS_OPTION_ENABLE_ROUTE_DISCOVERY: 256>, groupId=0, sequence=251), 252, -71, 0xc7ac, 255, 255, b'\x18\x01\x0b\x00\x83']
2022-04-13 22:16:20 DEBUG (MainThread) [zigpy.zcl] [0xC7AC:1:0xef00] Received ZCL frame: b'\x18\x01\x0b\x00\x83'
2022-04-13 22:16:20 DEBUG (MainThread) [zigpy.zcl] [0xC7AC:1:0xef00] Decoded ZCL frame header: ZCLHeader(frame_control=FrameControl(frame_type=<FrameType.GLOBAL_COMMAND: 0>, is_manufacturer_specific=0, is_reply=1, disable_default_response=1, reserved=0, *is_cluster=False, *is_general=True), tsn=1, command_id=11, *is_reply=True)
2022-04-13 22:16:20 DEBUG (MainThread) [zigpy.zcl] [0xC7AC:1:0xef00] Decoded ZCL frame: TuyaManufacturerWindowCover:Default_Response(command_id=0, status=<Status.UNSUP_MANUF_CLUSTER_COMMAND: 131>)
2022-04-13 22:16:20 DEBUG (MainThread) [homeassistant.components.zha.core.channels.base] [0xC7AC:1:0x0102]: executed 'up_open' command with args: '()' kwargs: '{}' result: Default_Response(command_id=0, status=<Status.UNSUP_MANUF_CLUSTER_COMMAND: 131>)
2022-04-13 22:16:21 DEBUG (MainThread) [homeassistant.core] Bus:Handling <Event call_service[L]: domain=cover, service=stop_cover, service_data=entity_id=cover.tze200_rmymn92d_ts0601_4c436f76_window_covering>
2022-04-13 22:16:21 DEBUG (MainThread) [tuya] a4:c1:38:bd:76:6f:43:4c Sending Tuya Cluster Command.. Manufacturer is _TZE200_rmymn92d Cluster Command is 0x0002, Arguments are ()
2022-04-13 22:16:21 DEBUG (MainThread) [tuya] a4:c1:38:bd:76:6f:43:4c Sending Tuya Command. Paylod values [endpoint_id : 1, Status : 0, TSN: 0, Command: 0x0401, Function: 0, Data: [1, 1]]
2022-04-13 22:16:21 DEBUG (MainThread) [zigpy.zcl] [0xC7AC:1:0xef00] Sending request header: ZCLHeader(frame_control=FrameControl(frame_type=<FrameType.CLUSTER_COMMAND: 1>, is_manufacturer_specific=True, is_reply=0, disable_default_response=0, reserved=0, *is_cluster=True, *is_general=False), manufacturer=4417, tsn=3, command_id=0, *is_reply=False)
2022-04-13 22:16:21 DEBUG (MainThread) [zigpy.zcl] [0xC7AC:1:0xef00] Sending request: set_data(param=Command(status=0, tsn=0, command_id=1025, function=0, data=[1, 1]))
2022-04-13 22:16:21 DEBUG (MainThread) [bellows.ezsp.protocol] Send command sendUnicast: (<EmberOutgoingMessageType.OUTGOING_DIRECT: 0>, 0xC7AC, EmberApsFrame(profileId=260, clusterId=61184, sourceEndpoint=1, destinationEndpoint=1, options=<EmberApsOption.APS_OPTION_ENABLE_ROUTE_DISCOVERY|APS_OPTION_RETRY: 320>, groupId=0, sequence=3), 4, b'\x05A\x11\x03\x00\x00\x00\x01\x04\x00\x01\x01')
2022-04-13 22:16:21 DEBUG (MainThread) [bellows.ezsp.protocol] Application frame 52 (sendUnicast) received: b'0061'
2022-04-13 22:16:21 DEBUG (MainThread) [bellows.ezsp.protocol] Application frame 63 (messageSentHandler) received: b'00acc7040100ef01014001000061040000'
2022-04-13 22:16:21 DEBUG (MainThread) [bellows.zigbee.application] Received messageSentHandler frame with [<EmberOutgoingMessageType.OUTGOING_DIRECT: 0>, 51116, EmberApsFrame(profileId=260, clusterId=61184, sourceEndpoint=1, destinationEndpoint=1, options=<EmberApsOption.APS_OPTION_ENABLE_ROUTE_DISCOVERY|APS_OPTION_RETRY: 320>, groupId=0, sequence=97), 4, <EmberStatus.SUCCESS: 0>, b'']
2022-04-13 22:16:21 DEBUG (MainThread) [bellows.ezsp.protocol] Application frame 69 (incomingMessageHandler) received: b'00040100ef010100010000fcffb9acc7ffff0518030b0083'
2022-04-13 22:16:21 DEBUG (MainThread) [bellows.zigbee.application] Received incomingMessageHandler frame with [<EmberIncomingMessageType.INCOMING_UNICAST: 0>, EmberApsFrame(profileId=260, clusterId=61184, sourceEndpoint=1, destinationEndpoint=1, options=<EmberApsOption.APS_OPTION_ENABLE_ROUTE_DISCOVERY: 256>, groupId=0, sequence=252), 255, -71, 0xc7ac, 255, 255, b'\x18\x03\x0b\x00\x83']
2022-04-13 22:16:21 DEBUG (MainThread) [zigpy.zcl] [0xC7AC:1:0xef00] Received ZCL frame: b'\x18\x03\x0b\x00\x83'
2022-04-13 22:16:21 DEBUG (MainThread) [zigpy.zcl] [0xC7AC:1:0xef00] Decoded ZCL frame header: ZCLHeader(frame_control=FrameControl(frame_type=<FrameType.GLOBAL_COMMAND: 0>, is_manufacturer_specific=0, is_reply=1, disable_default_response=1, reserved=0, *is_cluster=False, *is_general=True), tsn=3, command_id=11, *is_reply=True)
2022-04-13 22:16:21 DEBUG (MainThread) [zigpy.zcl] [0xC7AC:1:0xef00] Decoded ZCL frame: TuyaManufacturerWindowCover:Default_Response(command_id=0, status=<Status.UNSUP_MANUF_CLUSTER_COMMAND: 131>)
2022-04-13 22:16:21 DEBUG (MainThread) [homeassistant.components.zha.core.channels.base] [0xC7AC:1:0x0102]: executed 'stop' command with args: '()' kwargs: '{}' result: Default_Response(command_id=0, status=<Status.UNSUP_MANUF_CLUSTER_COMMAND: 131>)
2022-04-13 22:16:22 DEBUG (MainThread) [bellows.ezsp.protocol] Send command nop: ()
2022-04-13 22:16:22 DEBUG (MainThread) [bellows.ezsp.protocol] Application frame 5 (nop) received: b''

@TheJulianJES
Copy link
Collaborator

I don't know why your working blinds don't log any "Sending Tuya". That would mean that this code is never called.

Please check if your logger config (or service call where you set tuya: debug) also contains this:
zhaquirks: debug

Also, can you send me the device signature from both the working cover that does not output the commands and from the non-working cover that outputs the command?

@SergeantPup
Copy link

That's correct:

logger:
  default: info
  logs:
    homeassistant.core: debug
    homeassistant.components.zha: debug
    bellows.zigbee.application: debug
    bellows.ezsp: debug
    zigpy: debug
    zigpy_deconz.zigbee.application: debug
    zigpy_deconz.api: debug
    zigpy_xbee.zigbee.application: debug
    zigpy_xbee.api: debug
    zigpy_zigate: debug
    zigpy_znp: debug
    zhaquirks: debug
    tuya: debug

@SergeantPup
Copy link

Zigbee device signature of controllable unit:

{
  "node_descriptor": "NodeDescriptor(logical_type=<LogicalType.Router: 1>, complex_descriptor_available=0, user_descriptor_available=0, reserved=0, aps_flags=0, frequency_band=<FrequencyBand.Freq2400MHz: 8>, mac_capability_flags=<MACCapabilityFlags.AllocateAddress|RxOnWhenIdle|MainsPowered|FullFunctionDevice: 142>, manufacturer_code=4098, maximum_buffer_size=82, maximum_incoming_transfer_size=82, server_mask=11264, maximum_outgoing_transfer_size=82, descriptor_capability_field=<DescriptorCapability.NONE: 0>, *allocate_address=True, *is_alternate_pan_coordinator=False, *is_coordinator=False, *is_end_device=False, *is_full_function_device=True, *is_mains_powered=True, *is_receiver_on_when_idle=True, *is_router=True, *is_security_capable=False)",
  "endpoints": {
    "1": {
      "profile_id": 260,
      "device_type": "0x0202",
      "in_clusters": [
        "0x0000",
        "0x0004",
        "0x0005",
        "0x0102",
        "0xef00"
      ],
      "out_clusters": [
        "0x000a",
        "0x0019"
      ]
    }
  },
  "manufacturer": "_TZE200_xaabybja",
  "model": "TS0601",
  "class": "tuya.ts0601_cover.TuyaMoesCover0601"
}

Device signature of tracked but non controlled unit:

{
  "node_descriptor": "NodeDescriptor(logical_type=<LogicalType.Router: 1>, complex_descriptor_available=0, user_descriptor_available=0, reserved=0, aps_flags=0, frequency_band=<FrequencyBand.Freq2400MHz: 8>, mac_capability_flags=<MACCapabilityFlags.AllocateAddress|RxOnWhenIdle|MainsPowered|FullFunctionDevice: 142>, manufacturer_code=4417, maximum_buffer_size=66, maximum_incoming_transfer_size=66, server_mask=10752, maximum_outgoing_transfer_size=66, descriptor_capability_field=<DescriptorCapability.NONE: 0>, *allocate_address=True, *is_alternate_pan_coordinator=False, *is_coordinator=False, *is_end_device=False, *is_full_function_device=True, *is_mains_powered=True, *is_receiver_on_when_idle=True, *is_router=True, *is_security_capable=False)",
  "endpoints": {
    "1": {
      "profile_id": 260,
      "device_type": "0x0202",
      "in_clusters": [
        "0x0000",
        "0x0004",
        "0x0005",
        "0x0102",
        "0xef00"
      ],
      "out_clusters": [
        "0x000a",
        "0x0019"
      ]
    },
    "242": {
      "profile_id": 41440,
      "device_type": "0x0061",
      "in_clusters": [],
      "out_clusters": [
        "0x0021"
      ]
    }
  },
  "manufacturer": "_TZE200_rmymn92d",
  "model": "TS0601",
  "class": "tuya.ts0601_cover.TuyaZemismartSmartCover0601_4"
}

@kolmakova
Copy link

@kolmakova Did you change the is_manufacturer_specific=True to is_manufacturer_specific=False (near TuyaManufCluster in __init__.py) for the custom quirk?

Yes

@Kiread-work
Copy link

Kiread-work commented May 18, 2022

@[kolmakova] Did you try the percentage control? Does it work?
image

And did you add your device to the TUYA_COVER_COMMAND section?
image

@kolmakova
Copy link

kolmakova commented May 19, 2022

@Kiread-work Wow, didn't know that there is percentage control. Found it, works, thanks! But buttons are still unusable.

Yes, I've added my device into TUYA_COVER_COMMAND.

@ruudkoeyvoets
Copy link

Try changing Tuya_cover_command and I think your button will work.

@kolmakova
Copy link

Try changing Tuya_cover_command and I think your button will work.

I've tried almost all combinations of 0x0001, 0x0002 and 0x0000, no success :/

@Kiread-work
Copy link

@kolmakova Mine only works after I added it to the TUYA_COVER_INVERTED_BY_DEFAULT section
Also changed True to False here

    server_commands = {
        0x0000: foundation.ZCLCommandDef(
            "set_data", {"param": Command}, False, is_manufacturer_specific=False#True
        ),

@sushant-here
Copy link
Contributor

What’s the latest on this? I just purchased this Zemismart curtain motor and I would like to use in ZHA. Are we able to create a official quirk for it?

@sebastian3107
Copy link

Can someone create a PR for this, if this is working now? Thanks! 🥇

@Daniel-dev22
Copy link

Can someone create a PR for this, if this is working now? Thanks! 🥇

I'm wondering the same.

@pedrams1
Copy link

@kolmakova Mine only works after I added it to the TUYA_COVER_INVERTED_BY_DEFAULT section Also changed True to False here

    server_commands = {
        0x0000: foundation.ZCLCommandDef(
            "set_data", {"param": Command}, False, is_manufacturer_specific=False#True
        ),

I can confirm the quirk, in addition to this change works!

@TheJulianJES TheJulianJES added Tuya Request/PR regarding a Tuya device custom quirk available A custom quirk is available to solve the issue, but it's not merged in the repo yet labels Jan 4, 2023
@ToastySefac
Copy link

ToastySefac commented Apr 8, 2023

I have a _TZE200_rmymn92d and am struggling to get the quirk working.

I have tried:

(1) downloaded https://github.com/zigpy/zha-device-handlers/blob/dev/zhaquirks/tuya/ts0601_cover.py and saved to config/custom_zha_quirks

(2) Replaced line 266 under MODELS_INFO from ("_TZE200_3i3exuay", "TS0601"), to ("_TZE200_rmymn92d", "TS0601"), then saved

(3) Added /config/custom_zha_quirks/init.py

(4) Changed "set_data", {"param": Command}, False, is_manufacturer_specific=True to "set_data", {"param": Command}, False, is_manufacturer_specific=False#True

(5) Deleted the cached file in __config/custom_zha_quirks/pycache

(6) Re-paired the curtain motor to ZHA

(7) Restarted HA

I've also tried a combination of (4) and (3) in different order.

I have quirks for other devices, so I assume I'm doing the right procedure (mostly), but I can't see that the quirk is loading for this device.

Log
2023-03-14 19:07:24.955 WARNING (MainThread) [zhaquirks] Loaded custom quirks. Please contribute them to https://github.com/zigpy/zha-device-handlers
2023-03-14 19:08:16.009 WARNING (MainThread) [zigpy.zcl] [0x9729:1:0xef00] Unknown cluster command 2 b'\x05\xf8\x01\x04\x00\x01\x01'
2023-03-14 19:08:16.200 WARNING (MainThread) [zigpy.zcl] [0x9729:1:0xef00] Unknown cluster command 2 b'\x05\xf8\x01\x04\x00\x01\x01'
2023-03-14 19:08:16.396 WARNING (MainThread) [zigpy.zcl] [0x9729:1:0xef00] Unknown cluster command 2 b'\x05\xf8\x01\x04\x00\x01\x01'
2023-03-14 19:08:16.597 WARNING (MainThread) [zigpy.zcl] [0x9729:1:0xef00] Unknown cluster command 2 b'\x05\xf8\x01\x04\x00\x01\x01'
2023-03-14 19:08:16.795 WARNING (MainThread) [zigpy.zcl] [0x9729:1:0xef00] Unknown cluster command 2 b'\x05\xf8\x01\x04\x00\x01\x01'
2023-03-14 19:08:16.998 WARNING (MainThread) [zigpy.zcl] [0x9729:1:0xef00] Unknown cluster command 2 b'\x05\xf8\x01\x04\x00\x01\x01'
2023-03-14 19:08:17.209 WARNING (MainThread) [zigpy.zcl] [0x9729:1:0xef00] Unknown cluster command 2 b'\x05\xf8\x01\x04\x00\x01\x01'
2023-03-14 19:08:17.398 WARNING (MainThread) [zigpy.zcl] [0x9729:1:0xef00] Unknown cluster command 2 b'\x05\xf8\x01\x04\x00\x01\x01'
2023-03-14 19:08:17.602 WARNING (MainThread) [zigpy.zcl] [0x9729:1:0xef00] Unknown cluster command 2 b'\x05\xf8\x01\x04\x00\x01\x01'
2023-03-14 19:08:17.825 WARNING (MainThread) [zigpy.zcl] [0x9729:1:0xef00] Unknown cluster command 2 b'\x05\xf8\x01\x04\x00\x01\x01'
2023-03-14 19:08:18.001 WARNING (MainThread) [zigpy.zcl] [0x9729:1:0xef00] Unknown cluster command 2 b'\x05\xf8\x01\x04\x00\x01\x01'
2023-03-14 19:08:18.206 WARNING (MainThread) [zigpy.zcl] [0x9729:1:0xef00] Unknown cluster command 2 b'\x05\xf8\x01\x04\x00\x01\x01'
2023-03-14 19:08:18.402 WARNING (MainThread) [zigpy.zcl] [0x9729:1:0xef00] Unknown cluster command 2 b'\x05\xf8\x01\x04\x00\x01\x01'
2023-03-14 19:08:18.601 WARNING (MainThread) [zigpy.zcl] [0x9729:1:0xef00] Unknown cluster command 2 b'\x05\xf8\x01\x04\x00\x01\x01'
2023-03-14 19:08:18.801 WARNING (MainThread) [zigpy.zcl] [0x9729:1:0xef00] Unknown cluster command 2 b'\x05\xf8\x01\x04\x00\x01\x01'
2023-04-08 21:16:18.098 DEBUG (MainThread) [zigpy_deconz.uart] Frame received: 0x0e5c000700aa00
2023-04-08 21:16:18.099 DEBUG (MainThread) [zigpy_deconz.api] Received command device_state_changed[<DeviceState.128|APSDE_DATA_REQUEST_SLOTS_AVAILABLE|APSDE_DATA_INDICATION|2: 170>, 0]
2023-04-08 21:16:18.099 DEBUG (MainThread) [zigpy_deconz.api] Device state changed response: [<DeviceState.128|APSDE_DATA_REQUEST_SLOTS_AVAILABLE|APSDE_DATA_INDICATION|2: 170>, 0]
2023-04-08 21:16:18.100 DEBUG (MainThread) [zigpy_deconz.api] Command Command.aps_data_indication (1, <DataIndicationFlags.Always_Use_NWK_Source_Addr: 1>)
2023-04-08 21:16:18.100 DEBUG (MainThread) [zigpy_deconz.uart] Send: 0x175b000800010001
2023-04-08 21:16:18.103 DEBUG (MainThread) [zigpy_deconz.uart] Frame received: 0x175b0025001e00220200000102a8580104010100070018e20a2000201e00afff9c518000cc
2023-04-08 21:16:18.104 DEBUG (MainThread) [zigpy_deconz.api] Received command aps_data_indication[30, <DeviceState.APSDE_DATA_REQUEST_SLOTS_AVAILABLE|2: 34>, <DeconzAddress address_mode=AddressMode.NWK address=0x0000>, 1, <DeconzAddress address_mode=AddressMode.NWK address=0x58a8>, 1, 260, 1, b'\x18\xe2\n \x00 \x1e', 0, 175, 255, 156, 81, 128, 0, -52]
2023-04-08 21:16:18.104 DEBUG (MainThread) [zigpy_deconz.api] APS data indication response: [30, <DeviceState.APSDE_DATA_REQUEST_SLOTS_AVAILABLE|2: 34>, <DeconzAddress address_mode=AddressMode.NWK address=0x0000>, 1, <DeconzAddress address_mode=AddressMode.NWK address=0x58a8>, 1, 260, 1, b'\x18\xe2\n \x00 \x1e', 0, 175, 255, 156, 81, 128, 0, -52]
2023-04-08 21:16:18.119 DEBUG (MainThread) [zigpy_deconz.uart] Frame received: 0x0e5c000700aa00
2023-04-08 21:16:18.119 DEBUG (MainThread) [zigpy_deconz.api] Received command device_state_changed[<DeviceState.128|APSDE_DATA_REQUEST_SLOTS_AVAILABLE|APSDE_DATA_INDICATION|2: 170>, 0]
2023-04-08 21:16:18.120 DEBUG (MainThread) [zigpy_deconz.api] Device state changed response: [<DeviceState.128|APSDE_DATA_REQUEST_SLOTS_AVAILABLE|APSDE_DATA_INDICATION|2: 170>, 0]
2023-04-08 21:16:18.120 DEBUG (MainThread) [zigpy_deconz.api] 'aps_data_indication' response from <DeconzAddress address_mode=AddressMode.NWK address=0x58a8>, ep: 1, profile: 0x0104, cluster_id: 0x0001, data: b'18e20a2000201e'
2023-04-08 21:16:18.121 DEBUG (MainThread) [zigpy_deconz.api] Command Command.aps_data_indication (1, <DataIndicationFlags.Always_Use_NWK_Source_Addr: 1>)
2023-04-08 21:16:18.121 DEBUG (MainThread) [zigpy_deconz.uart] Send: 0x175c000800010001
2023-04-08 21:16:18.123 DEBUG (MainThread) [zigpy_deconz.uart] Frame received: 0x175c0025001e00220200000102a8580104010100070018e30a210020c800afff9c518000cc
2023-04-08 21:16:18.124 DEBUG (MainThread) [zigpy_deconz.api] Received command aps_data_indication[30, <DeviceState.APSDE_DATA_REQUEST_SLOTS_AVAILABLE|2: 34>, <DeconzAddress address_mode=AddressMode.NWK address=0x0000>, 1, <DeconzAddress address_mode=AddressMode.NWK address=0x58a8>, 1, 260, 1, b'\x18\xe3\n!\x00 \xc8', 0, 175, 255, 156, 81, 128, 0, -52]
2023-04-08 21:16:18.124 DEBUG (MainThread) [zigpy_deconz.api] APS data indication response: [30, <DeviceState.APSDE_DATA_REQUEST_SLOTS_AVAILABLE|2: 34>, <DeconzAddress address_mode=AddressMode.NWK address=0x0000>, 1, <DeconzAddress address_mode=AddressMode.NWK address=0x58a8>, 1, 260, 1, b'\x18\xe3\n!\x00 \xc8', 0, 175, 255, 156, 81, 128, 0, -52]
2023-04-08 21:16:18.134 DEBUG (MainThread) [zigpy_deconz.api] 'aps_data_indication' response from <DeconzAddress address_mode=AddressMode.NWK address=0x58a8>, ep: 1, profile: 0x0104, cluster_id: 0x0001, data: b'18e30a210020c8'
2023-04-08 21:16:18.268 DEBUG (MainThread) [zigpy_deconz.uart] Frame received: 0x1c5d000c00050002b756ffc9
2023-04-08 21:16:18.269 DEBUG (MainThread) [zigpy_deconz.api] Received command mac_poll[5, <DeconzAddress address_mode=AddressMode.NWK address=0x56b7>, 255, -55]
2023-04-08 21:16:18.465 DEBUG (MainThread) [zigpy_deconz.uart] Frame received: 0x0e5e000700aa00
2023-04-08 21:16:18.465 DEBUG (MainThread) [zigpy_deconz.api] Received command device_state_changed[<DeviceState.128|APSDE_DATA_REQUEST_SLOTS_AVAILABLE|APSDE_DATA_INDICATION|2: 170>, 0]
2023-04-08 21:16:18.466 DEBUG (MainThread) [zigpy_deconz.api] Device state changed response: [<DeviceState.128|APSDE_DATA_REQUEST_SLOTS_AVAILABLE|APSDE_DATA_INDICATION|2: 170>, 0]
2023-04-08 21:16:18.466 DEBUG (MainThread) [zigpy_deconz.api] Command Command.aps_data_indication (1, <DataIndicationFlags.Always_Use_NWK_Source_Addr: 1>)
2023-04-08 21:16:18.467 DEBUG (MainThread) [zigpy_deconz.uart] Send: 0x175d000800010001
2023-04-08 21:16:18.475 DEBUG (MainThread) [zigpy_deconz.uart] Frame received: 0x175d002b0024002202000001026da001040100ef0d000903023e01060200040000000000afff9c518000cc
2023-04-08 21:16:18.475 DEBUG (MainThread) [zigpy_deconz.api] Received command aps_data_indication[36, <DeviceState.APSDE_DATA_REQUEST_SLOTS_AVAILABLE|2: 34>, <DeconzAddress address_mode=AddressMode.NWK address=0x0000>, 1, <DeconzAddress address_mode=AddressMode.NWK address=0xa06d>, 1, 260, 61184, b'\t\x03\x02>\x01\x06\x02\x00\x04\x00\x00\x00\x00', 0, 175, 255, 156, 81, 128, 0, -52]
2023-04-08 21:16:18.476 DEBUG (MainThread) [zigpy_deconz.api] APS data indication response: [36, <DeviceState.APSDE_DATA_REQUEST_SLOTS_AVAILABLE|2: 34>, <DeconzAddress address_mode=AddressMode.NWK address=0x0000>, 1, <DeconzAddress address_mode=AddressMode.NWK address=0xa06d>, 1, 260, 61184, b'\t\x03\x02>\x01\x06\x02\x00\x04\x00\x00\x00\x00', 0, 175, 255, 156, 81, 128, 0, -52]
2023-04-08 21:16:18.479 DEBUG (MainThread) [zigpy_deconz.zigbee.application] Sending packet: ZigbeePacket(src=AddrModeAddress(addr_mode=<AddrMode.NWK: 2>, address=0x0000), src_ep=1, dst=AddrModeAddress(addr_mode=<AddrMode.NWK: 2>, address=0xA06D), dst_ep=1, source_route=None, extended_timeout=False, tsn=3, profile_id=260, cluster_id=61184, data=Serialized[b'\x18\x03\x0b\x02\x86'], tx_options=<TransmitOptions.ACK: 1>, radius=0, non_member_radius=0, lqi=None, rssi=None)
2023-04-08 21:16:18.480 DEBUG (MainThread) [zigpy_deconz.api] 'aps_data_indication' response from <DeconzAddress address_mode=AddressMode.NWK address=0xa06d>, ep: 1, profile: 0x0104, cluster_id: 0xef00, data: b'0903023e010602000400000000'
2023-04-08 21:16:18.481 DEBUG (MainThread) [zigpy_deconz.api] Command Command.aps_data_request (20, 206, <DeconzSendDataFlags.NONE: 0>, <DeconzAddressEndpoint address_mode=AddressMode.NWK address=0xa06d endpoint=1>, 260, 61184, 1, b'\x18\x03\x0b\x02\x86', <DeconzTransmitOptions.USE_APS_ACKS|USE_NWK_KEY_SECURITY: 6>, 0)
2023-04-08 21:16:18.481 DEBUG (MainThread) [zigpy_deconz.uart] Send: 0x125e001b001400ce00026da001040100ef01050018030b02860600
2023-04-08 21:16:18.491 DEBUG (MainThread) [zigpy_deconz.uart] Frame received: 0x125e000900020022ce
2023-04-08 21:16:18.491 DEBUG (MainThread) [zigpy_deconz.api] Received command aps_data_request[2, <DeviceState.APSDE_DATA_REQUEST_SLOTS_AVAILABLE|2: 34>, 206]
2023-04-08 21:16:18.491 DEBUG (MainThread) [zigpy_deconz.api] APS data request response: [2, <DeviceState.APSDE_DATA_REQUEST_SLOTS_AVAILABLE|2: 34>, 206]
2023-04-08 21:16:18.495 DEBUG (MainThread) [zigpy_deconz.uart] Frame received: 0x0e5f000700a600
2023-04-08 21:16:18.495 DEBUG (MainThread) [zigpy_deconz.api] Received command device_state_changed[<DeviceState.128|APSDE_DATA_REQUEST_SLOTS_AVAILABLE|APSDE_DATA_CONFIRM|2: 166>, 0]
2023-04-08 21:16:18.495 DEBUG (MainThread) [zigpy_deconz.api] Device state changed response: [<DeviceState.128|APSDE_DATA_REQUEST_SLOTS_AVAILABLE|APSDE_DATA_CONFIRM|2: 166>, 0]
2023-04-08 21:16:18.495 DEBUG (MainThread) [zigpy_deconz.api] Command Command.aps_data_confirm (0,)
2023-04-08 21:16:18.495 DEBUG (MainThread) [zigpy_deconz.uart] Send: 0x045f0007000000
2023-04-08 21:16:18.497 DEBUG (MainThread) [zigpy_deconz.uart] Frame received: 0x045f0013000c0022ce026da001010000000000
2023-04-08 21:16:18.497 DEBUG (MainThread) [zigpy_deconz.api] Received command aps_data_confirm[12, <DeviceState.APSDE_DATA_REQUEST_SLOTS_AVAILABLE|2: 34>, 206, <DeconzAddressEndpoint address_mode=AddressMode.NWK address=0xa06d endpoint=1>, 1, <TXStatus.SUCCESS: 0>, 0, 0, 0, 0]
2023-04-08 21:16:18.497 DEBUG (MainThread) [zigpy_deconz.api] APS data confirm response for request with id 206: 00
2023-04-08 21:16:18.498 DEBUG (MainThread) [zigpy_deconz.api] Request id: 0xce 'aps_data_confirm' for <DeconzAddressEndpoint address_mode=AddressMode.NWK address=0xa06d endpoint=1>, status: 0x00
2023-04-08 21:16:18.519 DEBUG (MainThread) [zigpy_deconz.uart] Frame received: 0x1c60000c00050002b756ffc9
2023-04-08 21:16:18.519 DEBUG (MainThread) [zigpy_deconz.api] Received command mac_poll[5, <DeconzAddress address_mode=AddressMode.NWK address=0x56b7>, 255, -55]
2023-04-08 21:16:18.768 DEBUG (MainThread) [zigpy_deconz.uart] Frame received: 0x1c61000c00050002b756ffc9
2023-04-08 21:16:18.769 DEBUG (MainThread) [zigpy_deconz.api] Received command mac_poll[5, <DeconzAddress address_mode=AddressMode.NWK address=0x56b7>, 255, -55]
2023-04-08 21:16:19.017 DEBUG (MainThread) [zigpy_deconz.uart] Frame received: 0x1c62000c00050002b756ffc9
2023-04-08 21:16:19.018 DEBUG (MainThread) [zigpy_deconz.api] Received command mac_poll[5, <DeconzAddress address_mode=AddressMode.NWK address=0x56b7>, 255, -55]
2023-04-08 21:16:19.267 DEBUG (MainThread) [zigpy_deconz.uart] Frame received: 0x1c63000c00050002b756ffc9
2023-04-08 21:16:19.268 DEBUG (MainThread) [zigpy_deconz.api] Received command mac_poll[5, <DeconzAddress address_mode=AddressMode.NWK address=0x56b7>, 255, -55]
2023-04-08 21:16:19.519 DEBUG (MainThread) [zigpy_deconz.uart] Frame received: 0x1c64000c00050002b756ffc9
2023-04-08 21:16:19.519 DEBUG (MainThread) [zigpy_deconz.api] Received command mac_poll[5, <DeconzAddress address_mode=AddressMode.NWK address=0x56b7>, 255, -55]
2023-04-08 21:16:19.768 DEBUG (MainThread) [zigpy_deconz.uart] Frame received: 0x1c65000c00050002b756ffc9
2023-04-08 21:16:19.768 DEBUG (MainThread) [zigpy_deconz.api] Received command mac_poll[5, <DeconzAddress address_mode=AddressMode.NWK address=0x56b7>, 255, -55]
2023-04-08 21:16:20.018 DEBUG (MainThread) [zigpy_deconz.uart] Frame received: 0x1c66000c00050002b756ffc9
2023-04-08 21:16:20.018 DEBUG (MainThread) [zigpy_deconz.api] Received command mac_poll[5, <DeconzAddress address_mode=AddressMode.NWK address=0x56b7>, 255, -55]
2023-04-08 21:16:20.268 DEBUG (MainThread) [zigpy_deconz.uart] Frame received: 0x1c67000c00050002b756ffc9
2023-04-08 21:16:20.269 DEBUG (MainThread) [zigpy_deconz.api] Received command mac_poll[5, <DeconzAddress address_mode=AddressMode.NWK address=0x56b7>, 255, -55]
2023-04-08 21:16:20.518 DEBUG (MainThread) [zigpy_deconz.uart] Frame received: 0x1c68000c00050002b756ffc9
2023-04-08 21:16:20.519 DEBUG (MainThread) [zigpy_deconz.api] Received command mac_poll[5, <DeconzAddress address_mode=AddressMode.NWK address=0x56b7>, 255, -55]
2023-04-08 21:16:20.767 DEBUG (MainThread) [zigpy_deconz.uart] Frame received: 0x1c69000c00050002b756ffc9
2023-04-08 21:16:20.768 DEBUG (MainThread) [zigpy_deconz.api] Received command mac_poll[5, <DeconzAddress address_mode=AddressMode.NWK address=0x56b7>, 255, -55]
2023-04-08 21:16:21.018 DEBUG (MainThread) [zigpy_deconz.uart] Frame received: 0x1c6a000c00050002b756ffc9
2023-04-08 21:16:21.018 DEBUG (MainThread) [zigpy_deconz.api] Received command mac_poll[5, <DeconzAddress address_mode=AddressMode.NWK address=0x56b7>, 255, -55]
2023-04-08 21:16:21.265 DEBUG (MainThread) [zigpy_deconz.uart] Frame received: 0x1c6b000c00050002b756ffc9
2023-04-08 21:16:21.266 DEBUG (MainThread) [zigpy_deconz.api] Received command mac_poll[5, <DeconzAddress address_mode=AddressMode.NWK address=0x56b7>, 255, -55]
2023-04-08 21:16:21.518 DEBUG (MainThread) [zigpy_deconz.uart] Frame received: 0x1c6c000c00050002b756ffc9
2023-04-08 21:16:21.518 DEBUG (MainThread) [zigpy_deconz.api] Received command mac_poll[5, <DeconzAddress address_mode=AddressMode.NWK address=0x56b7>, 255, -55]
2023-04-08 21:16:21.766 DEBUG (MainThread) [zigpy_deconz.uart] Frame received: 0x1c6d000c00050002b756ffc9
2023-04-08 21:16:21.766 DEBUG (MainThread) [zigpy_deconz.api] Received command mac_poll[5, <DeconzAddress address_mode=AddressMode.NWK address=0x56b7>, 255, -55]
2023-04-08 21:16:22.014 DEBUG (MainThread) [zigpy_deconz.uart] Frame received: 0x1c6e000c00050002b756ffc9
2023-04-08 21:16:22.015 DEBUG (MainThread) [zigpy_deconz.api] Received command mac_poll[5, <DeconzAddress address_mode=AddressMode.NWK address=0x56b7>, 255, -55]
2023-04-08 21:16:22.264 DEBUG (MainThread) [zigpy_deconz.uart] Frame received: 0x1c6f000c00050002b756ffc9
2023-04-08 21:16:22.265 DEBUG (MainThread) [zigpy_deconz.api] Received command mac_poll[5, <DeconzAddress address_mode=AddressMode.NWK address=0x56b7>, 255, -55]
2023-04-08 21:16:22.515 DEBUG (MainThread) [zigpy_deconz.uart] Frame received: 0x1c70000c00050002b756ffc9
2023-04-08 21:16:22.515 DEBUG (MainThread) [zigpy_deconz.api] Received command mac_poll[5, <DeconzAddress address_mode=AddressMode.NWK address=0x56b7>, 255, -55]
2023-04-08 21:16:22.767 DEBUG (MainThread) [zigpy_deconz.uart] Frame received: 0x1c71000c00050002b756ffc9
2023-04-08 21:16:22.768 DEBUG (MainThread) [zigpy_deconz.api] Received command mac_poll[5, <DeconzAddress address_mode=AddressMode.NWK address=0x56b7>, 255, -55]
2023-04-08 21:16:29.386 WARNING (MainThread) [zigpy.zcl] [0x819C:1:0xef00] Unknown cluster command 2 b'\xa1R\x01\x04\x00\x01\x01'
2023-04-08 21:16:29.570 WARNING (MainThread) [zigpy.zcl] [0x819C:1:0xef00] Unknown cluster command 2 b'\xa1R\x01\x04\x00\x01\x01'
2023-04-08 21:16:29.779 WARNING (MainThread) [zigpy.zcl] [0x819C:1:0xef00] Unknown cluster command 2 b'\xa1R\x01\x04\x00\x01\x01'
2023-04-08 21:16:29.975 WARNING (MainThread) [zigpy.zcl] [0x819C:1:0xef00] Unknown cluster command 2 b'\xa1R\x01\x04\x00\x01\x01'
2023-04-08 21:16:30.184 WARNING (MainThread) [zigpy.zcl] [0x819C:1:0xef00] Unknown cluster command 2 b'\xa1R\x01\x04\x00\x01\x01'
2023-04-08 21:16:30.377 WARNING (MainThread) [zigpy.zcl] [0x819C:1:0xef00] Unknown cluster command 2 b'\xa1R\x01\x04\x00\x01\x01'
2023-04-08 21:16:30.577 WARNING (MainThread) [zigpy.zcl] [0x819C:1:0xef00] Unknown cluster command 2 b'\xa1R\x01\x04\x00\x01\x01'
2023-04-08 21:16:30.778 WARNING (MainThread) [zigpy.zcl] [0x819C:1:0xef00] Unknown cluster command 2 b'\xa1R\x01\x04\x00\x01\x01'
2023-04-08 21:16:30.979 WARNING (MainThread) [zigpy.zcl] [0x819C:1:0xef00] Unknown cluster command 2 b'\xa1R\x01\x04\x00\x01\x01'
2023-04-08 21:16:31.177 WARNING (MainThread) [zigpy.zcl] [0x819C:1:0xef00] Unknown cluster command 2 b'\xa1R\x01\x04\x00\x01\x01'
2023-04-08 21:16:31.379 WARNING (MainThread) [zigpy.zcl] [0x819C:1:0xef00] Unknown cluster command 2 b'\xa1R\x01\x04\x00\x01\x01'
2023-04-08 21:16:31.581 WARNING (MainThread) [zigpy.zcl] [0x819C:1:0xef00] Unknown cluster command 2 b'\xa1R\x01\x04\x00\x01\x01'
2023-04-08 21:16:31.777 WARNING (MainThread) [zigpy.zcl] [0x819C:1:0xef00] Unknown cluster command 2 b'\xa1R\x01\x04\x00\x01\x01'
2023-04-08 21:16:31.980 WARNING (MainThread) [zigpy.zcl] [0x819C:1:0xef00] Unknown cluster command 2 b'\xa1R\x01\x04\x00\x01\x01'
2023-04-08 21:16:32.180 WARNING (MainThread) [zigpy.zcl] [0x819C:1:0xef00] Unknown cluster command 2 b'\xa1R\x01\x04\x00\x01\x01'
2023-04-08 21:16:32.656 WARNING (MainThread) [zigpy.zcl] [0x819C:1:0xef00] Unknown cluster command 2 b'\xa1S\x03\x02\x00\x04\x00\x00\x002'
2023-04-08 21:16:37.122 INFO (MainThread) [zigpy.application] Device 0x0539 (60:a4:23:ff:fe:7d:df:57) joined the network
2023-04-08 21:16:37.239 INFO (MainThread) [zigpy.application] Device 0x23d7 (00:3c:84:ff:fe:21:56:67) joined the network

@ToastySefac
Copy link

Hi all, has anyone got any suggestions on why this is not working for me? Thanks.

@SergeantPup
Copy link

SergeantPup commented May 13, 2023

Hi all, has anyone got any suggestions on why this is not working for me? Thanks.

You're in luck. I just migrated my Home Assistant this week and my _TZE200_rmymn92d was the last device that I successfully moved yesterday and I can confirm that my quirk is working on a brand new install. The easiest way I found to do this (in the last dozen times I've done this) is to use something like Samba share to swap out the files.

After swapping the folders and a reboot, put this in your config yaml to start reading the quirk:
zha: enable_quirks: true custom_quirks_path: /config/custom_zha_quirks/

I can give you a copy of my file if you choose how you want me to get it to you. There's no other custom quirk in it, the only custom quirk I have is for _TZE200_rmymn92d and that's all that's in the fileset. I DO recall from the previous times I did this that the file structure matters and it won't work if you don't have everything aligned just so (and I think there's a counterintuitive requirement here). Yesterday I basically did a lift and shift to a HA yellow and it the curtain immediately showed entities after I installed this.

I did not build this quirk, somebody at HA more experienced help me build this a year ago. I confirmed with my new install that this quirk is still not part of core but it definitely works for this device.

@sebastian3107
Copy link

Isn't there a way to get this quirk included in main ZHA so it works for everyone?

@SergeantPup
Copy link

Isn't there a way to get this quirk included in main ZHA so it works for everyone?

Yes but. If I recall correctly, the person who was helping me with this quirk had some unanswered questions about this device and the logs and I think that's what was preventing a permanent solution. My HA suggests where I should consider a pr for this custom quirk. I just never considered mine a "complete" or "correct" solution; however, I will admit it works enough to where I've had this curtain rail automated for a year with this quirk.

The outstanding item on this device was: from all that we can tell, the curtain functionality works (open/close/pause) but it was still throwing some errors in the zha logs (nothing critical). Something about it looked like it was trying to call home with a date/time (which didn't make sense). This functionality isn't present in the other motor thats just like this with a different model number (TZE200_xaabybja).

I thought it was also missing LQI or RSSI but I just looked and it appears to be functioning. Perhaps my memory is failing me on what the missing functionality was.

I think we always figured somebody would come along and solve the mystery and "finish" the quirk.

If you think its helpful to put in a pr for this quirk just with the knowledge that open close and pause works, then I'll be happy to submit my quirk for that. I'm just not sure my quirk is "best".

@ToastySefac
Copy link

Thanks a lot SergeantPup. Can you upload it to here: https://www.dropbox.com/request/X6Oj3LXZpuoYaz9MGERM

@SergeantPup
Copy link

Thanks a lot SergeantPup. Can you upload it to here: https://www.dropbox.com/request/X6Oj3LXZpuoYaz9MGERM

done. Just reboot before you add the line in your config file pointing at the quirk

@ToastySefac
Copy link

Thanks.

Before using your files, I removed my existing quirks and the pycache, removed my custom quirks reference in configuration.yaml and rebooted HA.

I then copied across your files straight from what you uploaded. I rebooted, added in the zha quirks reference to configuration.yaml, and rebooted once again.

I re-paired the device but it wasn't loading the quirk. I looked at the logs and it gave this error:

Logger: zhaquirks
Source: custom_zha_quirks/ts0601_cover.py:13
First occurred: 10:06:58 AM (1 occurrences)
Last logged: 10:06:58 AM

Unexpected exception importing custom quirk 'ts0601_cover'
Traceback (most recent call last):
  File "/usr/local/lib/python3.10/site-packages/zhaquirks/__init__.py", line 454, in setup
    importer.find_module(modname).load_module(modname)
  File "<frozen importlib._bootstrap_external>", line 548, in _check_name_wrapper
  File "<frozen importlib._bootstrap_external>", line 1063, in load_module
  File "<frozen importlib._bootstrap_external>", line 888, in load_module
  File "<frozen importlib._bootstrap>", line 290, in _load_module_shim
  File "<frozen importlib._bootstrap>", line 719, in _load
  File "<frozen importlib._bootstrap>", line 688, in _load_unlocked
  File "<frozen importlib._bootstrap_external>", line 883, in exec_module
  File "<frozen importlib._bootstrap>", line 241, in _call_with_frames_removed
  File "/config/custom_zha_quirks/ts0601_cover.py", line 13, in <module>
    from . import (
ImportError: attempted relative import with no known parent package

Looking at your quirk, Line 13 was a little different to what I had before. Your's had:

from . import (

So I changed that to:

from zhaquirks.tuya import (

After another reboot, the error has gone and I can see the quirk is being applied. I now also see the controls for the device. Unfortunately, they don't do anything.

@ToastySefac
Copy link

I changed line 13 from from zhaquirks.tuya import ( to from custom_zha_quirks import (
It's now working perfectly - thanks for your help!

**ts0601_cover.py**
"""Tuya based cover and blinds."""
from zigpy.profiles import zha
from zigpy.zcl.clusters.general import Basic, GreenPowerProxy, Groups, Identify, OnOff, Ota, Scenes, Time

from zhaquirks.const import (
    DEVICE_TYPE,
    ENDPOINTS,
    INPUT_CLUSTERS,
    MODELS_INFO,
    OUTPUT_CLUSTERS,
    PROFILE_ID,
)
from custom_zha_quirks import (
    TuyaManufacturerWindowCover,
    TuyaManufCluster,
    TuyaWindowCover,
    TuyaWindowCoverControl,
)


class TuyaZemismartSmartCover0601(TuyaWindowCover):
    """Tuya Zemismart blind cover motor."""

    signature = {
        # "node_descriptor": "<NodeDescriptor byte1=1 byte2=64 mac_capability_flags=142 manufacturer_code=4098
        #                       maximum_buffer_size=82 maximum_incoming_transfer_size=82 server_mask=11264
        #                       maximum_outgoing_transfer_size=82 descriptor_capability_field=0>",
        # input_clusters=[0x0000, 0x0004, 0x0005, 0x000a, 0xef00]
        # output_clusters=[0x0019]
        # <SimpleDescriptor endpoint=1 profile=260 device_type=51 input_clusters=[0, 4, 5, 61184] output_clusters=[25]>
        MODELS_INFO: [
            ("_TZE200_fzo2pocs", "TS0601"),
            ("_TZE200_zpzndjez", "TS0601"),
            ("_TZE200_cowvfni3", "TS0601"),
        ],
        ENDPOINTS: {
            1: {
                PROFILE_ID: zha.PROFILE_ID,
                DEVICE_TYPE: zha.DeviceType.SMART_PLUG,
                INPUT_CLUSTERS: [
                    Basic.cluster_id,
                    Groups.cluster_id,
                    Scenes.cluster_id,
                    Time.cluster_id,
                    TuyaManufCluster.cluster_id,
                ],
                OUTPUT_CLUSTERS: [Ota.cluster_id],
            },
        },
    }
    replacement = {
        ENDPOINTS: {
            1: {
                PROFILE_ID: zha.PROFILE_ID,
                DEVICE_TYPE: zha.DeviceType.WINDOW_COVERING_DEVICE,
                INPUT_CLUSTERS: [
                    Basic.cluster_id,
                    Groups.cluster_id,
                    Scenes.cluster_id,
                    Time.cluster_id,
                    TuyaManufacturerWindowCover,
                    TuyaWindowCoverControl,
                ],
                OUTPUT_CLUSTERS: [Ota.cluster_id],
            },
        },
    }


# From: https://github.com/zigpy/zha-device-handlers/issues/1294#issuecomment-1014843749
class TuyaZemismartSmartCover0601_4(TuyaWindowCover):
    """Tuya blind controller device."""

    signature = {
        # "node_descriptor": "NodeDescriptor(byte1=1, byte2=64, mac_capability_flags=142, manufacturer_code=4417,
        #                     maximum_buffer_size=66, maximum_incoming_transfer_size=66, server_mask=10752,
        #                     maximum_outgoing_transfer_size=66, descriptor_capability_field=0>,
        # "endpoints": { "1": { "profile_id": 260, "device_type": "0x0051", "in_clusters": [ "0x0000", "0x0004",
        # "0x0005","0xef00"], "out_clusters": ["0x000a","0x0019"] }, "242": { "profile_id": 41440, "device_type":
        # "0x0061", in_clusters": [], "out_clusters": [ "0x0021" ] } }, "manufacturer": "_TZE200_rmymn92d",
        # "model": "TS0601", "class": "zigpy.device.Device" }
        MODELS_INFO: [
            ("_TZE200_rmymn92d", "TS0601"),
        ],
        ENDPOINTS: {
            1: {
                PROFILE_ID: 260,
                DEVICE_TYPE: 0x0051,
                INPUT_CLUSTERS: [
                    Basic.cluster_id,
                    Groups.cluster_id,
                    Scenes.cluster_id,
                    TuyaManufCluster.cluster_id,
                ],
                OUTPUT_CLUSTERS: [Time.cluster_id, Ota.cluster_id],
            },
            242: {
                PROFILE_ID: 41440,
                DEVICE_TYPE: 0x0061,
                INPUT_CLUSTERS: [],
                OUTPUT_CLUSTERS: [GreenPowerProxy.cluster_id],
            },
        },
    }

    replacement = {
        ENDPOINTS: {
            1: {
                DEVICE_TYPE: zha.DeviceType.WINDOW_COVERING_DEVICE,
                INPUT_CLUSTERS: [
                    Basic.cluster_id,
                    Groups.cluster_id,
                    Scenes.cluster_id,
                    TuyaManufacturerWindowCover,
                    TuyaWindowCoverControl,
                ],
                OUTPUT_CLUSTERS: [Time.cluster_id, Ota.cluster_id],
            },
            242: {
                PROFILE_ID: 41440,
                DEVICE_TYPE: 0x0061,
                INPUT_CLUSTERS: [],
                OUTPUT_CLUSTERS: [GreenPowerProxy.cluster_id],
            },
        }
    }


class TuyaZemismartSmartCover0601_3(TuyaWindowCover):
    """Tuya Zemismart blind cover motor."""

    signature = {
        # "node_descriptor": "<NodeDescriptor byte1=1 byte2=64 mac_capability_flags=142 manufacturer_code=4098
        #                       maximum_buffer_size=82 maximum_incoming_transfer_size=82 server_mask=11264
        #                       maximum_outgoing_transfer_size=82 descriptor_capability_field=0>",
        # input_clusters=[0x0000, 0x0004, 0x0005, 0x000a, 0xef00]
        # output_clusters=[0x0019]
        # <SimpleDescriptor endpoint=1 profile=260 device_type=51 input_clusters=[0, 4, 5, 61184] output_clusters=[25]>
        MODELS_INFO: [
            ("_TZE200_fzo2pocs", "TS0601"),
            ("_TZE200_zpzndjez", "TS0601"),
            ("_TZE200_iossyxra", "TS0601"),
        ],
        ENDPOINTS: {
            1: {
                PROFILE_ID: zha.PROFILE_ID,
                DEVICE_TYPE: zha.DeviceType.SMART_PLUG,
                INPUT_CLUSTERS: [
                    Basic.cluster_id,
                    Groups.cluster_id,
                    Scenes.cluster_id,
                    TuyaManufCluster.cluster_id,
                ],
                OUTPUT_CLUSTERS: [Time.cluster_id, Ota.cluster_id],
            },
        },
    }
    replacement = {
        ENDPOINTS: {
            1: {
                PROFILE_ID: zha.PROFILE_ID,
                DEVICE_TYPE: zha.DeviceType.WINDOW_COVERING_DEVICE,
                INPUT_CLUSTERS: [
                    Basic.cluster_id,
                    Groups.cluster_id,
                    Scenes.cluster_id,
                    TuyaManufacturerWindowCover,
                    TuyaWindowCoverControl,
                ],
                OUTPUT_CLUSTERS: [Ota.cluster_id],
            },
        },
    }


class TuyaZemismartSmartCover0601_2(TuyaWindowCover):
    """Tuya Zemismart curtain cover motor."""

    signature = {
        # "node_descriptor": "<NodeDescriptor byte1=1 byte2=64 mac_capability_flags=142 manufacturer_code=4098
        #                       maximum_buffer_size=82 maximum_incoming_transfer_size=82 server_mask=11264
        #                       maximum_outgoing_transfer_size=82 descriptor_capability_field=0>",
        # input_clusters=[0x0000, 0x000a, 0x0004, 0x0005, 0xef00]
        # output_clusters=[0x0019]
        # <SimpleDescriptor endpoint=1 profile=260 device_type=81 input_clusters=[0, 10, 4, 5, 61184] output_clusters=[25]>
        MODELS_INFO: [
            ("_TZE200_3i3exuay", "TS0601"),
        ],
        ENDPOINTS: {
            1: {
                PROFILE_ID: zha.PROFILE_ID,
                DEVICE_TYPE: zha.DeviceType.SMART_PLUG,
                INPUT_CLUSTERS: [
                    Basic.cluster_id,
                    Time.cluster_id,
                    Groups.cluster_id,
                    Scenes.cluster_id,
                    TuyaManufCluster.cluster_id,
                ],
                OUTPUT_CLUSTERS: [Ota.cluster_id],
            },
        },
    }
    replacement = {
        ENDPOINTS: {
            1: {
                PROFILE_ID: zha.PROFILE_ID,
                DEVICE_TYPE: zha.DeviceType.WINDOW_COVERING_DEVICE,
                INPUT_CLUSTERS: [
                    Basic.cluster_id,
                    Groups.cluster_id,
                    Scenes.cluster_id,
                    Time.cluster_id,
                    TuyaManufacturerWindowCover,
                    TuyaWindowCoverControl,
                ],
                OUTPUT_CLUSTERS: [Ota.cluster_id],
            },
        },
    }


class TuyaMoesCover0601(TuyaWindowCover):
    """Tuya blind controller device."""

    signature = {
        # "node_descriptor": "NodeDescriptor(byte1=2, byte2=64, mac_capability_flags=128, manufacturer_code=4098,
        #                    maximum_buffer_size=82, maximum_incoming_transfer_size=82, server_mask=11264,
        #                    maximum_outgoing_transfer_size=82, descriptor_capability_field=0)",
        # "endpoints": {
        # "1": { "profile_id": 260, "device_type": "0x0051", "in_clusters": [ "0x0000", "0x0004","0x0005","0xef00"], "out_clusters": ["0x000a","0x0019"] }
        # },
        # "manufacturer": "_TZE200_zah67ekd",
        # "model": "TS0601",
        # "class": "zigpy.device.Device"
        # }
        MODELS_INFO: [
            ("_TZE200_zah67ekd", "TS0601"),
            ("_TZE200_xuzcvlku", "TS0601"),
            ("_TZE200_rddyvrci", "TS0601"),
            ("_TZE200_nueqqe6k", "TS0601"),
            ("_TZE200_gubdgai2", "TS0601"),
            ("_TZE200_yenbr4om", "TS0601"),
            ("_TZE200_5sbebbzs", "TS0601"),
            ("_TZE200_xaabybja", "TS0601"),
            ("_TZE200_hsgrhjpf", "TS0601"),
            ("_TZE200_68nvbio9", "TS0601"),
            ("_TZE200_zuz7f94z", "TS0601"),
            ("_TZE200_ergbiejo", "TS0601"),
        ],
        ENDPOINTS: {
            1: {
                PROFILE_ID: zha.PROFILE_ID,
                DEVICE_TYPE: zha.DeviceType.SMART_PLUG,
                INPUT_CLUSTERS: [
                    Basic.cluster_id,
                    Groups.cluster_id,
                    Scenes.cluster_id,
                    TuyaManufCluster.cluster_id,
                ],
                OUTPUT_CLUSTERS: [Time.cluster_id, Ota.cluster_id],
            }
        },
    }

    replacement = {
        ENDPOINTS: {
            1: {
                DEVICE_TYPE: zha.DeviceType.WINDOW_COVERING_DEVICE,
                INPUT_CLUSTERS: [
                    Basic.cluster_id,
                    Groups.cluster_id,
                    Scenes.cluster_id,
                    TuyaManufacturerWindowCover,
                    TuyaWindowCoverControl,
                ],
                OUTPUT_CLUSTERS: [Time.cluster_id, Ota.cluster_id],
            }
        }
    }


class TuyaCloneCover0601(TuyaWindowCover):
    """Tuya blind controller device."""

    signature = {
        # <SimpleDescriptor endpoint=1 profile=260 device_type=256 device_version=0
        # input_clusters=[0, 3, 4, 5, 6]
        # output_clusters=[25]>
        # },
        # "manufacturer": "_TYST11_wmcdj3aq",
        # "model": "mcdj3aq",
        # "class": "zigpy.device.Device"
        # }
        MODELS_INFO: [("_TYST11_wmcdj3aq", "mcdj3aq")],  # Not tested
        ENDPOINTS: {
            1: {
                PROFILE_ID: zha.PROFILE_ID,
                DEVICE_TYPE: zha.DeviceType.ON_OFF_LIGHT,
                INPUT_CLUSTERS: [
                    Basic.cluster_id,
                    Identify.cluster_id,
                    Groups.cluster_id,
                    Scenes.cluster_id,
                    OnOff.cluster_id,
                ],
                OUTPUT_CLUSTERS: [Ota.cluster_id],
            }
        },
    }

    replacement = {
        ENDPOINTS: {
            1: {
                DEVICE_TYPE: zha.DeviceType.WINDOW_COVERING_DEVICE,
                INPUT_CLUSTERS: [
                    Basic.cluster_id,
                    Groups.cluster_id,
                    Scenes.cluster_id,
                    TuyaManufacturerWindowCover,
                    TuyaWindowCoverControl,
                ],
                OUTPUT_CLUSTERS: [Ota.cluster_id],
            }
        }
    }
**__init__.py**
"""Tuya devices."""
import dataclasses
import datetime
import logging
from typing import Any, Callable, Dict, List, Optional, Tuple, Union

from zigpy.quirks import CustomCluster, CustomDevice
import zigpy.types as t
from zigpy.zcl import foundation
from zigpy.zcl.clusters.closures import WindowCovering
from zigpy.zcl.clusters.general import LevelControl, OnOff, PowerConfiguration
from zigpy.zcl.clusters.homeautomation import ElectricalMeasurement
from zigpy.zcl.clusters.hvac import Thermostat, UserInterface
from zigpy.zcl.clusters.smartenergy import Metering

from zhaquirks import Bus, EventableCluster, LocalDataCluster
from zhaquirks.const import (
    DOUBLE_PRESS,
    LEFT,
    LONG_PRESS,
    RIGHT,
    SHORT_PRESS,
    ZHA_SEND_EVENT,
)

# ---------------------------------------------------------
# Tuya Custom Cluster ID
# ---------------------------------------------------------
TUYA_CLUSTER_ID = 0xEF00
TUYA_CLUSTER_E000_ID = 0xE000
TUYA_CLUSTER_E001_ID = 0xE001
# ---------------------------------------------------------
# Tuya Cluster Commands
# ---------------------------------------------------------
TUYA_SET_DATA = 0x00
TUYA_GET_DATA = 0x01
TUYA_SET_DATA_RESPONSE = 0x02
TUYA_SEND_DATA = 0x04
TUYA_ACTIVE_STATUS_RPT = 0x06
TUYA_SET_TIME = 0x24
# TODO: To be checked
TUYA_MCU_VERSION_REQ = 0x10
TUYA_MCU_VERSION_RSP = 0x11
#
TUYA_LEVEL_COMMAND = 514

COVER_EVENT = "cover_event"
LEVEL_EVENT = "level_event"
TUYA_MCU_COMMAND = "tuya_mcu_command"

# Rotating for remotes
STOP = "stop"  # To constans

# ---------------------------------------------------------
# Value for dp_type
# ---------------------------------------------------------
# ID    Name            Description
# ---------------------------------------------------------
# 0x00 	DP_TYPE_RAW 	?
# 0x01 	DP_TYPE_BOOL 	?
# 0x02 	DP_TYPE_VALUE 	4 byte unsigned integer
# 0x03 	DP_TYPE_STRING 	variable length string
# 0x04 	DP_TYPE_ENUM 	1 byte enum
# 0x05 	DP_TYPE_FAULT 	1 byte bitmap (didn't test yet)
TUYA_DP_TYPE_RAW = 0x0000
TUYA_DP_TYPE_BOOL = 0x0100
TUYA_DP_TYPE_VALUE = 0x0200
TUYA_DP_TYPE_STRING = 0x0300
TUYA_DP_TYPE_ENUM = 0x0400
TUYA_DP_TYPE_FAULT = 0x0500
# ---------------------------------------------------------
# Value for dp_identifier (These are device specific)
# ---------------------------------------------------------
# ID    Name               Type    Description
# ---------------------------------------------------------
# 0x01  control            enum    open, stop, close, continue
# 0x02  percent_control    value   0-100% control
# 0x03  percent_state      value   Report from motor about current percentage
# 0x04  control_back       enum    Configures motor direction (untested)
# 0x05  work_state         enum    Motor Direction Setting
# 0x06  situation_set      enum    Configures if 100% equals to fully closed or fully open (untested)
# 0x07  fault              bitmap  Anything but 0 means something went wrong (untested)
TUYA_DP_ID_CONTROL = 0x01
TUYA_DP_ID_PERCENT_CONTROL = 0x02
TUYA_DP_ID_PERCENT_STATE = 0x03
TUYA_DP_ID_DIRECTION_CHANGE = 0x05
TUYA_DP_ID_COVER_INVERTED = 0x06
# ---------------------------------------------------------
# Window Cover Server Commands
# ---------------------------------------------------------
WINDOW_COVER_COMMAND_UPOPEN = 0x0000
WINDOW_COVER_COMMAND_DOWNCLOSE = 0x0001
WINDOW_COVER_COMMAND_STOP = 0x0002
WINDOW_COVER_COMMAND_LIFTPERCENT = 0x0005
WINDOW_COVER_COMMAND_CUSTOM = 0x0006
# ---------------------------------------------------------
# TUYA Cover Custom Values
# ---------------------------------------------------------
COVER_EVENT = "cover_event"
ATTR_COVER_POSITION = 0x0008
ATTR_COVER_DIRECTION = 0x8001
ATTR_COVER_INVERTED = 0x8002
# For most tuya devices 0 = Up/Open, 1 = Stop, 2 = Down/Close
TUYA_COVER_COMMAND = {
    "_TZE200_zah67ekd": {0x0000: 0x0000, 0x0001: 0x0002, 0x0002: 0x0001},
    "_TZE200_fzo2pocs": {0x0000: 0x0000, 0x0001: 0x0002, 0x0002: 0x0001},
    "_TZE200_xuzcvlku": {0x0000: 0x0000, 0x0001: 0x0002, 0x0002: 0x0001},
    "_TZE200_rddyvrci": {0x0000: 0x0002, 0x0001: 0x0001, 0x0002: 0x0000},
    "_TZE200_3i3exuay": {0x0000: 0x0000, 0x0001: 0x0002, 0x0002: 0x0001},
    "_TZE200_nueqqe6k": {0x0000: 0x0000, 0x0001: 0x0002, 0x0002: 0x0001},
    "_TZE200_gubdgai2": {0x0000: 0x0000, 0x0001: 0x0002, 0x0002: 0x0001},
    "_TZE200_zpzndjez": {0x0000: 0x0000, 0x0001: 0x0002, 0x0002: 0x0001},
    "_TZE200_cowvfni3": {0x0000: 0x0002, 0x0001: 0x0000, 0x0002: 0x0001},
    "_TYST11_wmcdj3aq": {0x0000: 0x0000, 0x0001: 0x0002, 0x0002: 0x0001},
    "_TZE200_yenbr4om": {0x0000: 0x0000, 0x0001: 0x0002, 0x0002: 0x0001},
    "_TZE200_5sbebbzs": {0x0000: 0x0000, 0x0001: 0x0002, 0x0002: 0x0001},
    "_TZE200_xaabybja": {0x0000: 0x0000, 0x0001: 0x0002, 0x0002: 0x0001},
    "_TZE200_hsgrhjpf": {0x0000: 0x0000, 0x0001: 0x0002, 0x0002: 0x0001},
    "_TZE200_iossyxra": {0x0000: 0x0000, 0x0001: 0x0002, 0x0002: 0x0001},
    "_TZE200_68nvbio9": {0x0000: 0x0000, 0x0001: 0x0002, 0x0002: 0x0001},
    "_TZE200_zuz7f94z": {0x0000: 0x0000, 0x0001: 0x0002, 0x0002: 0x0001},
    "_TZE200_ergbiejo": {0x0000: 0x0000, 0x0001: 0x0002, 0x0002: 0x0001},
    "_TZE200_rmymn92d": {0x0000: 0x0000, 0x0001: 0x0002, 0x0002: 0x0001},
}
# Taken from zigbee-herdsman-converters
# Contains all covers which need their position inverted by default
# Default is 100 = open, 0 = closed; Devices listed here will use 0 = open, 100 = closed instead
# Use manufacturerName to identify device!
# Don't invert _TZE200_cowvfni3: https://github.com/Koenkk/zigbee2mqtt/issues/6043
TUYA_COVER_INVERTED_BY_DEFAULT = [
    "_TZE200_wmcdj3aq",
    "_TZE200_nogaemzt",
    "_TZE200_xuzcvlku",
    "_TZE200_xaabybja",
    "_TZE200_yenbr4om",
    "_TZE200_zpzndjez",
    "_TZE200_zuz7f94z",
    "_TZE200_rmymn92d",
]

# ---------------------------------------------------------
# TUYA Switch Custom Values
# ---------------------------------------------------------
SWITCH_EVENT = "switch_event"
ATTR_ON_OFF = 0x0000
ATTR_COVER_POSITION = 0x0008
TUYA_CMD_BASE = 0x0100
# ---------------------------------------------------------
# DP Value meanings in Status Report
# ---------------------------------------------------------
# Type ID    IntDP   Description
# ---------------------------------------------------------
# 0x04 0x01  1025    Confirm opening/closing/stopping (triggered from Zigbee)
# 0x02 0x02   514    Started moving to position (triggered from Zigbee)
# 0x04 0x07  1031    Started moving (triggered by transmitter order pulling on curtain)
# 0x02 0x03   515    Arrived at position
# 0x01 0x05   261    Returned by configuration set; ignore
# 0x02 0x69   617    Not sure what this is
# 0x04 0x05  1029    Changed the Motor Direction
# 0x04 0x65  1125    Change of tilt/lift mode 1 = lift 0=tilt
# ---------------------------------------------------------

_LOGGER = logging.getLogger(__name__)


class BigEndianInt16(int):
    """Helper class to represent big endian 16 bit value."""

    def serialize(self) -> bytes:
        """Value serialisation."""

        try:
            return self.to_bytes(2, "big", signed=False)
        except OverflowError as e:
            # OverflowError is not a subclass of ValueError, making it annoying to catch
            raise ValueError(str(e)) from e

    @classmethod
    def deserialize(cls, data: bytes) -> Tuple["BigEndianInt16", bytes]:
        """Value deserialisation."""

        if len(data) < 2:
            raise ValueError(f"Data is too short to contain {cls._size} bytes")

        r = cls.from_bytes(data[:2], "big", signed=False)
        data = data[2:]
        return r, data


class TuyaTimePayload(t.LVList, item_type=t.uint8_t, length_type=BigEndianInt16):
    """Tuya set time payload definition."""


class TuyaDPType(t.enum8):
    """DataPoint Type."""

    RAW = 0x00
    BOOL = 0x01
    VALUE = 0x02
    STRING = 0x03
    ENUM = 0x04
    BITMAP = 0x05


class TuyaData(t.Struct):
    """Tuya Data type."""

    dp_type: TuyaDPType
    function: t.uint8_t
    raw: t.LVBytes

    @classmethod
    def deserialize(cls, data: bytes) -> Tuple["TuyaData", bytes]:
        """Deserialize data."""
        res = cls()
        res.dp_type, data = TuyaDPType.deserialize(data)
        res.function, data = t.uint8_t.deserialize(data)
        res.raw, data = t.LVBytes.deserialize(data)
        if res.dp_type not in (TuyaDPType.BITMAP, TuyaDPType.STRING, TuyaDPType.ENUM):
            res.raw = res.raw[::-1]
        return res, data

    @property
    def payload(self) -> Union[t.Bool, t.CharacterString, t.uint32_t, t.data32]:
        """Payload accordingly to data point type."""
        if self.dp_type == TuyaDPType.VALUE:
            return t.uint32_t.deserialize(self.raw)[0]
        elif self.dp_type == TuyaDPType.BOOL:
            return t.Bool.deserialize(self.raw)[0]
        elif self.dp_type == TuyaDPType.STRING:
            return self.raw.decode("utf8")
        elif self.dp_type == TuyaDPType.ENUM:
            return t.enum8.deserialize(self.raw)[0]
        elif self.dp_type == TuyaDPType.BITMAP:
            bitmaps = {1: t.bitmap8, 2: t.bitmap16, 4: t.bitmap32}
            try:
                return bitmaps[len(self.raw)].deserialize(self.raw)[0]
            except KeyError as exc:
                raise ValueError(f"Wrong bitmap length: {len(self.raw)}") from exc

        raise ValueError(f"Unknown {self.dp_type} datapoint type")


class Data(t.List, item_type=t.uint8_t):
    """list of uint8_t."""

    @classmethod
    def from_value(cls, value):
        """Convert from a zigpy typed value to a tuya data payload."""
        # serialized in little-endian by zigpy
        data = cls(value.serialize())
        # we want big-endian, with length prepended
        data.append(len(data))
        data.reverse()
        return data

    def to_value(self, ztype):
        """Convert from a tuya data payload to a zigpy typed value."""
        # first uint8_t is the length of the remaining data
        # tuya data is in big endian whereas ztypes use little endian
        value, _ = ztype.deserialize(bytes(reversed(self[1:])))
        return value


class TuyaCommand(t.Struct):
    """Tuya manufacturer cluster command."""

    status: t.uint8_t
    tsn: t.uint8_t
    dp: t.uint8_t
    data: TuyaData


class TuyaManufCluster(CustomCluster):
    """Tuya manufacturer specific cluster."""

    name = "Tuya Manufacturer Specicific"
    cluster_id = TUYA_CLUSTER_ID
    ep_attribute = "tuya_manufacturer"
    set_time_offset = 0
    set_time_local_offset = None

    class Command(t.Struct):
        """Tuya manufacturer cluster command."""

        status: t.uint8_t
        tsn: t.uint8_t
        command_id: t.uint16_t
        function: t.uint8_t
        data: Data

    class MCUVersionRsp(t.Struct):
        """Tuya MCU version response Zcl payload."""

        tsn: t.uint16_t
        version: t.uint8_t

    """ Time sync command (It's transparent between MCU and server)
            Time request device -> server
               payloadSize = 0
            Set time, server -> device
               payloadSize, should be always 8
               payload[0-3] - UTC timestamp (big endian)
               payload[4-7] - Local timestamp (big endian)

            Zigbee payload is very similar to the UART payload which is described here: https://developer.tuya.com/en/docs/iot/device-development/access-mode-mcu/zigbee-general-solution/tuya-zigbee-module-uart-communication-protocol/tuya-zigbee-module-uart-communication-protocol?id=K9ear5khsqoty#title-10-Time%20synchronization

            Some devices need the timestamp in seconds from 1/1/1970 and others in seconds from 1/1/2000.
            Also, there is devices which uses both timestamps variants (probably bug). Use set_time_local_offset var in this cases.

            NOTE: You need to wait for time request before setting it. You can't set time without request."""

    server_commands = {
        0x0000: foundation.ZCLCommandDef(
            "set_data", {"param": Command}, False, is_manufacturer_specific=False
        ),
        0x0010: foundation.ZCLCommandDef(
            "mcu_version_req",
            {"param": t.uint16_t},
            False,
            is_manufacturer_specific=True,
        ),
        0x0024: foundation.ZCLCommandDef(
            "set_time", {"param": TuyaTimePayload}, False, is_manufacturer_specific=True
        ),
    }

    client_commands = {
        0x0001: foundation.ZCLCommandDef(
            "get_data", {"param": Command}, True, is_manufacturer_specific=True
        ),
        0x0002: foundation.ZCLCommandDef(
            "set_data_response", {"param": Command}, True, is_manufacturer_specific=True
        ),
        0x0006: foundation.ZCLCommandDef(
            "active_status_report",
            {"param": Command},
            True,
            is_manufacturer_specific=True,
        ),
        0x0011: foundation.ZCLCommandDef(
            "mcu_version_rsp",
            {"param": MCUVersionRsp},
            True,
            is_manufacturer_specific=True,
        ),
        0x0024: foundation.ZCLCommandDef(
            "set_time_request", {"param": t.data16}, True, is_manufacturer_specific=True
        ),
    }

    def __init__(self, *args, **kwargs):
        """Init."""
        super().__init__(*args, **kwargs)
        self.endpoint.device.command_bus = Bus()
        self.endpoint.device.command_bus.add_listener(self)  # listen MCU commands

    def tuya_mcu_command(self, command: Command):
        """Tuya MCU command listener. Only endpoint:1 must listen to MCU commands."""

        self.create_catching_task(
            self.command(TUYA_SET_DATA, command, expect_reply=True)
        )

    def handle_cluster_request(
        self,
        hdr: foundation.ZCLHeader,
        args: Tuple,
        *,
        dst_addressing: Optional[
            Union[t.Addressing.Group, t.Addressing.IEEE, t.Addressing.NWK]
        ] = None,
    ) -> None:
        """Handle time request."""

        if hdr.command_id != 0x0024 or self.set_time_offset == 0:
            return super().handle_cluster_request(
                hdr, args, dst_addressing=dst_addressing
            )

        # Send default response because the MCU expects it
        if not hdr.frame_control.disable_default_response:
            self.send_default_rsp(hdr, status=foundation.Status.SUCCESS)

        _LOGGER.debug(
            "[0x%04x:%s:0x%04x] Got set time request (command 0x%04x)",
            self.endpoint.device.nwk,
            self.endpoint.endpoint_id,
            self.cluster_id,
            hdr.command_id,
        )
        payload = TuyaTimePayload()
        utc_timestamp = int(
            (
                datetime.datetime.utcnow()
                - datetime.datetime(self.set_time_offset, 1, 1)
            ).total_seconds()
        )
        local_timestamp = int(
            (
                datetime.datetime.now()
                - datetime.datetime(
                    self.set_time_local_offset or self.set_time_offset, 1, 1
                )
            ).total_seconds()
        )
        payload.extend(utc_timestamp.to_bytes(4, "big", signed=False))
        payload.extend(local_timestamp.to_bytes(4, "big", signed=False))

        self.create_catching_task(
            super().command(TUYA_SET_TIME, payload, expect_reply=False)
        )


class TuyaManufClusterAttributes(TuyaManufCluster):
    """Manufacturer specific cluster for Tuya converting attributes <-> commands."""

    def handle_cluster_request(
        self,
        hdr: foundation.ZCLHeader,
        args: Tuple,
        *,
        dst_addressing: Optional[
            Union[t.Addressing.Group, t.Addressing.IEEE, t.Addressing.NWK]
        ] = None,
    ) -> None:
        """Handle cluster request."""
        if hdr.command_id not in (0x0001, 0x0002):
            return super().handle_cluster_request(
                hdr, args, dst_addressing=dst_addressing
            )

        # Send default response because the MCU expects it
        if not hdr.frame_control.disable_default_response:
            self.send_default_rsp(hdr, status=foundation.Status.SUCCESS)

        tuya_cmd = args[0].command_id
        tuya_data = args[0].data

        _LOGGER.debug(
            "[0x%04x:%s:0x%04x] Received value %s "
            "for attribute 0x%04x (command 0x%04x)",
            self.endpoint.device.nwk,
            self.endpoint.endpoint_id,
            self.cluster_id,
            repr(tuya_data[1:]),
            tuya_cmd,
            hdr.command_id,
        )

        if tuya_cmd not in self.attributes:
            return

        ztype = self.attributes[tuya_cmd].type
        zvalue = tuya_data.to_value(ztype)
        self._update_attribute(tuya_cmd, zvalue)

    def read_attributes(
        self, attributes, allow_cache=False, only_cache=False, manufacturer=None
    ):
        """Ignore remote reads as the "get_data" command doesn't seem to do anything."""

        return super().read_attributes(
            attributes, allow_cache=True, only_cache=True, manufacturer=manufacturer
        )

    async def write_attributes(self, attributes, manufacturer=None):
        """Defer attributes writing to the set_data tuya command."""

        records = self._write_attr_records(attributes)

        for record in records:
            cmd_payload = TuyaManufCluster.Command()
            cmd_payload.status = 0
            cmd_payload.tsn = self.endpoint.device.application.get_sequence()
            cmd_payload.command_id = record.attrid
            cmd_payload.function = 0
            cmd_payload.data = Data.from_value(record.value.value)

            await super().command(
                TUYA_SET_DATA,
                cmd_payload,
                manufacturer=manufacturer,
                expect_reply=False,
                tsn=cmd_payload.tsn,
            )

        return [[foundation.WriteAttributesStatusRecord(foundation.Status.SUCCESS)]]


class TuyaOnOff(CustomCluster, OnOff):
    """Tuya On/Off cluster for On/Off device."""

    def __init__(self, *args, **kwargs):
        """Init."""
        super().__init__(*args, **kwargs)
        self.endpoint.device.switch_bus.add_listener(self)

    def switch_event(self, channel, state):
        """Switch event."""
        _LOGGER.debug(
            "%s - Received switch event message, channel: %d, state: %d",
            self.endpoint.device.ieee,
            channel,
            state,
        )
        # update status only if event == endpoint
        if self.endpoint.endpoint_id == channel:
            self._update_attribute(ATTR_ON_OFF, state)

    async def command(
        self,
        command_id: Union[foundation.GeneralCommand, int, t.uint8_t],
        *args,
        manufacturer: Optional[Union[int, t.uint16_t]] = None,
        expect_reply: bool = True,
        tsn: Optional[Union[int, t.uint8_t]] = None,
    ):
        """Override the default Cluster command."""

        if command_id in (0x0000, 0x0001):
            cmd_payload = TuyaManufCluster.Command()
            cmd_payload.status = 0
            # cmd_payload.tsn = tsn if tsn else self.endpoint.device.application.get_sequence()
            cmd_payload.tsn = 0
            cmd_payload.command_id = TUYA_CMD_BASE + self.endpoint.endpoint_id
            cmd_payload.function = 0
            cmd_payload.data = [1, command_id]

            self.endpoint.device.command_bus.listener_event(
                TUYA_MCU_COMMAND,
                cmd_payload,
            )
            return foundation.Status.SUCCESS

        return foundation.Status.UNSUP_CLUSTER_COMMAND


class TuyaManufacturerClusterOnOff(TuyaManufCluster):
    """Manufacturer Specific Cluster of On/Off device."""

    def handle_cluster_request(
        self,
        hdr: foundation.ZCLHeader,
        args: Tuple[TuyaManufCluster.Command],
        *,
        dst_addressing: Optional[
            Union[t.Addressing.Group, t.Addressing.IEEE, t.Addressing.NWK]
        ] = None,
    ) -> None:
        """Handle cluster request."""

        if hdr.command_id in (0x0002, 0x0001):
            # Send default response because the MCU expects it
            if not hdr.frame_control.disable_default_response:
                self.send_default_rsp(hdr, status=foundation.Status.SUCCESS)

            tuya_payload = args[0]
            self.endpoint.device.switch_bus.listener_event(
                SWITCH_EVENT,
                tuya_payload.command_id - TUYA_CMD_BASE,
                tuya_payload.data[1],
            )
        elif hdr.command_id == TUYA_SET_TIME:
            """Time event call super"""
            _LOGGER.debug("TUYA_SET_TIME --> hdr: %s, args: %s", hdr, args)
            super().handle_cluster_request(hdr, args, dst_addressing=dst_addressing)
        else:
            _LOGGER.warning("Unsupported command: %s", hdr)


class TuyaSwitch(CustomDevice):
    """Tuya switch device."""

    def __init__(self, *args, **kwargs):
        """Init device."""
        self.switch_bus = Bus()
        super().__init__(*args, **kwargs)


class TuyaDimmerSwitch(TuyaSwitch):
    """Tuya dimmer switch device."""

    def __init__(self, *args, **kwargs):
        """Init device."""
        self.dimmer_bus = Bus()
        super().__init__(*args, **kwargs)


class TuyaThermostatCluster(LocalDataCluster, Thermostat):
    """Thermostat cluster for Tuya thermostats."""

    _CONSTANT_ATTRIBUTES = {0x001B: Thermostat.ControlSequenceOfOperation.Heating_Only}

    def __init__(self, *args, **kwargs):
        """Init."""
        super().__init__(*args, **kwargs)
        self.endpoint.device.thermostat_bus.add_listener(self)

    def temperature_change(self, attr, value):
        """Local or target temperature change from device."""
        self._update_attribute(self.attributes_by_name[attr].id, value)

    def state_change(self, value):
        """State update from device."""
        if value == 0:
            mode = self.RunningMode.Off
            state = self.RunningState.Idle
        else:
            mode = self.RunningMode.Heat
            state = self.RunningState.Heat_State_On
        self._update_attribute(self.attributes_by_name["running_mode"].id, mode)
        self._update_attribute(self.attributes_by_name["running_state"].id, state)

    # pylint: disable=R0201
    def map_attribute(self, attribute, value):
        """Map standardized attribute value to dict of manufacturer values."""
        return {}

    async def write_attributes(self, attributes, manufacturer=None):
        """Implement writeable attributes."""

        records = self._write_attr_records(attributes)

        if not records:
            return [[foundation.WriteAttributesStatusRecord(foundation.Status.SUCCESS)]]

        manufacturer_attrs = {}
        for record in records:
            attr_name = self.attributes[record.attrid].name
            new_attrs = self.map_attribute(attr_name, record.value.value)

            _LOGGER.debug(
                "[0x%04x:%s:0x%04x] Mapping standard %s (0x%04x) "
                "with value %s to custom %s",
                self.endpoint.device.nwk,
                self.endpoint.endpoint_id,
                self.cluster_id,
                attr_name,
                record.attrid,
                repr(record.value.value),
                repr(new_attrs),
            )

            manufacturer_attrs.update(new_attrs)

        if not manufacturer_attrs:
            return [
                [
                    foundation.WriteAttributesStatusRecord(
                        foundation.Status.FAILURE, r.attrid
                    )
                    for r in records
                ]
            ]

        await self.endpoint.tuya_manufacturer.write_attributes(
            manufacturer_attrs, manufacturer=manufacturer
        )

        return [[foundation.WriteAttributesStatusRecord(foundation.Status.SUCCESS)]]

    # pylint: disable=W0236
    async def command(
        self,
        command_id: Union[foundation.GeneralCommand, int, t.uint8_t],
        *args,
        manufacturer: Optional[Union[int, t.uint16_t]] = None,
        expect_reply: bool = True,
        tsn: Optional[Union[int, t.uint8_t]] = None,
    ):
        """Implement thermostat commands."""

        if command_id != 0x0000:
            return foundation.GENERAL_COMMANDS[
                foundation.GeneralCommand.Default_Response
            ].schema(
                command_id=command_id, status=foundation.Status.UNSUP_CLUSTER_COMMAND
            )

        mode, offset = args
        if mode not in (self.SetpointMode.Heat, self.SetpointMode.Both):
            return foundation.GENERAL_COMMANDS[
                foundation.GeneralCommand.Default_Response
            ].schema(command_id=command_id, status=foundation.Status.INVALID_VALUE)

        attrid = self.attributes_by_name["occupied_heating_setpoint"].id

        success, _ = await self.read_attributes((attrid,), manufacturer=manufacturer)
        try:
            current = success[attrid]
        except KeyError:
            return foundation.Status.FAILURE

        # offset is given in decidegrees, see Zigbee cluster specification
        (res,) = await self.write_attributes(
            {"occupied_heating_setpoint": current + offset * 10},
            manufacturer=manufacturer,
        )
        return foundation.GENERAL_COMMANDS[
            foundation.GeneralCommand.Default_Response
        ].schema(command_id=command_id, status=res[0].status)


class TuyaUserInterfaceCluster(LocalDataCluster, UserInterface):
    """HVAC User interface cluster for tuya thermostats."""

    def __init__(self, *args, **kwargs):
        """Init."""
        super().__init__(*args, **kwargs)
        self.endpoint.device.ui_bus.add_listener(self)

    def child_lock_change(self, mode):
        """Change of child lock setting."""
        if mode == 0:
            lockout = self.KeypadLockout.No_lockout
        else:
            lockout = self.KeypadLockout.Level_1_lockout

        self._update_attribute(self.attributes_by_name["keypad_lockout"].id, lockout)

    def map_attribute(self, attribute, value):
        """Map standardized attribute value to dict of manufacturer values."""
        return {}

    async def write_attributes(self, attributes, manufacturer=None):
        """Defer the keypad_lockout attribute to child_lock."""

        records = self._write_attr_records(attributes)

        manufacturer_attrs = {}
        for record in records:
            if record.attrid == self.attributes_by_name["keypad_lockout"].id:
                lock = 0 if record.value.value == self.KeypadLockout.No_lockout else 1
                new_attrs = {self._CHILD_LOCK_ATTR: lock}
            else:
                attr_name = self.attributes[record.attrid].name
                new_attrs = self.map_attribute(attr_name, record.value.value)

                _LOGGER.debug(
                    "[0x%04x:%s:0x%04x] Mapping standard %s (0x%04x) "
                    "with value %s to custom %s",
                    self.endpoint.device.nwk,
                    self.endpoint.endpoint_id,
                    self.cluster_id,
                    attr_name,
                    record.attrid,
                    repr(record.value.value),
                    repr(new_attrs),
                )

            manufacturer_attrs.update(new_attrs)

        if not manufacturer_attrs:
            return [
                [
                    foundation.WriteAttributesStatusRecord(
                        foundation.Status.FAILURE, r.attrid
                    )
                    for r in records
                ]
            ]

        await self.endpoint.tuya_manufacturer.write_attributes(
            manufacturer_attrs, manufacturer=manufacturer
        )

        return [[foundation.WriteAttributesStatusRecord(foundation.Status.SUCCESS)]]


class TuyaPowerConfigurationCluster(LocalDataCluster, PowerConfiguration):
    """PowerConfiguration cluster for battery-operated thermostats."""

    def __init__(self, *args, **kwargs):
        """Init."""
        super().__init__(*args, **kwargs)
        self.endpoint.device.battery_bus.add_listener(self)

    def battery_change(self, value):
        """Change of reported battery percentage remaining."""
        self._update_attribute(
            self.attributes_by_name["battery_percentage_remaining"].id, value * 2
        )


class TuyaPowerConfigurationCluster2AA(TuyaPowerConfigurationCluster):
    """PowerConfiguration cluster for battery-operated TRVs with 2 AA."""

    BATTERY_SIZES = 0x0031
    BATTERY_RATED_VOLTAGE = 0x0034
    BATTERY_QUANTITY = 0x0033

    _CONSTANT_ATTRIBUTES = {
        BATTERY_SIZES: 3,
        BATTERY_RATED_VOLTAGE: 15,
        BATTERY_QUANTITY: 2,
    }


class TuyaPowerConfigurationCluster3AA(TuyaPowerConfigurationCluster):
    """PowerConfiguration cluster for battery-operated TRVs with 3 AA."""

    BATTERY_SIZES = 0x0031
    BATTERY_RATED_VOLTAGE = 0x0034
    BATTERY_QUANTITY = 0x0033

    _CONSTANT_ATTRIBUTES = {
        BATTERY_SIZES: 3,
        BATTERY_RATED_VOLTAGE: 15,
        BATTERY_QUANTITY: 3,
    }


class TuyaThermostat(CustomDevice):
    """Generic Tuya thermostat device."""

    def __init__(self, *args, **kwargs):
        """Init device."""
        self.thermostat_bus = Bus()
        self.ui_bus = Bus()
        self.battery_bus = Bus()
        super().__init__(*args, **kwargs)


# Tuya Zigbee OnOff Cluster Attribute Implementation
class SwitchBackLight(t.enum8):
    """Tuya switch back light mode enum."""

    Mode_0 = 0x00
    Mode_1 = 0x01
    Mode_2 = 0x02


class SwitchMode(t.enum8):
    """Tuya switch mode enum."""

    Command = 0x00
    Event = 0x01


class PowerOnState(t.enum8):
    """Tuya power on state enum."""

    Off = 0x00
    On = 0x01
    LastState = 0x02


class TuyaZBOnOffAttributeCluster(CustomCluster, OnOff):
    """Tuya Zigbee On Off cluster with extra attributes."""

    attributes = OnOff.attributes.copy()
    attributes.update({0x8000: ("child_lock", t.Bool)})
    attributes.update({0x8001: ("backlight_mode", SwitchBackLight)})
    attributes.update({0x8002: ("power_on_state", PowerOnState)})
    attributes.update({0x8004: ("switch_mode", SwitchMode)})


class TuyaSmartRemoteOnOffCluster(OnOff, EventableCluster):
    """TuyaSmartRemoteOnOffCluster: fire events corresponding to press type."""

    rotate_type = {
        0x00: RIGHT,
        0x01: LEFT,
        0x02: STOP,
    }
    press_type = {
        0x00: SHORT_PRESS,
        0x01: DOUBLE_PRESS,
        0x02: LONG_PRESS,
    }
    name = "TS004X_cluster"
    ep_attribute = "TS004X_cluster"
    attributes = OnOff.attributes.copy()
    attributes.update({0x8001: ("backlight_mode", SwitchBackLight)})
    attributes.update({0x8002: ("power_on_state", PowerOnState)})
    attributes.update({0x8004: ("switch_mode", SwitchMode)})

    def __init__(self, *args, **kwargs):
        """Init."""
        self.last_tsn = -1
        super().__init__(*args, **kwargs)

    server_commands = OnOff.server_commands.copy()
    server_commands.update(
        {
            0xFC: foundation.ZCLCommandDef(
                "rotate_type",
                {"rotate_type": t.uint8_t},
                False,
                is_manufacturer_specific=True,
            ),
            0xFD: foundation.ZCLCommandDef(
                "press_type",
                {"press_type": t.uint8_t},
                False,
                is_manufacturer_specific=True,
            ),
        }
    )

    def handle_cluster_request(
        self,
        hdr: foundation.ZCLHeader,
        args: List[Any],
        *,
        dst_addressing: Optional[
            Union[t.Addressing.Group, t.Addressing.IEEE, t.Addressing.NWK]
        ] = None,
    ):
        """Handle press_types command."""
        # normally if default response sent, TS004x wouldn't send such repeated zclframe (with same sequence number),
        # but for stability reasons (e. g. the case the response doesn't arrive the device), we can simply ignore it
        if hdr.tsn == self.last_tsn:
            _LOGGER.debug("TS004X: ignoring duplicate frame")
            return
        # save last sequence number
        self.last_tsn = hdr.tsn

        # send default response (as soon as possible), so avoid repeated zclframe from device
        if not hdr.frame_control.disable_default_response:
            self.debug("TS004X: send default response")
            self.send_default_rsp(hdr, status=foundation.Status.SUCCESS)
        # handle command
        if hdr.command_id == 0xFC:
            rotate_type = args[0]
            self.listener_event(
                ZHA_SEND_EVENT, self.rotate_type.get(rotate_type, "unknown"), []
            )
        elif hdr.command_id == 0xFD:
            press_type = args[0]
            self.listener_event(
                ZHA_SEND_EVENT, self.press_type.get(press_type, "unknown"), []
            )


# Tuya Zigbee Metering Cluster Correction Implementation
class TuyaZBMeteringCluster(CustomCluster, Metering):
    """Divides the kWh for tuya."""

    MULTIPLIER = 0x0301
    DIVISOR = 0x0302
    _CONSTANT_ATTRIBUTES = {MULTIPLIER: 1, DIVISOR: 100}


class TuyaZBElectricalMeasurement(CustomCluster, ElectricalMeasurement):
    """Divides the Current for tuya."""

    AC_CURRENT_MULTIPLIER = 0x0602
    AC_CURRENT_DIVISOR = 0x0603
    _CONSTANT_ATTRIBUTES = {AC_CURRENT_MULTIPLIER: 1, AC_CURRENT_DIVISOR: 1000}


# Tuya Zigbee Cluster 0xE000 Implementation
class TuyaZBE000Cluster(CustomCluster):
    """Tuya manufacturer specific cluster 57344."""

    name = "Tuya Manufacturer Specific"
    cluster_id = TUYA_CLUSTER_E000_ID
    ep_attribute = "tuya_is_pita_0"


# Tuya Zigbee Cluster 0xE001 Implementation
class ExternalSwitchType(t.enum8):
    """Tuya external switch type enum."""

    Toggle = 0x00
    State = 0x01
    Momentary = 0x02


class TuyaZBExternalSwitchTypeCluster(CustomCluster):
    """Tuya External Switch Type Cluster."""

    name = "Tuya External Switch Type Cluster"
    cluster_id = TUYA_CLUSTER_E001_ID
    ep_attribute = "tuya_external_switch_type"
    attributes = {0xD030: ("external_switch_type", ExternalSwitchType)}


# Tuya Window Cover Implementation
class TuyaManufacturerWindowCover(TuyaManufCluster):
    """Manufacturer Specific Cluster for cover device."""

    def handle_cluster_request(
        self,
        hdr: foundation.ZCLHeader,
        args: Tuple[TuyaManufCluster.Command],
        *,
        dst_addressing: Optional[
            Union[t.Addressing.Group, t.Addressing.IEEE, t.Addressing.NWK]
        ] = None,
    ) -> None:
        """Handle cluster request."""
        """Tuya Specific Cluster Commands"""
        if hdr.command_id in (TUYA_GET_DATA, TUYA_SET_DATA_RESPONSE):
            tuya_payload = args[0]
            _LOGGER.debug(
                "%s Received Attribute Report. Command is 0x%04x, Tuya Paylod values"
                "[Status : %s, TSN: %s, Command: 0x%04x, Function: 0x%02x, Data: %s]",
                self.endpoint.device.ieee,
                hdr.command_id,
                tuya_payload.status,
                tuya_payload.tsn,
                tuya_payload.command_id,
                tuya_payload.function,
                tuya_payload.data,
            )

            if tuya_payload.command_id == TUYA_DP_TYPE_VALUE + TUYA_DP_ID_PERCENT_STATE:
                self.endpoint.device.cover_bus.listener_event(
                    COVER_EVENT,
                    ATTR_COVER_POSITION,
                    tuya_payload.data[4],
                )
            elif (
                tuya_payload.command_id
                == TUYA_DP_TYPE_VALUE + TUYA_DP_ID_PERCENT_CONTROL
            ):
                self.endpoint.device.cover_bus.listener_event(
                    COVER_EVENT,
                    ATTR_COVER_POSITION,
                    tuya_payload.data[4],
                )
            elif (
                tuya_payload.command_id
                == TUYA_DP_TYPE_ENUM + TUYA_DP_ID_DIRECTION_CHANGE
            ):
                self.endpoint.device.cover_bus.listener_event(
                    COVER_EVENT,
                    ATTR_COVER_DIRECTION,
                    tuya_payload.data[1],
                )
            elif (
                tuya_payload.command_id == TUYA_DP_TYPE_ENUM + TUYA_DP_ID_COVER_INVERTED
            ):
                self.endpoint.device.cover_bus.listener_event(
                    COVER_EVENT,
                    ATTR_COVER_INVERTED,
                    tuya_payload.data[1],  # Check this
                )
        elif hdr.command_id == TUYA_SET_TIME:
            """Time event call super"""
            super().handle_cluster_request(hdr, args, dst_addressing=dst_addressing)
        else:
            _LOGGER.debug(
                "%s Received Attribute Report - Unknown Command. Self [%s], Header [%s], Tuya Paylod [%s]",
                self.endpoint.device.ieee,
                self,
                hdr,
                args,
            )


class TuyaWindowCoverControl(LocalDataCluster, WindowCovering):
    """Manufacturer Specific Cluster of Device cover."""

    """Add additional attributes for direction"""
    attributes = WindowCovering.attributes.copy()
    attributes.update({ATTR_COVER_DIRECTION: ("motor_direction", t.Bool)})
    attributes.update({ATTR_COVER_INVERTED: ("cover_inverted", t.Bool)})

    def __init__(self, *args, **kwargs):
        """Initialize instance."""
        super().__init__(*args, **kwargs)
        self.endpoint.device.cover_bus.add_listener(self)

    def cover_event(self, attribute, value):
        """Event listener for cover events."""
        if attribute == ATTR_COVER_POSITION:
            invert_attr = self._attr_cache.get(ATTR_COVER_INVERTED) == 1
            invert = (
                not invert_attr
                if self.endpoint.device.manufacturer in TUYA_COVER_INVERTED_BY_DEFAULT
                else invert_attr
            )
            value = value if invert else 100 - value
        self._update_attribute(attribute, value)
        _LOGGER.debug(
            "%s Tuya Attribute Cache : [%s]",
            self.endpoint.device.ieee,
            self._attr_cache,
        )

    def command(
        self,
        command_id: Union[foundation.GeneralCommand, int, t.uint8_t],
        *args,
        manufacturer: Optional[Union[int, t.uint16_t]] = None,
        expect_reply: bool = True,
        tsn: Optional[Union[int, t.uint8_t]] = None,
    ):
        """Override the default Cluster command."""
        if manufacturer is None:
            manufacturer = self.endpoint.device.manufacturer
        _LOGGER.debug(
            "%s Sending Tuya Cluster Command.. Manufacturer is %s Cluster Command is 0x%04x, Arguments are %s",
            self.endpoint.device.ieee,
            manufacturer,
            command_id,
            args,
        )
        # Open Close or Stop commands
        tuya_payload = TuyaManufCluster.Command()
        if command_id in (
            WINDOW_COVER_COMMAND_UPOPEN,
            WINDOW_COVER_COMMAND_DOWNCLOSE,
            WINDOW_COVER_COMMAND_STOP,
        ):
            tuya_payload.status = 0
            tuya_payload.tsn = tsn if tsn else 0
            tuya_payload.command_id = TUYA_DP_TYPE_ENUM + TUYA_DP_ID_CONTROL
            tuya_payload.function = 0
            tuya_payload.data = [
                1,
                # need to implement direction change
                TUYA_COVER_COMMAND[manufacturer][command_id],
            ]  # remap the command to the Tuya command
        # Set Position Command
        elif command_id == WINDOW_COVER_COMMAND_LIFTPERCENT:
            tuya_payload.status = 0
            tuya_payload.tsn = tsn if tsn else 0
            tuya_payload.command_id = TUYA_DP_TYPE_VALUE + TUYA_DP_ID_PERCENT_CONTROL
            tuya_payload.function = 0
            """Check direction and correct value"""
            invert_attr = self._attr_cache.get(ATTR_COVER_INVERTED) == 1
            invert = (
                not invert_attr
                if self.endpoint.device.manufacturer in TUYA_COVER_INVERTED_BY_DEFAULT
                else invert_attr
            )
            position = args[0] if invert else 100 - args[0]
            tuya_payload.data = [
                4,
                0,
                0,
                0,
                position,
            ]
        # Custom Command
        elif command_id == WINDOW_COVER_COMMAND_CUSTOM:
            tuya_payload.status = args[0]
            tuya_payload.tsn = args[1]
            tuya_payload.command_id = args[2]
            tuya_payload.function = args[3]
            tuya_payload.data = args[4]
        else:
            tuya_payload = None
        # Send the command
        if tuya_payload.command_id:
            _LOGGER.debug(
                "%s Sending Tuya Command. Paylod values [endpoint_id : %s, "
                "Status : %s, TSN: %s, Command: 0x%04x, Function: %s, Data: %s]",
                self.endpoint.device.ieee,
                self.endpoint.endpoint_id,
                tuya_payload.status,
                tuya_payload.tsn,
                tuya_payload.command_id,
                tuya_payload.function,
                tuya_payload.data,
            )

            return self.endpoint.tuya_manufacturer.command(
                TUYA_SET_DATA, tuya_payload, expect_reply=True
            )
        else:
            _LOGGER.debug("Unrecognised command: %x", command_id)
            return foundation.Status.UNSUP_CLUSTER_COMMAND


class TuyaWindowCover(CustomDevice):
    """Tuya switch device."""

    def __init__(self, *args, **kwargs):
        """Init device."""
        self.cover_bus = Bus()
        super().__init__(*args, **kwargs)


class TuyaManufacturerLevelControl(TuyaManufCluster):
    """Manufacturer Specific Cluster for cover device."""

    def handle_cluster_request(
        self,
        hdr: foundation.ZCLHeader,
        args: Tuple[TuyaManufCluster.Command],
        *,
        dst_addressing: Optional[
            Union[t.Addressing.Group, t.Addressing.IEEE, t.Addressing.NWK]
        ] = None,
    ) -> None:
        """Handle cluster request."""
        tuya_payload = args[0]

        _LOGGER.debug(
            "%s Received Attribute Report. Command is %x, Tuya Paylod values"
            "[Status : %s, TSN: %s, Command: %s, Function: %s, Data: %s]",
            self.endpoint.device.ieee,
            hdr.command_id,
            tuya_payload.status,
            tuya_payload.tsn,
            tuya_payload.command_id,
            tuya_payload.function,
            tuya_payload.data,
        )

        if hdr.command_id in (0x0002, 0x0001):
            if tuya_payload.command_id == TUYA_LEVEL_COMMAND:
                self.endpoint.device.dimmer_bus.listener_event(
                    LEVEL_EVENT,
                    tuya_payload.command_id,
                    tuya_payload.data,
                )
            else:
                self.endpoint.device.switch_bus.listener_event(
                    SWITCH_EVENT,
                    tuya_payload.command_id - TUYA_CMD_BASE,
                    tuya_payload.data[1],
                )


class TuyaLevelControl(CustomCluster, LevelControl):
    """Tuya Level cluster for dimmable device."""

    def __init__(self, *args, **kwargs):
        """Init."""
        super().__init__(*args, **kwargs)
        self.endpoint.device.dimmer_bus.add_listener(self)

    def level_event(self, channel, state):
        """Level event."""
        level = (((state[3] << 8) + state[4]) * 255) // 1000
        _LOGGER.debug(
            "%s - Received level event message, channel: %d, level: %d, data: %d",
            self.endpoint.device.ieee,
            channel,
            level,
            state,
        )
        self._update_attribute(self.attributes_by_name["current_level"].id, level)

    def command(
        self,
        command_id: Union[foundation.GeneralCommand, int, t.uint8_t],
        *args,
        manufacturer: Optional[Union[int, t.uint16_t]] = None,
        expect_reply: bool = True,
        tsn: Optional[Union[int, t.uint8_t]] = None,
    ):
        """Override the default Cluster command."""
        _LOGGER.debug(
            "%s Sending Tuya Cluster Command.. Cluster Command is %x, Arguments are %s",
            self.endpoint.device.ieee,
            command_id,
            args,
        )
        # Move to level
        # move_to_level_with_on_off
        if command_id in (0x0000, 0x0001, 0x0004):
            cmd_payload = TuyaManufCluster.Command()
            cmd_payload.status = 0
            cmd_payload.tsn = 0
            cmd_payload.command_id = TUYA_LEVEL_COMMAND
            cmd_payload.function = 0
            brightness = (args[0] * 1000) // 255
            val1 = brightness >> 8
            val2 = brightness & 0xFF
            cmd_payload.data = [4, 0, 0, val1, val2]  # Custom Command

            return self.endpoint.tuya_manufacturer.command(
                TUYA_SET_DATA, cmd_payload, expect_reply=True
            )

        return foundation.Status.UNSUP_CLUSTER_COMMAND


class TuyaLocalCluster(LocalDataCluster):
    """Tuya virtual clusters.

    Prevents attribute reads and writes. Attribute writes could be converted
    to DataPoint updates.
    """

    def update_attribute(self, attr_name: str, value: Any) -> None:
        """Update attribute by attribute name."""

        try:
            attr = self.attributes_by_name[attr_name]
        except KeyError:
            self.debug("no such attribute: %s", attr_name)
            return
        return self._update_attribute(attr.id, value)


@dataclasses.dataclass
class DPToAttributeMapping:
    """Container for datapoint to cluster attribute update mapping."""

    ep_attribute: str
    attribute_name: str
    converter: Optional[
        Callable[
            [
                Any,
            ],
            Any,
        ]
    ] = None
    endpoint_id: Optional[int] = None


class TuyaNewManufCluster(CustomCluster):
    """Tuya manufacturer specific cluster.

    This is an attempt to consolidate the multiple above clusters into a
    single framework. Instead of overriding the handle_cluster_request()
    method, implement handlers for commands, like get_data, set_data_response,
    set_time_request, etc.
    """

    name: str = "Tuya Manufacturer Specific"
    cluster_id: t.uint16_t = TUYA_CLUSTER_ID
    ep_attribute: str = "tuya_manufacturer"

    server_commands = {
        TUYA_SET_DATA: foundation.ZCLCommandDef(
            "set_data", {"data": TuyaCommand}, False, is_manufacturer_specific=True
        ),
        TUYA_SEND_DATA: foundation.ZCLCommandDef(
            "send_data", {"data": TuyaCommand}, False, is_manufacturer_specific=True
        ),
        TUYA_SET_TIME: foundation.ZCLCommandDef(
            "set_time", {"time": TuyaTimePayload}, False, is_manufacturer_specific=True
        ),
    }

    client_commands = {
        TUYA_GET_DATA: foundation.ZCLCommandDef(
            "get_data", {"data": TuyaCommand}, True, is_manufacturer_specific=True
        ),
        TUYA_SET_DATA_RESPONSE: foundation.ZCLCommandDef(
            "set_data_response",
            {"data": TuyaCommand},
            True,
            is_manufacturer_specific=True,
        ),
        TUYA_ACTIVE_STATUS_RPT: foundation.ZCLCommandDef(
            "active_status_report",
            {"data": TuyaCommand},
            True,
            is_manufacturer_specific=True,
        ),
        TUYA_SET_TIME: foundation.ZCLCommandDef(
            "set_time_request", {"data": t.data16}, True, is_manufacturer_specific=True
        ),
    }

    data_point_handlers: Dict[int, str] = {}

    def handle_cluster_request(
        self,
        hdr: foundation.ZCLHeader,
        args: Tuple,
        *,
        dst_addressing: Optional[
            Union[t.Addressing.Group, t.Addressing.IEEE, t.Addressing.NWK]
        ] = None,
    ) -> None:
        """Handle cluster specific request."""

        try:
            if hdr.is_reply:
                # server_cluster -> client_cluster cluster specific command
                handler_name = f"handle_{self.client_commands[hdr.command_id].name}"
            else:
                handler_name = f"handle_{self.server_commands[hdr.command_id].name}"
        except KeyError:
            self.debug(
                "Received unknown manufacturer command %s: %s", hdr.command_id, args
            )
            if not hdr.frame_control.disable_default_response:
                self.send_default_rsp(
                    hdr, status=foundation.Status.UNSUP_CLUSTER_COMMAND
                )
                return

        try:
            status = getattr(self, handler_name)(*args)
        except AttributeError:
            self.warning(
                "No '%s' tuya handler found for %s",
                handler_name,
                args,
            )
            status = foundation.Status.UNSUP_CLUSTER_COMMAND

        if not hdr.frame_control.disable_default_response:
            self.send_default_rsp(hdr, status=status)

    def handle_get_data(self, command: TuyaCommand) -> foundation.Status:
        """Handle get_data response (report)."""
        try:
            dp_handler = self.data_point_handlers[command.dp]
            getattr(self, dp_handler)(command)
        except (AttributeError, KeyError):
            self.debug("No datapoint handler for %s", command)
            return foundation.status.UNSUPPORTED_ATTRIBUTE

        return foundation.Status.SUCCESS

    handle_set_data_response = handle_get_data
    handle_active_status_report = handle_get_data

    def handle_set_time_request(self, payload: t.uint16_t) -> foundation.Status:
        """Handle Time set request."""
        return foundation.Status.SUCCESS

    def _dp_2_attr_update(self, command: TuyaCommand) -> None:
        """Handle data point to attribute report conversion."""
        try:
            dp_map = self.dp_to_attribute[command.dp]
        except KeyError:
            self.debug("No attribute mapping for %s data point", command.dp)
            return

        endpoint = self.endpoint
        if dp_map.endpoint_id:
            endpoint = self.endpoint.device.endpoints[dp_map.endpoint_id]
        cluster = getattr(endpoint, dp_map.ep_attribute)
        value = command.data.payload
        if dp_map.converter:
            value = dp_map.converter(value)

        cluster.update_attribute(dp_map.attribute_name, value)

@sebastian3107
Copy link

If you think its helpful to put in a pr for this quirk just with the knowledge that open close and pause works, then I'll be happy to submit my quirk for that. I'm just not sure my quirk is "best".

That's much better than having nothing at all. I would definitely appreciate this PR. 👍🏼

@sebastian3107
Copy link

sebastian3107 commented Jul 20, 2023

ts0601_cover.py
init.py

Thank you for providing the files. I needed to change line 13 to from zha_quirks import ( because I named the folder like this: /config/zha_quirks/ts0601_cover_local.py
Now it is working fine, so thank you for everyone involved. I did not even need to re-pair the devices. 🥇
I also own some curtain motors with device name TS0601 _TZE200_xaabybja and can confirm that they are also working fine with this custom quirk.

Still it would be better if this was provided as a PR.
That's also what the warning in the log says:
Loaded custom quirks. Please contribute them to https://github.com/zigpy/zha-device-handlers

@ChujieChen
Copy link

ChujieChen commented Jul 27, 2023

=== updated on Jul 30, 2023 ===
The curtain is working now! I used the .zip uploaded by cc @TheJulianJES Thanks!

Changes are rebased on tag:0.0.101.

Those files are:

/config/zha-device-handlers/zhaquirks/tuya/__init__.py
"""Tuya devices."""
import dataclasses
import datetime
import logging
from typing import Any, Callable, Dict, List, Optional, Tuple, Union

from zigpy.quirks import CustomCluster, CustomDevice
import zigpy.types as t
from zigpy.zcl import foundation
from zigpy.zcl.clusters.closures import WindowCovering
from zigpy.zcl.clusters.general import LevelControl, OnOff, PowerConfiguration
from zigpy.zcl.clusters.homeautomation import ElectricalMeasurement
from zigpy.zcl.clusters.hvac import Thermostat, UserInterface
from zigpy.zcl.clusters.smartenergy import Metering

from zhaquirks import Bus, EventableCluster, LocalDataCluster
from zhaquirks.const import (
    DOUBLE_PRESS,
    LEFT,
    LONG_PRESS,
    RIGHT,
    SHORT_PRESS,
    ZHA_SEND_EVENT,
)

# ---------------------------------------------------------
# Tuya Custom Cluster ID
# ---------------------------------------------------------
TUYA_CLUSTER_ID = 0xEF00
TUYA_CLUSTER_E000_ID = 0xE000
TUYA_CLUSTER_E001_ID = 0xE001
# ---------------------------------------------------------
# Tuya Cluster Commands
# ---------------------------------------------------------
TUYA_SET_DATA = 0x00
TUYA_GET_DATA = 0x01
TUYA_SET_DATA_RESPONSE = 0x02
TUYA_SEND_DATA = 0x04
TUYA_ACTIVE_STATUS_RPT = 0x06
TUYA_SET_TIME = 0x24
# TODO: To be checked
TUYA_MCU_VERSION_REQ = 0x10
TUYA_MCU_VERSION_RSP = 0x11
#
TUYA_LEVEL_COMMAND = 514

COVER_EVENT = "cover_event"
LEVEL_EVENT = "level_event"
TUYA_MCU_COMMAND = "tuya_mcu_command"

# Rotating for remotes
STOP = "stop"  # To constans

# ---------------------------------------------------------
# Value for dp_type
# ---------------------------------------------------------
# ID    Name            Description
# ---------------------------------------------------------
# 0x00 	DP_TYPE_RAW 	?
# 0x01 	DP_TYPE_BOOL 	?
# 0x02 	DP_TYPE_VALUE 	4 byte unsigned integer
# 0x03 	DP_TYPE_STRING 	variable length string
# 0x04 	DP_TYPE_ENUM 	1 byte enum
# 0x05 	DP_TYPE_FAULT 	1 byte bitmap (didn't test yet)
TUYA_DP_TYPE_RAW = 0x0000
TUYA_DP_TYPE_BOOL = 0x0100
TUYA_DP_TYPE_VALUE = 0x0200
TUYA_DP_TYPE_STRING = 0x0300
TUYA_DP_TYPE_ENUM = 0x0400
TUYA_DP_TYPE_FAULT = 0x0500
# ---------------------------------------------------------
# Value for dp_identifier (These are device specific)
# ---------------------------------------------------------
# ID    Name               Type    Description
# ---------------------------------------------------------
# 0x01  control            enum    open, stop, close, continue
# 0x02  percent_control    value   0-100% control
# 0x03  percent_state      value   Report from motor about current percentage
# 0x04  control_back       enum    Configures motor direction (untested)
# 0x05  work_state         enum    Motor Direction Setting
# 0x06  situation_set      enum    Configures if 100% equals to fully closed or fully open (untested)
# 0x07  fault              bitmap  Anything but 0 means something went wrong (untested)
TUYA_DP_ID_CONTROL = 0x01
TUYA_DP_ID_PERCENT_CONTROL = 0x02
TUYA_DP_ID_PERCENT_STATE = 0x03
TUYA_DP_ID_DIRECTION_CHANGE = 0x05
TUYA_DP_ID_COVER_INVERTED = 0x06
# ---------------------------------------------------------
# Window Cover Server Commands
# ---------------------------------------------------------
WINDOW_COVER_COMMAND_UPOPEN = 0x0000
WINDOW_COVER_COMMAND_DOWNCLOSE = 0x0001
WINDOW_COVER_COMMAND_STOP = 0x0002
WINDOW_COVER_COMMAND_LIFTPERCENT = 0x0005
WINDOW_COVER_COMMAND_CUSTOM = 0x0006
# ---------------------------------------------------------
# TUYA Cover Custom Values
# ---------------------------------------------------------
COVER_EVENT = "cover_event"
ATTR_COVER_POSITION = 0x0008
ATTR_COVER_DIRECTION = 0x8001
ATTR_COVER_INVERTED = 0x8002
# For most tuya devices 0 = Up/Open, 1 = Stop, 2 = Down/Close
TUYA_COVER_COMMAND = {
    "_TZE200_zah67ekd": {0x0000: 0x0000, 0x0001: 0x0002, 0x0002: 0x0001},
    "_TZE200_fzo2pocs": {0x0000: 0x0000, 0x0001: 0x0002, 0x0002: 0x0001},
    "_TZE200_xuzcvlku": {0x0000: 0x0000, 0x0001: 0x0002, 0x0002: 0x0001},
    "_TZE200_rddyvrci": {0x0000: 0x0002, 0x0001: 0x0001, 0x0002: 0x0000},
    "_TZE200_3i3exuay": {0x0000: 0x0000, 0x0001: 0x0002, 0x0002: 0x0001},
    "_TZE200_nueqqe6k": {0x0000: 0x0000, 0x0001: 0x0002, 0x0002: 0x0001},
    "_TZE200_gubdgai2": {0x0000: 0x0000, 0x0001: 0x0002, 0x0002: 0x0001},
    "_TZE200_zpzndjez": {0x0000: 0x0000, 0x0001: 0x0002, 0x0002: 0x0001},
    "_TZE200_cowvfni3": {0x0000: 0x0002, 0x0001: 0x0000, 0x0002: 0x0001},
    "_TYST11_wmcdj3aq": {0x0000: 0x0000, 0x0001: 0x0002, 0x0002: 0x0001},
    "_TZE200_yenbr4om": {0x0000: 0x0000, 0x0001: 0x0002, 0x0002: 0x0001},
    "_TZE200_5sbebbzs": {0x0000: 0x0000, 0x0001: 0x0002, 0x0002: 0x0001},
    "_TZE200_xaabybja": {0x0000: 0x0000, 0x0001: 0x0002, 0x0002: 0x0001},
    "_TZE200_hsgrhjpf": {0x0000: 0x0000, 0x0001: 0x0002, 0x0002: 0x0001},
    "_TZE200_iossyxra": {0x0000: 0x0000, 0x0001: 0x0002, 0x0002: 0x0001},
    "_TZE200_68nvbio9": {0x0000: 0x0000, 0x0001: 0x0002, 0x0002: 0x0001},
    "_TZE200_zuz7f94z": {0x0000: 0x0000, 0x0001: 0x0002, 0x0002: 0x0001},
    "_TZE200_ergbiejo": {0x0000: 0x0000, 0x0001: 0x0002, 0x0002: 0x0001},
    "_TZE200_rmymn92d": {0x0000: 0x0000, 0x0001: 0x0002, 0x0002: 0x0001},
}
# Taken from zigbee-herdsman-converters
# Contains all covers which need their position inverted by default
# Default is 100 = open, 0 = closed; Devices listed here will use 0 = open, 100 = closed instead
# Use manufacturerName to identify device!
# Don't invert _TZE200_cowvfni3: https://github.com/Koenkk/zigbee2mqtt/issues/6043
TUYA_COVER_INVERTED_BY_DEFAULT = [
    "_TZE200_wmcdj3aq",
    "_TZE200_nogaemzt",
    "_TZE200_xuzcvlku",
    "_TZE200_xaabybja",
    "_TZE200_yenbr4om",
    "_TZE200_zpzndjez",
    "_TZE200_zuz7f94z",
    "_TZE200_rmymn92d",
]

# ---------------------------------------------------------
# TUYA Switch Custom Values
# ---------------------------------------------------------
SWITCH_EVENT = "switch_event"
ATTR_ON_OFF = 0x0000
ATTR_COVER_POSITION = 0x0008
TUYA_CMD_BASE = 0x0100
# ---------------------------------------------------------
# DP Value meanings in Status Report
# ---------------------------------------------------------
# Type ID    IntDP   Description
# ---------------------------------------------------------
# 0x04 0x01  1025    Confirm opening/closing/stopping (triggered from Zigbee)
# 0x02 0x02   514    Started moving to position (triggered from Zigbee)
# 0x04 0x07  1031    Started moving (triggered by transmitter order pulling on curtain)
# 0x02 0x03   515    Arrived at position
# 0x01 0x05   261    Returned by configuration set; ignore
# 0x02 0x69   617    Not sure what this is
# 0x04 0x05  1029    Changed the Motor Direction
# 0x04 0x65  1125    Change of tilt/lift mode 1 = lift 0=tilt
# ---------------------------------------------------------

_LOGGER = logging.getLogger(__name__)


class BigEndianInt16(int):
    """Helper class to represent big endian 16 bit value."""

    def serialize(self) -> bytes:
        """Value serialisation."""

        try:
            return self.to_bytes(2, "big", signed=False)
        except OverflowError as e:
            # OverflowError is not a subclass of ValueError, making it annoying to catch
            raise ValueError(str(e)) from e

    @classmethod
    def deserialize(cls, data: bytes) -> Tuple["BigEndianInt16", bytes]:
        """Value deserialisation."""

        if len(data) < 2:
            raise ValueError(f"Data is too short to contain {cls._size} bytes")

        r = cls.from_bytes(data[:2], "big", signed=False)
        data = data[2:]
        return r, data


class TuyaTimePayload(t.LVList, item_type=t.uint8_t, length_type=BigEndianInt16):
    """Tuya set time payload definition."""


class TuyaDPType(t.enum8):
    """DataPoint Type."""

    RAW = 0x00
    BOOL = 0x01
    VALUE = 0x02
    STRING = 0x03
    ENUM = 0x04
    BITMAP = 0x05


class TuyaData(t.Struct):
    """Tuya Data type."""

    dp_type: TuyaDPType
    function: t.uint8_t
    raw: t.LVBytes

    @classmethod
    def deserialize(cls, data: bytes) -> Tuple["TuyaData", bytes]:
        """Deserialize data."""
        res = cls()
        res.dp_type, data = TuyaDPType.deserialize(data)
        res.function, data = t.uint8_t.deserialize(data)
        res.raw, data = t.LVBytes.deserialize(data)
        if res.dp_type not in (TuyaDPType.BITMAP, TuyaDPType.STRING, TuyaDPType.ENUM):
            res.raw = res.raw[::-1]
        return res, data

    @property
    def payload(self) -> Union[t.Bool, t.CharacterString, t.uint32_t, t.data32]:
        """Payload accordingly to data point type."""
        if self.dp_type == TuyaDPType.VALUE:
            return t.uint32_t.deserialize(self.raw)[0]
        elif self.dp_type == TuyaDPType.BOOL:
            return t.Bool.deserialize(self.raw)[0]
        elif self.dp_type == TuyaDPType.STRING:
            return self.raw.decode("utf8")
        elif self.dp_type == TuyaDPType.ENUM:
            return t.enum8.deserialize(self.raw)[0]
        elif self.dp_type == TuyaDPType.BITMAP:
            bitmaps = {1: t.bitmap8, 2: t.bitmap16, 4: t.bitmap32}
            try:
                return bitmaps[len(self.raw)].deserialize(self.raw)[0]
            except KeyError as exc:
                raise ValueError(f"Wrong bitmap length: {len(self.raw)}") from exc

        raise ValueError(f"Unknown {self.dp_type} datapoint type")


class Data(t.List, item_type=t.uint8_t):
    """list of uint8_t."""

    @classmethod
    def from_value(cls, value):
        """Convert from a zigpy typed value to a tuya data payload."""
        # serialized in little-endian by zigpy
        data = cls(value.serialize())
        # we want big-endian, with length prepended
        data.append(len(data))
        data.reverse()
        return data

    def to_value(self, ztype):
        """Convert from a tuya data payload to a zigpy typed value."""
        # first uint8_t is the length of the remaining data
        # tuya data is in big endian whereas ztypes use little endian
        value, _ = ztype.deserialize(bytes(reversed(self[1:])))
        return value


class TuyaCommand(t.Struct):
    """Tuya manufacturer cluster command."""

    status: t.uint8_t
    tsn: t.uint8_t
    dp: t.uint8_t
    data: TuyaData


class TuyaManufCluster(CustomCluster):
    """Tuya manufacturer specific cluster."""

    name = "Tuya Manufacturer Specicific"
    cluster_id = TUYA_CLUSTER_ID
    ep_attribute = "tuya_manufacturer"
    set_time_offset = 0
    set_time_local_offset = None

    class Command(t.Struct):
        """Tuya manufacturer cluster command."""

        status: t.uint8_t
        tsn: t.uint8_t
        command_id: t.uint16_t
        function: t.uint8_t
        data: Data

    class MCUVersionRsp(t.Struct):
        """Tuya MCU version response Zcl payload."""

        tsn: t.uint16_t
        version: t.uint8_t

    """ Time sync command (It's transparent between MCU and server)
            Time request device -> server
               payloadSize = 0
            Set time, server -> device
               payloadSize, should be always 8
               payload[0-3] - UTC timestamp (big endian)
               payload[4-7] - Local timestamp (big endian)

            Zigbee payload is very similar to the UART payload which is described here: https://developer.tuya.com/en/docs/iot/device-development/access-mode-mcu/zigbee-general-solution/tuya-zigbee-module-uart-communication-protocol/tuya-zigbee-module-uart-communication-protocol?id=K9ear5khsqoty#title-10-Time%20synchronization

            Some devices need the timestamp in seconds from 1/1/1970 and others in seconds from 1/1/2000.
            Also, there is devices which uses both timestamps variants (probably bug). Use set_time_local_offset var in this cases.

            NOTE: You need to wait for time request before setting it. You can't set time without request."""

    server_commands = {
        0x0000: foundation.ZCLCommandDef(
            "set_data", {"param": Command}, False, is_manufacturer_specific=False
        ),
        0x0010: foundation.ZCLCommandDef(
            "mcu_version_req",
            {"param": t.uint16_t},
            False,
            is_manufacturer_specific=True,
        ),
        0x0024: foundation.ZCLCommandDef(
            "set_time", {"param": TuyaTimePayload}, False, is_manufacturer_specific=True
        ),
    }

    client_commands = {
        0x0001: foundation.ZCLCommandDef(
            "get_data", {"param": Command}, True, is_manufacturer_specific=True
        ),
        0x0002: foundation.ZCLCommandDef(
            "set_data_response", {"param": Command}, True, is_manufacturer_specific=True
        ),
        0x0006: foundation.ZCLCommandDef(
            "active_status_report",
            {"param": Command},
            True,
            is_manufacturer_specific=True,
        ),
        0x0011: foundation.ZCLCommandDef(
            "mcu_version_rsp",
            {"param": MCUVersionRsp},
            True,
            is_manufacturer_specific=True,
        ),
        0x0024: foundation.ZCLCommandDef(
            "set_time_request", {"param": t.data16}, True, is_manufacturer_specific=True
        ),
    }

    def __init__(self, *args, **kwargs):
        """Init."""
        super().__init__(*args, **kwargs)
        self.endpoint.device.command_bus = Bus()
        self.endpoint.device.command_bus.add_listener(self)  # listen MCU commands

    def tuya_mcu_command(self, command: Command):
        """Tuya MCU command listener. Only endpoint:1 must listen to MCU commands."""

        self.create_catching_task(
            self.command(TUYA_SET_DATA, command, expect_reply=True)
        )

    def handle_cluster_request(
        self,
        hdr: foundation.ZCLHeader,
        args: Tuple,
        *,
        dst_addressing: Optional[
            Union[t.Addressing.Group, t.Addressing.IEEE, t.Addressing.NWK]
        ] = None,
    ) -> None:
        """Handle time request."""

        if hdr.command_id != 0x0024 or self.set_time_offset == 0:
            return super().handle_cluster_request(
                hdr, args, dst_addressing=dst_addressing
            )

        # Send default response because the MCU expects it
        if not hdr.frame_control.disable_default_response:
            self.send_default_rsp(hdr, status=foundation.Status.SUCCESS)

        _LOGGER.debug(
            "[0x%04x:%s:0x%04x] Got set time request (command 0x%04x)",
            self.endpoint.device.nwk,
            self.endpoint.endpoint_id,
            self.cluster_id,
            hdr.command_id,
        )
        payload = TuyaTimePayload()
        utc_timestamp = int(
            (
                datetime.datetime.utcnow()
                - datetime.datetime(self.set_time_offset, 1, 1)
            ).total_seconds()
        )
        local_timestamp = int(
            (
                datetime.datetime.now()
                - datetime.datetime(
                    self.set_time_local_offset or self.set_time_offset, 1, 1
                )
            ).total_seconds()
        )
        payload.extend(utc_timestamp.to_bytes(4, "big", signed=False))
        payload.extend(local_timestamp.to_bytes(4, "big", signed=False))

        self.create_catching_task(
            super().command(TUYA_SET_TIME, payload, expect_reply=False)
        )


class TuyaManufClusterAttributes(TuyaManufCluster):
    """Manufacturer specific cluster for Tuya converting attributes <-> commands."""

    def handle_cluster_request(
        self,
        hdr: foundation.ZCLHeader,
        args: Tuple,
        *,
        dst_addressing: Optional[
            Union[t.Addressing.Group, t.Addressing.IEEE, t.Addressing.NWK]
        ] = None,
    ) -> None:
        """Handle cluster request."""
        if hdr.command_id not in (0x0001, 0x0002):
            return super().handle_cluster_request(
                hdr, args, dst_addressing=dst_addressing
            )

        # Send default response because the MCU expects it
        if not hdr.frame_control.disable_default_response:
            self.send_default_rsp(hdr, status=foundation.Status.SUCCESS)

        tuya_cmd = args[0].command_id
        tuya_data = args[0].data

        _LOGGER.debug(
            "[0x%04x:%s:0x%04x] Received value %s "
            "for attribute 0x%04x (command 0x%04x)",
            self.endpoint.device.nwk,
            self.endpoint.endpoint_id,
            self.cluster_id,
            repr(tuya_data[1:]),
            tuya_cmd,
            hdr.command_id,
        )

        if tuya_cmd not in self.attributes:
            return

        ztype = self.attributes[tuya_cmd].type
        zvalue = tuya_data.to_value(ztype)
        self._update_attribute(tuya_cmd, zvalue)

    def read_attributes(
        self, attributes, allow_cache=False, only_cache=False, manufacturer=None
    ):
        """Ignore remote reads as the "get_data" command doesn't seem to do anything."""

        return super().read_attributes(
            attributes, allow_cache=True, only_cache=True, manufacturer=manufacturer
        )

    async def write_attributes(self, attributes, manufacturer=None):
        """Defer attributes writing to the set_data tuya command."""

        records = self._write_attr_records(attributes)

        for record in records:
            cmd_payload = TuyaManufCluster.Command()
            cmd_payload.status = 0
            cmd_payload.tsn = self.endpoint.device.application.get_sequence()
            cmd_payload.command_id = record.attrid
            cmd_payload.function = 0
            cmd_payload.data = Data.from_value(record.value.value)

            await super().command(
                TUYA_SET_DATA,
                cmd_payload,
                manufacturer=manufacturer,
                expect_reply=False,
                tsn=cmd_payload.tsn,
            )

        return [[foundation.WriteAttributesStatusRecord(foundation.Status.SUCCESS)]]


class TuyaOnOff(CustomCluster, OnOff):
    """Tuya On/Off cluster for On/Off device."""

    def __init__(self, *args, **kwargs):
        """Init."""
        super().__init__(*args, **kwargs)
        self.endpoint.device.switch_bus.add_listener(self)

    def switch_event(self, channel, state):
        """Switch event."""
        _LOGGER.debug(
            "%s - Received switch event message, channel: %d, state: %d",
            self.endpoint.device.ieee,
            channel,
            state,
        )
        # update status only if event == endpoint
        if self.endpoint.endpoint_id == channel:
            self._update_attribute(ATTR_ON_OFF, state)

    async def command(
        self,
        command_id: Union[foundation.GeneralCommand, int, t.uint8_t],
        *args,
        manufacturer: Optional[Union[int, t.uint16_t]] = None,
        expect_reply: bool = True,
        tsn: Optional[Union[int, t.uint8_t]] = None,
    ):
        """Override the default Cluster command."""

        if command_id in (0x0000, 0x0001):
            cmd_payload = TuyaManufCluster.Command()
            cmd_payload.status = 0
            # cmd_payload.tsn = tsn if tsn else self.endpoint.device.application.get_sequence()
            cmd_payload.tsn = 0
            cmd_payload.command_id = TUYA_CMD_BASE + self.endpoint.endpoint_id
            cmd_payload.function = 0
            cmd_payload.data = [1, command_id]

            self.endpoint.device.command_bus.listener_event(
                TUYA_MCU_COMMAND,
                cmd_payload,
            )
            return foundation.Status.SUCCESS

        return foundation.Status.UNSUP_CLUSTER_COMMAND


class TuyaManufacturerClusterOnOff(TuyaManufCluster):
    """Manufacturer Specific Cluster of On/Off device."""

    def handle_cluster_request(
        self,
        hdr: foundation.ZCLHeader,
        args: Tuple[TuyaManufCluster.Command],
        *,
        dst_addressing: Optional[
            Union[t.Addressing.Group, t.Addressing.IEEE, t.Addressing.NWK]
        ] = None,
    ) -> None:
        """Handle cluster request."""

        if hdr.command_id in (0x0002, 0x0001):
            # Send default response because the MCU expects it
            if not hdr.frame_control.disable_default_response:
                self.send_default_rsp(hdr, status=foundation.Status.SUCCESS)

            tuya_payload = args[0]
            self.endpoint.device.switch_bus.listener_event(
                SWITCH_EVENT,
                tuya_payload.command_id - TUYA_CMD_BASE,
                tuya_payload.data[1],
            )
        elif hdr.command_id == TUYA_SET_TIME:
            """Time event call super"""
            _LOGGER.debug("TUYA_SET_TIME --> hdr: %s, args: %s", hdr, args)
            super().handle_cluster_request(hdr, args, dst_addressing=dst_addressing)
        else:
            _LOGGER.warning("Unsupported command: %s", hdr)


class TuyaSwitch(CustomDevice):
    """Tuya switch device."""

    def __init__(self, *args, **kwargs):
        """Init device."""
        self.switch_bus = Bus()
        super().__init__(*args, **kwargs)


class TuyaDimmerSwitch(TuyaSwitch):
    """Tuya dimmer switch device."""

    def __init__(self, *args, **kwargs):
        """Init device."""
        self.dimmer_bus = Bus()
        super().__init__(*args, **kwargs)


class TuyaThermostatCluster(LocalDataCluster, Thermostat):
    """Thermostat cluster for Tuya thermostats."""

    _CONSTANT_ATTRIBUTES = {0x001B: Thermostat.ControlSequenceOfOperation.Heating_Only}

    def __init__(self, *args, **kwargs):
        """Init."""
        super().__init__(*args, **kwargs)
        self.endpoint.device.thermostat_bus.add_listener(self)

    def temperature_change(self, attr, value):
        """Local or target temperature change from device."""
        self._update_attribute(self.attributes_by_name[attr].id, value)

    def state_change(self, value):
        """State update from device."""
        if value == 0:
            mode = self.RunningMode.Off
            state = self.RunningState.Idle
        else:
            mode = self.RunningMode.Heat
            state = self.RunningState.Heat_State_On
        self._update_attribute(self.attributes_by_name["running_mode"].id, mode)
        self._update_attribute(self.attributes_by_name["running_state"].id, state)

    # pylint: disable=R0201
    def map_attribute(self, attribute, value):
        """Map standardized attribute value to dict of manufacturer values."""
        return {}

    async def write_attributes(self, attributes, manufacturer=None):
        """Implement writeable attributes."""

        records = self._write_attr_records(attributes)

        if not records:
            return [[foundation.WriteAttributesStatusRecord(foundation.Status.SUCCESS)]]

        manufacturer_attrs = {}
        for record in records:
            attr_name = self.attributes[record.attrid].name
            new_attrs = self.map_attribute(attr_name, record.value.value)

            _LOGGER.debug(
                "[0x%04x:%s:0x%04x] Mapping standard %s (0x%04x) "
                "with value %s to custom %s",
                self.endpoint.device.nwk,
                self.endpoint.endpoint_id,
                self.cluster_id,
                attr_name,
                record.attrid,
                repr(record.value.value),
                repr(new_attrs),
            )

            manufacturer_attrs.update(new_attrs)

        if not manufacturer_attrs:
            return [
                [
                    foundation.WriteAttributesStatusRecord(
                        foundation.Status.FAILURE, r.attrid
                    )
                    for r in records
                ]
            ]

        await self.endpoint.tuya_manufacturer.write_attributes(
            manufacturer_attrs, manufacturer=manufacturer
        )

        return [[foundation.WriteAttributesStatusRecord(foundation.Status.SUCCESS)]]

    # pylint: disable=W0236
    async def command(
        self,
        command_id: Union[foundation.GeneralCommand, int, t.uint8_t],
        *args,
        manufacturer: Optional[Union[int, t.uint16_t]] = None,
        expect_reply: bool = True,
        tsn: Optional[Union[int, t.uint8_t]] = None,
    ):
        """Implement thermostat commands."""

        if command_id != 0x0000:
            return foundation.GENERAL_COMMANDS[
                foundation.GeneralCommand.Default_Response
            ].schema(
                command_id=command_id, status=foundation.Status.UNSUP_CLUSTER_COMMAND
            )

        mode, offset = args
        if mode not in (self.SetpointMode.Heat, self.SetpointMode.Both):
            return foundation.GENERAL_COMMANDS[
                foundation.GeneralCommand.Default_Response
            ].schema(command_id=command_id, status=foundation.Status.INVALID_VALUE)

        attrid = self.attributes_by_name["occupied_heating_setpoint"].id

        success, _ = await self.read_attributes((attrid,), manufacturer=manufacturer)
        try:
            current = success[attrid]
        except KeyError:
            return foundation.Status.FAILURE

        # offset is given in decidegrees, see Zigbee cluster specification
        (res,) = await self.write_attributes(
            {"occupied_heating_setpoint": current + offset * 10},
            manufacturer=manufacturer,
        )
        return foundation.GENERAL_COMMANDS[
            foundation.GeneralCommand.Default_Response
        ].schema(command_id=command_id, status=res[0].status)


class TuyaUserInterfaceCluster(LocalDataCluster, UserInterface):
    """HVAC User interface cluster for tuya thermostats."""

    def __init__(self, *args, **kwargs):
        """Init."""
        super().__init__(*args, **kwargs)
        self.endpoint.device.ui_bus.add_listener(self)

    def child_lock_change(self, mode):
        """Change of child lock setting."""
        if mode == 0:
            lockout = self.KeypadLockout.No_lockout
        else:
            lockout = self.KeypadLockout.Level_1_lockout

        self._update_attribute(self.attributes_by_name["keypad_lockout"].id, lockout)

    def map_attribute(self, attribute, value):
        """Map standardized attribute value to dict of manufacturer values."""
        return {}

    async def write_attributes(self, attributes, manufacturer=None):
        """Defer the keypad_lockout attribute to child_lock."""

        records = self._write_attr_records(attributes)

        manufacturer_attrs = {}
        for record in records:
            if record.attrid == self.attributes_by_name["keypad_lockout"].id:
                lock = 0 if record.value.value == self.KeypadLockout.No_lockout else 1
                new_attrs = {self._CHILD_LOCK_ATTR: lock}
            else:
                attr_name = self.attributes[record.attrid].name
                new_attrs = self.map_attribute(attr_name, record.value.value)

                _LOGGER.debug(
                    "[0x%04x:%s:0x%04x] Mapping standard %s (0x%04x) "
                    "with value %s to custom %s",
                    self.endpoint.device.nwk,
                    self.endpoint.endpoint_id,
                    self.cluster_id,
                    attr_name,
                    record.attrid,
                    repr(record.value.value),
                    repr(new_attrs),
                )

            manufacturer_attrs.update(new_attrs)

        if not manufacturer_attrs:
            return [
                [
                    foundation.WriteAttributesStatusRecord(
                        foundation.Status.FAILURE, r.attrid
                    )
                    for r in records
                ]
            ]

        await self.endpoint.tuya_manufacturer.write_attributes(
            manufacturer_attrs, manufacturer=manufacturer
        )

        return [[foundation.WriteAttributesStatusRecord(foundation.Status.SUCCESS)]]


class TuyaPowerConfigurationCluster(LocalDataCluster, PowerConfiguration):
    """PowerConfiguration cluster for battery-operated thermostats."""

    def __init__(self, *args, **kwargs):
        """Init."""
        super().__init__(*args, **kwargs)
        self.endpoint.device.battery_bus.add_listener(self)

    def battery_change(self, value):
        """Change of reported battery percentage remaining."""
        self._update_attribute(
            self.attributes_by_name["battery_percentage_remaining"].id, value * 2
        )


class TuyaPowerConfigurationCluster2AA(TuyaPowerConfigurationCluster):
    """PowerConfiguration cluster for battery-operated TRVs with 2 AA."""

    BATTERY_SIZES = 0x0031
    BATTERY_RATED_VOLTAGE = 0x0034
    BATTERY_QUANTITY = 0x0033

    _CONSTANT_ATTRIBUTES = {
        BATTERY_SIZES: 3,
        BATTERY_RATED_VOLTAGE: 15,
        BATTERY_QUANTITY: 2,
    }


class TuyaPowerConfigurationCluster3AA(TuyaPowerConfigurationCluster):
    """PowerConfiguration cluster for battery-operated TRVs with 3 AA."""

    BATTERY_SIZES = 0x0031
    BATTERY_RATED_VOLTAGE = 0x0034
    BATTERY_QUANTITY = 0x0033

    _CONSTANT_ATTRIBUTES = {
        BATTERY_SIZES: 3,
        BATTERY_RATED_VOLTAGE: 15,
        BATTERY_QUANTITY: 3,
    }


class TuyaThermostat(CustomDevice):
    """Generic Tuya thermostat device."""

    def __init__(self, *args, **kwargs):
        """Init device."""
        self.thermostat_bus = Bus()
        self.ui_bus = Bus()
        self.battery_bus = Bus()
        super().__init__(*args, **kwargs)


# Tuya Zigbee OnOff Cluster Attribute Implementation
class SwitchBackLight(t.enum8):
    """Tuya switch back light mode enum."""

    Mode_0 = 0x00
    Mode_1 = 0x01
    Mode_2 = 0x02


class SwitchMode(t.enum8):
    """Tuya switch mode enum."""

    Command = 0x00
    Event = 0x01


class PowerOnState(t.enum8):
    """Tuya power on state enum."""

    Off = 0x00
    On = 0x01
    LastState = 0x02


class TuyaZBOnOffAttributeCluster(CustomCluster, OnOff):
    """Tuya Zigbee On Off cluster with extra attributes."""

    attributes = OnOff.attributes.copy()
    attributes.update({0x8000: ("child_lock", t.Bool)})
    attributes.update({0x8001: ("backlight_mode", SwitchBackLight)})
    attributes.update({0x8002: ("power_on_state", PowerOnState)})
    attributes.update({0x8004: ("switch_mode", SwitchMode)})


class TuyaSmartRemoteOnOffCluster(OnOff, EventableCluster):
    """TuyaSmartRemoteOnOffCluster: fire events corresponding to press type."""

    rotate_type = {
        0x00: RIGHT,
        0x01: LEFT,
        0x02: STOP,
    }
    press_type = {
        0x00: SHORT_PRESS,
        0x01: DOUBLE_PRESS,
        0x02: LONG_PRESS,
    }
    name = "TS004X_cluster"
    ep_attribute = "TS004X_cluster"
    attributes = OnOff.attributes.copy()
    attributes.update({0x8001: ("backlight_mode", SwitchBackLight)})
    attributes.update({0x8002: ("power_on_state", PowerOnState)})
    attributes.update({0x8004: ("switch_mode", SwitchMode)})

    def __init__(self, *args, **kwargs):
        """Init."""
        self.last_tsn = -1
        super().__init__(*args, **kwargs)

    server_commands = OnOff.server_commands.copy()
    server_commands.update(
        {
            0xFC: foundation.ZCLCommandDef(
                "rotate_type",
                {"rotate_type": t.uint8_t},
                False,
                is_manufacturer_specific=True,
            ),
            0xFD: foundation.ZCLCommandDef(
                "press_type",
                {"press_type": t.uint8_t},
                False,
                is_manufacturer_specific=True,
            ),
        }
    )

    def handle_cluster_request(
        self,
        hdr: foundation.ZCLHeader,
        args: List[Any],
        *,
        dst_addressing: Optional[
            Union[t.Addressing.Group, t.Addressing.IEEE, t.Addressing.NWK]
        ] = None,
    ):
        """Handle press_types command."""
        # normally if default response sent, TS004x wouldn't send such repeated zclframe (with same sequence number),
        # but for stability reasons (e. g. the case the response doesn't arrive the device), we can simply ignore it
        if hdr.tsn == self.last_tsn:
            _LOGGER.debug("TS004X: ignoring duplicate frame")
            return
        # save last sequence number
        self.last_tsn = hdr.tsn

        # send default response (as soon as possible), so avoid repeated zclframe from device
        if not hdr.frame_control.disable_default_response:
            self.debug("TS004X: send default response")
            self.send_default_rsp(hdr, status=foundation.Status.SUCCESS)
        # handle command
        if hdr.command_id == 0xFC:
            rotate_type = args[0]
            self.listener_event(
                ZHA_SEND_EVENT, self.rotate_type.get(rotate_type, "unknown"), []
            )
        elif hdr.command_id == 0xFD:
            press_type = args[0]
            self.listener_event(
                ZHA_SEND_EVENT, self.press_type.get(press_type, "unknown"), []
            )


# Tuya Zigbee Metering Cluster Correction Implementation
class TuyaZBMeteringCluster(CustomCluster, Metering):
    """Divides the kWh for tuya."""

    MULTIPLIER = 0x0301
    DIVISOR = 0x0302
    _CONSTANT_ATTRIBUTES = {MULTIPLIER: 1, DIVISOR: 100}


class TuyaZBElectricalMeasurement(CustomCluster, ElectricalMeasurement):
    """Divides the Current for tuya."""

    AC_CURRENT_MULTIPLIER = 0x0602
    AC_CURRENT_DIVISOR = 0x0603
    _CONSTANT_ATTRIBUTES = {AC_CURRENT_MULTIPLIER: 1, AC_CURRENT_DIVISOR: 1000}


# Tuya Zigbee Cluster 0xE000 Implementation
class TuyaZBE000Cluster(CustomCluster):
    """Tuya manufacturer specific cluster 57344."""

    name = "Tuya Manufacturer Specific"
    cluster_id = TUYA_CLUSTER_E000_ID
    ep_attribute = "tuya_is_pita_0"


# Tuya Zigbee Cluster 0xE001 Implementation
class ExternalSwitchType(t.enum8):
    """Tuya external switch type enum."""

    Toggle = 0x00
    State = 0x01
    Momentary = 0x02


class TuyaZBExternalSwitchTypeCluster(CustomCluster):
    """Tuya External Switch Type Cluster."""

    name = "Tuya External Switch Type Cluster"
    cluster_id = TUYA_CLUSTER_E001_ID
    ep_attribute = "tuya_external_switch_type"
    attributes = {0xD030: ("external_switch_type", ExternalSwitchType)}


# Tuya Window Cover Implementation
class TuyaManufacturerWindowCover(TuyaManufCluster):
    """Manufacturer Specific Cluster for cover device."""

    def handle_cluster_request(
        self,
        hdr: foundation.ZCLHeader,
        args: Tuple[TuyaManufCluster.Command],
        *,
        dst_addressing: Optional[
            Union[t.Addressing.Group, t.Addressing.IEEE, t.Addressing.NWK]
        ] = None,
    ) -> None:
        """Handle cluster request."""
        """Tuya Specific Cluster Commands"""
        if hdr.command_id in (TUYA_GET_DATA, TUYA_SET_DATA_RESPONSE):
            tuya_payload = args[0]
            _LOGGER.debug(
                "%s Received Attribute Report. Command is 0x%04x, Tuya Paylod values"
                "[Status : %s, TSN: %s, Command: 0x%04x, Function: 0x%02x, Data: %s]",
                self.endpoint.device.ieee,
                hdr.command_id,
                tuya_payload.status,
                tuya_payload.tsn,
                tuya_payload.command_id,
                tuya_payload.function,
                tuya_payload.data,
            )

            if tuya_payload.command_id == TUYA_DP_TYPE_VALUE + TUYA_DP_ID_PERCENT_STATE:
                self.endpoint.device.cover_bus.listener_event(
                    COVER_EVENT,
                    ATTR_COVER_POSITION,
                    tuya_payload.data[4],
                )
            elif (
                tuya_payload.command_id
                == TUYA_DP_TYPE_VALUE + TUYA_DP_ID_PERCENT_CONTROL
            ):
                self.endpoint.device.cover_bus.listener_event(
                    COVER_EVENT,
                    ATTR_COVER_POSITION,
                    tuya_payload.data[4],
                )
            elif (
                tuya_payload.command_id
                == TUYA_DP_TYPE_ENUM + TUYA_DP_ID_DIRECTION_CHANGE
            ):
                self.endpoint.device.cover_bus.listener_event(
                    COVER_EVENT,
                    ATTR_COVER_DIRECTION,
                    tuya_payload.data[1],
                )
            elif (
                tuya_payload.command_id == TUYA_DP_TYPE_ENUM + TUYA_DP_ID_COVER_INVERTED
            ):
                self.endpoint.device.cover_bus.listener_event(
                    COVER_EVENT,
                    ATTR_COVER_INVERTED,
                    tuya_payload.data[1],  # Check this
                )
        elif hdr.command_id == TUYA_SET_TIME:
            """Time event call super"""
            super().handle_cluster_request(hdr, args, dst_addressing=dst_addressing)
        else:
            _LOGGER.debug(
                "%s Received Attribute Report - Unknown Command. Self [%s], Header [%s], Tuya Paylod [%s]",
                self.endpoint.device.ieee,
                self,
                hdr,
                args,
            )


class TuyaWindowCoverControl(LocalDataCluster, WindowCovering):
    """Manufacturer Specific Cluster of Device cover."""

    """Add additional attributes for direction"""
    attributes = WindowCovering.attributes.copy()
    attributes.update({ATTR_COVER_DIRECTION: ("motor_direction", t.Bool)})
    attributes.update({ATTR_COVER_INVERTED: ("cover_inverted", t.Bool)})

    def __init__(self, *args, **kwargs):
        """Initialize instance."""
        super().__init__(*args, **kwargs)
        self.endpoint.device.cover_bus.add_listener(self)

    def cover_event(self, attribute, value):
        """Event listener for cover events."""
        if attribute == ATTR_COVER_POSITION:
            invert_attr = self._attr_cache.get(ATTR_COVER_INVERTED) == 1
            invert = (
                not invert_attr
                if self.endpoint.device.manufacturer in TUYA_COVER_INVERTED_BY_DEFAULT
                else invert_attr
            )
            value = value if invert else 100 - value
        self._update_attribute(attribute, value)
        _LOGGER.debug(
            "%s Tuya Attribute Cache : [%s]",
            self.endpoint.device.ieee,
            self._attr_cache,
        )

    def command(
        self,
        command_id: Union[foundation.GeneralCommand, int, t.uint8_t],
        *args,
        manufacturer: Optional[Union[int, t.uint16_t]] = None,
        expect_reply: bool = True,
        tsn: Optional[Union[int, t.uint8_t]] = None,
    ):
        """Override the default Cluster command."""
        if manufacturer is None:
            manufacturer = self.endpoint.device.manufacturer
        _LOGGER.debug(
            "%s Sending Tuya Cluster Command.. Manufacturer is %s Cluster Command is 0x%04x, Arguments are %s",
            self.endpoint.device.ieee,
            manufacturer,
            command_id,
            args,
        )
        # Open Close or Stop commands
        tuya_payload = TuyaManufCluster.Command()
        if command_id in (
            WINDOW_COVER_COMMAND_UPOPEN,
            WINDOW_COVER_COMMAND_DOWNCLOSE,
            WINDOW_COVER_COMMAND_STOP,
        ):
            tuya_payload.status = 0
            tuya_payload.tsn = tsn if tsn else 0
            tuya_payload.command_id = TUYA_DP_TYPE_ENUM + TUYA_DP_ID_CONTROL
            tuya_payload.function = 0
            tuya_payload.data = [
                1,
                # need to implement direction change
                TUYA_COVER_COMMAND[manufacturer][command_id],
            ]  # remap the command to the Tuya command
        # Set Position Command
        elif command_id == WINDOW_COVER_COMMAND_LIFTPERCENT:
            tuya_payload.status = 0
            tuya_payload.tsn = tsn if tsn else 0
            tuya_payload.command_id = TUYA_DP_TYPE_VALUE + TUYA_DP_ID_PERCENT_CONTROL
            tuya_payload.function = 0
            """Check direction and correct value"""
            invert_attr = self._attr_cache.get(ATTR_COVER_INVERTED) == 1
            invert = (
                not invert_attr
                if self.endpoint.device.manufacturer in TUYA_COVER_INVERTED_BY_DEFAULT
                else invert_attr
            )
            position = args[0] if invert else 100 - args[0]
            tuya_payload.data = [
                4,
                0,
                0,
                0,
                position,
            ]
        # Custom Command
        elif command_id == WINDOW_COVER_COMMAND_CUSTOM:
            tuya_payload.status = args[0]
            tuya_payload.tsn = args[1]
            tuya_payload.command_id = args[2]
            tuya_payload.function = args[3]
            tuya_payload.data = args[4]
        else:
            tuya_payload = None
        # Send the command
        if tuya_payload.command_id:
            _LOGGER.debug(
                "%s Sending Tuya Command. Paylod values [endpoint_id : %s, "
                "Status : %s, TSN: %s, Command: 0x%04x, Function: %s, Data: %s]",
                self.endpoint.device.ieee,
                self.endpoint.endpoint_id,
                tuya_payload.status,
                tuya_payload.tsn,
                tuya_payload.command_id,
                tuya_payload.function,
                tuya_payload.data,
            )

            return self.endpoint.tuya_manufacturer.command(
                TUYA_SET_DATA, tuya_payload, expect_reply=True
            )
        else:
            _LOGGER.debug("Unrecognised command: %x", command_id)
            return foundation.Status.UNSUP_CLUSTER_COMMAND


class TuyaWindowCover(CustomDevice):
    """Tuya switch device."""

    def __init__(self, *args, **kwargs):
        """Init device."""
        self.cover_bus = Bus()
        super().__init__(*args, **kwargs)


class TuyaManufacturerLevelControl(TuyaManufCluster):
    """Manufacturer Specific Cluster for cover device."""

    def handle_cluster_request(
        self,
        hdr: foundation.ZCLHeader,
        args: Tuple[TuyaManufCluster.Command],
        *,
        dst_addressing: Optional[
            Union[t.Addressing.Group, t.Addressing.IEEE, t.Addressing.NWK]
        ] = None,
    ) -> None:
        """Handle cluster request."""
        tuya_payload = args[0]

        _LOGGER.debug(
            "%s Received Attribute Report. Command is %x, Tuya Paylod values"
            "[Status : %s, TSN: %s, Command: %s, Function: %s, Data: %s]",
            self.endpoint.device.ieee,
            hdr.command_id,
            tuya_payload.status,
            tuya_payload.tsn,
            tuya_payload.command_id,
            tuya_payload.function,
            tuya_payload.data,
        )

        if hdr.command_id in (0x0002, 0x0001):
            if tuya_payload.command_id == TUYA_LEVEL_COMMAND:
                self.endpoint.device.dimmer_bus.listener_event(
                    LEVEL_EVENT,
                    tuya_payload.command_id,
                    tuya_payload.data,
                )
            else:
                self.endpoint.device.switch_bus.listener_event(
                    SWITCH_EVENT,
                    tuya_payload.command_id - TUYA_CMD_BASE,
                    tuya_payload.data[1],
                )


class TuyaLevelControl(CustomCluster, LevelControl):
    """Tuya Level cluster for dimmable device."""

    def __init__(self, *args, **kwargs):
        """Init."""
        super().__init__(*args, **kwargs)
        self.endpoint.device.dimmer_bus.add_listener(self)

    def level_event(self, channel, state):
        """Level event."""
        level = (((state[3] << 8) + state[4]) * 255) // 1000
        _LOGGER.debug(
            "%s - Received level event message, channel: %d, level: %d, data: %d",
            self.endpoint.device.ieee,
            channel,
            level,
            state,
        )
        self._update_attribute(self.attributes_by_name["current_level"].id, level)

    def command(
        self,
        command_id: Union[foundation.GeneralCommand, int, t.uint8_t],
        *args,
        manufacturer: Optional[Union[int, t.uint16_t]] = None,
        expect_reply: bool = True,
        tsn: Optional[Union[int, t.uint8_t]] = None,
    ):
        """Override the default Cluster command."""
        _LOGGER.debug(
            "%s Sending Tuya Cluster Command.. Cluster Command is %x, Arguments are %s",
            self.endpoint.device.ieee,
            command_id,
            args,
        )
        # Move to level
        # move_to_level_with_on_off
        if command_id in (0x0000, 0x0001, 0x0004):
            cmd_payload = TuyaManufCluster.Command()
            cmd_payload.status = 0
            cmd_payload.tsn = 0
            cmd_payload.command_id = TUYA_LEVEL_COMMAND
            cmd_payload.function = 0
            brightness = (args[0] * 1000) // 255
            val1 = brightness >> 8
            val2 = brightness & 0xFF
            cmd_payload.data = [4, 0, 0, val1, val2]  # Custom Command

            return self.endpoint.tuya_manufacturer.command(
                TUYA_SET_DATA, cmd_payload, expect_reply=True
            )

        return foundation.Status.UNSUP_CLUSTER_COMMAND


class TuyaLocalCluster(LocalDataCluster):
    """Tuya virtual clusters.

    Prevents attribute reads and writes. Attribute writes could be converted
    to DataPoint updates.
    """

    def update_attribute(self, attr_name: str, value: Any) -> None:
        """Update attribute by attribute name."""

        try:
            attr = self.attributes_by_name[attr_name]
        except KeyError:
            self.debug("no such attribute: %s", attr_name)
            return
        return self._update_attribute(attr.id, value)


@dataclasses.dataclass
class DPToAttributeMapping:
    """Container for datapoint to cluster attribute update mapping."""

    ep_attribute: str
    attribute_name: str
    converter: Optional[
        Callable[
            [
                Any,
            ],
            Any,
        ]
    ] = None
    endpoint_id: Optional[int] = None


class TuyaNewManufCluster(CustomCluster):
    """Tuya manufacturer specific cluster.

    This is an attempt to consolidate the multiple above clusters into a
    single framework. Instead of overriding the handle_cluster_request()
    method, implement handlers for commands, like get_data, set_data_response,
    set_time_request, etc.
    """

    name: str = "Tuya Manufacturer Specific"
    cluster_id: t.uint16_t = TUYA_CLUSTER_ID
    ep_attribute: str = "tuya_manufacturer"

    server_commands = {
        TUYA_SET_DATA: foundation.ZCLCommandDef(
            "set_data", {"data": TuyaCommand}, False, is_manufacturer_specific=True
        ),
        TUYA_SEND_DATA: foundation.ZCLCommandDef(
            "send_data", {"data": TuyaCommand}, False, is_manufacturer_specific=True
        ),
        TUYA_SET_TIME: foundation.ZCLCommandDef(
            "set_time", {"time": TuyaTimePayload}, False, is_manufacturer_specific=True
        ),
    }

    client_commands = {
        TUYA_GET_DATA: foundation.ZCLCommandDef(
            "get_data", {"data": TuyaCommand}, True, is_manufacturer_specific=True
        ),
        TUYA_SET_DATA_RESPONSE: foundation.ZCLCommandDef(
            "set_data_response",
            {"data": TuyaCommand},
            True,
            is_manufacturer_specific=True,
        ),
        TUYA_ACTIVE_STATUS_RPT: foundation.ZCLCommandDef(
            "active_status_report",
            {"data": TuyaCommand},
            True,
            is_manufacturer_specific=True,
        ),
        TUYA_SET_TIME: foundation.ZCLCommandDef(
            "set_time_request", {"data": t.data16}, True, is_manufacturer_specific=True
        ),
    }

    data_point_handlers: Dict[int, str] = {}

    def handle_cluster_request(
        self,
        hdr: foundation.ZCLHeader,
        args: Tuple,
        *,
        dst_addressing: Optional[
            Union[t.Addressing.Group, t.Addressing.IEEE, t.Addressing.NWK]
        ] = None,
    ) -> None:
        """Handle cluster specific request."""

        try:
            if hdr.is_reply:
                # server_cluster -> client_cluster cluster specific command
                handler_name = f"handle_{self.client_commands[hdr.command_id].name}"
            else:
                handler_name = f"handle_{self.server_commands[hdr.command_id].name}"
        except KeyError:
            self.debug(
                "Received unknown manufacturer command %s: %s", hdr.command_id, args
            )
            if not hdr.frame_control.disable_default_response:
                self.send_default_rsp(
                    hdr, status=foundation.Status.UNSUP_CLUSTER_COMMAND
                )
                return

        try:
            status = getattr(self, handler_name)(*args)
        except AttributeError:
            self.warning(
                "No '%s' tuya handler found for %s",
                handler_name,
                args,
            )
            status = foundation.Status.UNSUP_CLUSTER_COMMAND

        if not hdr.frame_control.disable_default_response:
            self.send_default_rsp(hdr, status=status)

    def handle_get_data(self, command: TuyaCommand) -> foundation.Status:
        """Handle get_data response (report)."""
        try:
            dp_handler = self.data_point_handlers[command.dp]
            getattr(self, dp_handler)(command)
        except (AttributeError, KeyError):
            self.debug("No datapoint handler for %s", command)
            return foundation.status.UNSUPPORTED_ATTRIBUTE

        return foundation.Status.SUCCESS

    handle_set_data_response = handle_get_data
    handle_active_status_report = handle_get_data

    def handle_set_time_request(self, payload: t.uint16_t) -> foundation.Status:
        """Handle Time set request."""
        return foundation.Status.SUCCESS

    def _dp_2_attr_update(self, command: TuyaCommand) -> None:
        """Handle data point to attribute report conversion."""
        try:
            dp_map = self.dp_to_attribute[command.dp]
        except KeyError:
            self.debug("No attribute mapping for %s data point", command.dp)
            return

        endpoint = self.endpoint
        if dp_map.endpoint_id:
            endpoint = self.endpoint.device.endpoints[dp_map.endpoint_id]
        cluster = getattr(endpoint, dp_map.ep_attribute)
        value = command.data.payload
        if dp_map.converter:
            value = dp_map.converter(value)

        cluster.update_attribute(dp_map.attribute_name, value)
/config/zha-device-handlers/zhaquirks/tuya/ts0601_cover.py

"""Tuya based cover and blinds."""
from zigpy.profiles import zha
from zigpy.zcl.clusters.general import Basic, GreenPowerProxy, Groups, Identify, OnOff, Ota, Scenes, Time

from zhaquirks.const import (
    DEVICE_TYPE,
    ENDPOINTS,
    INPUT_CLUSTERS,
    MODELS_INFO,
    OUTPUT_CLUSTERS,
    PROFILE_ID,
)
from . import (
    TuyaManufacturerWindowCover,
    TuyaManufCluster,
    TuyaWindowCover,
    TuyaWindowCoverControl,
)


class TuyaZemismartSmartCover0601(TuyaWindowCover):
    """Tuya Zemismart blind cover motor."""

    signature = {
        # "node_descriptor": "<NodeDescriptor byte1=1 byte2=64 mac_capability_flags=142 manufacturer_code=4098
        #                       maximum_buffer_size=82 maximum_incoming_transfer_size=82 server_mask=11264
        #                       maximum_outgoing_transfer_size=82 descriptor_capability_field=0>",
        # input_clusters=[0x0000, 0x0004, 0x0005, 0x000a, 0xef00]
        # output_clusters=[0x0019]
        # <SimpleDescriptor endpoint=1 profile=260 device_type=51 input_clusters=[0, 4, 5, 61184] output_clusters=[25]>
        MODELS_INFO: [
            ("_TZE200_fzo2pocs", "TS0601"),
            ("_TZE200_zpzndjez", "TS0601"),
            ("_TZE200_cowvfni3", "TS0601"),
        ],
        ENDPOINTS: {
            1: {
                PROFILE_ID: zha.PROFILE_ID,
                DEVICE_TYPE: zha.DeviceType.SMART_PLUG,
                INPUT_CLUSTERS: [
                    Basic.cluster_id,
                    Groups.cluster_id,
                    Scenes.cluster_id,
                    Time.cluster_id,
                    TuyaManufCluster.cluster_id,
                ],
                OUTPUT_CLUSTERS: [Ota.cluster_id],
            },
        },
    }
    replacement = {
        ENDPOINTS: {
            1: {
                PROFILE_ID: zha.PROFILE_ID,
                DEVICE_TYPE: zha.DeviceType.WINDOW_COVERING_DEVICE,
                INPUT_CLUSTERS: [
                    Basic.cluster_id,
                    Groups.cluster_id,
                    Scenes.cluster_id,
                    Time.cluster_id,
                    TuyaManufacturerWindowCover,
                    TuyaWindowCoverControl,
                ],
                OUTPUT_CLUSTERS: [Ota.cluster_id],
            },
        },
    }


# From: https://github.com/zigpy/zha-device-handlers/issues/1294#issuecomment-1014843749
class TuyaZemismartSmartCover0601_4(TuyaWindowCover):
    """Tuya blind controller device."""

    signature = {
        # "node_descriptor": "NodeDescriptor(byte1=1, byte2=64, mac_capability_flags=142, manufacturer_code=4417,
        #                     maximum_buffer_size=66, maximum_incoming_transfer_size=66, server_mask=10752,
        #                     maximum_outgoing_transfer_size=66, descriptor_capability_field=0>,
        # "endpoints": { "1": { "profile_id": 260, "device_type": "0x0051", "in_clusters": [ "0x0000", "0x0004",
        # "0x0005","0xef00"], "out_clusters": ["0x000a","0x0019"] }, "242": { "profile_id": 41440, "device_type":
        # "0x0061", in_clusters": [], "out_clusters": [ "0x0021" ] } }, "manufacturer": "_TZE200_rmymn92d",
        # "model": "TS0601", "class": "zigpy.device.Device" }
        MODELS_INFO: [
            ("_TZE200_rmymn92d", "TS0601"),
        ],
        ENDPOINTS: {
            1: {
                PROFILE_ID: zha.PROFILE_ID,
                DEVICE_TYPE: zha.DeviceType.SMART_PLUG,
                INPUT_CLUSTERS: [
                    Basic.cluster_id,
                    Groups.cluster_id,
                    Scenes.cluster_id,
                    TuyaManufCluster.cluster_id,
                ],
                OUTPUT_CLUSTERS: [Time.cluster_id, Ota.cluster_id],
            },
            242: {
                PROFILE_ID: 41440,
                DEVICE_TYPE: 0x0061,
                INPUT_CLUSTERS: [],
                OUTPUT_CLUSTERS: [GreenPowerProxy.cluster_id],
            },
        },
    }

    replacement = {
        ENDPOINTS: {
            1: {
                PROFILE_ID: zha.PROFILE_ID,
                DEVICE_TYPE: zha.DeviceType.WINDOW_COVERING_DEVICE,
                INPUT_CLUSTERS: [
                    Basic.cluster_id,
                    Groups.cluster_id,
                    Scenes.cluster_id,
                    TuyaManufacturerWindowCover,
                    TuyaWindowCoverControl,
                ],
                OUTPUT_CLUSTERS: [Time.cluster_id, Ota.cluster_id],
            },
            242: {
                PROFILE_ID: 41440,
                DEVICE_TYPE: 0x0061,
                INPUT_CLUSTERS: [],
                OUTPUT_CLUSTERS: [GreenPowerProxy.cluster_id],
            },
        }
    }


class TuyaZemismartSmartCover0601_3(TuyaWindowCover):
    """Tuya Zemismart blind cover motor."""

    signature = {
        # "node_descriptor": "<NodeDescriptor byte1=1 byte2=64 mac_capability_flags=142 manufacturer_code=4098
        #                       maximum_buffer_size=82 maximum_incoming_transfer_size=82 server_mask=11264
        #                       maximum_outgoing_transfer_size=82 descriptor_capability_field=0>",
        # input_clusters=[0x0000, 0x0004, 0x0005, 0x000a, 0xef00]
        # output_clusters=[0x0019]
        # <SimpleDescriptor endpoint=1 profile=260 device_type=51 input_clusters=[0, 4, 5, 61184] output_clusters=[25]>
        MODELS_INFO: [
            ("_TZE200_fzo2pocs", "TS0601"),
            ("_TZE200_zpzndjez", "TS0601"),
            ("_TZE200_iossyxra", "TS0601"),
        ],
        ENDPOINTS: {
            1: {
                PROFILE_ID: zha.PROFILE_ID,
                DEVICE_TYPE: zha.DeviceType.SMART_PLUG,
                INPUT_CLUSTERS: [
                    Basic.cluster_id,
                    Groups.cluster_id,
                    Scenes.cluster_id,
                    TuyaManufCluster.cluster_id,
                ],
                OUTPUT_CLUSTERS: [Time.cluster_id, Ota.cluster_id],
            },
        },
    }
    replacement = {
        ENDPOINTS: {
            1: {
                PROFILE_ID: zha.PROFILE_ID,
                DEVICE_TYPE: zha.DeviceType.WINDOW_COVERING_DEVICE,
                INPUT_CLUSTERS: [
                    Basic.cluster_id,
                    Groups.cluster_id,
                    Scenes.cluster_id,
                    TuyaManufacturerWindowCover,
                    TuyaWindowCoverControl,
                ],
                OUTPUT_CLUSTERS: [Ota.cluster_id],
            },
        },
    }


class TuyaZemismartSmartCover0601_2(TuyaWindowCover):
    """Tuya Zemismart curtain cover motor."""

    signature = {
        # "node_descriptor": "<NodeDescriptor byte1=1 byte2=64 mac_capability_flags=142 manufacturer_code=4098
        #                       maximum_buffer_size=82 maximum_incoming_transfer_size=82 server_mask=11264
        #                       maximum_outgoing_transfer_size=82 descriptor_capability_field=0>",
        # input_clusters=[0x0000, 0x000a, 0x0004, 0x0005, 0xef00]
        # output_clusters=[0x0019]
        # <SimpleDescriptor endpoint=1 profile=260 device_type=81 input_clusters=[0, 10, 4, 5, 61184] output_clusters=[25]>
        MODELS_INFO: [
            ("_TZE200_3i3exuay", "TS0601"),
        ],
        ENDPOINTS: {
            1: {
                PROFILE_ID: zha.PROFILE_ID,
                DEVICE_TYPE: zha.DeviceType.SMART_PLUG,
                INPUT_CLUSTERS: [
                    Basic.cluster_id,
                    Time.cluster_id,
                    Groups.cluster_id,
                    Scenes.cluster_id,
                    TuyaManufCluster.cluster_id,
                ],
                OUTPUT_CLUSTERS: [Ota.cluster_id],
            },
        },
    }
    replacement = {
        ENDPOINTS: {
            1: {
                PROFILE_ID: zha.PROFILE_ID,
                DEVICE_TYPE: zha.DeviceType.WINDOW_COVERING_DEVICE,
                INPUT_CLUSTERS: [
                    Basic.cluster_id,
                    Groups.cluster_id,
                    Scenes.cluster_id,
                    Time.cluster_id,
                    TuyaManufacturerWindowCover,
                    TuyaWindowCoverControl,
                ],
                OUTPUT_CLUSTERS: [Ota.cluster_id],
            },
        },
    }


class TuyaMoesCover0601(TuyaWindowCover):
    """Tuya blind controller device."""

    signature = {
        # "node_descriptor": "NodeDescriptor(byte1=2, byte2=64, mac_capability_flags=128, manufacturer_code=4098,
        #                    maximum_buffer_size=82, maximum_incoming_transfer_size=82, server_mask=11264,
        #                    maximum_outgoing_transfer_size=82, descriptor_capability_field=0)",
        # "endpoints": {
        # "1": { "profile_id": 260, "device_type": "0x0051", "in_clusters": [ "0x0000", "0x0004","0x0005","0xef00"], "out_clusters": ["0x000a","0x0019"] }
        # },
        # "manufacturer": "_TZE200_zah67ekd",
        # "model": "TS0601",
        # "class": "zigpy.device.Device"
        # }
        MODELS_INFO: [
            ("_TZE200_zah67ekd", "TS0601"),
            ("_TZE200_xuzcvlku", "TS0601"),
            ("_TZE200_rddyvrci", "TS0601"),
            ("_TZE200_nueqqe6k", "TS0601"),
            ("_TZE200_gubdgai2", "TS0601"),
            ("_TZE200_yenbr4om", "TS0601"),
            ("_TZE200_5sbebbzs", "TS0601"),
            ("_TZE200_xaabybja", "TS0601"),
            ("_TZE200_hsgrhjpf", "TS0601"),
            ("_TZE200_68nvbio9", "TS0601"),
            ("_TZE200_zuz7f94z", "TS0601"),
            ("_TZE200_ergbiejo", "TS0601"),
        ],
        ENDPOINTS: {
            1: {
                PROFILE_ID: zha.PROFILE_ID,
                DEVICE_TYPE: zha.DeviceType.SMART_PLUG,
                INPUT_CLUSTERS: [
                    Basic.cluster_id,
                    Groups.cluster_id,
                    Scenes.cluster_id,
                    TuyaManufCluster.cluster_id,
                ],
                OUTPUT_CLUSTERS: [Time.cluster_id, Ota.cluster_id],
            }
        },
    }

    replacement = {
        ENDPOINTS: {
            1: {
                DEVICE_TYPE: zha.DeviceType.WINDOW_COVERING_DEVICE,
                INPUT_CLUSTERS: [
                    Basic.cluster_id,
                    Groups.cluster_id,
                    Scenes.cluster_id,
                    TuyaManufacturerWindowCover,
                    TuyaWindowCoverControl,
                ],
                OUTPUT_CLUSTERS: [Time.cluster_id, Ota.cluster_id],
            }
        }
    }


class TuyaCloneCover0601(TuyaWindowCover):
    """Tuya blind controller device."""

    signature = {
        # <SimpleDescriptor endpoint=1 profile=260 device_type=256 device_version=0
        # input_clusters=[0, 3, 4, 5, 6]
        # output_clusters=[25]>
        # },
        # "manufacturer": "_TYST11_wmcdj3aq",
        # "model": "mcdj3aq",
        # "class": "zigpy.device.Device"
        # }
        MODELS_INFO: [("_TYST11_wmcdj3aq", "mcdj3aq")],  # Not tested
        ENDPOINTS: {
            1: {
                PROFILE_ID: zha.PROFILE_ID,
                DEVICE_TYPE: zha.DeviceType.ON_OFF_LIGHT,
                INPUT_CLUSTERS: [
                    Basic.cluster_id,
                    Identify.cluster_id,
                    Groups.cluster_id,
                    Scenes.cluster_id,
                    OnOff.cluster_id,
                ],
                OUTPUT_CLUSTERS: [Ota.cluster_id],
            }
        },
    }

    replacement = {
        ENDPOINTS: {
            1: {
                DEVICE_TYPE: zha.DeviceType.WINDOW_COVERING_DEVICE,
                INPUT_CLUSTERS: [
                    Basic.cluster_id,
                    Groups.cluster_id,
                    Scenes.cluster_id,
                    TuyaManufacturerWindowCover,
                    TuyaWindowCoverControl,
                ],
                OUTPUT_CLUSTERS: [Ota.cluster_id],
            }
        }
    }

|
|
|
|
|
=== first post on Jul 27, 2023 ===
I have a _TZE200_rmymn92d here. I tried everything, including

  • removing __pycache__
  • using __init__.py and ts0601_cover.py above in the @ToastySefac 's comment
  • enable zha's enable_quirks in yaml

. The UI shows the status and buttons. But neither the slider nor the buttons work. I don't see tuya-related import errors and I can see "sending tuya command"s. The curtain track just doesn't move at all.

zhaquirks.tuya logging
  2023-07-27 02:10:46.649 DEBUG (MainThread) [zhaquirks.tuya] a4:c1:38:d0:15:3b:4d:38 Sending Tuya Cluster Command.. Manufacturer is _TZE200_rmymn92d Cluster Command is 0x0005, Arguments are (69,)
  2023-07-27 02:10:46.650 DEBUG (MainThread) [zhaquirks.tuya] a4:c1:38:d0:15:3b:4d:38 Sending Tuya Command. Paylod values [endpoint_id : 1, Status : 0, TSN: 0, Command: 0x0202, Function: 0, Data: [4, 0, 0, 0, 31]]
  2023-07-27 02:10:47.599 INFO (MainThread) [pyhap.hap_protocol] ('192.168.86.111', 59989): Connection made to 192 168 86 116
  2023-07-27 02:10:47.607 INFO (MainThread) [pyhap.hap_protocol] ('192.168.86.111', 59990): Connection made to HASS Bridge GZ
  2023-07-27 02:10:47.665 INFO (MainThread) [pyhap.hap_protocol] ('192.168.86.111', 59991): Connection made to HASS Bridge 1W
  2023-07-27 02:10:48.010 DEBUG (MainThread) [zhaquirks.tuya] a4:c1:38:d0:15:3b:4d:38 Sending Tuya Cluster Command.. Manufacturer is _TZE200_rmymn92d Cluster Command is 0x0005, Arguments are (88,)
  2023-07-27 02:10:48.011 DEBUG (MainThread) [zhaquirks.tuya] a4:c1:38:d0:15:3b:4d:38 Sending Tuya Command. Paylod values [endpoint_id : 1, Status : 0, TSN: 0, Command: 0x0202, Function: 0, Data: [4, 0, 0, 0, 12]]
  2023-07-27 02:10:49.418 DEBUG (MainThread) [zhaquirks.tuya] a4:c1:38:d0:15:3b:4d:38 Sending Tuya Cluster Command.. Manufacturer is _TZE200_rmymn92d Cluster Command is 0x0005, Arguments are (0,)
  2023-07-27 02:10:49.418 DEBUG (MainThread) [zhaquirks.tuya] a4:c1:38:d0:15:3b:4d:38 Sending Tuya Command. Paylod values [endpoint_id : 1, Status : 0, TSN: 0, Command: 0x0202, Function: 0, Data: [4, 0, 0, 0, 100]]
  2023-07-27 02:10:50.934 DEBUG (MainThread) [zhaquirks.tuya] a4:c1:38:d0:15:3b:4d:38 Sending Tuya Cluster Command.. Manufacturer is _TZE200_rmymn92d Cluster Command is 0x0005, Arguments are (99,)
  2023-07-27 02:10:50.935 DEBUG (MainThread) [zhaquirks.tuya] a4:c1:38:d0:15:3b:4d:38 Sending Tuya Command. Paylod values [endpoint_id : 1, Status : 0, TSN: 0, Command: 0x0202, Function: 0, Data: [4, 0, 0, 0, 1]]
  2023-07-27 02:10:51.505 DEBUG (MainThread) [zhaquirks.tuya] a4:c1:38:d0:15:3b:4d:38 Sending Tuya Cluster Command.. Manufacturer is _TZE200_rmymn92d Cluster Command is 0x0005, Arguments are (85,)
  2023-07-27 02:10:51.506 DEBUG (MainThread) [zhaquirks.tuya] a4:c1:38:d0:15:3b:4d:38 Sending Tuya Command. Paylod values [endpoint_id : 1, Status : 0, TSN: 0, Command: 0x0202, Function: 0, Data: [4, 0, 0, 0, 15]]
  2023-07-27 02:10:52.128 DEBUG (MainThread) [zhaquirks.tuya] a4:c1:38:d0:15:3b:4d:38 Sending Tuya Cluster Command.. Manufacturer is _TZE200_rmymn92d Cluster Command is 0x0005, Arguments are (54,)
  2023-07-27 02:10:52.129 DEBUG (MainThread) [zhaquirks.tuya] a4:c1:38:d0:15:3b:4d:38 Sending Tuya Command. Paylod values [endpoint_id : 1, Status : 0, TSN: 0, Command: 0x0202, Function: 0, Data: [4, 0, 0, 0, 46]]
  2023-07-27 02:10:52.663 DEBUG (MainThread) [zhaquirks.tuya] a4:c1:38:d0:15:3b:4d:38 Sending Tuya Cluster Command.. Manufacturer is _TZE200_rmymn92d Cluster Command is 0x0005, Arguments are (22,)
  2023-07-27 02:10:52.664 DEBUG (MainThread) [zhaquirks.tuya] a4:c1:38:d0:15:3b:4d:38 Sending Tuya Command. Paylod values [endpoint_id : 1, Status : 0, TSN: 0, Command: 0x0202, Function: 0, Data: [4, 0, 0, 0, 78]]
  2023-07-27 02:10:54.164 DEBUG (MainThread) [zhaquirks.tuya] a4:c1:38:d0:15:3b:4d:38 Sending Tuya Cluster Command.. Manufacturer is _TZE200_rmymn92d Cluster Command is 0x0000, Arguments are ()
  2023-07-27 02:10:54.165 DEBUG (MainThread) [zhaquirks.tuya] a4:c1:38:d0:15:3b:4d:38 Sending Tuya Command. Paylod values [endpoint_id : 1, Status : 0, TSN: 0, Command: 0x0401, Function: 0, Data: [1, 0]]
  2023-07-27 02:10:54.432 DEBUG (MainThread) [zhaquirks.tuya] a4:c1:38:d0:15:3b:4d:38 Sending Tuya Cluster Command.. Manufacturer is _TZE200_rmymn92d Cluster Command is 0x0000, Arguments are ()
  2023-07-27 02:10:54.433 DEBUG (MainThread) [zhaquirks.tuya] a4:c1:38:d0:15:3b:4d:38 Sending Tuya Command. Paylod values [endpoint_id : 1, Status : 0, TSN: 0, Command: 0x0401, Function: 0, Data: [1, 0]]
  2023-07-27 02:10:54.628 DEBUG (MainThread) [zhaquirks.tuya] a4:c1:38:d0:15:3b:4d:38 Sending Tuya Cluster Command.. Manufacturer is _TZE200_rmymn92d Cluster Command is 0x0000, Arguments are ()
  2023-07-27 02:10:54.628 DEBUG (MainThread) [zhaquirks.tuya] a4:c1:38:d0:15:3b:4d:38 Sending Tuya Command. Paylod values [endpoint_id : 1, Status : 0, TSN: 0, Command: 0x0401, Function: 0, Data: [1, 0]]
  2023-07-27 02:10:55.277 DEBUG (MainThread) [zhaquirks.tuya] a4:c1:38:d0:15:3b:4d:38 Sending Tuya Cluster Command.. Manufacturer is _TZE200_rmymn92d Cluster Command is 0x0001, Arguments are ()
  2023-07-27 02:10:55.277 DEBUG (MainThread) [zhaquirks.tuya] a4:c1:38:d0:15:3b:4d:38 Sending Tuya Command. Paylod values [endpoint_id : 1, Status : 0, TSN: 0, Command: 0x0401, Function: 0, Data: [1, 2]]
  2023-07-27 02:10:55.432 DEBUG (MainThread) [zhaquirks.tuya] a4:c1:38:d0:15:3b:4d:38 Sending Tuya Cluster Command.. Manufacturer is _TZE200_rmymn92d Cluster Command is 0x0001, Arguments are ()
  2023-07-27 02:10:55.432 DEBUG (MainThread) [zhaquirks.tuya] a4:c1:38:d0:15:3b:4d:38 Sending Tuya Command. Paylod values [endpoint_id : 1, Status : 0, TSN: 0, Command: 0x0401, Function: 0, Data: [1, 2]]
  2023-07-27 02:10:55.577 DEBUG (MainThread) [zhaquirks.tuya] a4:c1:38:d0:15:3b:4d:38 Sending Tuya Cluster Command.. Manufacturer is _TZE200_rmymn92d Cluster Command is 0x0001, Arguments are ()
  2023-07-27 02:10:55.578 DEBUG (MainThread) [zhaquirks.tuya] a4:c1:38:d0:15:3b:4d:38 Sending Tuya Command. Paylod values [endpoint_id : 1, Status : 0, TSN: 0, Command: 0x0401, Function: 0, Data: [1, 2]]
  2023-07-27 02:10:56.178 DEBUG (MainThread) [zhaquirks.tuya] a4:c1:38:d0:15:3b:4d:38 Sending Tuya Cluster Command.. Manufacturer is _TZE200_rmymn92d Cluster Command is 0x0002, Arguments are ()
  2023-07-27 02:10:56.179 DEBUG (MainThread) [zhaquirks.tuya] a4:c1:38:d0:15:3b:4d:38 Sending Tuya Command. Paylod values [endpoint_id : 1, Status : 0, TSN: 0, Command: 0x0401, Function: 0, Data: [1, 1]]
  2023-07-27 02:11:07.887 INFO (MainThread) [homeassistant.components.mqtt.discovery] Component has already been discovered: sensor 95d685af-b8d0-43ef-af19-3fd3346bb293 a9054a47-2b53-47a5-92c3-101318046926_info, sending update
full logging
2023-07-27 03:37:23.399 DEBUG (MainThread) [zhaquirks.tuya] a4:c1:38:d0:15:3b:4d:38 Sending Tuya Cluster Command.. Manufacturer is _TZE200_rmymn92d Cluster Command is 0x0002, Arguments are ()
2023-07-27 03:37:23.399 DEBUG (MainThread) [zhaquirks.tuya] a4:c1:38:d0:15:3b:4d:38 Sending Tuya Command. Paylod values [endpoint_id : 1, Status : 0, TSN: 0, Command: 0x0401, Function: 0, Data: [1, 1]]
2023-07-27 03:37:23.400 DEBUG (MainThread) [zigpy.zcl] [0x8B32:1:0xef00] Sending request header: ZCLHeader(frame_control=FrameControl(frame_type=<FrameType.CLUSTER_COMMAND: 1>, is_manufacturer_specific=True, direction=<Direction.Server_to_Client: 0>, disable_default_response=0, reserved=0, *is_cluster=True, *is_general=False), manufacturer=4417, tsn=197, command_id=0, *direction=<Direction.Server_to_Client: 0>)
2023-07-27 03:37:23.401 DEBUG (MainThread) [zigpy.zcl] [0x8B32:1:0xef00] Sending request: set_data(param=Command(status=0, tsn=0, command_id=1025, function=0, data=[1, 1]))
2023-07-27 03:37:23.403 DEBUG (MainThread) [bellows.zigbee.application] Sending packet ZigbeePacket(src=AddrModeAddress(addr_mode=<AddrMode.NWK: 2>, address=0x0000), src_ep=1, dst=AddrModeAddress(addr_mode=<AddrMode.NWK: 2>, address=0x8B32), dst_ep=1, source_route=None, extended_timeout=False, tsn=197, profile_id=260, cluster_id=61184, data=Serialized[b'\x05A\x11\xc5\x00\x00\x00\x01\x04\x00\x01\x01'], tx_options=<TransmitOptions.NONE: 0>, radius=0, non_member_radius=0, lqi=None, rssi=None)
2023-07-27 03:37:23.403 DEBUG (MainThread) [bellows.ezsp.protocol] Send command sendUnicast: (<EmberOutgoingMessageType.OUTGOING_DIRECT: 0>, 0x8b32, EmberApsFrame(profileId=260, clusterId=61184, sourceEndpoint=1, destinationEndpoint=1, options=<EmberApsOption.APS_OPTION_ENABLE_ROUTE_DISCOVERY: 256>, groupId=0, sequence=197), 198, b'\x05A\x11\xc5\x00\x00\x00\x01\x04\x00\x01\x01')
2023-07-27 03:37:23.405 DEBUG (bellows.thread_0) [bellows.uart] Data frame: b'71b5b1a9112a15b65894a524ab5593499c3def65df459874f7cf0b8bfc8b3aa6ebccdeb6b87e'
2023-07-27 03:37:23.406 DEBUG (bellows.thread_0) [bellows.uart] Sending: b'8070787e'
2023-07-27 03:37:23.406 DEBUG (bellows.thread_0) [bellows.uart] Sending: b'10ba21a9602a1580d2904b25455493499d4e276e2bc262caec036389fc7f3ba7eacc73427e'
2023-07-27 03:37:23.408 DEBUG (MainThread) [bellows.ezsp.protocol] Application frame received incomingMessageHandler: [<EmberIncomingMessageType.INCOMING_UNICAST: 0>, EmberApsFrame(profileId=260, clusterId=61184, sourceEndpoint=1, destinationEndpoint=1, options=<EmberApsOption.APS_OPTION_ENABLE_ROUTE_DISCOVERY: 256>, groupId=0, sequence=115), 200, -50, 0x8b32, 255, 255, b'\th\x02\x00\xf5\x05\x01\x00\x01\x00']
2023-07-27 03:37:23.409 DEBUG (MainThread) [bellows.zigbee.application] Received incomingMessageHandler frame with [<EmberIncomingMessageType.INCOMING_UNICAST: 0>, EmberApsFrame(profileId=260, clusterId=61184, sourceEndpoint=1, destinationEndpoint=1, options=<EmberApsOption.APS_OPTION_ENABLE_ROUTE_DISCOVERY: 256>, groupId=0, sequence=115), 200, -50, 0x8b32, 255, 255, b'\th\x02\x00\xf5\x05\x01\x00\x01\x00']
2023-07-27 03:37:23.409 DEBUG (MainThread) [zigpy.application] Received a packet: ZigbeePacket(src=AddrModeAddress(addr_mode=<AddrMode.NWK: 2>, address=0x8B32), src_ep=1, dst=AddrModeAddress(addr_mode=<AddrMode.NWK: 2>, address=0x0000), dst_ep=1, source_route=None, extended_timeout=False, tsn=115, profile_id=260, cluster_id=61184, data=Serialized[b'\th\x02\x00\xf5\x05\x01\x00\x01\x00'], tx_options=<TransmitOptions.NONE: 0>, radius=0, non_member_radius=0, lqi=200, rssi=-50)
2023-07-27 03:37:23.410 DEBUG (MainThread) [zigpy.zcl] [0x8B32:1:0xef00] Received ZCL frame: b'\th\x02\x00\xf5\x05\x01\x00\x01\x00'
2023-07-27 03:37:23.411 DEBUG (MainThread) [zigpy.zcl] [0x8B32:1:0xef00] Decoded ZCL frame header: ZCLHeader(frame_control=FrameControl(frame_type=<FrameType.CLUSTER_COMMAND: 1>, is_manufacturer_specific=0, direction=<Direction.Client_to_Server: 1>, disable_default_response=0, reserved=0, *is_cluster=True, *is_general=False), tsn=104, command_id=2, *direction=<Direction.Client_to_Server: 1>)
2023-07-27 03:37:23.416 DEBUG (bellows.thread_0) [bellows.uart] Data frame: b'02baa1a9602a15a0c88b7e'
2023-07-27 03:37:23.416 DEBUG (bellows.thread_0) [bellows.uart] Sending: b'8160597e'
2023-07-27 03:37:23.413 DEBUG (MainThread) [zigpy.zcl] [0x8B32:1:0xef00] Decoded ZCL frame: TuyaManufacturerWindowCover:set_data_response(param=Command(status=0, tsn=245, command_id=261, function=0, data=[1, 0]))
2023-07-27 03:37:23.427 DEBUG (MainThread) [zigpy.zcl] [0x8B32:1:0xef00] Received command 0x02 (TSN 104): set_data_response(param=Command(status=0, tsn=245, command_id=261, function=0, data=[1, 0]))
2023-07-27 03:37:23.428 DEBUG (MainThread) [zhaquirks.tuya] a4:c1:38:d0:15:3b:4d:38 Received Attribute Report. Command is 0x0002, Tuya Paylod values[Status : 0, TSN: 245, Command: 0x0105, Function: 0x00, Data: [1, 0]]
2023-07-27 03:37:23.429 DEBUG (MainThread) [bellows.ezsp.protocol] Application frame received sendUnicast: [<EmberStatus.SUCCESS: 0>, 18]
2023-07-27 03:37:23.430 DEBUG (bellows.thread_0) [bellows.uart] Data frame: b'12bab1a96b2a1580d2904b25455493499d4e27b92bce6726f57e'
2023-07-27 03:37:23.430 DEBUG (bellows.thread_0) [bellows.uart] Sending: b'82503a7e'
2023-07-27 03:37:23.437 DEBUG (MainThread) [bellows.ezsp.protocol] Application frame received messageSentHandler: [<EmberOutgoingMessageType.OUTGOING_DIRECT: 0>, 35634, EmberApsFrame(profileId=260, clusterId=61184, sourceEndpoint=1, destinationEndpoint=1, options=<EmberApsOption.APS_OPTION_ENABLE_ROUTE_DISCOVERY: 256>, groupId=0, sequence=18), 198, <EmberStatus.SUCCESS: 0>, b'']
2023-07-27 03:37:23.437 DEBUG (MainThread) [bellows.zigbee.application] Received messageSentHandler frame with [<EmberOutgoingMessageType.OUTGOING_DIRECT: 0>, 35634, EmberApsFrame(profileId=260, clusterId=61184, sourceEndpoint=1, destinationEndpoint=1, options=<EmberApsOption.APS_OPTION_ENABLE_ROUTE_DISCOVERY: 256>, groupId=0, sequence=18), 198, <EmberStatus.SUCCESS: 0>, b'']
2023-07-27 03:37:23.467 DEBUG (bellows.thread_0) [bellows.uart] Data frame: b'22bab1a9112a15b65894a524ab5593499c3aef65df459874f8dea682fcfd011c7e'
2023-07-27 03:37:23.467 DEBUG (bellows.thread_0) [bellows.uart] Sending: b'83401b7e'
2023-07-27 03:37:23.468 DEBUG (MainThread) [bellows.ezsp.protocol] Application frame received incomingMessageHandler: [<EmberIncomingMessageType.INCOMING_UNICAST: 0>, EmberApsFrame(profileId=260, clusterId=61184, sourceEndpoint=1, destinationEndpoint=1, options=<EmberApsOption.APS_OPTION_ENABLE_ROUTE_DISCOVERY: 256>, groupId=0, sequence=116), 200, -50, 0x8b32, 255, 255, b'\x18\xc5\x0b\x00\x83']
2023-07-27 03:37:23.468 DEBUG (MainThread) [bellows.zigbee.application] Received incomingMessageHandler frame with [<EmberIncomingMessageType.INCOMING_UNICAST: 0>, EmberApsFrame(profileId=260, clusterId=61184, sourceEndpoint=1, destinationEndpoint=1, options=<EmberApsOption.APS_OPTION_ENABLE_ROUTE_DISCOVERY: 256>, groupId=0, sequence=116), 200, -50, 0x8b32, 255, 255, b'\x18\xc5\x0b\x00\x83']
2023-07-27 03:37:23.468 DEBUG (MainThread) [zigpy.application] Received a packet: ZigbeePacket(src=AddrModeAddress(addr_mode=<AddrMode.NWK: 2>, address=0x8B32), src_ep=1, dst=AddrModeAddress(addr_mode=<AddrMode.NWK: 2>, address=0x0000), dst_ep=1, source_route=None, extended_timeout=False, tsn=116, profile_id=260, cluster_id=61184, data=Serialized[b'\x18\xc5\x0b\x00\x83'], tx_options=<TransmitOptions.NONE: 0>, radius=0, non_member_radius=0, lqi=200, rssi=-50)
2023-07-27 03:37:23.469 DEBUG (MainThread) [zigpy.zcl] [0x8B32:1:0xef00] Received ZCL frame: b'\x18\xc5\x0b\x00\x83'
2023-07-27 03:37:23.470 DEBUG (MainThread) [zigpy.zcl] [0x8B32:1:0xef00] Decoded ZCL frame header: ZCLHeader(frame_control=FrameControl(frame_type=<FrameType.GLOBAL_COMMAND: 0>, is_manufacturer_specific=0, direction=<Direction.Client_to_Server: 1>, disable_default_response=1, reserved=0, *is_cluster=False, *is_general=True), tsn=197, command_id=11, *direction=<Direction.Client_to_Server: 1>)
2023-07-27 03:37:23.471 DEBUG (MainThread) [zigpy.zcl] [0x8B32:1:0xef00] Decoded ZCL frame: TuyaManufacturerWindowCover:Default_Response(command_id=0, status=<Status.UNSUP_MANUF_CLUSTER_COMMAND: 131>)
2023-07-27 03:37:23.540 DEBUG (MainThread) [bellows.ezsp.protocol] Send command readCounters: ()
2023-07-27 03:37:23.541 DEBUG (bellows.thread_0) [bellows.uart] Sending: b'23bb21a9a52a326d7e'
2023-07-27 03:37:23.559 DEBUG (bellows.thread_0) [bellows.uart] Data frame: b'33bba1a9a52aa4b309945c24d655f249834e23abe9ced78ba5c67289f27e31a7e7cddf6f8fffc7dbd5d2698c4623a9ec763ba5ea758241984c2607b1e070381c0e07bbe5ca658e459a4d9e4f9ff7c3d9d46a35a2519048244f987e'
2023-07-27 03:37:23.560 DEBUG (bellows.thread_0) [bellows.uart] Sending: b'8430fc7e'
2023-07-27 03:37:23.561 DEBUG (MainThread) [bellows.ezsp.protocol] Application frame received readCounters: [[433, 80, 278, 124, 96, 31, 4, 4, 176, 88, 17, 14, 14, 12, 1, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 20, 0, 0, 0, 0, 0, 4, 0, 0, 0, 0, 0, 0, 0, 0]]
2023-07-27 03:37:23.561 DEBUG (MainThread) [bellows.ezsp.protocol] Send command getValue: (<EzspValueId.VALUE_FREE_BUFFERS: 3>,)
2023-07-27 03:37:23.562 DEBUG (bellows.thread_0) [bellows.uart] Sending: b'34b821a9fe2a1647067e'
2023-07-27 03:37:23.568 DEBUG (bellows.thread_0) [bellows.uart] Data frame: b'44b8a1a9fe2a15b3aeb2957e'
2023-07-27 03:37:23.568 DEBUG (bellows.thread_0) [bellows.uart] Sending: b'8520dd7e'
2023-07-27 03:37:23.569 DEBUG (MainThread) [bellows.ezsp.protocol] Application frame received getValue: [<EzspStatus.SUCCESS: 0>, b'\xf7']
2023-07-27 03:37:23.570 DEBUG (MainThread) [bellows.zigbee.application] Free buffers status EzspStatus.SUCCESS, value: 247
2023-07-27 03:37:23.570 DEBUG (MainThread) [bellows.zigbee.application] ezsp_counters: [MAC_RX_BROADCAST = 433, MAC_TX_BROADCAST = 80, MAC_RX_UNICAST = 278, MAC_TX_UNICAST_SUCCESS = 124, MAC_TX_UNICAST_RETRY = 96, MAC_TX_UNICAST_FAILED = 31, APS_DATA_RX_BROADCAST = 4, APS_DATA_TX_BROADCAST = 4, APS_DATA_RX_UNICAST = 176, APS_DATA_TX_UNICAST_SUCCESS = 88, APS_DATA_TX_UNICAST_RETRY = 17, APS_DATA_TX_UNICAST_FAILED = 14, ROUTE_DISCOVERY_INITIATED = 14, NEIGHBOR_ADDED = 12, NEIGHBOR_REMOVED = 1, NEIGHBOR_STALE = 0, JOIN_INDICATION = 0, CHILD_REMOVED = 0, ASH_OVERFLOW_ERROR = 0, ASH_FRAMING_ERROR = 0, ASH_OVERRUN_ERROR = 0, NWK_FRAME_COUNTER_FAILURE = 0, APS_FRAME_COUNTER_FAILURE = 0, UTILITY = 0, APS_LINK_KEY_NOT_AUTHORIZED = 0, NWK_DECRYPTION_FAILURE = 0, APS_DECRYPTION_FAILURE = 20, ALLOCATE_PACKET_BUFFER_FAILURE = 0, RELAYED_UNICAST = 0, PHY_TO_MAC_QUEUE_LIMIT_REACHED = 0, PACKET_VALIDATE_LIBRARY_DROPPED_COUNT = 0, TYPE_NWK_RETRY_OVERFLOW = 0, PHY_CCA_FAIL_COUNT = 4, BROADCAST_TABLE_FULL = 0, PTA_LO_PRI_REQUESTED = 0, PTA_HI_PRI_REQUESTED = 0, PTA_LO_PRI_DENIED = 0, PTA_HI_PRI_DENIED = 0, PTA_LO_PRI_TX_ABORTED = 0, PTA_HI_PRI_TX_ABORTED = 0, ADDRESS_CONFLICT_SENT = 0, EZSP_FREE_BUFFERS = 247]
2023-07-27 03:37:23.601 DEBUG (bellows.thread_0) [bellows.uart] Data frame: b'54b8b1a9112a15b65894a524ab5593499c3be366df459874f7cf0b8bfc8b3aa6ebccde3b3a7e'
2023-07-27 03:37:23.601 DEBUG (bellows.thread_0) [bellows.uart] Sending: b'8610be7e'
2023-07-27 03:37:23.602 DEBUG (MainThread) [bellows.ezsp.protocol] Application frame received incomingMessageHandler: [<EmberIncomingMessageType.INCOMING_UNICAST: 0>, EmberApsFrame(profileId=260, clusterId=61184, sourceEndpoint=1, destinationEndpoint=1, options=<EmberApsOption.APS_OPTION_ENABLE_ROUTE_DISCOVERY: 256>, groupId=0, sequence=117), 196, -51, 0x8b32, 255, 255, b'\th\x02\x00\xf5\x05\x01\x00\x01\x00']
2023-07-27 03:37:23.603 DEBUG (MainThread) [bellows.zigbee.application] Received incomingMessageHandler frame with [<EmberIncomingMessageType.INCOMING_UNICAST: 0>, EmberApsFrame(profileId=260, clusterId=61184, sourceEndpoint=1, destinationEndpoint=1, options=<EmberApsOption.APS_OPTION_ENABLE_ROUTE_DISCOVERY: 256>, groupId=0, sequence=117), 196, -51, 0x8b32, 255, 255, b'\th\x02\x00\xf5\x05\x01\x00\x01\x00']
2023-07-27 03:37:23.603 DEBUG (MainThread) [zigpy.application] Received a packet: ZigbeePacket(src=AddrModeAddress(addr_mode=<AddrMode.NWK: 2>, address=0x8B32), src_ep=1, dst=AddrModeAddress(addr_mode=<AddrMode.NWK: 2>, address=0x0000), dst_ep=1, source_route=None, extended_timeout=False, tsn=117, profile_id=260, cluster_id=61184, data=Serialized[b'\th\x02\x00\xf5\x05\x01\x00\x01\x00'], tx_options=<TransmitOptions.NONE: 0>, radius=0, non_member_radius=0, lqi=196, rssi=-51)
2023-07-27 03:37:23.604 DEBUG (MainThread) [zigpy.zcl] [0x8B32:1:0xef00] Received ZCL frame: b'\th\x02\x00\xf5\x05\x01\x00\x01\x00'
2023-07-27 03:37:23.606 DEBUG (MainThread) [zigpy.zcl] [0x8B32:1:0xef00] Decoded ZCL frame header: ZCLHeader(frame_control=FrameControl(frame_type=<FrameType.CLUSTER_COMMAND: 1>, is_manufacturer_specific=0, direction=<Direction.Client_to_Server: 1>, disable_default_response=0, reserved=0, *is_cluster=True, *is_general=False), tsn=104, command_id=2, *direction=<Direction.Client_to_Server: 1>)
2023-07-27 03:37:23.613 DEBUG (MainThread) [zigpy.zcl] [0x8B32:1:0xef00] Decoded ZCL frame: TuyaManufacturerWindowCover:set_data_response(param=Command(status=0, tsn=245, command_id=261, function=0, data=[1, 0]))
2023-07-27 03:37:23.614 DEBUG (MainThread) [zigpy.zcl] [0x8B32:1:0xef00] Received command 0x02 (TSN 104): set_data_response(param=Command(status=0, tsn=245, command_id=261, function=0, data=[1, 0]))
2023-07-27 03:37:23.615 DEBUG (MainThread) [zhaquirks.tuya] a4:c1:38:d0:15:3b:4d:38 Received Attribute Report. Command is 0x0002, Tuya Paylod values[Status : 0, TSN: 245, Command: 0x0105, Function: 0x00, Data: [1, 0]]
2023-07-27 03:37:23.855 DEBUG (MainThread) [homeassistant.components.zha.core.device] [0xE266](DG6HD): Device seen - marking the device available and resetting counter
2023-07-27 03:37:23.856 DEBUG (MainThread) [homeassistant.components.zha.core.device] [0xE266](DG6HD): Update device availability -  device available: True - new availability: True - changed: False
2023-07-27 03:37:24.011 DEBUG (bellows.thread_0) [bellows.uart] Data frame: b'64b8b1a9112a15b65894a524ab5593499c38ef65df459874f7cf0a8bfc883ea3ebccdf23867e'
2023-07-27 03:37:24.011 DEBUG (bellows.thread_0) [bellows.uart] Sending: b'87009f7e'
2023-07-27 03:37:24.030 DEBUG (MainThread) [bellows.ezsp.protocol] Application frame received incomingMessageHandler: [<EmberIncomingMessageType.INCOMING_UNICAST: 0>, EmberApsFrame(profileId=260, clusterId=61184, sourceEndpoint=1, destinationEndpoint=1, options=<EmberApsOption.APS_OPTION_ENABLE_ROUTE_DISCOVERY: 256>, groupId=0, sequence=118), 200, -50, 0x8b32, 255, 255, b'\ti\x02\x00\xf6\x01\x04\x00\x01\x01']
2023-07-27 03:37:24.031 DEBUG (MainThread) [bellows.zigbee.application] Received incomingMessageHandler frame with [<EmberIncomingMessageType.INCOMING_UNICAST: 0>, EmberApsFrame(profileId=260, clusterId=61184, sourceEndpoint=1, destinationEndpoint=1, options=<EmberApsOption.APS_OPTION_ENABLE_ROUTE_DISCOVERY: 256>, groupId=0, sequence=118), 200, -50, 0x8b32, 255, 255, b'\ti\x02\x00\xf6\x01\x04\x00\x01\x01']
2023-07-27 03:37:24.031 DEBUG (MainThread) [zigpy.application] Received a packet: ZigbeePacket(src=AddrModeAddress(addr_mode=<AddrMode.NWK: 2>, address=0x8B32), src_ep=1, dst=AddrModeAddress(addr_mode=<AddrMode.NWK: 2>, address=0x0000), dst_ep=1, source_route=None, extended_timeout=False, tsn=118, profile_id=260, cluster_id=61184, data=Serialized[b'\ti\x02\x00\xf6\x01\x04\x00\x01\x01'], tx_options=<TransmitOptions.NONE: 0>, radius=0, non_member_radius=0, lqi=200, rssi=-50)
2023-07-27 03:37:24.032 DEBUG (MainThread) [zigpy.zcl] [0x8B32:1:0xef00] Received ZCL frame: b'\ti\x02\x00\xf6\x01\x04\x00\x01\x01'
2023-07-27 03:37:24.032 DEBUG (MainThread) [zigpy.zcl] [0x8B32:1:0xef00] Decoded ZCL frame header: ZCLHeader(frame_control=FrameControl(frame_type=<FrameType.CLUSTER_COMMAND: 1>, is_manufacturer_specific=0, direction=<Direction.Client_to_Server: 1>, disable_default_response=0, reserved=0, *is_cluster=True, *is_general=False), tsn=105, command_id=2, *direction=<Direction.Client_to_Server: 1>)
2023-07-27 03:37:24.033 DEBUG (MainThread) [zigpy.zcl] [0x8B32:1:0xef00] Decoded ZCL frame: TuyaManufacturerWindowCover:set_data_response(param=Command(status=0, tsn=246, command_id=1025, function=0, data=[1, 1]))
2023-07-27 03:37:24.034 DEBUG (MainThread) [zigpy.zcl] [0x8B32:1:0xef00] Received command 0x02 (TSN 105): set_data_response(param=Command(status=0, tsn=246, command_id=1025, function=0, data=[1, 1]))
2023-07-27 03:37:24.034 DEBUG (MainThread) [zhaquirks.tuya] a4:c1:38:d0:15:3b:4d:38 Received Attribute Report. Command is 0x0002, Tuya Paylod values[Status : 0, TSN: 246, Command: 0x0401, Function: 0x00, Data: [1, 1]]
2023-07-27 03:37:24.250 DEBUG (MainThread) [homeassistant.core] Bus:Handling <Event call_service[L]: domain=cover, service=close_cover, service_data=entity_id=cover.tze200_rmymn92d_ts0601_cover>
2023-07-27 03:37:24.253 DEBUG (MainThread) [zigpy.util] Tries remaining: 3
2023-07-27 03:37:24.253 DEBUG (MainThread) [zhaquirks.tuya] a4:c1:38:d0:15:3b:4d:38 Sending Tuya Cluster Command.. Manufacturer is _TZE200_rmymn92d Cluster Command is 0x0001, Arguments are ()
2023-07-27 03:37:24.253 DEBUG (MainThread) [zhaquirks.tuya] a4:c1:38:d0:15:3b:4d:38 Sending Tuya Command. Paylod values [endpoint_id : 1, Status : 0, TSN: 0, Command: 0x0401, Function: 0, Data: [1, 2]]
2023-07-27 03:37:24.254 DEBUG (MainThread) [zigpy.zcl] [0x8B32:1:0xef00] Sending request header: ZCLHeader(frame_control=FrameControl(frame_type=<FrameType.CLUSTER_COMMAND: 1>, is_manufacturer_specific=True, direction=<Direction.Server_to_Client: 0>, disable_default_response=0, reserved=0, *is_cluster=True, *is_general=False), manufacturer=4417, tsn=199, command_id=0, *direction=<Direction.Server_to_Client: 0>)
2023-07-27 03:37:24.255 DEBUG (MainThread) [zigpy.zcl] [0x8B32:1:0xef00] Sending request: set_data(param=Command(status=0, tsn=0, command_id=1025, function=0, data=[1, 2]))
2023-07-27 03:37:24.256 DEBUG (MainThread) [bellows.zigbee.application] Sending packet ZigbeePacket(src=AddrModeAddress(addr_mode=<AddrMode.NWK: 2>, address=0x0000), src_ep=1, dst=AddrModeAddress(addr_mode=<AddrMode.NWK: 2>, address=0x8B32), dst_ep=1, source_route=None, extended_timeout=False, tsn=199, profile_id=260, cluster_id=61184, data=Serialized[b'\x05A\x11\xc7\x00\x00\x00\x01\x04\x00\x01\x02'], tx_options=<TransmitOptions.NONE: 0>, radius=0, non_member_radius=0, lqi=None, rssi=None)
2023-07-27 03:37:24.256 DEBUG (MainThread) [bellows.ezsp.protocol] Send command sendUnicast: (<EmberOutgoingMessageType.OUTGOING_DIRECT: 0>, 0x8b32, EmberApsFrame(profileId=260, clusterId=61184, sourceEndpoint=1, destinationEndpoint=1, options=<EmberApsOption.APS_OPTION_ENABLE_ROUTE_DISCOVERY: 256>, groupId=0, sequence=199), 200, b'\x05A\x11\xc7\x00\x00\x00\x01\x04\x00\x01\x02')
2023-07-27 03:37:24.263 DEBUG (bellows.thread_0) [bellows.uart] Sending: b'47b921a9602a1580d2904b25455493499d4e276c25c262caec016389fc7f3ba7eacf44f27e'
2023-07-27 03:37:24.272 DEBUG (bellows.thread_0) [bellows.uart] Data frame: b'75b9a1a9602a15a1773c7e'
2023-07-27 03:37:24.272 DEBUG (bellows.thread_0) [bellows.uart] Sending: b'8070787e'
2023-07-27 03:37:24.274 DEBUG (MainThread) [bellows.ezsp.protocol] Application frame received sendUnicast: [<EmberStatus.SUCCESS: 0>, 19]
2023-07-27 03:37:24.286 DEBUG (bellows.thread_0) [bellows.uart] Data frame: b'05b9b1a96b2a1580d2904b25455493499d4e27b825ce67e64e7e'
2023-07-27 03:37:24.286 DEBUG (bellows.thread_0) [bellows.uart] Sending: b'8160597e'
2023-07-27 03:37:24.289 DEBUG (MainThread) [bellows.ezsp.protocol] Application frame received messageSentHandler: [<EmberOutgoingMessageType.OUTGOING_DIRECT: 0>, 35634, EmberApsFrame(profileId=260, clusterId=61184, sourceEndpoint=1, destinationEndpoint=1, options=<EmberApsOption.APS_OPTION_ENABLE_ROUTE_DISCOVERY: 256>, groupId=0, sequence=19), 200, <EmberStatus.SUCCESS: 0>, b'']
2023-07-27 03:37:24.289 DEBUG (MainThread) [bellows.zigbee.application] Received messageSentHandler frame with [<EmberOutgoingMessageType.OUTGOING_DIRECT: 0>, 35634, EmberApsFrame(profileId=260, clusterId=61184, sourceEndpoint=1, destinationEndpoint=1, options=<EmberApsOption.APS_OPTION_ENABLE_ROUTE_DISCOVERY: 256>, groupId=0, sequence=19), 200, <EmberStatus.SUCCESS: 0>, b'']
2023-07-27 03:37:24.323 DEBUG (bellows.thread_0) [bellows.uart] Data frame: b'15b9b1a9112a15b65894a524ab5593499c39ef65df459874f8dea482fcfdbfb67e'
2023-07-27 03:37:24.324 DEBUG (bellows.thread_0) [bellows.uart] Sending: b'82503a7e'
2023-07-27 03:37:24.326 DEBUG (MainThread) [bellows.ezsp.protocol] Application frame received incomingMessageHandler: [<EmberIncomingMessageType.INCOMING_UNICAST: 0>, EmberApsFrame(profileId=260, clusterId=61184, sourceEndpoint=1, destinationEndpoint=1, options=<EmberApsOption.APS_OPTION_ENABLE_ROUTE_DISCOVERY: 256>, groupId=0, sequence=119), 200, -50, 0x8b32, 255, 255, b'\x18\xc7\x0b\x00\x83']
2023-07-27 03:37:24.327 DEBUG (MainThread) [bellows.zigbee.application] Received incomingMessageHandler frame with [<EmberIncomingMessageType.INCOMING_UNICAST: 0>, EmberApsFrame(profileId=260, clusterId=61184, sourceEndpoint=1, destinationEndpoint=1, options=<EmberApsOption.APS_OPTION_ENABLE_ROUTE_DISCOVERY: 256>, groupId=0, sequence=119), 200, -50, 0x8b32, 255, 255, b'\x18\xc7\x0b\x00\x83']
2023-07-27 03:37:24.327 DEBUG (MainThread) [zigpy.application] Received a packet: ZigbeePacket(src=AddrModeAddress(addr_mode=<AddrMode.NWK: 2>, address=0x8B32), src_ep=1, dst=AddrModeAddress(addr_mode=<AddrMode.NWK: 2>, address=0x0000), dst_ep=1, source_route=None, extended_timeout=False, tsn=119, profile_id=260, cluster_id=61184, data=Serialized[b'\x18\xc7\x0b\x00\x83'], tx_options=<TransmitOptions.NONE: 0>, radius=0, non_member_radius=0, lqi=200, rssi=-50)
2023-07-27 03:37:24.328 DEBUG (MainThread) [zigpy.zcl] [0x8B32:1:0xef00] Received ZCL frame: b'\x18\xc7\x0b\x00\x83'
2023-07-27 03:37:24.329 DEBUG (MainThread) [zigpy.zcl] [0x8B32:1:0xef00] Decoded ZCL frame header: ZCLHeader(frame_control=FrameControl(frame_type=<FrameType.GLOBAL_COMMAND: 0>, is_manufacturer_specific=0, direction=<Direction.Client_to_Server: 1>, disable_default_response=1, reserved=0, *is_cluster=False, *is_general=True), tsn=199, command_id=11, *direction=<Direction.Client_to_Server: 1>)
2023-07-27 03:37:24.330 DEBUG (MainThread) [zigpy.zcl] [0x8B32:1:0xef00] Decoded ZCL frame: TuyaManufacturerWindowCover:Default_Response(command_id=0, status=<Status.UNSUP_MANUF_CLUSTER_COMMAND: 131>)
2023-07-27 03:37:24.419 DEBUG (bellows.thread_0) [bellows.uart] Data frame: b'25b9b1a9112a15b65894a524ab5593499c36e366df459874f0cf098bfc893ca5ebc9de6f8f9b9e437e'
2023-07-27 03:37:24.420 DEBUG (bellows.thread_0) [bellows.uart] Sending: b'83401b7e'
2023-07-27 03:37:24.421 DEBUG (MainThread) [bellows.ezsp.protocol] Application frame received incomingMessageHandler: [<EmberIncomingMessageType.INCOMING_UNICAST: 0>, EmberApsFrame(profileId=260, clusterId=61184, sourceEndpoint=1, destinationEndpoint=1, options=<EmberApsOption.APS_OPTION_ENABLE_ROUTE_DISCOVERY: 256>, groupId=0, sequence=120), 196, -51, 0x8b32, 255, 255, b'\tj\x02\x00\xf7\x03\x02\x00\x04\x00\x00\x00d']
2023-07-27 03:37:24.421 DEBUG (MainThread) [bellows.zigbee.application] Received incomingMessageHandler frame with [<EmberIncomingMessageType.INCOMING_UNICAST: 0>, EmberApsFrame(profileId=260, clusterId=61184, sourceEndpoint=1, destinationEndpoint=1, options=<EmberApsOption.APS_OPTION_ENABLE_ROUTE_DISCOVERY: 256>, groupId=0, sequence=120), 196, -51, 0x8b32, 255, 255, b'\tj\x02\x00\xf7\x03\x02\x00\x04\x00\x00\x00d']
2023-07-27 03:37:24.422 DEBUG (MainThread) [zigpy.application] Received a packet: ZigbeePacket(src=AddrModeAddress(addr_mode=<AddrMode.NWK: 2>, address=0x8B32), src_ep=1, dst=AddrModeAddress(addr_mode=<AddrMode.NWK: 2>, address=0x0000), dst_ep=1, source_route=None, extended_timeout=False, tsn=120, profile_id=260, cluster_id=61184, data=Serialized[b'\tj\x02\x00\xf7\x03\x02\x00\x04\x00\x00\x00d'], tx_options=<TransmitOptions.NONE: 0>, radius=0, non_member_radius=0, lqi=196, rssi=-51)
2023-07-27 03:37:24.423 DEBUG (MainThread) [zigpy.zcl] [0x8B32:1:0xef00] Received ZCL frame: b'\tj\x02\x00\xf7\x03\x02\x00\x04\x00\x00\x00d'
2023-07-27 03:37:24.423 DEBUG (MainThread) [zigpy.zcl] [0x8B32:1:0xef00] Decoded ZCL frame header: ZCLHeader(frame_control=FrameControl(frame_type=<FrameType.CLUSTER_COMMAND: 1>, is_manufacturer_specific=0, direction=<Direction.Client_to_Server: 1>, disable_default_response=0, reserved=0, *is_cluster=True, *is_general=False), tsn=106, command_id=2, *direction=<Direction.Client_to_Server: 1>)
2023-07-27 03:37:24.425 DEBUG (MainThread) [zigpy.zcl] [0x8B32:1:0xef00] Decoded ZCL frame: TuyaManufacturerWindowCover:set_data_response(param=Command(status=0, tsn=247, command_id=515, function=0, data=[4, 0, 0, 0, 100]))
2023-07-27 03:37:24.432 DEBUG (MainThread) [zigpy.zcl] [0x8B32:1:0xef00] Received command 0x02 (TSN 106): set_data_response(param=Command(status=0, tsn=247, command_id=515, function=0, data=[4, 0, 0, 0, 100]))
2023-07-27 03:37:24.433 DEBUG (MainThread) [zhaquirks.tuya] a4:c1:38:d0:15:3b:4d:38 Received Attribute Report. Command is 0x0002, Tuya Paylod values[Status : 0, TSN: 247, Command: 0x0203, Function: 0x00, Data: [4, 0, 0, 0, 100]]
2023-07-27 03:37:24.434 DEBUG (MainThread) [homeassistant.components.zha.core.cluster_handlers] [0x8B32:1:0x0102]: Attribute report 'Window Covering'[current_position_lift_percentage] = 0
2023-07-27 03:37:24.434 DEBUG (MainThread) [homeassistant.components.zha.cover] setting position: 0
2023-07-27 03:37:24.435 DEBUG (MainThread) [zhaquirks.tuya] a4:c1:38:d0:15:3b:4d:38 Tuya Attribute Cache : [{8: 0}]
2023-07-27 03:37:24.828 DEBUG (bellows.thread_0) [bellows.uart] Data frame: b'35b9b1a9112a15b6589e4a24ab5593499c37ef65df459874f8c60889fb7e23d37e'

Does anyone have any idea what's going wrong here? Any help is appreciated.

Copy link

There hasn't been any activity on this issue recently. Due to the high number of incoming GitHub notifications, we have to clean some of the old issues, as many of them have already been resolved with the latest updates. Please make sure to update to the latest version and check if that solves the issue. Let us know if that works for you by adding a comment 👍 This issue has now been marked as stale and will be closed if no further activity occurs. Thank you for your contributions.

@github-actions github-actions bot added the stale Issue is inactivate and might get closed soon label Jan 26, 2024
@github-actions github-actions bot closed this as not planned Won't fix, can't repro, duplicate, stale Feb 2, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
custom quirk available A custom quirk is available to solve the issue, but it's not merged in the repo yet stale Issue is inactivate and might get closed soon Tuya Request/PR regarding a Tuya device
Projects
None yet
Development

No branches or pull requests