Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Possible bug in cooldowns #2057

Closed
Lulalaby opened this issue May 8, 2023 · 6 comments · Fixed by #2091
Closed

Possible bug in cooldowns #2057

Lulalaby opened this issue May 8, 2023 · 6 comments · Fixed by #2091
Labels
priority: high High Priority status: todo This issue needs work unconfirmed bug A bug report that needs triaging
Milestone

Comments

@Lulalaby
Copy link
Member

Lulalaby commented May 8, 2023

It might be that cooldowns and before_invoke would never run if you don't have max concurrency set for slash commands

if hasattr(self, "_max_concurrency"):
if self._max_concurrency is not None:
# For this application, context can be duck-typed as a Message
await self._max_concurrency.acquire(ctx) # type: ignore # ctx instead of non-existent message
try:
self._prepare_cooldowns(ctx)
await self.call_before_hooks(ctx)
except:
if self._max_concurrency is not None:
await self._max_concurrency.release(ctx) # type: ignore # ctx instead of non-existent message
raise

Reported by @Om1609

@Lulalaby Lulalaby added priority: high High Priority unconfirmed bug A bug report that needs triaging python labels May 8, 2023
@Lulalaby Lulalaby added this to the v2.5 milestone May 8, 2023
@Lulalaby Lulalaby added the status: todo This issue needs work label May 8, 2023
@NeloBlivion
Copy link
Member

All commands have the _max_concurrency attribute

self._max_concurrency: MaxConcurrency | None = max_concurrency

Odd logic, but not a bug (and in my personal experience, cooldowns work fine without max_concurrency).

@OmLanke
Copy link
Contributor

OmLanke commented May 21, 2023

All commands have the _max_concurrency attribute

self._max_concurrency: MaxConcurrency | None = max_concurrency

Odd logic, but not a bug (and in my personal experience, cooldowns work fine without max_concurrency).

No. SlashCommandGroup overrides the __init__ and doesn't call the super method.

def __init__(
self,
name: str,
description: str | None = None,
guild_ids: list[int] | None = None,
parent: SlashCommandGroup | None = None,
**kwargs,
) -> None:
self.name = str(name)
self.description = description or "No description provided"
validate_chat_input_name(self.name)
validate_chat_input_description(self.description)
self.input_type = SlashCommandOptionType.sub_command_group
self.subcommands: list[
SlashCommand | SlashCommandGroup
] = self.__initial_commands__
self.guild_ids = guild_ids
self.parent = parent
self.attached_to_group: bool = False
self.checks = kwargs.get("checks", [])
self._before_invoke = None
self._after_invoke = None
self.cog = MISSING
self.id = None
# Permissions
self.default_member_permissions: Permissions | None = kwargs.get(
"default_member_permissions", None
)
self.guild_only: bool | None = kwargs.get("guild_only", None)
self.nsfw: bool | None = kwargs.get("nsfw", None)
self.name_localizations: dict[str, str] = kwargs.get(
"name_localizations", MISSING
)
self.description_localizations: dict[str, str] = kwargs.get(
"description_localizations", MISSING
)

It doesn't have max_concurrency and cooldown attributes. If we add those to SlashCommandGroup then the logic here

if hasattr(self, "_max_concurrency"):
if self._max_concurrency is not None:
# For this application, context can be duck-typed as a Message
await self._max_concurrency.acquire(ctx) # type: ignore # ctx instead of non-existent message
try:
self._prepare_cooldowns(ctx)
await self.call_before_hooks(ctx)
except:
if self._max_concurrency is not None:
await self._max_concurrency.release(ctx) # type: ignore # ctx instead of non-existent message
raise
can be simplified

@NeloBlivion
Copy link
Member

NeloBlivion commented May 21, 2023

No. SlashCommandGroup overrides the init and doesn't call the super method.

This is true, but it's irrelevant: you can't use a group on its own, so it doesn't have the attributes - neither have you ever been able to add cooldowns to groups in their init, and you can't create them with a decorator either. All of its subcommands still have _max_concurrency, thus everything works as intended.
What you're looking for is a rewrite, not a bug fix.

@OmLanke
Copy link
Contributor

OmLanke commented May 21, 2023

neither have you ever been able to add cooldowns to groups in their init, and you can't create them with a decorator either

Yes I'm aware. But you could accept them through the kwrags for SlashCommandGroup init.

I added this to the init-

class SlashCommandGroup(ApplicationCommand):

    ...

    def __init__(...):

        ...

        from ..ext.commands.cooldowns import BucketType, CooldownMapping, MaxConcurrency

        cooldown = getattr(self, "__commands_cooldown__", kwargs.get("cooldown"))

        if cooldown is None:
            buckets = CooldownMapping(cooldown, BucketType.default)
        elif isinstance(cooldown, CooldownMapping):
            buckets = cooldown
        else:
            raise TypeError(
                "Cooldown must be a an instance of CooldownMapping or None."
            )

        self._buckets: CooldownMapping = buckets

        max_concurrency = getattr(
            self, "__commands_max_concurrency__", kwargs.get("max_concurrency")
        )

        self._max_concurrency: MaxConcurrency | None = max_concurrency

and changed the logic to

        if self._max_concurrency:
            # For this application, context can be duck-typed as a Message
            await self._max_concurrency.acquire(ctx)  # type: ignore # ctx instead of non-existent message

        try:
            self._prepare_cooldowns(ctx)
            await self.call_before_hooks(ctx)
        except:
            if self._max_concurrency:
                await self._max_concurrency.release(ctx)  # type: ignore # ctx instead of non-existent message
            raise

Didn't face issues in my limited scope of testing.

If you want to read those changes in a better way, check out my commit from when I was trying stuff out. Ignore the untested fix for #2058

@NeloBlivion
Copy link
Member

That's fair, my point was that this isn't a bug; it's a feature request. Just throw in a feat pr and go for it.

@OmLanke
Copy link
Contributor

OmLanke commented May 21, 2023

Oh lol alright.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
priority: high High Priority status: todo This issue needs work unconfirmed bug A bug report that needs triaging
Projects
Status: Done
Development

Successfully merging a pull request may close this issue.

3 participants