Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Feature Request: reply to message #31

Closed
nickjohn1912 opened this issue May 18, 2023 · 11 comments
Closed

Feature Request: reply to message #31

nickjohn1912 opened this issue May 18, 2023 · 11 comments
Milestone

Comments

@nickjohn1912
Copy link

Was just wondering if it would be possible to add a reply option, I see the reply to thread feature but was wondering it about having a command or know how have the bot reply to individual users

@chrisrude
Copy link
Owner

By reply, do you mean having the bot's responses posted as a "reply" in Discord (with the little link at the top to what it is responding to?)

If so, I actually had it do that initially, but it had some challenges, curious what you think about the tradeoffs...

On slower channels, it would work fine. The bot would be summoned by a message, it would read and respond to it, and tag the message that it was responding to as a reply. Everything worked fine.

But on busier channels, what can happen is that after the bot is summoned, but before it could reply, other messages would be posted. The bot then has a choice... does it "see" those newer messages when building its reply?

If it doesn't see those other messages, the response it writes is relevant to the original request, but might be out-of-context relative to the overall flow of the conversation. For instance someone might have asked a different question in the meantime, and it could be unclear to the participants who the bot was referring to.

If it does "see" those other messages, it's the same problem but the other way around. With the additional context it may no longer decide to reply to the message that summoned it, but rather a later request.

Having tried it both ways, the later feels more like a natural participant, but of course ymmv based on your server.

Thinking out loud, one compromise I could think of would be to have the bot respond differently based on whether or not it was hard-summoned (i.e. referred to by its name or an @-mention). In that case, we could follow the first path: don't include any context after the @-mention, and then post its response as a reply.

But then when doing an "unsolicited" responses it would always include the full context, and not explicitly mark any message as one it is replying to.

I think I might play around with this behavior to see how it feels, but other ideas are welcome!

@chrisrude chrisrude changed the title Feature Request Feature Request: reply to message May 19, 2023
@nickjohn1912
Copy link
Author

Sorry for the late reply @chrisrude , and ya that was exactly what i meant. and i think that compromised would do well. for me it gets a bit confusing to see who the bot responded to if there more then several people. but i think that idea would work. thanks for the response and hope all goes well. it a great bit non of the less. and have enjoyed using it

@nickjohn1912
Copy link
Author

i'll close this since i got the answer i was looking for

@chrisrude
Copy link
Owner

Glad you got the answer you needed! I'd like to keep this open until I can get the change you want done. (it helps me track work to do)

@chrisrude chrisrude reopened this May 20, 2023
@chrisrude
Copy link
Owner

This is committed as 6e2ebcf

@nickjohn1912 if you're able to run from source, do you want to test this out? It should just magically work.

Otherwise it should be included in the next release (0.1.8).

@chrisrude chrisrude added this to the 0.1.8 milestone May 20, 2023
@nickjohn1912
Copy link
Author

nickjohn1912 commented May 20, 2023

@chrisrude glad i came to check here. ya will check this out for you. apologies for my swayed time. got some projects so makes my communication random.

@nickjohn1912
Copy link
Author

nickjohn1912 commented May 20, 2023

but, i can wait as well if needed. i can understand being a dev it a time consuming task. but i'll give it a try when i can.

@nickjohn1912
Copy link
Author

alright i gave it a try in my group and it works wonderful. thank you very much and for sharing this kind of thing

@rebek43
Copy link

rebek43 commented May 26, 2023

@chrisrude something i've noticed with this is that with splitting responses on, every single split response replies to the same message, which can look a little messy. Off the top of my head there are two obvious solutions: either only having the first split part of the response reply to the request message (which i guess would make it unclear when the response ended some times) or having the response be in a single message (which is very humanlike but could be slow).
I think the second solution could be more elegant, and it could also be used to reuse the deprecated response streaming feature: instead of dumping the entire response at once, the bot would split it up as usual, but instead of sending them as different messages, it could edit the original message split response by split response (my impressions are that this would call the discord API far less times that the original text streaming implementation and work way more smoothly than the original implementation)

@chrisrude
Copy link
Owner

Yeah, I noticed the same thing. I think keeping the reply tagging is important, since otherwise it could be hard to understand what the bot is doing.

The half-streaming idea is interesting! I'll play around with that and see how it feels.

@chrisrude
Copy link
Owner

Split this into a separate open issue, so that I don't miss the convo on the closed one.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants