Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Performance degradation over time #11

Open
progmancod opened this issue Apr 14, 2022 · 12 comments · Fixed by #12
Open

Performance degradation over time #11

progmancod opened this issue Apr 14, 2022 · 12 comments · Fixed by #12

Comments

@progmancod
Copy link

Hi, first of all, congrats for your amazing project!

I noticed that every connection on the hub register a callback:

public function subscribe(callable $callback): void
{
    $this->init();
    $this->subscriber->on('message', function (string $channel, string $payload) use ($callback) {
        $callback($this->serializer->deserialize($payload));
    });
}

After the client close the connection, the callback remains there.

So, after few hours and hundreds of thousands of subscribes, the performance gets terrible, since every publish has to call hundreds of thousands functions, mostly unecessary (subscriber has long gone).

The solution is to listen connection e close events, register an array of subscribers (address => $callback()), and then remove them on connection close.

That's how the original implementation works: https://github.com/dunglas/mercure/blob/main/subscribe.go

@bpolaszek
Copy link
Owner

Hi @progmancod,

Thank you for spotting this! It makes sense indeed.

Problem is, Framework X doesn't seem to expose the Connection object, meaning we have no means to get notified when the connection closes. I'll try to get some help over there.

@bpolaszek
Copy link
Owner

Wait, after a coffee ☕ I think I found another way to address this, as the stream exposes a close event itself.
Can you please have a look at this and maybe try on your side?

@progmancod
Copy link
Author

With 2 thousand users connected, after a few hours, freddie mercure continues to degrade performance. Maybe just hearing the "close" event of the "stream" is not enough in some scenarios. I'm concerned about the "error" and "end" scenarios.. Let's see the docs (https://github.com/reactphp/stream):

  • The error event will be emitted once a fatal error occurs, usually while trying to read from this stream.

  • The end event will be emitted once the source stream has successfully reached the end of the stream (EOF).

  • The close event will be emitted once the stream closes (terminates).

  • After the stream errors, it MUST close the stream and SHOULD thus be followed by a close event (OK)

  • Many common streams (such as a TCP/IP connection or a file-based stream) will likely choose to emit this (close) event after reading a successful end event or after a fatal transmission error event. (Maybe the problem? will likely choose to emit?)

@bpolaszek
Copy link
Owner

Oh, that's bad news 🙁

Unfortunately, I'm quite overwhelmed these days, and I don't even have the time to perform a stress test.
I'll try to look at this ASAP, but I couldn't give any ETA at the moment.

@bpolaszek bpolaszek reopened this Jul 7, 2022
@progmancod
Copy link
Author

No problem, I'm trying to resolve the issue.

@bpolaszek
Copy link
Owner

Hey @progmancod - it's been a while this issue has been opened, and I wanted to make sure everything's now fine. I'm running Freddie on an intensive IoT platform (using Redis) and it can run during months without blinking an eye.

Do you mind if I close that issue?

@progmancod
Copy link
Author

Unfortunately, the problem persists in our application. There are about 200 million posts per day, and within a few days, degradation occurs, forcing us to recycle the application. This change has helped a lot, tripling the time it takes for degradation to happen. But there's still some scenario I couldn't find:

$stream->on('close', fn() => $this->hub->unsubscribe($subscriber, 'close'));
$stream->on('end', fn() => $this->hub->unsubscribe($subscriber, 'end'));
$stream->on('error', fn() => $this->hub->unsubscribe($subscriber, 'error'));

@SimonFrings
Copy link

Hey @progmancod and @bpolaszek, I'm one of the maintainer of ReactPHP and Framework X and we recently received a ticket in ReactPHP's HTTP component, which sounds a little bit similar to the problem described hear (not 100% sure if it's related though). The problem was that the connection close handler wasn't cleaned up properly for each request, resulting in some memory growth.

This issue got fixed through reactphp/http#515 and was already released with reactphp/http v1.10.0. Like I said, I'm not sure if this is the same issue described in here, does the problem still persist with the new changes?

@progmancod
Copy link
Author

Unfortunately, the problem persists, and we have given up on using the library in production after spending many hours investigating. Our application receives millions of hits per day, and performance degrades after a few hours. We have reverted to using the original library in Go.

@bpolaszek
Copy link
Owner

Hi @progmancod, no worries.
When you speak about performance degradation, what do you mean exactly? Memory leaks? High response times? Hub not responding?

@progmancod
Copy link
Author

The memory increases slightly over time, but the main symptom is an ever-increasing slowness, until it becomes impractical under high demand.

@SimonFrings
Copy link

SimonFrings commented Apr 25, 2024

The memory increases slightly over time, but the main symptom is an ever-increasing slowness, until it becomes impractical under high demand.

@progmancod Hm, when these performance degradation occur, do you see any high CPU or RAM usage or anything similar? We're currently not aware of anything like this in ReactPHP or Framework X, but we need a way to reproduce this in order to find out which project is responsible for this behavior and what a fix could look like.

FYI: Together with @clue, we have helped others with similar problems in the past. So if you're interested, you can drop us an email and we can set up a quick consulting call to take a look at this together.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging a pull request may close this issue.

3 participants