Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Fix LRU cache bloat #136

Merged
merged 12 commits into from Dec 7, 2016
Merged

Fix LRU cache bloat #136

merged 12 commits into from Dec 7, 2016

Conversation

richardms
Copy link
Contributor

See #135.

Copy link
Collaborator

@mcollina mcollina left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Good job. I think we should create an interval when the server is started, rather than calling setTimeout.

// Start timer to actively prune from LRU cache after expiration
buf.lru_timer = setTimeout(function () {
lru.get(key);
}, that._lruMaxAge + 1000)
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

you can create an interval and call unref(), then you won't need to reset the timer every time.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Indeed, this can go given the change above.

I was being careful with unref() as the CI script didn't like it on the older node versions.

, length: function(n) { return n.length }
, maxAge: parameters.exchangeLifetime
, length: function(n) { return n.buffer.byteLength }
, maxAge: this._lruMaxAge
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

there is no need to make _lruMaxAge a property, assign it directly here.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

OK. I'll switch to an Interval called once every maxAge that runs a prune() across the whole lru.

@mcollina
Copy link
Collaborator

Also, do you have some ideas into adding a unit test for this?

// Start LRU pruning timer
this._lru.pruneTimer = setInterval(function () {
that._lru.prune()
}, parameters.pruneTimerPeriod*1000)
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

can you please unref()  this?

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

this is still not unref().

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

it is below - one of the mock timer libraries doesn't support unref() so I had to put a guard around it.

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LOL, thanks. Sorry.

@richardms
Copy link
Contributor Author

Please ignore the PR for the time being - I'm having trouble getting the tests to run reliably on both Windows & Linux

@richardms
Copy link
Contributor Author

OK. Working well on Linux, although Travis CI is failing for node 0.12. Do you think that's something I've done or due to a changing dependency?

@mcollina
Copy link
Collaborator

I'm ok in dropping support for node v0.12.

Would you like to help maintaining/developing this module?

@richardms
Copy link
Contributor Author

Sorry to disappear - I went off sick for the second half of last week, and I've been catching up ever since.

Time is short for me at the moment, but I'm happy to help the maintenance effort when I can.

@mcollina mcollina merged commit 4fada62 into coapjs:master Dec 7, 2016
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

2 participants