Fixed the leak responsible for the accumulation of unreferenced scalars #1
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
The leak occurs due to creation of empty base object inside the XS code. This fix also happens to solve the "Attempt to free unreferenced scalar: SV 0xdeadbeef during global destruction." issue.
Modifications:
t/old-13slowleak.t
ast/96-leak.t
, using Devel::Leak. It now runs 10_000 iterations instead of 200. However, it's still fast, as the downloaded URL is"file:///$Bin/$Script"
(by default). This was required to spot the problem.Makefile.PL
so pkg-config is now able to detect libcurl in non-standard locations (like $HOME). This was required for me to deploy test cases on multiple perlbrew-based installations.inc/symbols-in-versions
from libcurl/7.28.0 and created a.gitignore
file. Not actually required to fix the leak, but hell, why not? :)Curl.xs
, HASHREF_BY_DEFAULT definition was updated, reverting the order of newRV_noinc() / sv_2mortal().The dismissal of *"Attempt to free unreferenced scalar..." warning can be verified via Net::Curl::Simple test suite; while using the original Net::Curl, it repeatedly produces the aforementioned message:
I also "smoked" the patched Net::Curl on a reasonably large crawler system (which processes over 1 million HTML-pages every 24 hours) no unexpected behavior was detected.