Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Memory leak determined by valgrind #152

Closed
pengweichu opened this issue Mar 23, 2019 · 4 comments
Closed

Memory leak determined by valgrind #152

pengweichu opened this issue Mar 23, 2019 · 4 comments

Comments

@pengweichu
Copy link

Hi, I'm using latest code of blockingconcurrentqueue in my project, I got below output by valgrind:

<error>
  <unique>0x3a</unique>
  <tid>1</tid>
  <kind>Leak_DefinitelyLost</kind>
  <xwhat>
    <text>24 bytes in 1 blocks are definitely lost in loss record 58 of 325</text>
    <leakedbytes>24</leakedbytes>
    <leakedblocks>1</leakedblocks>
  </xwhat>
  <stack>
    <frame>
      <ip>0x4C2A436</ip>
      <obj>/usr/lib64/valgrind/vgpreload_memcheck-amd64-linux.so</obj>
      <fn>operator new(unsigned long, std::nothrow_t const&amp;)</fn>
      <dir>/builddir/build/BUILD/valgrind-3.13.0/coregrind/m_replacemalloc</dir>
      <file>vg_replace_malloc.c</file>
      <line>377</line>
    </frame>
    <frame>
      <ip>0xAF57725</ip>
      <obj>/usr/lib64/libstdc++.so.6.0.22</obj>
      <fn>__cxa_thread_atexit</fn>
    </frame>
    <frame>
      <ip>0x621B1D</ip>
      <obj>/home/dev/project/bin/proxy</obj>
      <fn>moodycamel::details::ThreadExitNotifier::instance()</fn>
      <dir>/home/dev/project/common</dir>
      <file>concurrentqueue.h</file>
      <line>545</line>
    </frame>
    <frame>
      <ip>0x6219B1</ip>
      <obj>/home/dev/project/bin/proxy</obj>
      <fn>moodycamel::details::ThreadExitNotifier::subscribe(moodycamel::details::ThreadExitListener*)</fn>
      <dir>/home/dev/project/common</dir>
      <file>concurrentqueue.h</file>
      <line>510</line>
    </frame>
    <frame>
      <ip>0x62423C</ip>
      <obj>/home/dev/project/bin/proxy</obj>
      <fn>moodycamel::ConcurrentQueue&lt;resip::Data, proxy::LogQueue::Traits&gt;::get_or_add_implicit_producer()</fn>
      <dir>/home/dev/project/common</dir>
      <file>concurrentqueue.h</file>
      <line>3413</line>
    </frame>
    <frame>
      <ip>0x6236F9</ip>
      <obj>/home/dev/project/bin/proxy</obj>
      <fn>bool moodycamel::ConcurrentQueue&lt;resip::Data, proxy::LogQueue::Traits&gt;::inner_enqueue&lt;(moodycamel::ConcurrentQueue&lt;resip::Data, proxy::LogQueue::Traits&gt;::AllocationMode)0, resip::Data const&amp;&gt;(resip::Data const&amp;)</fn>
      <dir>/home/dev/project/common</dir>
      <file>concurrentqueue.h</file>
      <line>1291</line>
    </frame>
    <frame>
      <ip>0x623280</ip>
      <obj>/home/dev/project/bin/proxy</obj>
      <fn>moodycamel::ConcurrentQueue&lt;resip::Data, proxy::LogQueue::Traits&gt;::enqueue(resip::Data const&amp;)</fn>
      <dir>/home/dev/project/common</dir>
      <file>concurrentqueue.h</file>
      <line>914</line>
    </frame>
    <frame>
      <ip>0x622BDC</ip>
      <obj>/home/dev/project/bin/proxy</obj>
      <fn>moodycamel::BlockingConcurrentQueue&lt;resip::Data, proxy::LogQueue::Traits&gt;::enqueue(resip::Data const&amp;)</fn>
      <dir>/home/dev/project/common</dir>
      <file>blockingconcurrentqueue.h</file>
      <line>521</line>
    </frame>
    <frame>
      <ip>0x62061F</ip>
      <obj>/home/dev/project/bin/proxy</obj>
      <fn>proxy::LogQueue::add(resip::Data const&amp;)</fn>
      <dir>/home/dev/project/common</dir>
      <file>LogQueue.cxx</file>
      <line>46</line>
    </frame>
    <frame>
      <ip>0x62A60B</ip>
      <obj>/home/dev/project/bin/proxy</obj>
      <fn>proxy::AppExternalLogger::operator()(resip::Log::Level, resip::Subsystem const&amp;, resip::Data const&amp;, char const*, int, resip::Data const&amp;, resip::Data const&amp;)</fn>
      <dir>/home/dev/project/common</dir>
      <file>proxyLogger.cxx</file>
      <line>91</line>
    </frame>
    <frame>
      <ip>0x8F4E02</ip>
      <obj>/home/dev/project/bin/proxy</obj>
      <fn>resip::Log::Guard::~Guard()</fn>
    </frame>
    <frame>
      <ip>0x56EA75</ip>
      <obj>/home/dev/project/bin/proxy</obj>
      <fn>proxy::RedisConnection::queueSetClientName()::{lambda(cpp_redis::reply const&amp;)#1}::operator()(cpp_redis::reply const&amp;) const</fn>
      <dir>/home/dev/project/proxy</dir>
      <file>RedisConnection.cxx</file>
      <line>526</line>
    </frame>
  </stack>
</error>

@cameron314
Copy link
Owner

Not much I can do about that, it looks like your libstdc++ is leaking from __cxa_thread_atexit.

@pengweichu
Copy link
Author

pengweichu commented Mar 25, 2019

Not much I can do about that, it looks like your libstdc++ is leaking from __cxa_thread_atexit.

If remove macro "MOODYCAMEL_CPP11_THREAD_LOCAL_SUPPORTED" will avoid this issue ?
Does there has any negative results If remove macro "MOODYCAMEL_CPP11_THREAD_LOCAL_SUPPORTED" ?

Thanks

@cameron314
Copy link
Owner

You can remove MOODYCAMEL_CPP11_THREAD_LOCAL_SUPPORTED, and it will indeed work around this, but memory usage (and performance) will be impacted because every implicit producer will exist forever instead of being recycled when the thread it's associated to is destroyed.

@powerdevil
Copy link

Hi, I came across with issue too, but i'm using Clang Address sanitizer. It reports the leak is from line146: " elt *new_elt = new (std::nothrow) elt;", having no idea why.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants