Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

coverage run and multiprocessing problem #745

Closed
awaizman1 opened this issue Dec 23, 2018 · 14 comments
Closed

coverage run and multiprocessing problem #745

awaizman1 opened this issue Dec 23, 2018 · 14 comments
Labels
bug

Comments

@awaizman1
Copy link

@awaizman1 awaizman1 commented Dec 23, 2018

Hi,

I'm facing some issue with coverage and multiprocessing module.

I have a simple class wrapping multiprocessing.Pool:
matlab_interop/my_pool.py:

from multiprocessing import Pool


def job(a):
    return a


class MyPool:
    def __init__(self):
        self.pool = Pool(processes=1)

    def run_job_in_worker(self):
        return self.pool.apply(job, ("hello",))

and a simple unittest:
tests/test_my_pool.py

import unittest
from matlab_interop.my_pool import MyPool


class TestMyPool(unittest.TestCase):

    def test_simple(self):

        pool = MyPool()
        print(pool.run_job_in_worker())

when running:

python -m coverage run -m nose tests.test_my_pool

coverage halts and I get errors that I don't get if running nose directly without coverage.

======================================================================
ERROR: test_simple (tests.test_my_pool.TestMyPool)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "F:\views\g\qprism\QPrism\MatlabInterop\py\src\tests\test_my_pool.py", line 9, in test_simple
    pool = MyPool()
  File "F:\views\g\qprism\QPrism\MatlabInterop\py\src\matlab_interop\my_pool.py", line 10, in __init__
    self.pool = Pool(processes=1)
  File "F:\views\g\qprism\Qrelease\3rd\Python3.6.7\lib\multiprocessing\context.py", line 119, in Pool
    context=self.get_context())
  File "F:\views\g\qprism\Qrelease\3rd\Python3.6.7\lib\multiprocessing\pool.py", line 175, in __init__
    self._repopulate_pool()
  File "F:\views\g\qprism\Qrelease\3rd\Python3.6.7\lib\multiprocessing\pool.py", line 236, in _repopulate_pool
    self._wrap_exception)
  File "F:\views\g\qprism\Qrelease\3rd\Python3.6.7\lib\multiprocessing\pool.py", line 255, in _repopulate_pool_static
    w.start()
  File "F:\views\g\qprism\Qrelease\3rd\Python3.6.7\lib\multiprocessing\process.py", line 105, in start
    self._popen = self._Popen(self)
  File "F:\views\g\qprism\Qrelease\3rd\Python3.6.7\lib\multiprocessing\context.py", line 322, in _Popen
    return Popen(process_obj)
  File "F:\views\g\qprism\Qrelease\3rd\Python3.6.7\lib\multiprocessing\popen_spawn_win32.py", line 33, in __init__
    prep_data = spawn.get_preparation_data(process_obj._name)
  File "F:\views\g\qprism\Qrelease\3rd\Python3.6.7\lib\multiprocessing\spawn.py", line 143, in get_preparation_data
    _check_not_importing_main()
  File "F:\views\g\qprism\Qrelease\3rd\Python3.6.7\lib\multiprocessing\spawn.py", line 136, in _check_not_importing_main
    is not going to be frozen to produce an executable.''')
RuntimeError:
        An attempt has been made to start a new process before the
        current process has finished its bootstrapping phase.

        This probably means that you are not using fork to start your
        child processes and you have forgotten to use the proper idiom
        in the main module:

            if __name__ == '__main__':
                freeze_support()
                ...

        The "freeze_support()" line can be omitted if the program
        is not going to be frozen to produce an executable.

I would appreciate your help with figuring out what could be the problem. I don't understand why all works when running nose directly (without coverage).

Thanks.

@nedbat

This comment has been minimized.

Copy link
Owner

@nedbat nedbat commented Dec 27, 2018

I tried to reproduce this, but don't see the failure. Can you provide exact files to run, and the precise versions of nose and coverage? Thanks.

@awaizman1

This comment has been minimized.

Copy link
Author

@awaizman1 awaizman1 commented Dec 30, 2018

Thanks @nedbat,

Sorry for missing required information.
I'm running the test on windows, with python==3.6.7, nose==1.3.7, coverage==4.5.2 (attached requirements.txt of my env)

Attached 2 files: my_pool.py and test_my_pool.py. please download the files to 'src' folder and within 'src' run: python -m coverage run -m nose test_my_pool.py
src.zip

This should reproduce the problem.

Thanks,
Assaf.

@nedbat nedbat added windows and removed cant-reproduce labels Jan 13, 2019
@nedbat

This comment has been minimized.

Copy link
Owner

@nedbat nedbat commented Jan 14, 2019

Hmm, I tried this in a Windows VM, and the "py -3 -m nose test_my_pool.py" command hung for a while, doing nothing?

@markus-wa

This comment has been minimized.

Copy link

@markus-wa markus-wa commented Jan 26, 2019

I am experiencing the same issue but with unittest (when trying it with nose I got the same errors).

It appears to work on the linux CI servers though so I'm not sure if it's platform specific or related to the environment.

.coveragerc

[run]
branch = True
concurrency = multiprocessing
parallel = True

coverage run output

$ coverage run -m unittest discover tests
...
2019-01-26 17:33:56 DEBUG    config: <tests.test_to_opus.MicroMock object at 0x000001F0759FA828>
2019-01-26 17:33:56 INFO     checking for unconverted files
2019-01-26 17:33:56 INFO     migrating: "C:\Users\mwalt\dev\convert-to-opus\tests/source\aifc.aif" -> "C:\Users\mwalt\dev\convert-to-opus\tests/target\aifc.opus"
2019-01-26 17:33:56 INFO     migrating: "C:\Users\mwalt\dev\convert-to-opus\tests/source\aiff.aif" -> "C:\Users\mwalt\dev\convert-to-opus\tests/target\aiff.opus"
2019-01-26 17:33:56 INFO     migrating: "C:\Users\mwalt\dev\convert-to-opus\tests/source\flac-ogg.ogg" -> "C:\Users\mwalt\dev\convert-to-opus\tests/target\flac-ogg.opus"
2019-01-26 17:33:56 INFO     migrating: "C:\Users\mwalt\dev\convert-to-opus\tests/source\flac.flac" -> "C:\Users\mwalt\dev\convert-to-opus\tests/target\flac.opus"
2019-01-26 17:33:56 INFO     migrating: "C:\Users\mwalt\dev\convert-to-opus\tests/source\opus.opus" -> "C:\Users\mwalt\dev\convert-to-opus\tests/target\opus.opus"
2019-01-26 17:33:56 INFO     migrating: "C:\Users\mwalt\dev\convert-to-opus\tests/source\vorbis.ogg" -> "C:\Users\mwalt\dev\convert-to-opus\tests/target\vorbis.opus"
2019-01-26 17:33:56 INFO     migrating: "C:\Users\mwalt\dev\convert-to-opus\tests/source\wave.wav" -> "C:\Users\mwalt\dev\convert-to-opus\tests/target\wave.opus"
2019-01-26 17:33:56 INFO     migrating: "C:\Users\mwalt\dev\convert-to-opus\tests/source\nested\text.txt" -> "C:\Users\mwalt\dev\convert-to-opus\tests/target\nested\text.txt"
2019-01-26 17:33:56 INFO     migrating: "C:\Users\mwalt\dev\convert-to-opus\tests/source\nested\deep\aifc.aif" -> "C:\Users\mwalt\dev\convert-to-opus\tests/target\nested\deep\aifc.opus"
2019-01-26 17:33:56 INFO     migrating: "C:\Users\mwalt\dev\convert-to-opus\tests/source\nested\deep\aiff.aif" -> "C:\Users\mwalt\dev\convert-to-opus\tests/target\nested\deep\aiff.opus"
2019-01-26 17:33:56 INFO     migrating: "C:\Users\mwalt\dev\convert-to-opus\tests/source\nested\deep\flac-ogg.ogg" -> "C:\Users\mwalt\dev\convert-to-opus\tests/target\nested\deep\flac-ogg.opus"
2019-01-26 17:33:56 INFO     migrating: "C:\Users\mwalt\dev\convert-to-opus\tests/source\nested\deep\flac.flac" -> "C:\Users\mwalt\dev\convert-to-opus\tests/target\nested\deep\flac.opus"
2019-01-26 17:33:56 INFO     migrating: "C:\Users\mwalt\dev\convert-to-opus\tests/source\nested\deep\opus.opus" -> "C:\Users\mwalt\dev\convert-to-opus\tests/target\nested\deep\opus.opus"
2019-01-26 17:33:56 INFO     migrating: "C:\Users\mwalt\dev\convert-to-opus\tests/source\nested\deep\vorbis.ogg" -> "C:\Users\mwalt\dev\convert-to-opus\tests/target\nested\deep\vorbis.opus"
2019-01-26 17:33:56 INFO     migrating: "C:\Users\mwalt\dev\convert-to-opus\tests/source\nested\deep\wave.wav" -> "C:\Users\mwalt\dev\convert-to-opus\tests/target\nested\deep\wave.opus"
2019-01-26 17:33:56 INFO     finishing conversions
Traceback (most recent call last):
  File "<string>", line 1, in <module>
  File "c:\users\mwalt\scoop\apps\python\current\lib\multiprocessing\spawn.py", line 105, in spawn_main
    exitcode = _main(fd)
  File "c:\users\mwalt\scoop\apps\python\current\lib\multiprocessing\spawn.py", line 114, in _main
    prepare(preparation_data)
  File "c:\users\mwalt\scoop\apps\python\current\lib\multiprocessing\spawn.py", line 225, in prepare
    _fixup_main_from_path(data['init_main_from_path'])
  File "c:\users\mwalt\scoop\apps\python\current\lib\multiprocessing\spawn.py", line 277, in _fixup_main_from_path
    run_name="__mp_main__")
  File "c:\users\mwalt\scoop\apps\python\current\lib\runpy.py", line 263, in run_path
    pkg_name=pkg_name, script_name=fname)
  File "c:\users\mwalt\scoop\apps\python\current\lib\runpy.py", line 96, in _run_module_code
    mod_name, mod_spec, pkg_name, script_name)
  File "c:\users\mwalt\scoop\apps\python\current\lib\runpy.py", line 85, in _run_code
    exec(code, run_globals)
  File "c:\users\mwalt\scoop\apps\python\current\lib\unittest\__main__.py", line 16, in <module>
    from .main import main
ImportError: attempted relative import with no known parent package
Traceback (most recent call last):
  File "<string>", line 1, in <module>
  File "c:\users\mwalt\scoop\apps\python\current\lib\multiprocessing\spawn.py", line 105, in spawn_main
    exitcode = _main(fd)
  File "c:\users\mwalt\scoop\apps\python\current\lib\multiprocessing\spawn.py", line 114, in _main
    prepare(preparation_data)
  File "c:\users\mwalt\scoop\apps\python\current\lib\multiprocessing\spawn.py", line 225, in prepare
    _fixup_main_from_path(data['init_main_from_path'])
  File "c:\users\mwalt\scoop\apps\python\current\lib\multiprocessing\spawn.py", line 277, in _fixup_main_from_path
    run_name="__mp_main__")
  File "c:\users\mwalt\scoop\apps\python\current\lib\runpy.py", line 263, in run_path
    pkg_name=pkg_name, script_name=fname)
  File "c:\users\mwalt\scoop\apps\python\current\lib\runpy.py", line 96, in _run_module_code
    mod_name, mod_spec, pkg_name, script_name)
  File "c:\users\mwalt\scoop\apps\python\current\lib\runpy.py", line 85, in _run_code
    exec(code, run_globals)
  File "c:\users\mwalt\scoop\apps\python\current\lib\unittest\__main__.py", line 16, in <module>
    from .main import main
ImportError: attempted relative import with no known parent package
@adamklein

This comment has been minimized.

Copy link

@adamklein adamklein commented Jun 5, 2019

I am seeing the same as OP with python 3.6.8 nose 1.3.7 and coverage 4.4.1, on linux when multiprocessing method is spawn or forkserver (not fork). here is my repro:

test_me.py

from unittest import TestCase
import multiprocessing as mp


def square(x):
    return x**2


class TestPool(TestCase):

    def setUp(self):
        ctx = mp.get_context('spawn')
        self.pool = ctx.Pool(processes=2)

    def tearDown(self):
        pass

    def test_pool(self):
        self.pool.map(square, [1,2,3])

.coveragerc

[run]
branch = True
parallel = True
concurrency = multiprocessing

command:

python -m coverage run -m nose -v test_me.py
tjwalton added a commit to tjwalton/coveragepy that referenced this issue Aug 9, 2019
Add test which triggers the problem.
@tjwalton

This comment has been minimized.

Copy link

@tjwalton tjwalton commented Aug 9, 2019

I hit this problem myself and have got part of the way to understanding what is happening, but there is more going on that I don't understand. Please see the above pull request for details.

@ali1234

This comment has been minimized.

Copy link

@ali1234 ali1234 commented Nov 19, 2019

Same problem here.

Simple way to reproduce it:

import multiprocessing as mp
import unittest


def square(x):
    return x*x


def mptarget(f, work_queue, done_queue):
    x = work_queue.get()
    done_queue.put(f(x))


class TestMP(unittest.TestCase):

    def test_single(self):
        result = square(2)
        self.assertEqual(result, 4)

    def test_multi(self):
        ctx = mp.get_context('spawn')
        work_queue = ctx.Queue()
        done_queue = ctx.Queue()
        p = ctx.Process(target=mptarget, args=(square, work_queue, done_queue), daemon=True)
        p.start()
        work_queue.put(2)
        result = done_queue.get()
        p.join()
        self.assertEqual(result, 4)

Works fine with unittest:

(venv) al@al-desktop:~$ python3 -m unittest cov.py
..
----------------------------------------------------------------------
Ran 2 tests in 0.056s

OK

When running in coverage:

(venv) al@al-desktop:~$ coverage run -m unittest cov.py
Traceback (most recent call last):
  File "<string>", line 1, in <module>
  File "/usr/lib/python3.6/multiprocessing/spawn.py", line 105, in spawn_main
    exitcode = _main(fd)
  File "/usr/lib/python3.6/multiprocessing/spawn.py", line 114, in _main
    prepare(preparation_data)
  File "/usr/lib/python3.6/multiprocessing/spawn.py", line 225, in prepare
    _fixup_main_from_path(data['init_main_from_path'])
  File "/usr/lib/python3.6/multiprocessing/spawn.py", line 277, in _fixup_main_from_path
    run_name="__mp_main__")
  File "/usr/lib/python3.6/runpy.py", line 263, in run_path
    pkg_name=pkg_name, script_name=fname)
  File "/usr/lib/python3.6/runpy.py", line 96, in _run_module_code
    mod_name, mod_spec, pkg_name, script_name)
  File "/usr/lib/python3.6/runpy.py", line 85, in _run_code
    exec(code, run_globals)
  File "/usr/lib/python3.6/unittest/__main__.py", line 16, in <module>
    from .main import main, TestProgram
ImportError: attempted relative import with no known parent package

(Above exception is from the child process. Coverage is now deadlocked waiting for the childuntil until you ctrl-c.)

^CTraceback (most recent call last):
  File "/home/al/Source/vhs-teletext/venv/bin/coverage", line 11, in <module>
    sys.exit(main())
  File "/home/al/Source/vhs-teletext/venv/lib/python3.6/site-packages/coverage/cmdline.py", line 756, in main
    status = CoverageScript().command_line(argv)
  File "/home/al/Source/vhs-teletext/venv/lib/python3.6/site-packages/coverage/cmdline.py", line 491, in command_line
    return self.do_run(options, args)
  File "/home/al/Source/vhs-teletext/venv/lib/python3.6/site-packages/coverage/cmdline.py", line 627, in do_run
    self.run_python_module(args[0], args)
  File "/home/al/Source/vhs-teletext/venv/lib/python3.6/site-packages/coverage/execfile.py", line 122, in run_python_module
    run_python_file(pathname, args, package=packagename, modulename=modulename, path0=path0)
  File "/home/al/Source/vhs-teletext/venv/lib/python3.6/site-packages/coverage/execfile.py", line 192, in run_python_file
    exec(code, main_mod.__dict__)
  File "/usr/lib/python3.6/unittest/__main__.py", line 18, in <module>
    main(module=None)
  File "/usr/lib/python3.6/unittest/main.py", line 95, in __init__
    self.runTests()
  File "/usr/lib/python3.6/unittest/main.py", line 256, in runTests
    self.result = testRunner.run(self.test)
  File "/usr/lib/python3.6/unittest/runner.py", line 176, in run
    test(result)
  File "/usr/lib/python3.6/unittest/suite.py", line 84, in __call__
    return self.run(*args, **kwds)
  File "/usr/lib/python3.6/unittest/suite.py", line 122, in run
    test(result)
  File "/usr/lib/python3.6/unittest/suite.py", line 84, in __call__
    return self.run(*args, **kwds)
  File "/usr/lib/python3.6/unittest/suite.py", line 122, in run
    test(result)
  File "/usr/lib/python3.6/unittest/suite.py", line 84, in __call__
    return self.run(*args, **kwds)
  File "/usr/lib/python3.6/unittest/suite.py", line 122, in run
    test(result)
  File "/usr/lib/python3.6/unittest/case.py", line 653, in __call__
    return self.run(*args, **kwds)
  File "/usr/lib/python3.6/unittest/case.py", line 605, in run
    testMethod()
  File "/home/al/cov.py", line 27, in test_multi
    result = done_queue.get()
  File "/usr/lib/python3.6/multiprocessing/queues.py", line 94, in get
    res = self._recv_bytes()
  File "/usr/lib/python3.6/multiprocessing/connection.py", line 216, in recv_bytes
    buf = self._recv_bytes(maxlength)
  File "/usr/lib/python3.6/multiprocessing/connection.py", line 407, in _recv_bytes
    buf = self._recv(4)
  File "/usr/lib/python3.6/multiprocessing/connection.py", line 379, in _recv
    chunk = read(handle, remaining)
KeyboardInterrupt

I tested both with and without "concurrency = multiprocessing" in .coveragerc

@sam-habitat

This comment has been minimized.

Copy link

@sam-habitat sam-habitat commented Nov 21, 2019

I'm seeing this on Mac, and it looks like ali1234 was using Linux. Can we get the 'Windows' label removed?

This appeared for me when changing from a Python 3.7 venv to a Python 3.8 one.

I can work-around by using coverage run -m pytest instead of coverage run -m unittest

@ali1234

This comment has been minimized.

Copy link

@ali1234 ali1234 commented Nov 21, 2019

Spawn was made the default start method for MacOS in 3.8 so this also points to that being the cause. (It has always been the default on Windows as it doesn't have fork(2).)

Spawn requires multiprocessing to pickle functions and send them to the subprocesses to run. Pickling a function just sends the canonical name - that name must exist/be importable in the subprocess.

@nedbat nedbat added bug and removed windows labels Nov 21, 2019
@nedbat

This comment has been minimized.

Copy link
Owner

@nedbat nedbat commented Nov 21, 2019

Thanks everyone for the information. I'm hoping to dig into this soon. Of course, if anyone wants to debug it, I'm ready for clues :)

@tjwalton

This comment has been minimized.

Copy link

@tjwalton tjwalton commented Nov 21, 2019

I tried to fix this in https://github.com/nedbat/coveragepy/pull/836/files. This fixed the exception but I think it broke all coverage checking in subprocesses. The ModuleSpec was certainly something to do with it. (I later gave up with this because I found that spawn was not a suitable solution for my application due to re-importing all modules taking too long; I changed to run a new interpreter manually using the subprocess module.)

Repository owner deleted a comment from ali1234 Nov 24, 2019
@nedbat

This comment has been minimized.

Copy link
Owner

@nedbat nedbat commented Nov 24, 2019

@tjwalton Thanks, I have a fix almost ready to go, and it didn't have to touch the horror of multiprocessing! :)
This was actually a follow-on effect of not having __spec__ defined on the module, as reported in #838.

nedbat added a commit that referenced this issue Nov 24, 2019
@nedbat

This comment has been minimized.

Copy link
Owner

@nedbat nedbat commented Nov 24, 2019

This is fixed in 47d1659.

@nedbat nedbat closed this Nov 24, 2019
@nedbat

This comment has been minimized.

Copy link
Owner

@nedbat nedbat commented Dec 8, 2019

This was released as part of 5.0b2 today.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
7 participants
You can’t perform that action at this time.