# coverage run and multiprocessing problem #745

Closed
opened this issue Dec 23, 2018 · 14 comments
Closed

# coverage run and multiprocessing problem#745

opened this issue Dec 23, 2018 · 14 comments
Labels

### awaizman1 commented Dec 23, 2018

 Hi, I'm facing some issue with coverage and multiprocessing module. I have a simple class wrapping multiprocessing.Pool: matlab_interop/my_pool.py: from multiprocessing import Pool def job(a): return a class MyPool: def __init__(self): self.pool = Pool(processes=1) def run_job_in_worker(self): return self.pool.apply(job, ("hello",)) and a simple unittest: tests/test_my_pool.py import unittest from matlab_interop.my_pool import MyPool class TestMyPool(unittest.TestCase): def test_simple(self): pool = MyPool() print(pool.run_job_in_worker()) when running: python -m coverage run -m nose tests.test_my_pool  coverage halts and I get errors that I don't get if running nose directly without coverage. ====================================================================== ERROR: test_simple (tests.test_my_pool.TestMyPool) ---------------------------------------------------------------------- Traceback (most recent call last): File "F:\views\g\qprism\QPrism\MatlabInterop\py\src\tests\test_my_pool.py", line 9, in test_simple pool = MyPool() File "F:\views\g\qprism\QPrism\MatlabInterop\py\src\matlab_interop\my_pool.py", line 10, in __init__ self.pool = Pool(processes=1) File "F:\views\g\qprism\Qrelease\3rd\Python3.6.7\lib\multiprocessing\context.py", line 119, in Pool context=self.get_context()) File "F:\views\g\qprism\Qrelease\3rd\Python3.6.7\lib\multiprocessing\pool.py", line 175, in __init__ self._repopulate_pool() File "F:\views\g\qprism\Qrelease\3rd\Python3.6.7\lib\multiprocessing\pool.py", line 236, in _repopulate_pool self._wrap_exception) File "F:\views\g\qprism\Qrelease\3rd\Python3.6.7\lib\multiprocessing\pool.py", line 255, in _repopulate_pool_static w.start() File "F:\views\g\qprism\Qrelease\3rd\Python3.6.7\lib\multiprocessing\process.py", line 105, in start self._popen = self._Popen(self) File "F:\views\g\qprism\Qrelease\3rd\Python3.6.7\lib\multiprocessing\context.py", line 322, in _Popen return Popen(process_obj) File "F:\views\g\qprism\Qrelease\3rd\Python3.6.7\lib\multiprocessing\popen_spawn_win32.py", line 33, in __init__ prep_data = spawn.get_preparation_data(process_obj._name) File "F:\views\g\qprism\Qrelease\3rd\Python3.6.7\lib\multiprocessing\spawn.py", line 143, in get_preparation_data _check_not_importing_main() File "F:\views\g\qprism\Qrelease\3rd\Python3.6.7\lib\multiprocessing\spawn.py", line 136, in _check_not_importing_main is not going to be frozen to produce an executable.''') RuntimeError: An attempt has been made to start a new process before the current process has finished its bootstrapping phase. This probably means that you are not using fork to start your child processes and you have forgotten to use the proper idiom in the main module: if __name__ == '__main__': freeze_support() ... The "freeze_support()" line can be omitted if the program is not going to be frozen to produce an executable.  I would appreciate your help with figuring out what could be the problem. I don't understand why all works when running nose directly (without coverage). Thanks.

### nedbat commented Dec 27, 2018

 I tried to reproduce this, but don't see the failure. Can you provide exact files to run, and the precise versions of nose and coverage? Thanks.
added the label Dec 27, 2018

### awaizman1 commented Dec 30, 2018

 Thanks @nedbat, Sorry for missing required information. I'm running the test on windows, with python==3.6.7, nose==1.3.7, coverage==4.5.2 (attached requirements.txt of my env) Attached 2 files: my_pool.py and test_my_pool.py. please download the files to 'src' folder and within 'src' run: python -m coverage run -m nose test_my_pool.py src.zip This should reproduce the problem. Thanks, Assaf.
added and removed labels Jan 13, 2019

### nedbat commented Jan 14, 2019

 Hmm, I tried this in a Windows VM, and the "py -3 -m nose test_my_pool.py" command hung for a while, doing nothing?

### sam-habitat commented Nov 21, 2019 • edited

 I'm seeing this on Mac, and it looks like ali1234 was using Linux. Can we get the 'Windows' label removed? This appeared for me when changing from a Python 3.7 venv to a Python 3.8 one. I can work-around by using coverage run -m pytest instead of coverage run -m unittest

### ali1234 commented Nov 21, 2019 • edited

 Spawn was made the default start method for MacOS in 3.8 so this also points to that being the cause. (It has always been the default on Windows as it doesn't have fork(2).) Spawn requires multiprocessing to pickle functions and send them to the subprocesses to run. Pickling a function just sends the canonical name - that name must exist/be importable in the subprocess.
added and removed labels Nov 21, 2019

### nedbat commented Nov 21, 2019

 Thanks everyone for the information. I'm hoping to dig into this soon. Of course, if anyone wants to debug it, I'm ready for clues :)

### tjwalton commented Nov 21, 2019 • edited

 I tried to fix this in https://github.com/nedbat/coveragepy/pull/836/files. This fixed the exception but I think it broke all coverage checking in subprocesses. The ModuleSpec was certainly something to do with it. (I later gave up with this because I found that spawn was not a suitable solution for my application due to re-importing all modules taking too long; I changed to run a new interpreter manually using the subprocess module.)
Repository owner deleted a comment from ali1234 Nov 24, 2019

### nedbat commented Nov 24, 2019

 @tjwalton Thanks, I have a fix almost ready to go, and it didn't have to touch the horror of multiprocessing! :) This was actually a follow-on effect of not having __spec__ defined on the module, as reported in #838.
added a commit that referenced this issue Nov 24, 2019
 Implement __spec__ for files we run. #745 #838 
 47d1659 

### nedbat commented Nov 24, 2019

 This is fixed in 47d1659.
closed this Nov 24, 2019

### nedbat commented Dec 8, 2019

 This was released as part of 5.0b2 today.
mentioned this issue Dec 15, 2019