New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

test_muliprocessing failed on windows #6582

Open
souptc opened this Issue May 10, 2017 · 4 comments

Comments

Projects
None yet
5 participants
@souptc
Contributor

souptc commented May 10, 2017

System: win10 / win server 2012
Python: 3.5.2
Keras: latest master code
Backend: Tensorflow 1.1
Reproduce:
pytest tests\test_multiprocessing.py

Following tests will fail:
test_multiprocessing.py::test_multiprocessing_training FAILED
test_multiprocessing.py::test_multiprocessing_training_fromfile FAILED
test_multiprocessing.py::test_multiprocessing_predicting FAILED
test_multiprocessing.py::test_multiprocessing_evaluating FAILED

Call stack:
..\keras\legacy\interfaces.py:87: in wrapper
return func(*args, **kwargs)
..\keras\models.py:1097: in fit_generator
initial_epoch=initial_epoch)
..\keras\legacy\interfaces.py:87: in wrapper
return func(*args, **kwargs)
..\keras\engine\training.py:1831: in fit_generator
enqueuer.start(max_q_size=max_q_size, workers=workers)
..\keras\engine\training.py:632: in start
thread.start()
c:\program files\anaconda3\envs\keras\lib\multiprocessing\process.py:105: in start
self._popen = self._Popen(self)
c:\program files\anaconda3\envs\keras\lib\multiprocessing\context.py:212: in _Popen
return _default_context.get_context().Process._Popen(process_obj)
c:\program files\anaconda3\envs\keras\lib\multiprocessing\context.py:313: in _Popen
return Popen(process_obj)
c:\program files\anaconda3\envs\keras\lib\multiprocessing\popen_spawn_win32.py:66: in init
reduction.dump(process_obj, to_child)


obj = <Process(Process-1, initial daemon)>, file = <_io.BufferedWriter name=17>
protocol = None

def dump(obj, file, protocol=None):
    '''Replacement for pickle.dump() using ForkingPickler.'''
  ForkingPickler(file, protocol).dump(obj)

E AttributeError: Can't pickle local object 'GeneratorEnqueuer.start..data_generator_task'

Tried several machine with different system, the same error.

@fchollet

This comment has been minimized.

Show comment
Hide comment
@fchollet

fchollet May 10, 2017

Collaborator

Replacement for pickle.dump() using ForkingPickler.

This looks like a Windows-specific problem with Python multiprocessing. Could someone who is on Windows look into this?

Collaborator

fchollet commented May 10, 2017

Replacement for pickle.dump() using ForkingPickler.

This looks like a Windows-specific problem with Python multiprocessing. Could someone who is on Windows look into this?

@davidtellez

This comment has been minimized.

Show comment
Hide comment
@davidtellez

davidtellez Jul 29, 2017

Same here, can't enable use_multiprocessing in fit_generator without getting this error. My stack: W10, Anaconda Python 3.5.3, theano 0.9.0, Keras 2.0.6.

I suppose we are not the only Windows users (at least for development) so how come this error is so uncommon? Any ideas? Any stack working for Windows users? Thanks!

BTW: I'll test later if the exact same code works in Ubuntu.

davidtellez commented Jul 29, 2017

Same here, can't enable use_multiprocessing in fit_generator without getting this error. My stack: W10, Anaconda Python 3.5.3, theano 0.9.0, Keras 2.0.6.

I suppose we are not the only Windows users (at least for development) so how come this error is so uncommon? Any ideas? Any stack working for Windows users? Thanks!

BTW: I'll test later if the exact same code works in Ubuntu.

@davideboschetto

This comment has been minimized.

Show comment
Hide comment
@davideboschetto

davideboschetto Aug 22, 2017

I have the same problem, Windows 7 64bit.
No clue on how to fix multi processing with windows!

davideboschetto commented Aug 22, 2017

I have the same problem, Windows 7 64bit.
No clue on how to fix multi processing with windows!

@KardoPaska

This comment has been minimized.

Show comment
Hide comment
@KardoPaska

KardoPaska Oct 28, 2017

I am a physicist not a computer sciencetist. I am broken too. Having Tensorflow 1.0.0 as backend, ipython 5.1.0, spyder 3.2.3, numpy 1.11.3, h5py 2.6.0, hdf5, 1.8.15.1, patsy 0.4.1.

`File "E:\Anaconda3\envs\py35_try2\lib\site-packages\keras\legacy\interfaces.py", line 87, in wrapper
return func(*args, **kwargs)

File "E:\Anaconda3\envs\py35_try2\lib\site-packages\keras\engine\training.py", line 1800, in fit_generator
enqueuer.start(workers=workers, max_queue_size=max_queue_size)

File "E:\Anaconda3\envs\py35_try2\lib\site-packages\keras\utils\data_utils.py", line 588, in start
thread.start()

File "E:\Anaconda3\envs\py35_try2\lib\multiprocessing\process.py", line 105, in start
self._popen = self._Popen(self)

File "E:\Anaconda3\envs\py35_try2\lib\multiprocessing\context.py", line 212, in _Popen
return _default_context.get_context().Process._Popen(process_obj)

File "E:\Anaconda3\envs\py35_try2\lib\multiprocessing\context.py", line 313, in _Popen
return Popen(process_obj)

File "E:\Anaconda3\envs\py35_try2\lib\multiprocessing\popen_spawn_win32.py", line 66, in init
reduction.dump(process_obj, to_child)

File "E:\Anaconda3\envs\py35_try2\lib\multiprocessing\reduction.py", line 59, in dump
ForkingPickler(file, protocol).dump(obj)

AttributeError: Can't pickle local object 'GeneratorEnqueuer.start..data_generator_task'`

KardoPaska commented Oct 28, 2017

I am a physicist not a computer sciencetist. I am broken too. Having Tensorflow 1.0.0 as backend, ipython 5.1.0, spyder 3.2.3, numpy 1.11.3, h5py 2.6.0, hdf5, 1.8.15.1, patsy 0.4.1.

`File "E:\Anaconda3\envs\py35_try2\lib\site-packages\keras\legacy\interfaces.py", line 87, in wrapper
return func(*args, **kwargs)

File "E:\Anaconda3\envs\py35_try2\lib\site-packages\keras\engine\training.py", line 1800, in fit_generator
enqueuer.start(workers=workers, max_queue_size=max_queue_size)

File "E:\Anaconda3\envs\py35_try2\lib\site-packages\keras\utils\data_utils.py", line 588, in start
thread.start()

File "E:\Anaconda3\envs\py35_try2\lib\multiprocessing\process.py", line 105, in start
self._popen = self._Popen(self)

File "E:\Anaconda3\envs\py35_try2\lib\multiprocessing\context.py", line 212, in _Popen
return _default_context.get_context().Process._Popen(process_obj)

File "E:\Anaconda3\envs\py35_try2\lib\multiprocessing\context.py", line 313, in _Popen
return Popen(process_obj)

File "E:\Anaconda3\envs\py35_try2\lib\multiprocessing\popen_spawn_win32.py", line 66, in init
reduction.dump(process_obj, to_child)

File "E:\Anaconda3\envs\py35_try2\lib\multiprocessing\reduction.py", line 59, in dump
ForkingPickler(file, protocol).dump(obj)

AttributeError: Can't pickle local object 'GeneratorEnqueuer.start..data_generator_task'`

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment