You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
classification (Concatenate) (None, None, 1) 0 classification_submodel[1][0]
classification_submodel[2][0]
classification_submodel[3][0]
classification_submodel[4][0]
classification_submodel[5][0]
==================================================================================================
Total params: 36,382,957
Trainable params: 36,276,717
Non-trainable params: 106,240
__________________________________________________________________________________________________
None
Epoch 1/50
Exception in thread Thread-2:
Traceback (most recent call last):
File "C:\Users\****\AppData\Local\conda\conda\envs\POMP\lib\threading.py", line 916, in _bootstrap_inner
self.run()
File "C:\Users\****\AppData\Local\conda\conda\envs\POMP\lib\threading.py", line 864, in run
self._target(*self._args, **self._kwargs)
File "C:\Users\****\AppData\Local\conda\conda\envs\POMP\lib\site-packages\keras\utils\data_utils.py", line 565, in _run
with closing(self.executor_fn(_SHARED_SEQUENCES)) as executor:
File "C:\Users\****\AppData\Local\conda\conda\envs\POMP\lib\site-packages\keras\utils\data_utils.py", line 548, in <lambda>
initargs=(seqs,))
File "C:\Users\****\AppData\Local\conda\conda\envs\POMP\lib\multiprocessing\context.py", line 119, in Pool
context=self.get_context())
File "C:\Users\****\AppData\Local\conda\conda\envs\POMP\lib\multiprocessing\pool.py", line 174, in __init__
self._repopulate_pool()
File "C:\Users\****\AppData\Local\conda\conda\envs\POMP\lib\multiprocessing\pool.py", line 239, in _repopulate_pool
w.start()
File "C:\Users\****\AppData\Local\conda\conda\envs\POMP\lib\multiprocessing\process.py", line 105, in start
self._popen = self._Popen(self)
File "C:\Users\****\AppData\Local\conda\conda\envs\POMP\lib\multiprocessing\context.py", line 322, in _Popen
return Popen(process_obj)
File "C:\Users\****\AppData\Local\conda\conda\envs\POMP\lib\multiprocessing\popen_spawn_win32.py", line 65, in __init__
reduction.dump(process_obj, to_child)
File "C:\Users\****\AppData\Local\conda\conda\envs\POMP\lib\multiprocessing\reduction.py", line 60, in dump
ForkingPickler(file, protocol).dump(obj)
TypeError: can't pickle generator objects
Using TensorFlow backend.
Traceback (most recent call last):
File "<string>", line 1, in <module>
File "C:\Users\****\AppData\Local\conda\conda\envs\POMP\lib\multiprocessing\spawn.py", line 105, in spawn_main
exitcode = _main(fd)
File "C:\Users\****\AppData\Local\conda\conda\envs\POMP\lib\multiprocessing\spawn.py", line 115, in _main
self = reduction.pickle.load(from_parent)
EOFError: Ran out of input
I already found a solution here: #857 that you should add --workers 0.
However since this is not really a 'solution' and is more of a workaround I would like to leave this issue open to see if this can be fixed at the underlying level.
The text was updated successfully, but these errors were encountered:
However since this is not really a 'solution' and is more of a workaround I would like to leave this issue open to see if this can be fixed at the underlying level.
The underlying issue comes from Keras it seems; not much we can do about that in this repository.
I'm currently running into an issue when I run the following command:
The logs of this command:
I already found a solution here: #857 that you should add
--workers 0
.However since this is not really a 'solution' and is more of a workaround I would like to leave this issue open to see if this can be fixed at the underlying level.
The text was updated successfully, but these errors were encountered: