🌟 Feature Description
How to reduce the memory cost of double ensemble model.
Motivation
I tried to use DoubleEnsembleModel which seems to be the best over Alpha158 dataset. However, I failed with error:
numpy.core._exceptions._ArrayMemoryError: Unable to allocate 119. GiB for an array with shape (8174296, 1960) and data type float64
8174296 = sample_number * factor_number
1960 = traing steps
Can we reduce the memory cost of the training process?
By the way, it's not parallel runing when selected features and calculated the sample weights.