Skip to content

Double Ensemble Model neads more than 100GB memory. How to reduce it? #1436

@Wendroff

Description

@Wendroff

🌟 Feature Description

How to reduce the memory cost of double ensemble model.

Motivation

I tried to use DoubleEnsembleModel which seems to be the best over Alpha158 dataset. However, I failed with error:

numpy.core._exceptions._ArrayMemoryError: Unable to allocate 119. GiB for an array with shape (8174296, 1960) and data type float64

8174296 = sample_number * factor_number
1960 = traing steps

Can we reduce the memory cost of the training process?

By the way, it's not parallel runing when selected features and calculated the sample weights.

Metadata

Metadata

Assignees

No one assigned

    Labels

    enhancementNew feature or request

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions