Skip to content

LoadModel memory overflow  #30

@the-floating-city

Description

@the-floating-city

Hi there, I trained a model on the google news corpus and was able to successfully create an output to load, however when I use the loadModel function with the output dataset I'm getting a memory overflow from node.

Im running Node v18, linux

I also tried including the flag for memory allocation in the npm run script, and while it seems to run for a longer period before overflowing, it still doesn't complete with up to 12Gb allocated.

I'm running on a 16Gb RAM, but I was wondering if I'm missing an optimization step. The word embeddings file is only 3Gb.

If there's anything that can be done to better utilize memory I feel like it should work, as I was able to train this model on the same machine.

Any help would be much appreciated! Thanks.

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions