You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Currently, the language models are loaded into simple maps at runtime. Even though accessing the maps is pretty fast, they consume a significant amount of memory. The goal is to investigate whether there are more suitable data structures available that require less storage space in memory, something like NumPy for Python.
I wonder if using a SQLite database to store your frequencies would help. That way you would only need to open a database connection and not load full frequency maps into memory. With indexing and query optimization (SQL could handle a piece of the logic of determining the highest frequency) it seems to me it would be pretty fast and very low memory footprint.
Currently, the language models are loaded into simple maps at runtime. Even though accessing the maps is pretty fast, they consume a significant amount of memory. The goal is to investigate whether there are more suitable data structures available that require less storage space in memory, something like NumPy for Python.
One promising candidate could be Gonum.
The text was updated successfully, but these errors were encountered: