You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
it seems it consumes very little memory. nowadays ,memory is so cheap. most have 32GB.can it make memory configurable or automated increase mem use whenever system has more mem?
by doing this ,it may increase search/filter performance significantly.
The text was updated successfully, but these errors were encountered:
Glogg is cpu bound on initial indexing due to end of line searching code (see discussion in #227). After initial reading if file is not too large it should stay in OS file cache. That way next reading operations for search/filtering are from memory, not from disk. Keeping file in memory won't increase performance very much.
In klogg I use additional memory to keep cache of search results so repeated searching using same pattern don't actually do the search but uses last results. Using multiple threads for search operations also increases memory use but not very much.
However, more simple index data structure can be used for smaller files (may be configurable). That can increase search performance by using more memory. I'll try this idea.
I've done some benchmarking in klogg using plain vector for index as the most simple structure .
For small files (about 1Gb) in single threaded mode there is about 5% search performance improvement. However, in multi-threaded mode there is almost no difference. Right now I don't see where else trading memory for performance could help.
it seems it consumes very little memory. nowadays ,memory is so cheap. most have 32GB.can it make memory configurable or automated increase mem use whenever system has more mem?
by doing this ,it may increase search/filter performance significantly.
The text was updated successfully, but these errors were encountered: