Hello,
I've encountered a specific checkpoint in the code where a warning is issued for large nseg values: "Warning, too many segments chosen, falling back to nseg = {}". This checkpoint seems to be designed to maintain 100 frames for the longest timestep.
However, in the context of the default usage, where only m=20 values are utilized for the GLS calculation, the rationale behind this specific limitation isn't entirely clear. Is there a theoretical foundation for this constraint? If not, it might be beneficial to consider allowing larger nseg values to enhance statistical accuracy with the same dataset.
I'm curious to hear your thoughts on this.
Thank you for your time and consideration.
Hello,
I've encountered a specific checkpoint in the code where a warning is issued for large
nsegvalues: "Warning, too many segments chosen, falling back to nseg = {}". This checkpoint seems to be designed to maintain 100 frames for the longest timestep.However, in the context of the default usage, where only m=20 values are utilized for the GLS calculation, the rationale behind this specific limitation isn't entirely clear. Is there a theoretical foundation for this constraint? If not, it might be beneficial to consider allowing larger
nsegvalues to enhance statistical accuracy with the same dataset.I'm curious to hear your thoughts on this.
Thank you for your time and consideration.