Issue with the lambda parameter #2
Labels
bug
Something isn't working
help wanted
Extra attention is needed
low-priority
Not crucial but desired
While training, the GSC tries to balance quantization and optimization. This is done by balancing the two processes through lambda-diffusion:
In pseudo-code the dynamics
D
at any given training stept
is equal to:D(t) = lambda_t * Optimization + (1-lambda_t) * Quantization
lambda progressively decreases from 1 to 0, starting with pure Optimization and landing at pure Quantization. The important issue here is how we calculate and update lambda.
The original code by Goldrick et al. uses this matlab function:
which I translated in Python as follows:
where
self.Wc
is the weight matrix andself.domain.q
is the bowl-parameter q.The authors state that this formula
I've left the formula as in the original lacking a better idea, but I would be glad if anyone could help to improve this.
The text was updated successfully, but these errors were encountered: