A repository containing experimental results about fibonacci intialized neural networks in comparison to random intialization neural networks. In this repository we will adding experiments related to Fibonacci initializations on different neural networks or different modules of these networks. Also, will be experimenting with Golden ratio and it's signifance in activation layers/functions.
- As explained in the article the fibonacci based initialization with low variance has already peformed less better as compaered to regular initialization. Forget about comparing it with Glorut Xaviar Initialization.
- Also, after experiments for high variance intialization as compared to earlier experiments results keep on deteriorating. But, still good enough accuracy results of 95.33 for weights going upto '1.597' max is obtained. After, that it's a downhill worth not spending time on.
Fibonacci initialization with golden ratio as learning rate has no improvement effect as stated by study mentioned in the article. Rather, it seems like results seems to deteriorate for this kind of regularized initialization.