Skip to content
This repository has been archived by the owner on Feb 16, 2022. It is now read-only.

nmichlo/experiment-normalized-activations

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

32 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Weight Initialisation & Normalised Activations

NOTE: it turns out similar ideas have already recently been investigated:

NB After having a look at the above, and playing around with the initialisation methods. Their results usually beat the experiments below!

Old Experiments

Experiments based on normalising neural network activations (or weights) in a pretraining step, using random input noise.

The network is trained to achieve a target mean and standard deviation after each layer, based on this normally distributed noise.

Results show that this method can improve training speed and performance using the same number of weights.