We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
There was an error while loading. Please reload this page.
Updated Would there be a problem computing the gradient of cross entropy loss for the ReLU neuron? (markdown)
Created Would there be a problem computing the gradient of cross-entropy loss for the ReLU neuron? (markdown)
Updated Neural Network with ReLU and Sigmoid Activations (markdown)
Created Neural Network with ReLU and Sigmoid Activations (markdown)
Created Power of linear neural networks (markdown)
Updated Let $G ∼ N (0, σ^{2})$ be a gaussian random variable. Let $Y = ReLU(G)$. Calculate $E[Y]$ and $Var[Y]$. (markdown)
Created Let $G ∼ N (0, σ^{2})$ be a gaussian random variable. Let $Y = ReLU(G)$. Calculate $E[Y]$ and $Var[Y]$. (markdown)
Initial Home page