What are the mathematical requirements needed to get into neural networks?
You should have a good understanding of:
- linear algebra, especially of matrix operations
- calculus, including computing derivatives
- probabilities and statistics

Do you want me to detail each of this requirements?
Yes please.
Sure, let me explain each of these mathematical requirements in more detail:

    1. Linear Algebra: Neural networks rely heavily on linear algebra, specifically matrix operations. Matrix operations such as dot product, eigenvalues and eigenvectors, and singular value decomposition are important for understanding neural network theory and backpropagation algorithms.

    2. Calculus: Calculus is used to compute derivatives, which are important for optimizing the parameters of neural networks. The optimization of these parameters helps to reduce the error in predictions made by the network. The chain rule of calculus is also important for understanding backpropagation algorithms in neural networks.

    3. Probabilities and Statistics: A basic understanding of probability theory and statistics is essential for understanding the behavior of neural networks. This includes concepts such as random variables, Gaussian distributions, and maximum likelihood estimation. These concepts help to understand the uncertainty and variability in neural network predictions.

So these are the main mathematical requirements needed to get into neural networks. Having a good foundation in these areas will help you understand the theory behind neural networks and develop your own models effectively.