what is the difference between under-fitting and over-fitting
Underfitting and overfitting are problems that occur during the model training process.

Underfitting occurs when the model is too simple and cannot accurately capture the pattern in the training data. As a result, the model has poor performance on both the training data and new, unseen data. Underfitting is often caused by not having enough features or a lack of model complexity.

Overfitting occurs when a model has too many parameters relative to the number of observations in the training data, and it starts to memorize the training data instead of learning the underlying patterns. This leads to high training accuracy but poor generalization performance on unseen data

To prevent underfitting and overfitting, a balance needs to be struck between model complexity and the amount of training data. Techniques such as cross-validation, regularization, and early stopping can also help to prevent overfitting.
Which metric is most important to look at to detect if over-fitting is taking place?
One of the metrics that can be used to detect if a machine learning model is overfitting is looking at the loss function while training. A very small loss function usually indicates that overfitting is taking place and can be used to determine when to stop the training process.