Can you explain to me the parameter gamma in XGBoost algorithm? What is its impact on the model's perfomance?
Gamma is a hyperparameter in XGBoost that controls model complexity and helps prevent overfitting. It is a non-negative value used during tree pruning.  

When building decision trees in XGBoost, branches are added to split the data and improve predictions. Each split reduces error, but adding too many branches can lead to overfitting.  

Gamma penalizes the number of splits in a tree. When deciding whether to add a new branch, XGBoost considers both the improvement in error reduction (gain) and the gamma value. If the gain is less than gamma, the split is not made, pruning the tree.

Pseudo code:
```
For each node in the tree:
   Gain = error reduction from splitting node
   If (Gain > gamma): 
     Split node
   Else: 
     Do not split node (prune)
```


A higher gamma leads to a simpler model with fewer splits, while a lower gamma allows more splits and a more complex model. The choice of gamma is important, as too high a value can cause underfitting and too low a value can lead to overfitting. Cross-validation should be used to determine the optimal gamma.

In short, gamma controls the trade-off between error reduction and model complexity in the XGBoost tree pruning process. Tuning gamma appropriately is key to achieving the best model fit.

Code snippet (in Python): 
```python
 xgb_reg = XGBRegressor(gamma=0.1, n_estimators=100)
 xgb_reg.fit(X_train, y_train)
```


This will fit an XGBoost model with a gamma of 0.1 and 100 decision trees.
Seem helpful.