You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Our code uses several metrics to compare accuracy such as precision, recall, F1_Score.
I would like to firstly, visualize a confusion matrix for the same using heatmap, create a prec-recall curve and plot an ROC curve for the classification algos performed on finding job satisfaction.
Describe each metric that we use in the notebook through markdown or atleast comments to inform users about how the accuracy is calculated.
Optimize the results as much as possible.
Please assign me this task as a contributor under GSSOC'24 @sanjay-kv.
The text was updated successfully, but these errors were encountered:
Our code uses several metrics to compare accuracy such as precision, recall, F1_Score.
Please assign me this task as a contributor under GSSOC'24 @sanjay-kv.
The text was updated successfully, but these errors were encountered: