You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Thank you for open-sourcing your work. I am slightly confused about the metrics you guys used for the evaluation. Here is my understanding from your readme.
accuracy_n: accuracy evaluation on only the n-th task after training for the n-th task.
forgetting: Average forgetting up until the current task.
avg_acc: Average evaluation accuracy after training for the n-th task.
for (3), after training for the n-th task, we got,
acc_per_task = [a1, a2,...an];
avg_acc = average(acc_per_task);
Is it right?
The text was updated successfully, but these errors were encountered:
Dear authors,
Thank you for open-sourcing your work. I am slightly confused about the metrics you guys used for the evaluation. Here is my understanding from your readme.
accuracy_n
: accuracy evaluation on only the n-th task after training for the n-th task.forgetting
: Average forgetting up until the current task.avg_acc
: Average evaluation accuracy after training for the n-th task.Is it right?
The text was updated successfully, but these errors were encountered: