Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Draw the loss curve #6

Closed
943fansi opened this issue Dec 11, 2022 · 2 comments
Closed

Draw the loss curve #6

943fansi opened this issue Dec 11, 2022 · 2 comments

Comments

@943fansi
Copy link

@xuhongzuo . I want to use pd.DataFrame to save loss, loss_oc and val_loss in each epoch, and save it as a csv file. Could you add the code for this function?

@xuhongzuo
Copy link
Owner

I printed these loss values after each training epoch.
You could do this personalized requirement by yourself.
issue closed.

@943fansi
Copy link
Author

Add 4 lines code at src/algorithms/couta_algo.py, can save info in a "train_log.txt" file.

...
    val_loss = torch.mean(torch.stack(val_loss)).data.cpu().item()
# start
loss_file = "train_log.txt"
_f = open(loss_file, "a")
print(f'epoch: {i+1:02},loss: {epoch_loss:.6f},loss_oc: {epoch_loss_oc:.6f}, val_loss: {val_loss:.6f}',
                  file=_f, flush=True)
_f.close()
#end
    if (i+1) % 10 == 0:
    ...

And then use following code to transform it into a csv file:

import numpy as np
import pandas as pd

with open("train_log.txt", "r") as f:
    data = f.readlines()

n = len(data)
result = np.zeros([n, 4])
for i in range(n):
    record = data[i].split(",")
    epoch = int(record[0].split(" ")[-1])
    loss = float(record[1].split(" ")[-1])
    loss_oc = float(record[2].split(" ")[-1])
    val_loss = float(record[3].split(" ")[-1])
    print(epoch, loss, loss_oc, val_loss)
    result[i, :] = [epoch, loss, loss_oc, val_loss]
df = pd.DataFrame(data=result, columns=["epoch", "loss", "loss_oc", "val_loss"])
df.to_csv("loss_curve.csv")

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants