-
Notifications
You must be signed in to change notification settings - Fork 1.5k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Save, load and save again #159
Comments
Good point, maybe we don't. Let me take a look and try to remember if I had a good reason to add it. If not I can remove it. In the meantime, feel free to submit a PR. |
this messes it up for us, because it deletes our logs! @fmfn it might be good to check if the bounds changed, and only replace the logs if they did, and also critically to rename the logs file instead of deleting it, because that data is valuable, and it's just getting tossed out |
@fmfn Instead of rewriting the logs file and passing a filepath, is it too hard to pass a logs folder? Then we could save timestamped logs which might be useful for reproducability and publications. I think if the first line of the logs file contained the "Bounds" information, then bayesopt can easily compare the current bounds, to the bounds of the previous logs in the logs folder, and just pick the most recent logs with the same bounds. So if you pass new bounds, it starts a new log file, but if you use bounds from before, it picks where it left off. Is this hard to implement? |
Any feedback on this? I am still unsure as how to save - load - save again. Once I load the logs and save, it clears the logs and all history of past probes. Won't this affect the underlying GP? |
Hi,
|
For those who are wondering, the
|
Hi,
Thanks for this useful library!
I wonder how to do
save - load - save - load
multiple times?To achieve this, I use the code below:
But it seems that this line
will delete the orginal json file and previous records are lost.
Why do we need to do
os.remove(self._path)
inJSONLogger.__init__()
?The text was updated successfully, but these errors were encountered: