Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Update landscape plot #20

Merged
merged 2 commits into from Mar 14, 2023
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Jump to
Jump to file
Failed to load files.
Diff view
Diff view
8 changes: 5 additions & 3 deletions README.md
Expand Up @@ -22,14 +22,14 @@ The utility provided by this package is the function `optfuns` which returns thr
The code for this benchmark is in the `runtests.jl`.

## Visualize loss landscape
We define a plot recipe such that a loss landscape can be plotted with
Based on the work on [loss landscape visualization [2]](https://arxiv.org/abs/1712.09913), we define a plot recipe such that a loss landscape can be plotted with
```julia
using Plots
plot(loss, pars, l=0.1, npoints=50, seriestype=:contour)
contourf(() -> log10(1 + loss()), pars, color=:turbo, npoints=50, lnorm=1)
```
![landscape](figs/landscape.svg)

The landscape is plotted by selecting two random directions and extending the current point (`pars`) a distance `l*norm(pars)` (both negative and positive) along the two random directions. The number of loss evaluations will be `npoints^2`.
The landscape is plotted by selecting two random directions and extending the current point (`pars`) a distance `lnorm * norm(pars)` (both negative and positive) along the two random directions. The number of loss evaluations will be `npoints^2`.


## Flatten and Unflatten
Expand All @@ -48,3 +48,5 @@ This is what is used under the hood in the functions returned from `optfuns` in

# References
[[1] "Stochastic quasi-Newton with adaptive step lengths for large-scale problems", Adrian Wills, Thomas Schön, 2018](https://arxiv.org/abs/1802.04310)

[[2] "Visualizing the Loss Landscape of Neural Nets", Hao Li, Zheng Xu, Gavin Taylor, Christoph Studer, Tom Goldstein, 2018](https://arxiv.org/abs/1712.09913)