Skip to content

Commit

Permalink
Add @schubert editorial style fixes; mostly on collapsibles
Browse files Browse the repository at this point in the history
  • Loading branch information
Ludwig Schubert committed May 5, 2020
1 parent c87e376 commit b3de979
Show file tree
Hide file tree
Showing 4 changed files with 58 additions and 15 deletions.
45 changes: 41 additions & 4 deletions public/css/styles.css
Original file line number Diff line number Diff line change
Expand Up @@ -48,7 +48,7 @@
}

.collapsible {
background-color: #e1f2f2;
background-color: hsla(206, 95%, 20%, 0.2);
cursor: pointer;
padding: 1em;
border: none;
Expand All @@ -58,7 +58,7 @@

.active, .collapsible:hover {
transition: max-height 0.2s ease-out;
background-color: #acc2c2;
/* background-color: #acc2c2; */
}

.content {
Expand All @@ -68,7 +68,7 @@
transition: max-height 0.2s ease-out;
/*background-color: #e1f2f2;*/
/*border-style: ridge;*/
border-color: #e1f2f2;
border-color: hsla(206, 95%, 20%, 0.2);
}

/* Next & previous buttons */
Expand Down Expand Up @@ -125,7 +125,7 @@
}

button {
cursor:grabbing;
cursor: grabbing;
float: left;
width: 2.2em;
height: 2.2em;
Expand All @@ -139,3 +139,40 @@ button {
background: black;
cursor: pointer;
}

/* 2020-05-05 @schubert editorial style fixes */

.gif-slider button.stepper {
align-items: center;
justify-content: center;
cursor: pointer;
}

.gif-slider button:focus {
outline: 0;
}

.collapsible {
display: flex;
justify-content: space-between;
align-content: center;
border-radius: 4px;
}

.collapsible.active {
border-bottom-left-radius: 0;
border-bottom-right-radius: 0;
}

.collapsible .collapsible-indicator::after {
content: "+"
}

.collapsible.active .collapsible-indicator::after {
content: "-"
}

.collapsible h4 {
margin: 0;
line-height: inherit;
}
16 changes: 8 additions & 8 deletions public/index.html
Original file line number Diff line number Diff line change
Expand Up @@ -572,7 +572,7 @@ <h4 id="expectedimprovementei">Expected Improvement (EI)</h4>
Is this better than before? It turns out a yes and a no. We see that here we do too much exploration, given the value of <d-math>\epsilon = 3</d-math>. We quickly reach close to global maxima, but unfortunately, do not exploit to get more gains near the global maxima.
</p>

<h4 class="collapsible">PI vs. EI</h4>
<div class="collapsible"><h4>PI vs Ei</h4><div class="collapsible-indicator"></div></div>
<div class="content">
<p>
We have seen two closely related methods, The <em>Probability of Improvement</em> and the <em>Expected Improvement</em>.
Expand Down Expand Up @@ -653,7 +653,7 @@ <h3>Random</h3>
<h3>Summary of Acquisition Functions</h3> <p>
Let us now summarize the core ideas associated with acquisition functions: i) they are heuristics for evaluating the utility of a point; ii) they are a function of the surrogate posterior; iii) they combine exploration and exploitation; and iv) they are inexpensive to evaluate.</p>

<h3 class="collapsible"> Other Acquisition Functions </h3>
<div class="collapsible"><h4>Other Acquisition Functions</h4> <div class="collapsible-indicator"></div></div>
<div class="content">

<p>We have seen various acquisition functions until now. One trivial way to come up with acquisition functions is to have a explore/exploit combination.
Expand Down Expand Up @@ -794,7 +794,7 @@ <h3 id="comparison">Comparison</h3>



<h3 class="collapsible">Other Examples</h3>
<div class="collapsible"><h4>Other Examples</h4><div class="collapsible-indicator"></div></div>
<div class="content">
<h3>Example 2 -- Random Forest</h3>

Expand Down Expand Up @@ -851,7 +851,7 @@ <h3>Example 2 -- Random Forest</h3>

<h3>Example 3 -- Neural Networks</h3>
<p>
Let us take this example to get an idea of how to apply Bayesian Optimization to train neural networks. Here we will be using <d-code language="python">scikit-optim</d-code>, which also provides us support for optimizing function with a search space of categorical, integral, and real variables. We will not be plotting the ground truth here, as it is extremely costly to do so. Below are some code snippets that show the ease of using Bayesian Optimization packages for hyperparameter tuning.
Let us take this example to get an idea of how to apply Bayesian Optimization to train neural networks. Here we will be using <code>scikit-optim</code>, which also provides us support for optimizing function with a search space of categorical, integral, and real variables. We will not be plotting the ground truth here, as it is extremely costly to do so. Below are some code snippets that show the ease of using Bayesian Optimization packages for hyperparameter tuning.
</p>

<p>
Expand Down Expand Up @@ -896,7 +896,7 @@ <h3>Example 3 -- Neural Networks</h3>
</d-code>

<p>
Now import <d-code language="python">gp-minimize</d-code><d-footnote><strong>Note</strong>: One will need to negate the accuracy values as we are using the minimizer function from <d-code language="python">scikit-optim</d-code>.</d-footnote> from <d-code language="python">scikit-optim</d-code> to perform the optimization. Below we show calling the optimizer using <em>Expected Improvement</em>, but of course we can select from a number of other acquisition functions.
Now import <code>gp-minimize</code><d-footnote><strong>Note</strong>: One will need to negate the accuracy values as we are using the minimizer function from <code>scikit-optim</code>.</d-footnote> from <code>scikit-optim</code> to perform the optimization. Below we show calling the optimizer using <em>Expected Improvement</em>, but of course we can select from a number of other acquisition functions.
</p>

<d-code block language="python">
Expand All @@ -922,7 +922,7 @@ <h3>Example 3 -- Neural Networks</h3>
</p>

<p>
Looking at the above example, we can see that incorporating Bayesian Optimization is not difficult and can save a lot of time. Optimizing to get an accuracy of nearly one in around seven iterations is impressive!<d-footnote>The example above has been inspired by <a href="https://github.com/Hvass-Labs/TensorFlow-Tutorials/blob/master/19_Hyper-Parameters.ipynb">Hvass Laboratories' Tutorial Notebook</a> showcasing hyperparameter optimization in TensorFlow using <d-code language="python">scikit-optim</d-code>.</d-footnote>
Looking at the above example, we can see that incorporating Bayesian Optimization is not difficult and can save a lot of time. Optimizing to get an accuracy of nearly one in around seven iterations is impressive!<d-footnote>The example above has been inspired by <a href="https://github.com/Hvass-Labs/TensorFlow-Tutorials/blob/master/19_Hyper-Parameters.ipynb">Hvass Laboratories' Tutorial Notebook</a> showcasing hyperparameter optimization in TensorFlow using <code>scikit-optim</code>.</d-footnote>
</p>

<p>
Expand All @@ -933,7 +933,7 @@ <h3>Example 3 -- Neural Networks</h3>
<h1 id="conclusions">Conclusion and Summary</h1>

<p>
In this article, we looked at Bayesian Optimization for optimizing a black-box function. Bayesian Optimization is well suited when the function evaluations are expensive, making grid or exhaustive search impractical. We looked at the key components of Bayesian Optimization. First, we looked at the notion of using a surrogate function (with a prior over the space of objective functions) to model our black-box function. Next, we looked at the "Bayes" in Bayesian Optimization - the function evaluations are used as data to obtain the surrogate posterior. We look at acquisition functions, which are functions of the surrogate posterior and are optimized sequentially. This new sequential optimization is in-expensive and thus of utility of us. We also looked at a few acquisition functions and showed how these different functions balance exploration and exploitation. Finally, we looked at some practical examples of Bayesian Optimization for optimizing hyper-parameters for machine learning models.
In this article, we looked at Bayesian Optimization for optimizing a black-box function. Bayesian Optimization is well suited when the function evaluations are expensive, making grid or exhaustive search impractical. We looked at the key components of Bayesian Optimization. First, we looked at the notion of using a surrogate function (with a prior over the space of objective functions) to model our black-box function. Next, we looked at the "Bayes" in Bayesian Optimization -- the function evaluations are used as data to obtain the surrogate posterior. We look at acquisition functions, which are functions of the surrogate posterior and are optimized sequentially. This new sequential optimization is in-expensive and thus of utility of us. We also looked at a few acquisition functions and showed how these different functions balance exploration and exploitation. Finally, we looked at some practical examples of Bayesian Optimization for optimizing hyper-parameters for machine learning models.
</p>

<p>
Expand Down Expand Up @@ -991,7 +991,7 @@ <h3 id="FurtherReading">Further Reading</h3>
</li>
<li>
<p>
We talked about optimizing a black-box function here. If we are to perform over multiple objectives, how do these acquisition functions scale? There has been fantastic work in this domain too! We try to deal with these cases by having multi-objective acquisition functions. Have a look at <a href="https://gpflowopt.readthedocs.io/en/latest/notebooks/multiobjective.html">this excellent</a> notebook for an example using <d-code language="python">gpflowopt</d-code>.
We talked about optimizing a black-box function here. If we are to perform over multiple objectives, how do these acquisition functions scale? There has been fantastic work in this domain too! We try to deal with these cases by having multi-objective acquisition functions. Have a look at <a href="https://gpflowopt.readthedocs.io/en/latest/notebooks/multiobjective.html">this excellent</a> notebook for an example using <code>gpflowopt</code>.
</p>
</li>
<li>
Expand Down
9 changes: 7 additions & 2 deletions public/js/gif-slider.js
Original file line number Diff line number Diff line change
Expand Up @@ -18,6 +18,9 @@ function preloadImages(array) {
}
}

const trianglePointingRight = `<svg width="10" height="10" viewBox="0 0 10 10" style="margin-top: 3px;"><path d="M 0 0 L 10 5 L 0 10 z" fill="#888"></path></svg>`
const trianglePointingLeft = `<svg width="10" height="10" viewBox="0 0 10 10" style="margin-top: 3px; margin-left: -1px;"><path d="M 10 0 L 0 5 L 10 10 z" fill="#888"></path></svg>`

function appendInputButtons() {
// get all doms with class="gif-slider"
var figs = document.getElementsByClassName("gif-slider")
Expand All @@ -44,13 +47,15 @@ function appendInputButtons() {

// installing buttons
var button1 = document.createElement("button")
button1.appendChild(document.createTextNode("<"))
button1.innerHTML = trianglePointingLeft;
button1.setAttribute("onclick",
"changePng(this.parentNode.parentNode.parentNode, false)")
button1.classList += "stepper button-left"
var button2 = document.createElement("button")
button2.appendChild(document.createTextNode(">"))
button2.innerHTML = trianglePointingRight;
button2.setAttribute("onclick",
"changePng(this.parentNode.parentNode.parentNode, true)")
button2.classList += "stepper button-right"

div.setAttribute("class", "controls")

Expand Down
3 changes: 2 additions & 1 deletion public/js/hider.js
Original file line number Diff line number Diff line change
Expand Up @@ -10,7 +10,8 @@ for (i = 0; i < coll.length; i++) {
content.style.borderStyle = "hidden"
} else {
content.style.maxHeight = content.scrollHeight + "px";
content.style.borderStyle = "ridge"
content.style.borderStyle = "dashed"
content.style.borderTop = "none"
}
});
}

0 comments on commit b3de979

Please sign in to comment.