Skip to content

Commit

Permalink
remove link to script in tutorials and notebooks
Browse files Browse the repository at this point in the history
  • Loading branch information
dustinvtran committed Mar 9, 2017
1 parent 384ed28 commit 25a0e29
Show file tree
Hide file tree
Showing 16 changed files with 40 additions and 112 deletions.
9 changes: 2 additions & 7 deletions docs/notebooks/gan.ipynb
Expand Up @@ -11,11 +11,8 @@
"They posit a deep generative model and they enable fast and accurate\n",
"inferences.\n",
"\n",
"We demonstrate with an example in Edward. A webpage version is available \n",
"[here](http://edwardlib.org/tutorials/gan),\n",
"or as a script available at\n",
"[`examples/gan.py`](https://github.com/blei-lab/edward/blob/master/examples/gan.py)\n",
"in the Github repository."
"We demonstrate with an example in Edward. A webpage version is available at\n",
"http://edwardlib.org/tutorials/gan."
]
},
{
Expand Down Expand Up @@ -123,7 +120,6 @@
}
],
"source": [
"# DATA. MNIST batches are fed at training time.\n",
"mnist = input_data.read_data_sets(DATA_DIR, one_hot=True)\n",
"x_ph = tf.placeholder(tf.float32, [M, 784])"
]
Expand Down Expand Up @@ -168,7 +164,6 @@
" x = slim.fully_connected(h1, 784, activation_fn=tf.sigmoid)\n",
" return x\n",
"\n",
"# MODEL\n",
"with tf.variable_scope(\"Gen\"):\n",
" eps = Uniform(a=tf.zeros([M, d]) - 1.0, b=tf.ones([M, d]))\n",
" x = generative_network(eps)"
Expand Down
37 changes: 14 additions & 23 deletions docs/notebooks/getting_started.ipynb

Large diffs are not rendered by default.

8 changes: 2 additions & 6 deletions docs/notebooks/latent_space_models.ipynb
Expand Up @@ -18,11 +18,8 @@
"their distance in the latent space.\n",
"\n",
"We will analyze network data from neuroscience.\n",
"A webpage version is available \n",
"[here](http://edwardlib.org/tutorials/latent-space-models),\n",
"or as a script available at\n",
"[`examples/latent_space_model.py`](https://github.com/blei-lab/edward/blob/master/examples/latent_space_model.py)\n",
"in the Github repository."
"A webpage version is available at\n",
"http://edwardlib.org/tutorials/latent-space-models."
]
},
{
Expand Down Expand Up @@ -119,7 +116,6 @@
},
"outputs": [],
"source": [
"# MODEL\n",
"N = x_train.shape[0] # number of data points\n",
"K = 3 # latent dimensionality\n",
"\n",
Expand Down
11 changes: 2 additions & 9 deletions docs/notebooks/mixture_density_network.ipynb
Expand Up @@ -10,11 +10,8 @@
"of models obtained by combining a conventional neural network with a\n",
"mixture density model.\n",
"\n",
"We demonstrate with an example in Edward. A webpage version is available \n",
"[here](http://edwardlib.org/tutorials/mixture-density-network),\n",
"or as a script available at\n",
"[`examples/mixture_density_network.py`](https://github.com/blei-lab/edward/blob/master/examples/mixture_density_network.py)\n",
"in the Github repository."
"We demonstrate with an example in Edward. A webpage version is available at\n",
"http://edwardlib.org/tutorials/mixture-density-network."
]
},
{
Expand Down Expand Up @@ -136,7 +133,6 @@
"D = 1 # number of features\n",
"K = 20 # number of mixture components\n",
"\n",
"# DATA\n",
"X_train, X_test, y_train, y_test = build_toy_dataset(N)\n",
"print(\"Size of features in training data: {}\".format(X_train.shape))\n",
"print(\"Size of output in training data: {}\".format(y_train.shape))\n",
Expand Down Expand Up @@ -199,7 +195,6 @@
" return mus, sigmas, logits\n",
"\n",
"\n",
"# MODEL\n",
"mus, sigmas, logits = neural_network(X_ph)\n",
"cat = Categorical(logits=logits)\n",
"components = [Normal(mu=mu, sigma=sigma) for mu, sigma\n",
Expand Down Expand Up @@ -245,7 +240,6 @@
},
"outputs": [],
"source": [
"# INFERENCE\n",
"# There are no latent variables to infer. Thus inference is concerned\n",
"# with only training model parameters, which are baked into how we\n",
"# specify the neural networks.\n",
Expand Down Expand Up @@ -351,7 +345,6 @@
},
"outputs": [],
"source": [
"# CRITICISM\n",
"pred_weights, pred_means, pred_std = \\\n",
" sess.run([tf.nn.softmax(logits), mus, sigmas], feed_dict={X_ph: X_test})"
]
Expand Down
11 changes: 2 additions & 9 deletions docs/notebooks/probabilistic_pca.ipynb
Expand Up @@ -12,11 +12,8 @@
"used when there are missing values in the data or for multidimensional\n",
"scaling.\n",
"\n",
"We demonstrate with an example in Edward. A webpage version is available \n",
"[here](http://edwardlib.org/tutorials/probabilistic-pca),\n",
"or as a script available at\n",
"[`examples/probabilistic_pca.py`](https://github.com/blei-lab/edward/blob/master/examples/probabilistic_pca.py)\n",
"in the Github repository."
"We demonstrate with an example in Edward. A webpage version is available at\n",
"http://edwardlib.org/tutorials/probabilistic-pca."
]
},
{
Expand Down Expand Up @@ -89,8 +86,6 @@
"D = 2 # data dimensionality\n",
"K = 1 # latent dimensionality\n",
"\n",
"# DATA\n",
"\n",
"x_train = build_toy_dataset(N, D, K)"
]
},
Expand Down Expand Up @@ -175,8 +170,6 @@
},
"outputs": [],
"source": [
"# MODEL\n",
"\n",
"w = Normal(mu=tf.zeros([D, K]), sigma=2.0 * tf.ones([D, K]))\n",
"z = Normal(mu=tf.zeros([N, K]), sigma=tf.ones([N, K]))\n",
"x = Normal(mu=tf.matmul(w, z, transpose_b=True), sigma=tf.ones([D, N]))"
Expand Down
8 changes: 2 additions & 6 deletions docs/notebooks/supervised_classification.ipynb
Expand Up @@ -10,11 +10,8 @@
"labeled data, comprised of training examples $\\{(x_n, y_n)\\}$.\n",
"Classification means the output $y$ takes discrete values.\n",
"\n",
"We demonstrate with an example in Edward. A webpage version is available \n",
"[here](http://edwardlib.org/tutorials/supervised-classification),\n",
"or as a script available at\n",
"[`examples/gp_classification.py`](https://github.com/blei-lab/edward/blob/master/examples/gp_classification.py)\n",
"in the Github repository."
"We demonstrate with an example in Edward. A webpage version is available at\n",
"http://edwardlib.org/tutorials/supervised-classification."
]
},
{
Expand Down Expand Up @@ -57,7 +54,6 @@
"source": [
"ed.set_seed(42)\n",
"\n",
"# DATA\n",
"df = np.loadtxt('data/crabs_train.txt', dtype='float32', delimiter=',')\n",
"df[df[:, 0] == -1, 0] = 0 # replace -1 label with 0 label\n",
"N = 25 # number of data points\n",
Expand Down
11 changes: 2 additions & 9 deletions docs/notebooks/supervised_regression.ipynb
Expand Up @@ -12,11 +12,8 @@
"labeled data, comprised of training examples $\\{(x_n, y_n)\\}$.\n",
"Regression typically means the output $y$ takes continuous values.\n",
"\n",
"We demonstrate with an example in Edward. A webpage version is available \n",
"[here](http://edwardlib.org/tutorials/supervised-regression),\n",
"or as a script available at\n",
"[`examples/bayesian_linear_regression.py`](https://github.com/blei-lab/edward/blob/master/examples/bayesian_linear_regression.py)\n",
"in the Github repository."
"We demonstrate with an example in Edward. A webpage version is available at\n",
"http://edwardlib.org/tutorials/supervised-regression."
]
},
{
Expand Down Expand Up @@ -74,7 +71,6 @@
"N = 40 # number of data points\n",
"D = 10 # number of features\n",
"\n",
"# DATA\n",
"w_true = np.random.randn(D) * 0.5\n",
"X_train, y_train = build_toy_dataset(N, w_true)\n",
"X_test, y_test = build_toy_dataset(N, w_true)"
Expand Down Expand Up @@ -123,7 +119,6 @@
},
"outputs": [],
"source": [
"# MODEL\n",
"X = tf.placeholder(tf.float32, [N, D])\n",
"w = Normal(mu=tf.zeros(D), sigma=tf.ones(D))\n",
"b = Normal(mu=tf.zeros(1), sigma=tf.ones(1))\n",
Expand Down Expand Up @@ -157,7 +152,6 @@
},
"outputs": [],
"source": [
"# INFERENCE\n",
"qw = Normal(mu=tf.Variable(tf.random_normal([D])),\n",
" sigma=tf.nn.softplus(tf.Variable(tf.random_normal([D]))))\n",
"qb = Normal(mu=tf.Variable(tf.random_normal([1])),\n",
Expand Down Expand Up @@ -231,7 +225,6 @@
},
"outputs": [],
"source": [
"# CRITICISM\n",
"y_post = ed.copy(y, {w: qw, b: qb})\n",
"# This is equivalent to\n",
"# y_post = Normal(mu=ed.dot(X, qw) + qb, sigma=tf.ones(N))"
Expand Down
10 changes: 2 additions & 8 deletions docs/notebooks/unsupervised.ipynb
Expand Up @@ -9,11 +9,8 @@
"In unsupervised learning, the task is to infer hidden structure from\n",
"unlabeled data, comprised of training examples $\\{x_n\\}$.\n",
"\n",
"We demonstrate with an example in Edward. A webpage version is available \n",
"[here](http://edwardlib.org/tutorials/unsupervised),\n",
"or as a script available at\n",
"[`examples/mixture_gaussian_collapsed.py`](https://github.com/blei-lab/edward/blob/master/examples/mixture_gaussian_collapsed.py)\n",
"in the Github repository."
"We demonstrate with an example in Edward. A webpage version is available at\n",
"http://edwardlib.org/tutorials/unsupervised."
]
},
{
Expand Down Expand Up @@ -77,7 +74,6 @@
"D = 2 # dimensionality of data\n",
"ed.set_seed(42)\n",
"\n",
"# DATA\n",
"x_train = build_toy_dataset(N)"
]
},
Expand Down Expand Up @@ -194,7 +190,6 @@
},
"outputs": [],
"source": [
"# MODEL\n",
"mu = Normal(mu=tf.zeros([K, D]), sigma=tf.ones([K, D]))\n",
"sigma = InverseGamma(alpha=tf.ones([K, D]), beta=tf.ones([K, D]))\n",
"cat = Categorical(logits=tf.zeros([N, K]))\n",
Expand Down Expand Up @@ -241,7 +236,6 @@
},
"outputs": [],
"source": [
"# INFERENCE\n",
"qmu = Normal(\n",
" mu=tf.Variable(tf.random_normal([K, D])),\n",
" sigma=tf.nn.softplus(tf.Variable(tf.zeros([K, D]))))\n",
Expand Down
4 changes: 1 addition & 3 deletions docs/tex/getting-started.tex
Expand Up @@ -28,9 +28,7 @@ \subsubsection{Your first Edward program}
Here we will show a Bayesian neural network. It is a neural network
with a prior distribution on its weights.
(This example is abridged; an interactive version with Jupyter notebook is available
\href{http://nbviewer.jupyter.org/github/blei-lab/edward/blob/master/docs/notebooks/getting_started.ipynb}{here},
or as a script available
\href{https://github.com/blei-lab/edward/blob/master/examples/getting_started_example.py}{here}.)
\href{http://nbviewer.jupyter.org/github/blei-lab/edward/blob/master/docs/notebooks/getting_started.ipynb}{here}.)

First, simulate a toy dataset of 50 observations with a cosine relationship.

Expand Down
5 changes: 1 addition & 4 deletions docs/tex/tutorials/gan.tex
Expand Up @@ -9,10 +9,7 @@ \subsection{Generative Adversarial Networks}

We demonstrate with an example in Edward.
An interactive version with Jupyter notebook is available
\href{http://nbviewer.jupyter.org/github/blei-lab/edward/blob/master/docs/notebooks/gan.ipynb}{here},
or as a script available at
\href{https://github.com/blei-lab/edward/blob/master/examples/gan.py}
{\texttt{examples/gan.py}} in the Github repository.
\href{http://nbviewer.jupyter.org/github/blei-lab/edward/blob/master/docs/notebooks/gan.ipynb}{here}.

\begin{lstlisting}[language=Python]
M = 128 # batch size during training
Expand Down
5 changes: 1 addition & 4 deletions docs/tex/tutorials/latent-space-models.tex
Expand Up @@ -15,10 +15,7 @@ \subsection{Latent Space Models for Neural Data}

We will analyze network data from neuroscience.
An interactive version with Jupyter notebook is available
\href{http://nbviewer.jupyter.org/github/blei-lab/edward/blob/master/docs/notebooks/latent_space_models.ipynb}{here},
or as a script available at
\href{https://github.com/blei-lab/edward/blob/master/examples/latent_space_model.py}
{\texttt{examples/latent_space_model.py}} in the Github repository.
\href{http://nbviewer.jupyter.org/github/blei-lab/edward/blob/master/docs/notebooks/latent_space_models.ipynb}{here}.

\subsubsection{Data}

Expand Down
5 changes: 1 addition & 4 deletions docs/tex/tutorials/mixture-density-network.tex
Expand Up @@ -8,10 +8,7 @@ \subsection{Mixture Density Networks}

We demonstrate with an example in Edward.
An interactive version with Jupyter notebook is available
\href{http://nbviewer.jupyter.org/github/blei-lab/edward/blob/master/docs/notebooks/mixture_density_network.ipynb}{here},
or as a script available at
\href{https://github.com/blei-lab/edward/blob/master/examples/mixture_density_network.py}
{\texttt{examples/mixture_density_network.py}} in the Github repository.
\href{http://nbviewer.jupyter.org/github/blei-lab/edward/blob/master/docs/notebooks/mixture_density_network.ipynb}{here}.

\subsubsection{Data}

Expand Down
5 changes: 1 addition & 4 deletions docs/tex/tutorials/probabilistic-pca.tex
Expand Up @@ -10,10 +10,7 @@ \subsection{Probabilistic PCA}

We demonstrate with an example in Edward.
An interactive version with Jupyter notebook is available
\href{http://nbviewer.jupyter.org/github/blei-lab/edward/blob/master/docs/notebooks/probabilistic_pca.ipynb}{here},
or as a script available at
\href{https://github.com/blei-lab/edward/blob/master/examples/probabilistic_pca.py}
{\texttt{examples/probabilistic_pca.py}} in the Github repository.
\href{http://nbviewer.jupyter.org/github/blei-lab/edward/blob/master/docs/notebooks/probabilistic_pca.ipynb}{here}.

\subsubsection{Data}

Expand Down
5 changes: 1 addition & 4 deletions docs/tex/tutorials/supervised-classification.tex
Expand Up @@ -8,10 +8,7 @@ \subsection{Supervised Learning (Classification)}

We demonstrate with an example in Edward.
An interactive version with Jupyter notebook is available
\href{http://nbviewer.jupyter.org/github/blei-lab/edward/blob/master/docs/notebooks/supervised_classification.ipynb}{here},
or as a script available at
\href{https://github.com/blei-lab/edward/blob/master/examples/gp_classification.py}
{\texttt{examples/gp_classification.py}} in the Github repository.
\href{http://nbviewer.jupyter.org/github/blei-lab/edward/blob/master/docs/notebooks/supervised_classification.ipynb}{here}.

\subsubsection{Data}

Expand Down
13 changes: 5 additions & 8 deletions docs/tex/tutorials/supervised-regression.tex
Expand Up @@ -8,10 +8,7 @@ \subsection{Supervised Learning (Regression)}

We demonstrate with an example in Edward.
An interactive version with Jupyter notebook is available
\href{http://nbviewer.jupyter.org/github/blei-lab/edward/blob/master/docs/notebooks/supervised_regression.ipynb}{here},
or as a script available at
\href{https://github.com/blei-lab/edward/blob/master/examples/bayesian_linear_regression.py}
{\texttt{examples/bayesian_linear_regression.py}} in the Github repository.
\href{http://nbviewer.jupyter.org/github/blei-lab/edward/blob/master/docs/notebooks/supervised_regression.ipynb}{here}.

\subsubsection{Data}

Expand Down Expand Up @@ -120,10 +117,10 @@ \subsubsection{Criticism}
\end{lstlisting}

\begin{lstlisting}
Mean squared error on test data:
0.0300492
Mean absolute error on test data:
0.123616
## Mean squared error on test data:
## 0.0300492
## Mean absolute error on test data:
## 0.123616
\end{lstlisting}

The trained model makes predictions with low error
Expand Down
5 changes: 1 addition & 4 deletions docs/tex/tutorials/unsupervised.tex
Expand Up @@ -7,10 +7,7 @@ \subsection{Unsupervised Learning}

We demonstrate with an example in Edward.
An interactive version with Jupyter notebook is available
\href{http://nbviewer.jupyter.org/github/blei-lab/edward/blob/master/docs/notebooks/unsupervised.ipynb}{here},
or as a script available at
\href{https://github.com/blei-lab/edward/blob/master/examples/mixture_gaussian_collapsed.py}
{\texttt{examples/mixture_gaussian_collapsed.py}} in the Github repository.
\href{http://nbviewer.jupyter.org/github/blei-lab/edward/blob/master/docs/notebooks/unsupervised.ipynb}{here}.

\subsubsection{Data}

Expand Down

0 comments on commit 25a0e29

Please sign in to comment.