From 085239c44c9ca7ee18986c7e967e822800449571 Mon Sep 17 00:00:00 2001 From: mmcky Date: Mon, 8 May 2023 21:32:55 +1000 Subject: [PATCH 1/2] Remove backend check, add to status page, add GPU admonition --- lectures/aiyagari_jax.md | 21 +++++++++------------ lectures/arellano.md | 10 +++++++++- lectures/inventory_dynamics.md | 10 ++++++++++ lectures/jax_intro.md | 12 +++++++----- lectures/kesten_processes.md | 21 ++++++++++----------- lectures/newtons_method.md | 20 ++++++++++---------- lectures/opt_invest.md | 26 +++++++++++++------------- lectures/opt_savings.md | 10 ++++++++++ lectures/short_path.md | 22 +++++++++------------- lectures/status.md | 10 +++++++++- lectures/wealth_dynamics.md | 14 +++++++++++++- 11 files changed, 109 insertions(+), 67 deletions(-) diff --git a/lectures/aiyagari_jax.md b/lectures/aiyagari_jax.md index ce70d5e3..e180e65f 100644 --- a/lectures/aiyagari_jax.md +++ b/lectures/aiyagari_jax.md @@ -13,6 +13,15 @@ kernelspec: # The Aiyagari Model +```{admonition} GPU +:class: warning + +This lecture is accelerated via [hardware](status:machine-details) that has access to a GPU and JAX for GPU programming. + +Free GPUs are available on Google Colab. To use this option, please click on the play icon top right, select Colab, and set the runtime environment to include a GPU. + +Alternatively, if you have your own GPU, you can follow the [instructions](https://github.com/google/jax) for installing JAX with GPU support. If you would like to install JAX running on the `cpu` only you can use `pip install jax[cpu]` +``` ## Overview @@ -55,18 +64,6 @@ import jax import jax.numpy as jnp ``` - -Let’s check the backend used by JAX and the devices available. - -```{code-cell} ipython3 -# Check if JAX is using GPU -print(f"JAX backend: {jax.devices()[0].platform}") - -# Check the devices available for JAX -print(jax.devices()) -``` - - We will use 64 bit floats with JAX in order to increase the precision. ```{code-cell} ipython3 diff --git a/lectures/arellano.md b/lectures/arellano.md index 06cb868d..cf06bbf2 100644 --- a/lectures/arellano.md +++ b/lectures/arellano.md @@ -13,7 +13,15 @@ kernelspec: # Default Risk and Income Fluctuations -+++ +```{admonition} GPU +:class: warning + +This lecture is accelerated via [hardware](status:machine-details) that has access to a GPU and JAX for GPU programming. + +Free GPUs are available on Google Colab. To use this option, please click on the play icon top right, select Colab, and set the runtime environment to include a GPU. + +Alternatively, if you have your own GPU, you can follow the [instructions](https://github.com/google/jax) for installing JAX with GPU support. If you would like to install JAX running on the `cpu` only you can use `pip install jax[cpu]` +``` In addition to what's in Anaconda, this lecture will need the following libraries: diff --git a/lectures/inventory_dynamics.md b/lectures/inventory_dynamics.md index fa100b68..213b2e21 100644 --- a/lectures/inventory_dynamics.md +++ b/lectures/inventory_dynamics.md @@ -23,6 +23,16 @@ kernelspec: # Inventory Dynamics +```{admonition} GPU +:class: warning + +This lecture is accelerated via [hardware](status:machine-details) that has access to a GPU and JAX for GPU programming. + +Free GPUs are available on Google Colab. To use this option, please click on the play icon top right, select Colab, and set the runtime environment to include a GPU. + +Alternatively, if you have your own GPU, you can follow the [instructions](https://github.com/google/jax) for installing JAX with GPU support. If you would like to install JAX running on the `cpu` only you can use `pip install jax[cpu]` +``` + ```{index} single: Markov process, inventory ``` diff --git a/lectures/jax_intro.md b/lectures/jax_intro.md index 02ce0b1e..0ea57cbc 100644 --- a/lectures/jax_intro.md +++ b/lectures/jax_intro.md @@ -14,12 +14,14 @@ kernelspec: # JAX -```{note} -This lecture is built using [hardware](status:machine-details) that -has access to a GPU. This means that +```{admonition} GPU +:class: warning + +This lecture is accelerated via [hardware](status:machine-details) that has access to a GPU and JAX for GPU programming. + +Free GPUs are available on Google Colab. To use this option, please click on the play icon top right, select Colab, and set the runtime environment to include a GPU. -1. the lecture might be significantly slower when running on your machine, and -2. the code is well-suited to execution with [Google colab](https://colab.research.google.com/github/QuantEcon/lecture-python-programming.notebooks/blob/master/jax_intro.ipynb) +Alternatively, if you have your own GPU, you can follow the [instructions](https://github.com/google/jax) for installing JAX with GPU support. If you would like to install JAX running on the `cpu` only you can use `pip install jax[cpu]` ``` This lecture provides a short introduction to [Google JAX](https://github.com/google/jax). diff --git a/lectures/kesten_processes.md b/lectures/kesten_processes.md index cfee2830..f95b1912 100644 --- a/lectures/kesten_processes.md +++ b/lectures/kesten_processes.md @@ -29,6 +29,16 @@ kernelspec: :depth: 2 ``` +```{admonition} GPU +:class: warning + +This lecture is accelerated via [hardware](status:machine-details) that has access to a GPU and JAX for GPU programming. + +Free GPUs are available on Google Colab. To use this option, please click on the play icon top right, select Colab, and set the runtime environment to include a GPU. + +Alternatively, if you have your own GPU, you can follow the [instructions](https://github.com/google/jax) for installing JAX with GPU support. If you would like to install JAX running on the `cpu` only you can use `pip install jax[cpu]` +``` + In addition to what's in Anaconda, this lecture will need the following libraries: ```{code-cell} ipython3 @@ -64,17 +74,6 @@ from jax import random ``` -Let’s check the backend used by JAX and the devices available - -```{code-cell} ipython3 -# Check if JAX is using GPU -print(f"JAX backend: {jax.devices()[0].platform}") - -# Check the devices available for JAX -print(jax.devices()) -``` - - ## Kesten processes ```{index} single: Kesten processes; heavy tails diff --git a/lectures/newtons_method.md b/lectures/newtons_method.md index 41a52a22..0a38324f 100644 --- a/lectures/newtons_method.md +++ b/lectures/newtons_method.md @@ -14,6 +14,16 @@ kernelspec: # Newton’s Method via JAX +```{admonition} GPU +:class: warning + +This lecture is accelerated via [hardware](status:machine-details) that has access to a GPU and JAX for GPU programming. + +Free GPUs are available on Google Colab. To use this option, please click on the play icon top right, select Colab, and set the runtime environment to include a GPU. + +Alternatively, if you have your own GPU, you can follow the [instructions](https://github.com/google/jax) for installing JAX with GPU support. If you would like to install JAX running on the `cpu` only you can use `pip install jax[cpu]` +``` + ## Overview Continuing from the [Newton's Method lecture](https://python.quantecon.org/newton_method.html), we are going to solve the multidimensional problem with `JAX`. @@ -28,16 +38,6 @@ import jax.numpy as jnp from scipy.optimize import root ``` -Let’s check the backend used by JAX and the devices available. - -```{code-cell} ipython3 -# Check if JAX is using GPU -print(f"JAX backend: {jax.devices()[0].platform}") - -# Check the devices available for JAX -print(jax.devices()) -``` - ## The Two Goods Market Equilibrium Let's have a quick recap of this problem -- a more detailed explanation and derivation can be found at [A Two Goods Market Equilibrium](https://python.quantecon.org/newton_method.html#two-goods-market). diff --git a/lectures/opt_invest.md b/lectures/opt_invest.md index f645d952..5d87af87 100644 --- a/lectures/opt_invest.md +++ b/lectures/opt_invest.md @@ -14,6 +14,16 @@ kernelspec: # Optimal Investment +```{admonition} GPU +:class: warning + +This lecture is accelerated via [hardware](status:machine-details) that has access to a GPU and JAX for GPU programming. + +Free GPUs are available on Google Colab. To use this option, please click on the play icon top right, select Colab, and set the runtime environment to include a GPU. + +Alternatively, if you have your own GPU, you can follow the [instructions](https://github.com/google/jax) for installing JAX with GPU support. If you would like to install JAX running on the `cpu` only you can use `pip install jax[cpu]` +``` + We require the following library to be installed. ```{code-cell} ipython3 @@ -26,7 +36,9 @@ We require the following library to be installed. A monopolist faces inverse demand curve -$$ P_t = a_0 - a_1 Y_t + Z_t, $$ +$$ +P_t = a_0 - a_1 Y_t + Z_t, +$$ where @@ -61,18 +73,6 @@ import jax.numpy as jnp import matplotlib.pyplot as plt ``` - -Let’s check the backend used by JAX and the devices available - -```{code-cell} ipython3 -# Check if JAX is using GPU -print(f"JAX backend: {jax.devices()[0].platform}") - -# Check the devices available for JAX -print(jax.devices()) -``` - - We will use 64 bit floats with JAX in order to increase the precision. ```{code-cell} ipython3 diff --git a/lectures/opt_savings.md b/lectures/opt_savings.md index 61ed65af..694b1751 100644 --- a/lectures/opt_savings.md +++ b/lectures/opt_savings.md @@ -13,6 +13,16 @@ kernelspec: # Optimal Savings +```{admonition} GPU +:class: warning + +This lecture is accelerated via [hardware](status:machine-details) that has access to a GPU and JAX for GPU programming. + +Free GPUs are available on Google Colab. To use this option, please click on the play icon top right, select Colab, and set the runtime environment to include a GPU. + +Alternatively, if you have your own GPU, you can follow the [instructions](https://github.com/google/jax) for installing JAX with GPU support. If you would like to install JAX running on the `cpu` only you can use `pip install jax[cpu]` +``` + In addition to what’s in Anaconda, this lecture will need the following libraries: ```{code-cell} ipython3 diff --git a/lectures/short_path.md b/lectures/short_path.md index 8aa75eba..53b3c776 100644 --- a/lectures/short_path.md +++ b/lectures/short_path.md @@ -15,6 +15,15 @@ kernelspec: # Shortest Paths +```{admonition} GPU +:class: warning + +This lecture is accelerated via [hardware](status:machine-details) that has access to a GPU and JAX for GPU programming. + +Free GPUs are available on Google Colab. To use this option, please click on the play icon top right, select Colab, and set the runtime environment to include a GPU. + +Alternatively, if you have your own GPU, you can follow the [instructions](https://github.com/google/jax) for installing JAX with GPU support. If you would like to install JAX running on the `cpu` only you can use `pip install jax[cpu]` +``` ## Overview @@ -28,19 +37,6 @@ import jax.numpy as jnp import jax ``` - - -Let’s check the backend used by JAX and the devices available. - -```{code-cell} ipython3 -# Check if JAX is using GPU -print(f"JAX backend: {jax.devices()[0].platform}") - -# Check the devices available for JAX -print(jax.devices()) -``` - - ## Solving for Minimum Cost-to-Go Let $J(v)$ denote the minimum cost-to-go from node $v$, diff --git a/lectures/status.md b/lectures/status.md index 8309f510..29fd71ca 100644 --- a/lectures/status.md +++ b/lectures/status.md @@ -20,4 +20,12 @@ This table contains the latest execution statistics. These lectures are built on `linux` instances through `github actions` and `amazon web services (aws)` to enable access to a `gpu`. These lectures are built on a [p3.2xlarge](https://aws.amazon.com/ec2/instance-types/p3/) -that has access to `8 vcpu's`, a `V100 NVIDIA Tesla GPU`, and `61 Gb` of memory. \ No newline at end of file +that has access to `8 vcpu's`, a `V100 NVIDIA Tesla GPU`, and `61 Gb` of memory. + +You can check the backend used by JAX using: + +```{code-cell} ipython3 +import jax +# Check if JAX is using GPU +print(f"JAX backend: {jax.devices()[0].platform}") +``` \ No newline at end of file diff --git a/lectures/wealth_dynamics.md b/lectures/wealth_dynamics.md index 18c0689c..0ec5631d 100644 --- a/lectures/wealth_dynamics.md +++ b/lectures/wealth_dynamics.md @@ -14,7 +14,19 @@ kernelspec: # Wealth Distribution Dynamics -This lecture is the extended JAX implementation of [this lecture](https://python.quantecon.org/wealth_dynamics.html). Please refer that lecture for all background and notation. +```{admonition} GPU +:class: warning + +This lecture is accelerated via [hardware](status:machine-details) that has access to a GPU and JAX for GPU programming. + +Free GPUs are available on Google Colab. To use this option, please click on the play icon top right, select Colab, and set the runtime environment to include a GPU. + +Alternatively, if you have your own GPU, you can follow the [instructions](https://github.com/google/jax) for installing JAX with GPU support. If you would like to install JAX running on the `cpu` only you can use `pip install jax[cpu]` +``` + +This lecture is the extended JAX implementation of [this lecture](https://python.quantecon.org/wealth_dynamics.html). + +Please refer that lecture for all background and notation. We will use the following imports. From ea5bad741a42469a52321be9d7e93d81585fb723 Mon Sep 17 00:00:00 2001 From: mmcky Date: Mon, 8 May 2023 21:33:49 +1000 Subject: [PATCH 2/2] add nvidia-smi to view hardware --- lectures/wealth_dynamics.md | 8 ++------ 1 file changed, 2 insertions(+), 6 deletions(-) diff --git a/lectures/wealth_dynamics.md b/lectures/wealth_dynamics.md index 0ec5631d..5d260687 100644 --- a/lectures/wealth_dynamics.md +++ b/lectures/wealth_dynamics.md @@ -40,14 +40,10 @@ from collections import namedtuple ``` -Let's check the backend used by JAX and the devices available +Let's check the hardware we are running on: ```{code-cell} ipython3 -# Check if JAX is using GPU -print(f"JAX backend: {jax.devices()[0].platform}") - -# Check the devices available for JAX -print(jax.devices()) +!nvidia-smi ```