Skip to content

Commit

Permalink
misc
Browse files Browse the repository at this point in the history
  • Loading branch information
jstac committed Aug 12, 2024
1 parent 5d7f59c commit d5c625b
Show file tree
Hide file tree
Showing 3 changed files with 21 additions and 12 deletions.
13 changes: 10 additions & 3 deletions lectures/autodiff.md
Original file line number Diff line number Diff line change
Expand Up @@ -13,6 +13,12 @@ kernelspec:

# Adventures with Autodiff


```{include} _admonition/gpu.md
```

## Overview

This lecture gives a brief introduction to automatic differentiation using
Google JAX.

Expand All @@ -25,14 +31,15 @@ powerful implementations available.
One of the best of these is the automatic differentiation routines contained
in JAX.

While other software packages also offer this feature, the JAX version is
particularly powerful because it integrates so well with other core
components of JAX (e.g., JIT compilation and parallelization).

As we will see in later lectures, automatic differentiation can be used not only
for AI but also for many problems faced in mathematical modeling, such as
multi-dimensional nonlinear optimization and root-finding problems.


```{include} _admonition/gpu.md
```

We need the following imports

```{code-cell} ipython3
Expand Down
10 changes: 8 additions & 2 deletions lectures/job_search.md
Original file line number Diff line number Diff line change
Expand Up @@ -20,8 +20,12 @@ kernelspec:
In this lecture we study a basic infinite-horizon job search problem with Markov wage
draws

The exercise at the end asks you to add recursive preferences and compare
the result.
```{note}
For background on infinite horizon job search see, e.g., [DP1](https://dp.quantecon.org/).
```

The exercise at the end asks you to add risk-sensitive preferences and see how
the main results change.

In addition to what’s in Anaconda, this lecture will need the following libraries:

Expand Down Expand Up @@ -238,6 +242,8 @@ res_wage_index = min(stop_indices[0])
res_wage = w_vals[res_wage_index]
```

Here's a joint plot of the value function and the reservation wage.

```{code-cell} ipython3
fig, ax = plt.subplots()
ax.plot(w_vals, v_star, alpha=0.8, label="value function")
Expand Down
10 changes: 3 additions & 7 deletions lectures/newtons_method.md
Original file line number Diff line number Diff line change
Expand Up @@ -20,18 +20,14 @@ kernelspec:

One of the key features of JAX is automatic differentiation.

While other software packages also offer this feature, the JAX version is
particularly powerful because it integrates so closely with other core
components of JAX, such as accelerated linear algebra, JIT compilation and
parallelization.
We introduced this feature in {doc}`autodiff`.

The application of automatic differentiation we consider is computing economic equilibria via Newton's method.
In this lecture we apply automatic differentiation to the problem of computing economic equilibria via Newton's method.

Newton's method is a relatively simple root and fixed point solution algorithm, which we discussed
in [a more elementary QuantEcon lecture](https://python.quantecon.org/newton_method.html).

JAX is almost ideally suited to implementing Newton's method efficiently, even
in high dimensions.
JAX is ideally suited to implementing Newton's method efficiently, even in high dimensions.

We use the following imports in this lecture

Expand Down

0 comments on commit d5c625b

Please sign in to comment.