diff --git a/lectures/autodiff.md b/lectures/autodiff.md index 37bef07..aa88920 100644 --- a/lectures/autodiff.md +++ b/lectures/autodiff.md @@ -13,6 +13,12 @@ kernelspec: # Adventures with Autodiff + +```{include} _admonition/gpu.md +``` + +## Overview + This lecture gives a brief introduction to automatic differentiation using Google JAX. @@ -25,14 +31,15 @@ powerful implementations available. One of the best of these is the automatic differentiation routines contained in JAX. +While other software packages also offer this feature, the JAX version is +particularly powerful because it integrates so well with other core +components of JAX (e.g., JIT compilation and parallelization). + As we will see in later lectures, automatic differentiation can be used not only for AI but also for many problems faced in mathematical modeling, such as multi-dimensional nonlinear optimization and root-finding problems. -```{include} _admonition/gpu.md -``` - We need the following imports ```{code-cell} ipython3 diff --git a/lectures/job_search.md b/lectures/job_search.md index d80c1db..33a05df 100644 --- a/lectures/job_search.md +++ b/lectures/job_search.md @@ -20,8 +20,12 @@ kernelspec: In this lecture we study a basic infinite-horizon job search problem with Markov wage draws -The exercise at the end asks you to add recursive preferences and compare -the result. +```{note} +For background on infinite horizon job search see, e.g., [DP1](https://dp.quantecon.org/). +``` + +The exercise at the end asks you to add risk-sensitive preferences and see how +the main results change. In addition to what’s in Anaconda, this lecture will need the following libraries: @@ -238,6 +242,8 @@ res_wage_index = min(stop_indices[0]) res_wage = w_vals[res_wage_index] ``` +Here's a joint plot of the value function and the reservation wage. + ```{code-cell} ipython3 fig, ax = plt.subplots() ax.plot(w_vals, v_star, alpha=0.8, label="value function") diff --git a/lectures/newtons_method.md b/lectures/newtons_method.md index 0c0d5f0..4da4fe5 100644 --- a/lectures/newtons_method.md +++ b/lectures/newtons_method.md @@ -20,18 +20,14 @@ kernelspec: One of the key features of JAX is automatic differentiation. -While other software packages also offer this feature, the JAX version is -particularly powerful because it integrates so closely with other core -components of JAX, such as accelerated linear algebra, JIT compilation and -parallelization. +We introduced this feature in {doc}`autodiff`. -The application of automatic differentiation we consider is computing economic equilibria via Newton's method. +In this lecture we apply automatic differentiation to the problem of computing economic equilibria via Newton's method. Newton's method is a relatively simple root and fixed point solution algorithm, which we discussed in [a more elementary QuantEcon lecture](https://python.quantecon.org/newton_method.html). -JAX is almost ideally suited to implementing Newton's method efficiently, even -in high dimensions. +JAX is ideally suited to implementing Newton's method efficiently, even in high dimensions. We use the following imports in this lecture