From 37966569af6b81980702fd4d6b6e25a660167ad0 Mon Sep 17 00:00:00 2001 From: Matt McKay Date: Tue, 19 Nov 2024 21:02:08 +1100 Subject: [PATCH] FIX: pdf build for keras lecture (#197) * FIX: pdf build for keras lecture * remove ipython checkpoints * add to git ignore * remove virtual documents --- .gitignore | 2 ++ lectures/keras.md | 9 ++++----- 2 files changed, 6 insertions(+), 5 deletions(-) diff --git a/.gitignore b/.gitignore index 98795e75..434be6f0 100644 --- a/.gitignore +++ b/.gitignore @@ -1,3 +1,5 @@ .DS_Store _build/ lectures/_build/ +.ipython_checkpoints +.virtual_documents \ No newline at end of file diff --git a/lectures/keras.md b/lectures/keras.md index 932636c0..1702af98 100644 --- a/lectures/keras.md +++ b/lectures/keras.md @@ -211,13 +211,13 @@ Let's print the final MSE on the cross-validation data. ```{code-cell} ipython3 print("Testing loss on the validation set.") -regression_model.evaluate(x_validate, y_validate) +regression_model.evaluate(x_validate, y_validate, verbose=2) ``` Here's our output predictions on the cross-validation data. ```{code-cell} ipython3 -y_predict = regression_model.predict(x_validate) +y_predict = regression_model.predict(x_validate, verbose=2) ``` We use the following function to plot our predictions along with the data. @@ -265,7 +265,7 @@ Here's the final MSE for the deep learning model. ```{code-cell} ipython3 print("Testing loss on the validation set.") -nn_model.evaluate(x_validate, y_validate) +nn_model.evaluate(x_validate, y_validate, verbose=2) ``` You will notice that this loss is much lower than the one we achieved with @@ -274,7 +274,7 @@ linear regression, suggesting a better fit. To confirm this, let's look at the fitted function. ```{code-cell} ipython3 -y_predict = nn_model.predict(x_validate) +y_predict = nn_model.predict(x_validate, verbose=2) ``` ```{code-cell} ipython3 @@ -290,4 +290,3 @@ fig, ax = plt.subplots() plot_results(x_validate, y_validate, y_predict, ax) plt.show() ``` -