and sometimes
-example.com (but not on Github, for example).
-
-Some text to show that the reference links can follow later.
-
-[arbitrary case-insensitive reference text]: https://www.mozilla.org
-[1]: http://slashdot.org
-[link text itself]: http://www.reddit.com
-
-Here's our logo (hover to see the title text):
-
-Inline-style:
-![alt text](https://github.com/adam-p/markdown-here/raw/master/src/common/images/icon48.png "Logo Title Text 1")
-
-Reference-style:
-![alt text][logo]
-
-[logo]: https://github.com/adam-p/markdown-here/raw/master/src/common/images/icon48.png "Logo Title Text 2"
-
-Inline `code` has `back-ticks around` it.
-
-```javascript
-var s = "JavaScript syntax highlighting";
-alert(s);
-```
-
-```python
-s = "Python syntax highlighting"
-print s
-```
-
-```
-No language indicated, so no syntax highlighting.
-But let's throw in a tag.
-```
-
-Colons can be used to align columns.
-
-| Tables | Are | Cool |
-| ------------- |:-------------:| -----:|
-| col 3 is | right-aligned | $1600 |
-| col 2 is | centered | $12 |
-| zebra stripes | are neat | $1 |
-
-There must be at least 3 dashes separating each header cell.
-The outer pipes (|) are optional, and you don't need to make the
-raw Markdown line up prettily. You can also use inline Markdown.
-
-Markdown | Less | Pretty
---- | --- | ---
-*Still* | `renders` | **nicely**
-1 | 2 | 3
-
-> Blockquotes are very handy in email to emulate reply text.
-> This line is part of the same quote.
-
-Quote break.
-
-> This is a very long line that will still be quoted properly when it wraps. Oh boy let's keep writing to make sure this is long enough to actually wrap for everyone. Oh, you can *put* **Markdown** into a blockquote.
-
-
-Here's a line for us to start with.
-
-This line is separated from the one above by two newlines, so it will be a *separate paragraph*.
-
-This line is also a separate paragraph, but...
-This line is only separated by a single newline, so it's a separate line in the *same paragraph*.
diff --git a/_posts/2023-11-09-adaptive-controller-graph-eom.md b/_posts/2023-11-09-adaptive-controller-graph-eom.md
index 54008317..c7c7c01b 100644
--- a/_posts/2023-11-09-adaptive-controller-graph-eom.md
+++ b/_posts/2023-11-09-adaptive-controller-graph-eom.md
@@ -103,7 +103,7 @@ We simulated the arm moving from one random configuration to another—marked in
### Attempt 1: Graph Neural Net
As inspired by Bhatoo, we rearrange the dataset as a Graph Dataset based on the PyTorch Geometric Library. Each node contains the 10 physical property parameters, angle, angular velocity, and torque input. In total, each node has 13 features. The output is set to be angular acceleration of the 7 joints (1x7 vector). As for the edge index, the graph is defined to be directed, either information flows from the last node to the first or the first node to the last node. This is inspired by the physical intuition that forces propagate sequentially from one body to the next, and that motion with respect to the global coordinate frame also sequential depended on the previous body link.
-{% include figure.html path="assets/img/2023-11-09-adaptive-controller-graph-eom/node.jpg" class="img-fluid" %}
+{% include figure.html path="assets/img/2023-11-09-adaptive-controller-graph-eom/nodes.jpg" class="img-fluid" %}
We applied nine iterations of the Graph Convolution Layer, ensuring information flow from one end of the arm to the other.
diff --git a/_posts/2023-11-09-uncertainty.md b/_posts/2023-11-09-uncertainty.md
index ac6329e1..9cc46e98 100644
--- a/_posts/2023-11-09-uncertainty.md
+++ b/_posts/2023-11-09-uncertainty.md
@@ -99,10 +99,10 @@ We train a model with SWAG on the MINST and CIFAR10 datasets. First, we only tra
- {% include figure.html path="assets/img/2023-12-12-uncertainty-detection-project/violin_mnist_swag.jpg" class="img-fluid rounded z-depth-1" %}
+ {% include figure.html path="assets/img/2023-12-12-uncertainty-detection-project/violin_mnist_swag.png" class="img-fluid rounded z-depth-1" %}
- {% include figure.html path="assets/img/2023-12-12-uncertainty-detection-project/violin_cifar_swag.jpg" class="img-fluid rounded z-depth-1" %}
+ {% include figure.html path="assets/img/2023-12-12-uncertainty-detection-project/violin_cifar_swag.png" class="img-fluid rounded z-depth-1" %}
@@ -110,16 +110,16 @@ We can also take a look at the data itself and identify the images which have th
- {% include figure.html path="assets/img/2023-12-12-uncertainty-detection-project/mnist_hard_id_swag.jpg" class="img-fluid rounded z-depth-1" %}
+ {% include figure.html path="assets/img/2023-12-12-uncertainty-detection-project/mnist_hard_id_swag.png" class="img-fluid rounded z-depth-1" %}
- {% include figure.html path="assets/img/2023-12-12-uncertainty-detection-project/mnist_easy_id_swag.jpg" class="img-fluid rounded z-depth-1" %}
+ {% include figure.html path="assets/img/2023-12-12-uncertainty-detection-project/mnist_easy_id_swag.png" class="img-fluid rounded z-depth-1" %}
- {% include figure.html path="assets/img/2023-12-12-uncertainty-detection-project/cifar_hard_id_swag.jpg" class="img-fluid rounded z-depth-1" %}
+ {% include figure.html path="assets/img/2023-12-12-uncertainty-detection-project/cifar_hard_id_swag.png" class="img-fluid rounded z-depth-1" %}
- {% include figure.html path="assets/img/2023-12-12-uncertainty-detection-project/cifar_easy_id_swag.jpg" class="img-fluid rounded z-depth-1" %}
+ {% include figure.html path="assets/img/2023-12-12-uncertainty-detection-project/cifar_easy_id_swag.png" class="img-fluid rounded z-depth-1" %}
The above pictures correspond to the highest and lowest scores from in-distribution training data. The major contributors for the high scores for MNIST are digits that are so poorly written it's hard to say what it is or it resembles another image too much. For CIFAR, it seems like the high score images are inducing confusion due to their color scheme or background. A lot of images with a blue or sky background such as those of birds do seem to be mistaken for planes at times. The low score images on the other hands are all extremely similar to one another; they're very well written digits (usually 0) or something that is obviously a car (usually red).
@@ -127,16 +127,16 @@ The above pictures correspond to the highest and lowest scores from in-distribut
Next, we take a look at how these scores fair on new out-of-distribution images.
- {% include figure.html path="assets/img/2023-12-12-uncertainty-detection-project/mnist_hard_ood_swag.jpg" class="img-fluid rounded z-depth-1" %}
+ {% include figure.html path="assets/img/2023-12-12-uncertainty-detection-project/mnist_hard_ood_swag.png" class="img-fluid rounded z-depth-1" %}
- {% include figure.html path="assets/img/2023-12-12-uncertainty-detection-project/mnist_easy_ood_swag.jpg" class="img-fluid rounded z-depth-1" %}
+ {% include figure.html path="assets/img/2023-12-12-uncertainty-detection-project/mnist_easy_ood_swag.png" class="img-fluid rounded z-depth-1" %}
- {% include figure.html path="assets/img/2023-12-12-uncertainty-detection-project/cifar_hard_ood_swag.jpg" class="img-fluid rounded z-depth-1" %}
+ {% include figure.html path="assets/img/2023-12-12-uncertainty-detection-project/cifar_hard_ood_swag.png" class="img-fluid rounded z-depth-1" %}
- {% include figure.html path="assets/img/2023-12-12-uncertainty-detection-project/cifar_easy_ood_swag.jpg" class="img-fluid rounded z-depth-1" %}
+ {% include figure.html path="assets/img/2023-12-12-uncertainty-detection-project/cifar_easy_ood_swag.png" class="img-fluid rounded z-depth-1" %}
@@ -148,11 +148,11 @@ Now that we've seen that we can use our measure of uncertainty as how well the o
- {% include figure.html path="assets/img/2023-12-12-uncertainty-detection-project/mnist_correlation_swag.jpg" class="img-fluid rounded z-depth-1" %}
+ {% include figure.html path="assets/img/2023-12-12-uncertainty-detection-project/mnist_correlation_swag.png" class="img-fluid rounded z-depth-1" %}
Spearman Correlation: -.9923
- {% include figure.html path="assets/img/2023-12-12-uncertainty-detection-project/cifar_correlation_swag.jpg" class="img-fluid rounded z-depth-1" %}
+ {% include figure.html path="assets/img/2023-12-12-uncertainty-detection-project/cifar_correlation_swag.png" class="img-fluid rounded z-depth-1" %}
Spearman Correlation: -.3867
@@ -167,11 +167,11 @@ This model is very simple and our weight "peturbations" are not too mathematical
- {% include figure.html path="assets/img/2023-12-12-uncertainty-detection-project/mnist_correlation_mc.jpg" class="img-fluid rounded z-depth-1" %}
+ {% include figure.html path="assets/img/2023-12-12-uncertainty-detection-project/mnist_correlation_mc.png" class="img-fluid rounded z-depth-1" %}
Spearman Correlation: -.9944
- {% include figure.html path="assets/img/2023-12-12-uncertainty-detection-project/cifar_correlation_mc.jpg" class="img-fluid rounded z-depth-1" %}
+ {% include figure.html path="assets/img/2023-12-12-uncertainty-detection-project/cifar_correlation_mc.png" class="img-fluid rounded z-depth-1" %}
Spearman Correlation: -.2936
@@ -239,11 +239,11 @@ We do this by introducing a wrapper model that takes in a base model as well as
- {% include figure.html path="assets/img/2023-12-12-uncertainty-detection-project/mnist_scwag_correlations.jpg" class="img-fluid rounded z-depth-1" %}
+ {% include figure.html path="assets/img/2023-12-12-uncertainty-detection-project/mnist_scwag_correlations.png" class="img-fluid rounded z-depth-1" %}
Spearman Correlation: -.9897
- {% include figure.html path="assets/img/2023-12-12-uncertainty-detection-project/cifar_scwag_correlations.jpg" class="img-fluid rounded z-depth-1" %}
+ {% include figure.html path="assets/img/2023-12-12-uncertainty-detection-project/cifar_scwag_correlations.png" class="img-fluid rounded z-depth-1" %}
Spearman Correlation: -.8484
@@ -252,16 +252,16 @@ With MNIST, we already had near perfect correlation so this slight decrease isn'
- {% include figure.html path="assets/img/2023-12-12-uncertainty-detection-project/mnist_hard_scwag.jpg" class="img-fluid rounded z-depth-1" %}
+ {% include figure.html path="assets/img/2023-12-12-uncertainty-detection-project/mnist_hard_scwag.png" class="img-fluid rounded z-depth-1" %}
- {% include figure.html path="assets/img/2023-12-12-uncertainty-detection-project/mnist_easy_scwag.jpg" class="img-fluid rounded z-depth-1" %}
+ {% include figure.html path="assets/img/2023-12-12-uncertainty-detection-project/mnist_easy_scwag.png" class="img-fluid rounded z-depth-1" %}
- {% include figure.html path="assets/img/2023-12-12-uncertainty-detection-project/cifar_hard_scwag.jpg" class="img-fluid rounded z-depth-1" %}
+ {% include figure.html path="assets/img/2023-12-12-uncertainty-detection-project/cifar_hard_scwag.png" class="img-fluid rounded z-depth-1" %}
- {% include figure.html path="assets/img/2023-12-12-uncertainty-detection-project/cifar_easy_scwag.jpg" class="img-fluid rounded z-depth-1" %}
+ {% include figure.html path="assets/img/2023-12-12-uncertainty-detection-project/cifar_easy_scwag.png" class="img-fluid rounded z-depth-1" %}