Skip to content

Releases: femtomc/Jaynes.jl

v0.1.32 Photogenic Protozoa

05 Dec 01:42
Compare
Choose a tag to compare

No new features. Re-factored and simplified the ExecutionContext type hierarchy. Also, Jaynes is now working on Julia > 1.6.

v0.1.30 Photogenic Protozoa

26 Nov 17:17
Compare
Choose a tag to compare

Refactor the core Jaynes architecture: introducing pipelines, a new unifying concept which brings the overall goals of Jaynes into light, and should encourage the usage of Jaynes as a modular infrastructure for trying out various probabilistic programming compiler techniques.

A pipeline is a compiler pipeline - it describes how syntax code wrapped with the @jaynes macro:

  1. transforms into a generative function at model compilation time
  2. typechecks/support checks at model compilation time
  3. transforms at generative function interface execution time (e.g. there's hooks which allow for customization of the method body of model programs before the GFI is executed).

In short, I've tried to make the entire Jaynes.jl toolchain modular - so that a new user can come in, implement their own compiler passes, and have the rest of the infrastructure just sort of work (assuming everything is implemented by the user correctly).

v0.1.25 Photogenic Protozoa

21 Sep 14:36
Compare
Choose a tag to compare

microJaynes branch merged into master. Moving forward, the "core" relies on Gen inference.

The full version (pre -1.25) is still available under full-Jaynes branch.

v0.1.24 Howling Haddie

08 Sep 20:42
Compare
Choose a tag to compare

v0.1.21 Chronic Caterpillar

26 Aug 15:33
Compare
Choose a tag to compare

This release provides initial support for learning amortized proposal distributions with deep neural network components. This is provided through a set of backpropagation interfaces, as well as a set of neural variational inference algorithms.

v0.1.12 Killer Koala

18 Aug 01:12
Compare
Choose a tag to compare

v0.1.11 Dangerous Dolphin

15 Aug 13:09
Compare
Choose a tag to compare

Merge of address map refactor.

Numerous bug fixes - one important: gradients were not accumulating properly, meaning that the ADVI class of algorithms was not working correctly.

v0.1.8 - Parabolic Protozoa

02 Aug 01:33
Compare
Choose a tag to compare

This release provides initial support for interfacing with Gen.jl and Soss.jl through foreign model interfaces.

v0.1.7 Mystic Monca

30 Jul 19:04
Compare
Choose a tag to compare

This release brings support for learnable parameters in specialized call sites.

v0.1.6 Yodeling Yellowjacket

29 Jul 14:45
Compare
Choose a tag to compare

Large learnable parameter re-factor - learnable parameters can now supports arrays of all shapes and sizes.

Using parameters has slightly changed - they are completely separated from traces and call sites. You have to pass them into contexts, and manage them yourself. This cleanly separates a number of things, and I think is a good design choice in the long run.

Furthermore, this is the first official release which supports the new tuple indexing notation/selection specification for addresses.

To specify a selection, there is a universal interface:

sel = selection([(:x, ) => 5.0, (:x, :y, 1) => 10.0])

Here, commas in the tuple separate different levels of the call stack.

Indexing is done in a similar way:

# Some call site
cl[:x, :y, 10]

says, access the the choice in call :x, in call :y, at address 10.

This choice has also significantly cleaned up some weirdness associated with using Pairs in addressing.

Addresses are now Any as well. So go wild.