You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Then running ForwardDiff.binary_dual_definition adds a lot of methods to Distributions.cdf, but they all involve d::T where T <: Number, eg AbstractFloat, Irrational, Real, etc.
Besides defining one argument functions for each distribution, like "cdf_of_beta_2_3(x)", how could we define a derivative that ignores the first argument?
Perhaps more related to that pull request (and the Tensors.jl package https://github.com/KristofferC/Tensors.jl ) -- what should I look into for trying to efficiently define higher order derivatives?
Like, the Hessian of the quadform x' A x / 2 with respect to x is simply A. This Hessian is slow to compute via ForwardDiff*, but analytically requires no computation at all.
To what extant can we define custom diffrules (especially considering how hessians are calculated as Jacobians of gradients) to try and make something like that more efficient?
Ideally, I want to allow people to freely specify probability models, but offer highly efficient first and second derivatives (of the log density) for distributions that come up a lot, eg the multivariate normal.
*haven't tried ReverseDiff, but the inability to mutate array inputs being differentiated is a bothersome limitation; given ReverseDiff's fast gradients and the fact I intend to repeatedly calculate Hessians ForwardDiff Jacobians of ReverseDiff gradients may be a good idea.
The text was updated successfully, but these errors were encountered:
Related: JuliaDiff/ForwardDiff.jl#165
Maybe I should post here, but my question seemed a little off topic, so I figured I'd file an issue with the diffrule API documentation: http://www.juliadiff.org/DiffBase.jl/latest/diffrule_api.html
If one wants to use the diffrules with ForwardDiff, you also have to run the appropriate variant of
What about for greater number of args?
I haven't looked into ReverseDiff; can it also make use of these diffrules?
Furthermore, what if one wanted to define a diffrule like:
Then running ForwardDiff.binary_dual_definition adds a lot of methods to Distributions.cdf, but they all involve d::T where T <: Number, eg AbstractFloat, Irrational, Real, etc.
Besides defining one argument functions for each distribution, like "cdf_of_beta_2_3(x)", how could we define a derivative that ignores the first argument?
Perhaps more related to that pull request (and the Tensors.jl package https://github.com/KristofferC/Tensors.jl ) -- what should I look into for trying to efficiently define higher order derivatives?
Like, the Hessian of the quadform x' A x / 2 with respect to x is simply A. This Hessian is slow to compute via ForwardDiff*, but analytically requires no computation at all.
To what extant can we define custom diffrules (especially considering how hessians are calculated as Jacobians of gradients) to try and make something like that more efficient?
Ideally, I want to allow people to freely specify probability models, but offer highly efficient first and second derivatives (of the log density) for distributions that come up a lot, eg the multivariate normal.
*haven't tried ReverseDiff, but the inability to mutate array inputs being differentiated is a bothersome limitation; given ReverseDiff's fast gradients and the fact I intend to repeatedly calculate Hessians ForwardDiff Jacobians of ReverseDiff gradients may be a good idea.
The text was updated successfully, but these errors were encountered: