Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Problems with grad after grad #120

Open
gforchini opened this issue Jul 9, 2020 · 2 comments
Open

Problems with grad after grad #120

gforchini opened this issue Jul 9, 2020 · 2 comments

Comments

@gforchini
Copy link

Hi
I am trying to implement a WGAN and I have problems to apply grad on a function to which grad has already been applied (although on a different variables). An example of the problem and error message is attached.
Giovanni

using Knet, StatsBase, Random, LinearAlgebra, AutoGrad

function D(w,x)
x =elu.( w[1]*x .+ w[2])
return sigm.(w[3]*x.+ w[4])[1]
end

w = Array{Float64}[ randn(12, 12), randn(12, 1),
randn(1, 12), randn(1, 1) ]

∇D = grad(D)
x=randn( 12, 1)

D(w,x)
typeof(D(w,x))

y=∇D(w,x)

typeof(∇D(w,x))

No problem up to now the function D is differentiable in both w and x
Now I construct a penalty as a function of the gradient of D with respect to x

function penalty(w,x)
g = grad(x -> D(w,x))
return (norm(g(x)) -1)^2
end

penalty(w,x)
typeof(penalty(w,x))

Taking the gradient of penalty with respect to w gives an error.

∇penalty=grad(penalty)

∇penalty(w,x)

MethodError: Cannot convert an object of type
AutoGrad.Result{Array{Float64,2}} to an object of type
Array{Float64,N} where N
Closest candidates are:
convert(::Type{T}, !Matched::AbstractArray) where T<:Array at array.jl:533
convert(::Type{T}, !Matched::T) where T<:AbstractArray at abstractarray.jl:14
convert(::Type{T}, !Matched::Factorization) where T<:AbstractArray at /Users/julia/buildbot/worker/package_macos64/build/usr/share/julia/stdlib/v1.4/LinearAlgebra/src/factorization.jl:55
...
differentiate(::Function, ::Param{Array{Array{Float64,N} where N,1}}, ::Vararg{Any,N} where N; o::Base.Iterators.Pairs{Union{},Union{},Tuple{},NamedTuple{(),Tuple{}}}) at core.jl:148
differentiate(::Function, ::Param{Array{Array{Float64,N} where N,1}}, ::Vararg{Any,N} where N) at core.jl:135
(::AutoGrad.var"#gradfun#7"{AutoGrad.var"#gradfun#6#8"{typeof(penalty),Int64,Bool}})(::Array{Array{Float64,N} where N,1}, ::Vararg{Any,N} where N; kwargs::Base.Iterators.Pairs{Union{},Union{},Tuple{},NamedTuple{(),Tuple{}}}) at core.jl:225
(::AutoGrad.var"#gradfun#7"{AutoGrad.var"#gradfun#6#8"{typeof(penalty),Int64,Bool}})(::Array{Array{Float64,N} where N,1}, ::Vararg{Any,N} where N) at core.jl:221
top-level scope at Problems.jl:38

Notice that w is a vector of arrays, which works in other situations and which I suspect is what is creating problems now. In other examples where w is a proper vector the second gradient works.

@denizyuret
Copy link
Owner

denizyuret commented Jul 10, 2020 via email

@gforchini
Copy link
Author

Thank you for getting back. Unfortunately the problem I am looking at is almost (but not quite) a WGAN and it doesn't quite fit into the implementation in the link.
Giovanni

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants