-
Notifications
You must be signed in to change notification settings - Fork 17
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Slow lenses for nested named tuples #151
Comments
Thanks for reporting. It is interesting indeed. I never used such deep compositions. Here are a couple of thoughts:
julia> l1 = (@lens(_.a) ∘ @lens(_.b)) ∘ @lens(_.c)
(@lens _.a.b.c)
julia> l2 = @lens(_.a) ∘ (@lens(_.b) ∘ @lens(_.c))
(@lens _.a.b.c)
julia> l1 === l2
false
julia> typeof(l1)
Setfield.ComposedLens{Setfield.ComposedLens{Setfield.PropertyLens{:a},Setfield.PropertyLens{:b}},Setfield.PropertyLens{:c}}
julia> typeof(l2)
Setfield.ComposedLens{Setfield.PropertyLens{:a},Setfield.ComposedLens{Setfield.PropertyLens{:b},Setfield.PropertyLens{:c}}} The semantics of a composed lens is independent of composition order, but performance is not. I expect in a practical use case, you won't build such a huge lens in one step @lens _.a ... z but combine it out of smaller lenses: (@lens _a....m) ∘(@lens _n... z) Is this issue inspired by an existing or planned use case? I would be happy to learn about it. Might be worth benchmarking with a more practical composition order.
|
Thanks for the response. I'll give some more context on the problem... A sample in Soss.jl is a named tuple. You can have models within models, or some parameters could themselves be structs or named tuples, so in large models I'd expect this much nesting could come up. That's for a single sample, but we usually want many samples. But that gets inefficient, so I was looking into StructArrays. As it turns out, these already allow nesting, but that's not well-documented, so I started trying to implement it. My work-in-progress PR is here: In this case I'm not too worried about the possibility of matching performance. All the information is in the type, so a generated function should do it. The julia> @code_lowered get(x,ℓ14)
CodeInfo(
1 ─ %1 = Base.getproperty(l, :outer)
│ inner_obj = Setfield.get(obj, %1)
│ %3 = inner_obj
│ %4 = Base.getproperty(l, :inner)
│ %5 = Setfield.get(%3, %4)
└── return %5
)
julia> @code_lowered f(x)
CodeInfo(
1 ─ %1 = Base.getproperty(x, :b)
│ %2 = Base.getproperty(%1, :d)
│ %3 = Base.getproperty(%2, :f)
│ %4 = Base.getproperty(%3, :h)
│ %5 = Base.getproperty(%4, :j)
│ %6 = Base.getproperty(%5, :l)
│ %7 = Base.getproperty(%6, :n)
│ %8 = Base.getproperty(%7, :p)
│ %9 = Base.getproperty(%8, :r)
│ %10 = Base.getproperty(%9, :t)
│ %11 = Base.getproperty(%10, :v)
│ %12 = Base.getproperty(%11, :x)
│ %13 = Base.getproperty(%12, :z)
└── return %13
) Anyway, I could call |
@jw3126 I asked about this on Discourse, seems it might be a non-issue: |
Ahh good catch. Yeah |
Here's an interesting benchmark:
Then compare:
My big worry here was that time might be linear with depth. That doesn't seem to be the case, which is great! But Setfield still takes 50x the time of the "dotted path" approach. Do you think it's possible to close this gap?
The text was updated successfully, but these errors were encountered: