You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
(I don't think nplike has an r_, but something can be arranged. We have implementations of "parents" in the reducer code, for instance.)
The reason we want this to be a high-level interface now is because it's a step in preparing data for DeepSets and GNNs; PyTorch-Geometric has a DeepSetsAggregation and a aggr.MeanAggregation that does vectorized segmented reduction using "parents" as input (called "index").
But to be a normal Awkward function, it should take axis as an argument, like ak.local_index. That can be implemented with our internal recursively_apply. Suppose we have
>>>ak.num(deep, axis=1)
<Array [2] type='1 * int64'>>>>ak.num(deep, axis=2)
<Array [[1, 0]] type='1 * var * int64'>>>>ak.num(deep, axis=3)
<Array [[[1], []]] type='1 * var * var * int64'>>>>ak.num(deep, axis=4)
<Array [[[[2]], []]] type='1 * var * var * var * int64'>>>>ak.num(deep, axis=5)
<Array [[[[[3, 1]]], []]] type='1 * var * var * var * var * int64'>>>>ak.local_index(deep, axis=1)
<Array [[0, 1]] type='1 * var * int64'>>>>ak.local_index(deep, axis=2)
<Array [[[0], []]] type='1 * var * var * int64'>>>>ak.local_index(deep, axis=3)
<Array [[[[0]], []]] type='1 * var * var * var * int64'>>>>ak.local_index(deep, axis=4)
<Array [[[[[0, 1]]], []]] type='1 * var * var * var * var * int64'>>>>ak.local_index(deep, axis=5)
<Array [[[[[[0, 1, 2], [0]]]], []]] type='1 * var * var * var * var * var *...'>
We would want ak.parents_index(deep, axis=5) to return
<Array [[[[[0, 0, 0, 1]]], []]] type='1 * var * var * var * var * int64'>
Note that this is the same axis interpretation as ak.flatten:
>>>ak.flatten(deep, axis=5)
<Array [[[[[1.1, 2.2, 3.3, 4.4]]], []]] type='1 * var * var * var * var * float64'>
and it makes an array with the same list-lengths. I think that ak.parents_index and ak.flatten would be used together a lot.
The text was updated successfully, but these errors were encountered:
Description of new feature
Similar to ak.local_index, this function would give what we have been calling "parents" at some chosen
axis
.Given a ragged dimension (ListOffsetArray node or ListArray node) like this:
the "parents" are:
(I don't think nplike has an
r_
, but something can be arranged. We have implementations of "parents" in the reducer code, for instance.)The reason we want this to be a high-level interface now is because it's a step in preparing data for DeepSets and GNNs; PyTorch-Geometric has a
DeepSetsAggregation
and aaggr.MeanAggregation
that does vectorized segmented reduction using "parents" as input (called "index
").But to be a normal Awkward function, it should take
axis
as an argument, like ak.local_index. That can be implemented with our internalrecursively_apply
. Suppose we haveFunctions like ak.num and ak.local_index do this:
We would want
ak.parents_index(deep, axis=5)
to returnNote that this is the same
axis
interpretation as ak.flatten:and it makes an array with the same list-lengths. I think that
ak.parents_index
andak.flatten
would be used together a lot.The text was updated successfully, but these errors were encountered: