Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Adding of two sparse tensors does not work #113

Open
brechtmann opened this issue Feb 17, 2021 · 7 comments
Open

Adding of two sparse tensors does not work #113

brechtmann opened this issue Feb 17, 2021 · 7 comments
Labels
enhancement New feature or request

Comments

@brechtmann
Copy link

I tried to add two sparse tensors and failed with the error below.
I am using pytorch 1.6.0 and pytorch_sparse 0.6.8

Here is a minimal example which reproduces my error:

>>> from torch_sparse import SparseTensor, add
>>> a = SparseTensor.from_dense(torch.ones([2,3]))
>>> a
SparseTensor(row=tensor([0, 0, 0, 1, 1, 1]),
             col=tensor([0, 1, 2, 0, 1, 2]),
             val=tensor([1., 1., 1., 1., 1., 1.]),
             size=(2, 3), nnz=6, density=100.00%)




>>> add(a,a)

---------------------------------------------------------------------------
TypeError                                 Traceback (most recent call last)

...

     15     else:
     16         raise ValueError(
---> 17             f'Size mismatch: Expected size ({src.size(0)}, 1, ...) or '
     18             f'(1, {src.size(1)}, ...), but got size {other.size()}.')
     19     if value is not None:

TypeError: size() missing 1 required positional argument: 'dim'

In the above example I would have expected that add(a,a) == 2 * a
Is that correct or am I am using the wrong function?
And I get the same error for a + a

@rusty1s
Copy link
Owner

rusty1s commented Feb 17, 2021

Addition of two SparseTensors is currently not supported, sorry!

@brechtmann
Copy link
Author

Is this on your roadmap?

@rusty1s
Copy link
Owner

rusty1s commented Feb 22, 2021

Yes, it's on the roadmap.

@brechtmann
Copy link
Author

I implemented addition and subtraction for the moment the following way:

from torch_sparse import SparseTensor
from torch_sparse import coalesce


def add_sparse(a, b):
    assert a.sizes() == b.sizes(), "The Tensor dimensions do not match"
    row_a, col_a, values_a = a.coo()
    row_b, col_b, values_b = b.coo()
    
    index = torch.stack([torch.cat([row_a, row_b]), torch.cat([col_a, col_b])])
    value = torch.cat([values_a, values_b])
    
    m,n = a.sizes()
    index, value = coalesce(index, value, m=m, n=n)
    res = SparseTensor.from_edge_index(index, value, sparse_sizes=(m, n))
    return res



def sub_sparse(a, b):
    assert a.sizes() == b.sizes(), "The Tensor dimensions do not match"
    row_a, col_a, values_a = a.coo()
    row_b, col_b, values_b = b.coo()
    
    index = torch.stack([torch.cat([row_a, row_b]), torch.cat([col_a, col_b])])
    value = torch.cat([values_a, -1 * values_b])
    
    m,n = a.sizes()
    index, value = coalesce(index, value, m=m, n=n)
    res = SparseTensor.from_edge_index(index, value, sparse_sizes=(m, n))
    return res

It does the job for me and autograd works this way.

@rusty1s
Copy link
Owner

rusty1s commented Mar 16, 2021

Looks good! I think this can be made more efficient with custom CUDA kernels, but it's good to support it nonetheless. Are you interested in contributing your solution?

@brechtmann
Copy link
Author

Hi,

In case a simple function is sufficient for you I can contribute that.

@rusty1s
Copy link
Owner

rusty1s commented Mar 25, 2021

Yes, please :) Really appreciate it.

@rusty1s rusty1s mentioned this issue Apr 21, 2021
@rusty1s rusty1s added the enhancement New feature or request label Sep 16, 2021
RexYing pushed a commit to RexYing/pytorch_sparse that referenced this issue Apr 26, 2022
* financial metadata

* update

* update

* update

* update

* update

* fix test

* reset

* update
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
None yet
Development

No branches or pull requests

2 participants