Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Tutorial 7 : Issues with using custom Neuromorphic dataset with the CSNN model #337

Open
mhafiz95 opened this issue Jun 26, 2024 · 0 comments

Comments

@mhafiz95
Copy link

  • snntorch version: 0.9.1
  • Python version: 3.8.10
  • Operating System: Ubuntu 20.04

Description

Describe what you were trying to get done.
Tell us what happened, what went wrong, and what you expected to happen.

What I Did

Hi, I'm pretty new to torch. I'm using my own neuromorphic dataset with labels . I am not using Tonic as my dataset has been converted into the suitable format of an Event representation format I have them as Event Frames. I loaded them and converted them to tensors for torch and I used TensorDataset and then used the DataLoader. So when I check them after tensordataset the shape is [1446,1,260,346] which is correct and then when I try to check the shape by iterating over trainloader as shown on the tutorial :
event_tensor, target = next(iter(trainloader))
print(event_tensor.shape)
torch.Size([311, 128, 2, 34, 34])

I don't get 5 dimensions for my event tensor rather it is coming as [32,1,260,346] where 32 is the batch size. I should be getting [1446,32,1,260,346] as 1446 is the sequence length which should be my number of timesteps. So when I train I get error with the sequential model.
RuntimeError: mat1 and mat2 shapes cannot be multiplied (32x5146 and 164672x14) and it points to my forward pass function. I am using the exact one shown on the tutorial. The only thing I had to change were these for the model and here is the model:
net = nn.Sequential(nn.Conv2d(1, 12, 5),
nn.MaxPool2d(2),
snn.Leaky(beta=beta, spike_grad=spike_grad, init_hidden=True),
nn.Conv2d(12, 32, 5),
nn.MaxPool2d(2),
snn.Leaky(beta=beta, spike_grad=spike_grad, init_hidden=True),
nn.Flatten(),
nn.Linear(326283, 14),
snn.Leaky(beta=beta, spike_grad=spike_grad, init_hidden=True, output=True)
).to(device)

So how can we use neuromorphic datasets without Tonic for snnTorch. As what I can understand Tonic is used mainly for the conversion to any suitable event representation and pass it to the CSNN model.

Paste the command(s) you ran and the output.
If there was a crash, please include the traceback here.
---------------------------------------------------------------------------
RuntimeError                              Traceback (most recent call last)
Cell In[25], line 21
     19 # forward pass
     20 net.train()
---> 21 spk_rec = forward_pass(net, data)
     23 # initialize the loss & sum over time
     24 loss_val = loss_fn(spk_rec, tr_label)

Cell In[24], line 7
      4 utils.reset(net)  # resets hidden states for all LIF neurons in net
      6 for step in range(data.size(0)):  # data.size(0) = number of time steps
----> 7     spk_out, mem_out = net(data[step])
      8     spk_rec.append(spk_out)
     10 return torch.stack(spk_rec)
File /data/m/event/lib/python3.8/site-packages/torch/nn/modules/module.py:1532, in Module._wrapped_call_impl(self, *args, **kwargs)
   1530     return self._compiled_call_impl(*args, **kwargs)  # type: ignore[misc]
   1531 else:
-> 1532     return self._call_impl(*args, **kwargs)

File /data/m/event/lib/python3.8/site-packages/torch/nn/modules/module.py:1541, in Module._call_impl(self, *args, **kwargs)
   1536 # If we don't have any hooks, we want to skip the rest of the logic in
   1537 # this function, and just call forward.
   1538 if not (self._backward_hooks or self._backward_pre_hooks or self._forward_hooks or self._forward_pre_hooks
   1539         or _global_backward_pre_hooks or _global_backward_hooks
   1540         or _global_forward_hooks or _global_forward_pre_hooks):
-> 1541     return forward_call(*args, **kwargs)
 1543 try:
   1544     result = None

File /data/m/event/lib/python3.8/site-packages/torch/nn/modules/container.py:217, in Sequential.forward(self, input)
    215 def forward(self, input):
    216     for module in self:
--> 217         input = module(input)
    218     return input

File /data/m/event/lib/python3.8/site-packages/torch/nn/modules/module.py:1
532, in Module._wrapped_call_impl(self, *args, **kwargs)
   1530     return self._compiled_call_impl(*args, **kwargs)  # type: ignore[misc]
   1531 else:
-> 1532     return self._call_impl(*args, **kwargs)

File /data/m/event/lib/python3.8/site-packages/torch/nn/modules/module.py:1541, in Module._call_impl(self, *args, **kwargs)
   1536 # If we don't have any hooks, we want to skip the rest of the logic in
   1537 # this function, and just call forward.
   1538 if not (self._backward_hooks or self._backward_pre_hooks or self._forward_hooks or self._forward_pre_hooks
   1539         or _global_backward_pre_hooks or _global_backward_hooks
   1540         or _global_forward_hooks or _global_forward_pre_hooks):
-> 1541     return forward_call(*args, **kwargs)
   1543 try:
   1544     result = None

File /data/m/event/lib/python3.8/site-packages/torch/nn/modules/linear.py:116, in Linear.forward(self, input)
    115 def forward(self, input: Tensor) -> Tensor:
--> 116     return F.linear(input, self.weight, self.bias)

RuntimeError: mat1 and mat2 shapes cannot be multiplied (32x5146 and 164672x14)

     
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant