-
Notifications
You must be signed in to change notification settings - Fork 94
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[Question] How to add non-uniform transmission delays? #690
Comments
Thank you for your question! A contributor proposed a solution in a pull request (#595) to address non-uniform transmission delays. Although this PR hasn't been merged, it provides a useful approach:
However, we do not recommend using this method for GPU implementations because pure heterogeneous delay retrieval is very expensive. Instead, we suggest using the Taichi custom interface(in Hope this helps! Feel free to reach out if you have further questions. |
Hi Brendan! Here are some additional remarks. |
Hi, I am interested in implementing this, perhaps slowly as my time permits. By first look, the AlignPre and AlignPost assumption seems not really strong to prohibit hetero delay like this.
Can you elaborate a bit more on this? I am imagining merging delay with |
Hi brainpy!
I am hoping to incorporate non-uniform transmission delays into a LIF network (i.e. each synapse has a different delay), but I can't find a straightfoward way to implement this from the documentation.
For instance, in the example below I have tried passing an initializer for the delay to the synaptic projection classes with reduction and merging, which fails (as expected).
Is there a recommended way of constructing non-uniform delays, by drawing delays for each synapse from the initializer?
The text was updated successfully, but these errors were encountered: