-
Notifications
You must be signed in to change notification settings - Fork 52
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[MRG] Find end pts of section and a numpy version of h.distance() #661
[MRG] Find end pts of section and a numpy version of h.distance() #661
Conversation
hnn_core/find_coords.py
Outdated
for section_name, section in cell.sections.items(): | ||
print(section_name) | ||
print(section.end_pts) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
You need to build the cell and get the coords from that
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
When any new cell is created using Cell
class, a private function __update_end_pts()
is called at the end of __init__
. This function creates neuron sections by calling _create_sections()
then updates the end_pts
of the section
. The neuron objects are then deleted using del self._nrn_sections[name]
.
I have a few questions regarding this flow -
- Neuron objects are created and destroyed in this flow which according to the discussion in last meeting should be avoided outside of simulation.
- If this functionality is already there and the
end_pts
are automatically updated, is manually updating theend_pts
required ? - Should this function
_update_end_pts()
be removed?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Also if neuron objects
can be cleared using del
, can't we follow a similar scheme while saving the network? After calling build
on the copy of a cell, can we just iteratively delete the _nrn_sections
and _nrn_synapses
of the built cell??
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The problem is that Neuron objects are wrappers around C objects ... when del
is called, it makes a request to the garbage collector to clear the object from memory. It's a request, not an order. If a pointer is not cleared from memory because of how it is coded in C, it will lead to a segmentation fault and/or slow down the following simulation. The latter will be very hard to debug ... we do have a scheme to clear neuron objects in NetworkBuilder
but it was contributed by someone really in the weeds who understands Neuron.
But you have a good point about _update_end_pts
... I think @ntolley implemented that. Not sure how well it's been tested if the del
in that function does not lead to a performance regression. Any thoughts Nick?
Codecov Report
❗ Your organization is not using the GitHub App Integration. As a result you may experience degraded service beginning May 15th. Please install the Github App Integration for your organization. Read more. @@ Coverage Diff @@
## master #661 +/- ##
==========================================
+ Coverage 91.55% 91.62% +0.06%
==========================================
Files 22 22
Lines 4405 4488 +83
==========================================
+ Hits 4033 4112 +79
- Misses 372 376 +4
|
As discussed in the meeting, I stored a base network and dipole and checked equality of them after removing
|
I think if all |
@raj1701 let me know when I should look ! |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I spent some time reading the code ... everything isn't fully clear to me yet but it's getting there! Let's iterate a bit more so we can clean it up nicely. I'm passing the ball back to you @raj1701 . Let me know as soon as you do your pass (in a day or two?) and I'll read it one more time
hnn_core/cell.py
Outdated
dist_1 = self.distance_section(target_sec_name, | ||
sec_name, sec_end_pt) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
dist_1 = self.distance_section(target_sec_name, | |
sec_name, sec_end_pt) | |
dist_1 = self.distance_section(target_sec_name, node) |
I don't fully follow the logic of these if/else statements
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The algorithm searches for the destination nodes (end 0 or 1 of the target section) in all available paths. If the target section is in the subtree of the current node dist_1 (now named it dist_temp
) will contain distance between the current node to the centre of the target section
This looks like a recreation of the NEURON functionality, where the cell sections are first defined with endpoints, attached to each other into a "tree", and then morphed according a specified length parameter for each section. This is exactly what we're trying to avoid, which can be accomplished by specifying the final endpoints to begin with. |
@rythorpe this was done so as to avoid calling Neuron functions when saving the network ... right now, it's all implemented in pure python, so the ionic concentrations are known without using Neuron. I remember you were concerned about dangling neuron objects. This takes care of that issue. Furthermore, we better understand how the neuron distance function works :) |
Perhaps you imagined hardcoding the lengths/coordinates ... but that's not possible because we already had functionality to change the lengths (implemented by @ntolley ). The only way to match everything up was to reimplement it in Python |
Yes, I thought the whole goal was just to update the hardcoded endpoints because it'd be the most lightweight solution. I don't understand why we needed to re-implement everything in Python. |
hardcoding wouldn't have achieved what you wanted because of: Lines 682 to 697 in c81e5b8
see it already calls neuron functions. You'd have deprecate this ... |
hnn_core/tests/test_cell.py
Outdated
cell1.update_end_pts() | ||
for sec_name in cell1.sections.keys(): | ||
section = cell1.sections[sec_name] | ||
section._L = section._L / 2 |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Why are we dividing by two after calling update endpoints? wouldn't this lead to a mismatch?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Definitely think this is the right format for the test, and the explicit checks below are good too, but not sure I totally follow all the logic
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
We divide all section lengths by 2 to get back the original end points because earlier we had doubled them and updated end_pts in line 117.
hnn_core/cell.py
Outdated
if hasattr(val, '__call__'): | ||
seg_xs, seg_vals = list(), list() | ||
section_distance = self.distance_section(sec_name, | ||
('soma', 0)) | ||
adjacent_seg_dist = 1 / section.nseg | ||
first_seg_centre = (0.5 - (((section.nseg - 1) / 2) * | ||
adjacent_seg_dist)) | ||
seg_centres = list() | ||
|
||
# Finding centres of all segments in the section | ||
# If number of segments is 5 then seg_centres will | ||
# be 0.1, 0.3, 0.5, 0.7 and 0.9. | ||
for i in range(0, section.nseg): | ||
seg_centres.append(first_seg_centre + | ||
(i * adjacent_seg_dist)) | ||
|
||
for seg_x in seg_centres: | ||
# sec_end_dist is distance between 0 end of soma to | ||
# the 0 or 1 end of section (whichever is closer) | ||
sec_end_dist = section_distance - (section.L / 2) | ||
seg_xs.append(seg_x) | ||
seg_vals.append(val(sec_end_dist + | ||
(seg_x * section.L))) | ||
p_mech[attr] = [seg_xs, seg_vals] |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
can this logic be moved here: https://github.com/jonescompneurolab/hnn-core/blob/master/hnn_core/cells_default.py ?
So we don't use the partial
in the first place and directly store the values when creating the cell.
hnn_core/cell.py
Outdated
adjacent_seg_dist = 1 / section.nseg | ||
first_seg_centre = (0.5 - (((section.nseg - 1) / 2) * | ||
adjacent_seg_dist)) | ||
seg_centres = list() | ||
|
||
# Finding centres of all segments in the section | ||
# If number of segments is 5 then seg_centres will | ||
# be 0.1, 0.3, 0.5, 0.7 and 0.9. | ||
for i in range(0, section.nseg): | ||
seg_centres.append(first_seg_centre + | ||
(i * adjacent_seg_dist)) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
any chance we can use np.linspace
? That seems exactly for this purpose
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Very slight precision errors are there. But it is passing the check in _set_biophysics()
hnn_core/cell.py
Outdated
if isinstance(val, list): | ||
seg_xs, seg_vals = val[0], val[1] | ||
for seg, seg_x, seg_val in zip(sec, seg_xs, seg_vals): | ||
# Checking equaliy till 5 decimal places | ||
np.testing.assert_almost_equal(seg.x, seg_x, 5) | ||
setattr(seg, attr, seg_val) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
let's be brave and delete _set_biophysics
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Don't we still need this function for inserting nrn mechanisms and such?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I think _set_section_mechs
is doing that now. This method only exists for checking equality with the old stuff
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I think it is still required as it is inserting mechanisms in the NEURON section objects.
_set_section_mechs()
is just expanding the partial functions for now. If we include the _set_biophysics()
logic here then NEURON objects might still be created before simulating the network (in case of network write)
hnn_core/cell.py
Outdated
self.sections[node[0]].L) | ||
else: | ||
dist_temp = self.distance_section(target_sec_name, node) | ||
dist = min(dist, dist_temp) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
dist = min(dist, dist_temp) | |
dist = np.nanmin(dist, dist_temp) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
It throws a warning when all nan values are passed to the function. The results are correct. This warning is fine right?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
For now I have added a check to resolve this issue.
hnn_core/cell.py
Outdated
raise TypeError("distance_section() " | ||
"cannot work with cell_tree as None.") | ||
if curr_node not in self.cell_tree: | ||
return 1000000 |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
return 1000000 | |
return np.nan |
@raj1701 let us know when we can look again! |
492b6dc
to
6c55cb7
Compare
Hey @jasmainak @ntolley @rythorpe , I have rebased and pushed the final code. Please take a look |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I won't be able to look this over until tomorrow afternoon. If we don't feel like waiting until then, feel free to merge without me @ntolley. |
I went ahead and merged to keep the momentum! @raj1701 let's bring your other PR to merge state as soon as possible :) |
No description provided.