-
Notifications
You must be signed in to change notification settings - Fork 52
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
generalized_opt_routine #652
generalized_opt_routine #652
Conversation
Nice! I think we should explore how these plots change with: a) increasing tstop Also I would suggest overlaying the convergence plots in a single figure and reporting the relative tolerance instead of absolute RMSE (which depends on length of signal). You will basically divide the RMSE by the norm of the target signal. |
cons.append(lambda x, ind=0: x[0] - 0.01) | ||
x0.append(mu_offset) | ||
error_values_cobyla.clear() | ||
cobyla_optimization_results = fmin_cobyla(calculate_rmse, |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
could we also add the current optimization function to the comparisons? It has an argument called return_rmse
which you can plot ...
@carolinafernandezp do you have a |
hnn_core/opt_toy_example/general.py
Outdated
elif metric == 'rhythmic': | ||
self.metric = _rmse_rhythmic | ||
self.net_ = None | ||
self.error_values = None |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
self.error_values = None | |
self.obj = list() |
to be clear what is the data type
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
obj = objective value
I think it's more generalizable than "error"
hnn_core/opt_toy_example/general.py
Outdated
if metric == 'evoked': | ||
self.metric = _rmse_evoked | ||
elif metric == 'rhythmic': | ||
self.metric = _rmse_rhythmic |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I like the metric
argument here, but how about self.obj_func
instead of self.metric
so that it's clear this translates to a function?
|
hnn_core/opt_toy_example/general.py
Outdated
def fit(self, target_statistic, window_len = None): | ||
""" ... | ||
Parameters | ||
---------- | ||
target_statistic : dpl | ||
window_len : for smoothing dpl |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I think window_len
should be specified somehow with the metric
or possibly as an argument of the objective function since it is part of the cost calculation.
hnn_core/opt_toy_example/general.py
Outdated
|
||
for drive_name in net.external_drives.keys(): | ||
for cons_idx, cons_key in enumerate(constraints[drive_name]): | ||
if cons_key == 'mu' or cons_key == 'sigma': |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
if cons_key == 'mu' or cons_key == 'sigma': | |
if cons_key in ('mu', 'sigma'): |
hnn_core/opt_toy_example/general.py
Outdated
return opt_params, error_values | ||
|
||
|
||
def _get_fixed_params(net): |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I don't quite get the logic of extracting the fixed params. Why can't we just pass around the network object and never worry about the fixed params?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This is getting at the same issue I was pointing out in #652 (comment).
hnn_core/opt_toy_example/general.py
Outdated
|
||
|
||
# gets called only once | ||
def _get_params_bayesian(net, constraints): |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
it might be cleaner to do:
def _get_params(net, params):
# insert logic of flattening network
# into a dictionary
return params, constraints
def _get_params_bayesian(net, params):
params, constraints = _get_params(net, constraints)
for param in params:
# now convert into solver-specific format
continue
def _get_params_cobyla(net, params):
params, constraints = _get_params(net, constraints)
for param in params:
# now convert into solver-specific format
continue
so you don't repeat the logic of flattening multiple times and introduce potential sources of error
hnn_core/opt_toy_example/general.py
Outdated
for drive_name in net.external_drives.keys(): | ||
# clear drive | ||
del net_new.external_drives[drive_name] | ||
# clear connectivity | ||
conn_idxs = pick_connection(net_new, src_gids=drive_name) | ||
net_new.connectivity = [conn for conn_idx, conn | ||
in enumerate(net_new.connectivity) | ||
if conn_idx not in conn_idxs] | ||
net_new.add_evoked_drive(name=drive_name, | ||
mu=predicted_params[drive_name]['mu'], | ||
sigma=predicted_params[drive_name]['sigma'], | ||
numspikes=fixed_params[drive_name] | ||
['numspikes'], | ||
location=fixed_params[drive_name]['location'], | ||
n_drive_cells=fixed_params[drive_name] | ||
['n_drive_cells'], | ||
cell_specific=fixed_params[drive_name] | ||
['cell_specific'], | ||
weights_ampa=predicted_params[drive_name] | ||
['ampa_weights'], | ||
weights_nmda=predicted_params[drive_name] | ||
['nmda_weights'], | ||
space_constant=fixed_params[drive_name] | ||
['space_constant'], | ||
synaptic_delays=fixed_params[drive_name] |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I think the more generalizable way is to update the network attributes directly ... basically do the opposite of what you did in _get_params
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Shouldn't we strive to use the established code path via the public API since setting Network
attributes can be much trickier than getting them? Mostly just because there are a bunch of checks and stuff in Network.add_evoked_drive()
and Network._attach_drive()
. For instance, if the user should decide to optimize the n_drive_cells
parameter (not sure why one would want to do this, but it's theoretically an option), we'd need to make sure the gid_ranges
and connectivity
attributes are updated accordingly.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
but that seemed to be different from what you proposed here ?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I meant that the general strategy is to retrieve fixed parameters from Network
attributes that should be treated more-or-less as immutable (e.g., like an external drive), and then re-instantiate them using the standard API with their original fixed params and updated params. If the Network
attribute getting modified during optimization is fair-game for direct modification, the _get_params()
method is irrelevant (right?). Some Network
attributes can be overwritten directly (particularly those without a complex instantiation API written for them), but others are a bit error prone.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
In that case, maybe it's better if we initially restrict the params to be amongst those that can be instantiated through the API. If we assemble the dictionary cleverly, we might even be able to just say:
net.add_evoked_drive(**updated_params)
Where updated_params are the arguments relevant for the evoked drive. You can get the list of arguments a function accepts and make the dictionary keys match those.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
In that case, maybe it's better if we initially restrict the params to be amongst those that can be instantiated through the API. If we assemble the dictionary cleverly, we might even be able to just say:
You mean, restrict optimization params to just drives? Or drives and connectivity maybe?
net.add_evoked_drive(**updated_params)
Where updated_params are the arguments relevant for the evoked drive. You can get the list of arguments a function accepts and make the dictionary keys match those.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Start with drives and drive connectivity. We can always expand to other parameters.
hnn_core/opt_toy_example/general.py
Outdated
return | ||
|
||
|
||
def _get_params(net, constraints): |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
def _get_params(net, constraints): | |
def _set_constraints(net, constraints): |
?
hnn_core/opt_toy_example/general.py
Outdated
target_statistic : Dipole | ||
The target statistic. | ||
max_iter : int | ||
Max number of calls. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
target_statistic : Dipole | |
The target statistic. | |
max_iter : int | |
Max number of calls. | |
target_statistic : Dipole | |
The target statistic. | |
max_iter : int | |
Max number of calls. |
@carolinafernandezp please feel free to respond/interact on github. Communications in meetings are not documented and the rationale for a certain decision may not be obvious to a new outside contributor. Github is the more open communication method. |
@carolinafernandezp let's go ahead and move forward with this PR without the ability to optimize local network connectivity. This PR is already pretty big, so let's work on cleaning up what you have for evoked drives and keep it as simple as possible. Once you merge this one, it'll be much easier to expand to local network connectivity, as well as incorporate a new workflow/tutorial for optimizing bursty (rhythmic) drives to a targeted frequency range based on your most recent work. |
remove sklearn soft dependency
remove sklearn soft dependency
ec0fd0d
to
9f92685
Compare
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Your tests aren't passing @carolinafernandezp because the CI workflows aren't importing scikit-learn. This can be fixed by updating unit_tests.yml
L46 with pip install --verbose .'[gui,opt]'
. Don't forget to update the windows version as well.
Codecov ReportPatch coverage:
❗ Your organization needs to install the Codecov GitHub app to enable full functionality. Additional details and impacted files@@ Coverage Diff @@
## master #652 +/- ##
==========================================
- Coverage 91.62% 91.42% -0.21%
==========================================
Files 22 25 +3
Lines 4488 4604 +116
==========================================
+ Hits 4112 4209 +97
- Misses 376 395 +19
☔ View full report in Codecov by Sentry. 📢 Have feedback on the report? Share it here. |
@carolinafernandezp looks like a rebase problem. I think @rythorpe already fixed the error that you see in the windows CI |
I manually merged the PR because github was still complaining about review comments not being addressed. Thanks a ton @carolinafernandezp ! This is a great step towards making the optimization code easier to maintain for us. |
Yay! 👏🏻 Thank you guys for all your help! @jasmainak @ntolley @rythorpe |
No description provided.