You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
LMU hasn't received the attention due it since it's introduction. Nor has Nengo's spiking approach received due attention. IMHO, ABR's priorities have gotten the spiking cart before the LMU horse. LMU's superiority to LSTM is a strength of that horse that may well draw greater attention to spiking deployments.
Since the LMU's primary strength is dynamical systems modeling, (and having casually worked with the LMU code for a couple of years off and on) one of the things that becomes apparent to me (as a casual user) is the need for more examples involving dynamics and, more specifically, time series prediction. The most obvious omission is the original repository's Mackey-Glass example. But even that example, although it does demonstrate superiority to LSTM (other than in the hybrid architecture), it doesn't really get to the heart of dynamical systems identification for which LMU is likely to really shine:
Online identification of nonstationary dynamical systems.
Something that would accomplish this is an algorithm generating multiple, dynamically interdependent, wave forms (generated in the CPU fed to the GPU(s)), with dependency parameters changing continuously in time for the LMU to learn, online, in the GPU/TPU, and predict.
Particular attention to illustrating the function of LMU-unique parameters (theta, etc.) -- especially in contrast to the LSTM in this environment -- would help the outreach a great deal.
PS: Something to avoid in this kind of outreach is reliance on interpolative test sets -- that is to say, avoid the normal Keras training/testing mode involving chopping up time series data into training and test sets where what the model actually learns to do is interpolate rather than extrapolate.
The text was updated successfully, but these errors were encountered:
LMU hasn't received the attention due it since it's introduction. Nor has Nengo's spiking approach received due attention. IMHO, ABR's priorities have gotten the spiking cart before the LMU horse. LMU's superiority to LSTM is a strength of that horse that may well draw greater attention to spiking deployments.
Since the LMU's primary strength is dynamical systems modeling, (and having casually worked with the LMU code for a couple of years off and on) one of the things that becomes apparent to me (as a casual user) is the need for more examples involving dynamics and, more specifically, time series prediction. The most obvious omission is the original repository's Mackey-Glass example. But even that example, although it does demonstrate superiority to LSTM (other than in the hybrid architecture), it doesn't really get to the heart of dynamical systems identification for which LMU is likely to really shine:
Online identification of nonstationary dynamical systems.
Something that would accomplish this is an algorithm generating multiple, dynamically interdependent, wave forms (generated in the CPU fed to the GPU(s)), with dependency parameters changing continuously in time for the LMU to learn, online, in the GPU/TPU, and predict.
Particular attention to illustrating the function of LMU-unique parameters (theta, etc.) -- especially in contrast to the LSTM in this environment -- would help the outreach a great deal.
PS: Something to avoid in this kind of outreach is reliance on interpolative test sets -- that is to say, avoid the normal Keras training/testing mode involving chopping up time series data into training and test sets where what the model actually learns to do is interpolate rather than extrapolate.
The text was updated successfully, but these errors were encountered: