You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
For very large models, memory requirements become demanding. Would it be possible to formally state the scaling laws in the documentation or even provide functions to estimate the absolute memory usage? Of particular interest is the impact of the total number of bands to be solved, as that came as a surprise to us. If this is not possible analytically, I would be happy to contribute some empirical results on a research cluster for a proposed test suite.
The text was updated successfully, but these errors were encountered:
Scaling laws are discussed in Section 3.4-3.5 of the MPB Optics Express paper.
I also think it could be nice to have some simple variant of those in the documentation, for quick reference.
For very large models, memory requirements become demanding. Would it be possible to formally state the scaling laws in the documentation or even provide functions to estimate the absolute memory usage? Of particular interest is the impact of the total number of bands to be solved, as that came as a surprise to us. If this is not possible analytically, I would be happy to contribute some empirical results on a research cluster for a proposed test suite.
The text was updated successfully, but these errors were encountered: