Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

What to do when VectorOfVectors is too big #70

Open
iguinn opened this issue Apr 30, 2024 · 1 comment
Open

What to do when VectorOfVectors is too big #70

iguinn opened this issue Apr 30, 2024 · 1 comment

Comments

@iguinn
Copy link
Collaborator

iguinn commented Apr 30, 2024

To handle VoV in the processing chain, we use a 2D array to act as a buffer (effectively an AoESA). We NaN/zero pad the end of this array, and we calculate the vector length as a separate variable, which gets used to set the VoV lengths. The question is what to do if the vector length is greater than the second array dimension. Right now we don't have a standards solution, but there are some options:

  1. Raise an exception. This will cause a file to fail, which I don't think we want
  2. Have processors stop counting once they hit the array size. This will make it un-obvious that something has gone amiss
  3. Have processors keep counting but stop filling the AoESA buffers (this is the current solution for bi_level_zero_crossing_time_points). In this case we need to choose how to handle copying into the VoV. One option is to copy NaNs into the array (although for int types it would have to be 0s or 0xDEADBEEFs or something else much less obviously wrong than NaN). Another option is to copy only up to the array size and throw out a warning into the log.
  4. Maybe someone has a better idea?
@gipert
Copy link
Member

gipert commented May 8, 2024

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants