Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

min_n_subfiles could be better #783

Open
christinawlindberg opened this issue Jan 23, 2023 · 0 comments
Open

min_n_subfiles could be better #783

christinawlindberg opened this issue Jan 23, 2023 · 0 comments

Comments

@christinawlindberg
Copy link
Contributor

For ACCESS BEAST runs, subdividing files based on min_n_subfiles can be problematic because, if min_n_subfiles > 1, then if a catalog bin has very few sources, there's a decent chance that one of the subfiles will have just a single source, generating an error when fitting i.e. #762.

On the other hand, if min_n_subfiles = 1, then bins with few stars (<1000 based on n_per_file) won't be split into subfiles, meaning that the observational physics grid size remains huge.

Potential Solution:

  1. Implement a max_subfile_size limit in the BEAST settings, where, after splitting files, the code checks to see if any of the noisemodels or trimmed sed grids exceed the max_subfile_size limit.
  2. Subfiles are currently sorted and split according to "F475W". In order to optimize the splitting of the catalog, it might be beneficial to re-sort the subfile catalogs by a second filter e.g. F336W or F814W, before splitting a second time.
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

2 participants