You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
For ACCESS BEAST runs, subdividing files based on min_n_subfiles can be problematic because, if min_n_subfiles > 1, then if a catalog bin has very few sources, there's a decent chance that one of the subfiles will have just a single source, generating an error when fitting i.e. #762.
On the other hand, if min_n_subfiles = 1, then bins with few stars (<1000 based on n_per_file) won't be split into subfiles, meaning that the observational physics grid size remains huge.
Potential Solution:
Implement a max_subfile_size limit in the BEAST settings, where, after splitting files, the code checks to see if any of the noisemodels or trimmed sed grids exceed the max_subfile_size limit.
Subfiles are currently sorted and split according to "F475W". In order to optimize the splitting of the catalog, it might be beneficial to re-sort the subfile catalogs by a second filter e.g. F336W or F814W, before splitting a second time.
The text was updated successfully, but these errors were encountered:
For ACCESS BEAST runs, subdividing files based on
min_n_subfiles
can be problematic because, ifmin_n_subfiles > 1
, then if a catalog bin has very few sources, there's a decent chance that one of the subfiles will have just a single source, generating an error when fitting i.e. #762.On the other hand, if
min_n_subfiles = 1
, then bins with few stars (<1000 based onn_per_file
) won't be split into subfiles, meaning that the observational physics grid size remains huge.Potential Solution:
max_subfile_size
limit in the BEAST settings, where, after splitting files, the code checks to see if any of the noisemodels or trimmed sed grids exceed themax_subfile_size
limit.The text was updated successfully, but these errors were encountered: