You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Datasets for validation should be in the same format as the "Upload data" format generated by Dafne. I.e. a npz file containing the data, the masks, and the resolution. This will allow easy generation of validation datasets because it can be done directly from Dafne.
The relevant code from the client that generates the upload data format is the following:
which can be loaded with np.load(allow_pickle=false) (for security).
Additionally, the following should be taken care of:
Dice score is only calculated if the map contains more than a defined number of voxels (dice score with very few voxels is very unstable)
Only the slices where the maps are defined are used for the validation (a dataset might contain only a few segmented slices, we shouldn't assume that if all the masks are zero, the slice contains no features). Maybe we check that enough masks are defined? (In the incremental learning, we only perform the learning if more than half the ROIs are defined)
Merge _L and _R ROIs for the validation (this code is already there if I remember correctly)
The text was updated successfully, but these errors were encountered:
Datasets for validation should be in the same format as the "Upload data" format generated by Dafne. I.e. a npz file containing the data, the masks, and the resolution. This will allow easy generation of validation datasets because it can be done directly from Dafne.
The relevant code from the client that generates the upload data format is the following:
which can be loaded with
np.load(allow_pickle=false)
(for security).Additionally, the following should be taken care of:
The text was updated successfully, but these errors were encountered: