Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Implement option to export proofreading as segmentation #8286

Open
wants to merge 14 commits into
base: master
Choose a base branch
from
Open
Show file tree
Hide file tree
Changes from 7 commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
1 change: 1 addition & 0 deletions CHANGELOG.unreleased.md
Original file line number Diff line number Diff line change
Expand Up @@ -13,6 +13,7 @@ For upgrade instructions, please check the [migration guide](MIGRATIONS.released
### Added
- Added the total volume of a dataset to a tooltip in the dataset info tab. [#8229](https://github.com/scalableminds/webknossos/pull/8229)
- Optimized performance of data loading with “fill value“ chunks. [#8271](https://github.com/scalableminds/webknossos/pull/8271)
- Added the option to export proofreading as segmentation [#8286](https://github.com/scalableminds/webknossos/pull/8286)
cdfhalle marked this conversation as resolved.
Show resolved Hide resolved

### Changed
- Renamed "resolution" to "magnification" in more places within the codebase, including local variables. [#8168](https://github.com/scalableminds/webknossos/pull/8168)
Expand Down
11 changes: 9 additions & 2 deletions app/controllers/JobController.scala
Original file line number Diff line number Diff line change
Expand Up @@ -379,7 +379,9 @@ class JobController @Inject()(
newDatasetName: String,
outputSegmentationLayerName: String,
mergeSegments: Boolean,
volumeLayerName: Option[String]): Action[AnyContent] =
volumeLayerName: Option[String],
includesProofreading: Boolean,
selectedBoundingBox: Option[String]): Action[AnyContent] =
sil.SecuredAction.async { implicit request =>
log(Some(slackNotificationService.noticeFailedJobRequest)) {
for {
Expand All @@ -393,6 +395,9 @@ class JobController @Inject()(
command = JobCommand.materialize_volume_annotation
_ <- datasetService.assertValidDatasetName(newDatasetName)
_ <- datasetService.assertValidLayerNameLax(outputSegmentationLayerName)
multiUser <- multiUserDAO.findOne(request.identity._multiUser)
_ <- Fox.runIf(!multiUser.isSuperUser && includesProofreading)(Fox.runOptional(selectedBoundingBox)(bbox =>
jobService.assertBoundingBoxLimits(bbox, None)))
commandArgs = Json.obj(
"organization_id" -> organization._id,
"dataset_name" -> dataset.name,
Expand All @@ -403,7 +408,9 @@ class JobController @Inject()(
"annotation_type" -> annotationType,
"new_dataset_name" -> newDatasetName,
"merge_segments" -> mergeSegments,
"volume_layer_name" -> volumeLayerName
"volume_layer_name" -> volumeLayerName,
"includes_proofreading" -> includesProofreading,
"selected_bounding_box" -> selectedBoundingBox
)
job <- jobService.submitJob(command, commandArgs, request.identity, dataset._dataStore) ?~> "job.couldNotRunApplyMergerMode"
js <- jobService.publicWrites(job)
Expand Down
2 changes: 1 addition & 1 deletion conf/webknossos.latest.routes
Original file line number Diff line number Diff line change
Expand Up @@ -268,7 +268,7 @@ POST /jobs/run/inferNuclei/:datasetId
POST /jobs/run/inferNeurons/:datasetId controllers.JobController.runInferNeuronsJob(datasetId: String, layerName: String, bbox: String, newDatasetName: String)
POST /jobs/run/inferMitochondria/:datasetId controllers.JobController.runInferMitochondriaJob(datasetId: String, layerName: String, bbox: String, newDatasetName: String)
POST /jobs/run/alignSections/:datasetId controllers.JobController.runAlignSectionsJob(datasetId: String, layerName: String, newDatasetName: String, annotationId: Option[String])
POST /jobs/run/materializeVolumeAnnotation/:datasetId controllers.JobController.runMaterializeVolumeAnnotationJob(datasetId: String, fallbackLayerName: String, annotationId: String, annotationType: String, newDatasetName: String, outputSegmentationLayerName: String, mergeSegments: Boolean, volumeLayerName: Option[String])
POST /jobs/run/materializeVolumeAnnotation/:datasetId controllers.JobController.runMaterializeVolumeAnnotationJob(datasetId: String, fallbackLayerName: String, annotationId: String, annotationType: String, newDatasetName: String, outputSegmentationLayerName: String, mergeSegments: Boolean, volumeLayerName: Option[String], includesProofreading: Boolean, selectedBoundingBox: Option[String])
POST /jobs/run/findLargestSegmentId/:datasetId controllers.JobController.runFindLargestSegmentIdJob(datasetId: String, layerName: String)
POST /jobs/run/renderAnimation/:datasetId controllers.JobController.runRenderAnimationJob(datasetId: String)
GET /jobs/:id controllers.JobController.get(id: String)
Expand Down
12 changes: 12 additions & 0 deletions frontend/javascripts/admin/api/jobs.ts
Original file line number Diff line number Diff line change
Expand Up @@ -208,6 +208,8 @@ function startSegmentationAnnotationDependentJob(
annotationId: string,
annotationType: APIAnnotationType,
mergeSegments?: boolean,
includesProofreading?: boolean,
selectedBoundingBox?: Vector6,
philippotto marked this conversation as resolved.
Show resolved Hide resolved
): Promise<APIJob> {
const requestURL = new URL(`/api/jobs/run/${jobURLPath}/${datasetId}`, location.origin);
if (volumeLayerName != null) {
Expand All @@ -222,6 +224,12 @@ function startSegmentationAnnotationDependentJob(
if (mergeSegments != null) {
requestURL.searchParams.append("mergeSegments", mergeSegments.toString());
}
if (includesProofreading != null) {
requestURL.searchParams.append("includesProofreading", includesProofreading.toString());
}
if (selectedBoundingBox) {
requestURL.searchParams.append("selectedBoundingBox", selectedBoundingBox.toString());
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
requestURL.searchParams.append("selectedBoundingBox", selectedBoundingBox.toString());
requestURL.searchParams.append("selectedBoundingBox", selectedBoundingBox.join(","));

should be identical, but in your variant one has to know how JS serializes number arrays. I find it better to be explicit about it.

}
return Request.receiveJSON(requestURL.href, {
method: "POST",
});
Expand All @@ -235,6 +243,8 @@ export function startMaterializingVolumeAnnotationJob(
annotationId: string,
annotationType: APIAnnotationType,
mergeSegments: boolean,
includesProofreading: boolean,
selectedBoundingBox?: Vector6,
): Promise<APIJob> {
return startSegmentationAnnotationDependentJob(
"materializeVolumeAnnotation",
Expand All @@ -245,6 +255,8 @@ export function startMaterializingVolumeAnnotationJob(
annotationId,
annotationType,
mergeSegments,
includesProofreading,
selectedBoundingBox,
);
}

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -891,6 +891,7 @@ export function MaterializeVolumeAnnotationModal({
}: MaterializeVolumeAnnotationModalProps) {
const dataset = useSelector((state: OxalisState) => state.dataset);
const tracing = useSelector((state: OxalisState) => state.tracing);
let includesProofreading = false;
const activeSegmentationTracingLayer = useSelector(getActiveSegmentationTracingLayer);
const fixedSelectedLayer = selectedVolumeLayer || activeSegmentationTracingLayer;
const readableVolumeLayerName =
Expand Down Expand Up @@ -925,6 +926,8 @@ export function MaterializeVolumeAnnotationModal({
output dataset and the output segmentation layer.
</p>
);
} else {
includesProofreading = tracing.volumes.some((v) => v.hasEditableMapping === true);
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

If my annotation contains two volume layers and only one has an editable mapping, that property will be true. However, i I want to export the layer that has no editable mapping, true will still be used in this UI and sent to the backend, which seems wrong?

}
const jobImage =
jobNameToImagePath[jobName] != null ? (
Expand Down Expand Up @@ -954,8 +957,13 @@ export function MaterializeVolumeAnnotationModal({
jobName={"materialize_volume_annotation"}
suggestedDatasetSuffix="with_merged_segmentation"
chooseSegmentationLayer
isBoundingBoxConfigurable={includesProofreading}
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

only proofread layers support configuring the bbox? would it be easy to always support it when exporting volume annotations? no prio though.

Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The question then would be what the consequences of this bounding boxes should be. We are currently using Zarr Streaming (where one can specify a bounding box that should be streamed) for merging the annotations only when the annotation includes proofreading. In other cases the worker currently just downloads the whole annotation because that has proven to be faster than streaming it. I could not find an easy option to specify a bouding box in this case. However It would be easily possible to use the streaming approach whenever a bounding box was specified. This could then be faster or slower depending on the relationship of the dataset size and the size of the selected bounding boxes.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

In other cases the worker currently just downloads the whole annotation because that has proven to be faster than streaming it.

this could stay as is, I think. after downloading, one could only read the bbox from the annotation. however, as I said, it doesn't have a high priority. so, if it's not a quick thing to do, nevermind :)

fixedSelectedLayer={fixedSelectedLayer}
jobApiCall={async ({ newDatasetName, selectedLayer: segmentationLayer }) => {
jobApiCall={async ({
newDatasetName,
selectedLayer: segmentationLayer,
selectedBoundingBox,
}) => {
// There are 3 cases for the value assignments to volumeLayerName and baseSegmentationName for the job:
// 1. There is a volume annotation with a fallback layer. volumeLayerName will reference the volume layer
// and baseSegmentationName will reference the fallback layer. The job will merge those layers.
Expand All @@ -968,6 +976,9 @@ export function MaterializeVolumeAnnotationModal({
? getReadableNameOfVolumeLayer(segmentationLayer, tracing)
: null;
const baseSegmentationName = getBaseSegmentationName(segmentationLayer);
const bbox = selectedBoundingBox?.boundingBox
? computeArrayFromBoundingBox(selectedBoundingBox.boundingBox)
: undefined;
return startMaterializingVolumeAnnotationJob(
dataset.id,
baseSegmentationName,
Expand All @@ -976,6 +987,8 @@ export function MaterializeVolumeAnnotationModal({
tracing.annotationId,
tracing.annotationType,
isMergerModeEnabled,
includesProofreading,
bbox,
);
}}
description={
Expand Down