Skip to content

Commit

Permalink
Merge branch 'master' of github.com:scalableminds/webknossos into no_…
Browse files Browse the repository at this point in the history
…type

* 'master' of github.com:scalableminds/webknossos:
  fix error when loading agglomerate skeleton for single-segment agglomerate (#6294)
  Editable Mappings aka Supervoxel Proofreading (#6195)
  Increase maximum interpolation depth to 100 (#6292)
  Add download modal to dataset view actions (#6283)
  Drop "Explorational" from info tab (#6290)
  Allow version history view in annotations not owned by you (#6274)
  Bucket loading meter (#6269)
  Revert "Merge "Shared Annotations" with "My annotations" (#6230)" (#6286)
  Merge "Shared Annotations" with "My annotations" (#6230)
  • Loading branch information
hotzenklotz committed Jun 23, 2022
2 parents 6aed61a + 5156a4d commit 07fb7e8
Show file tree
Hide file tree
Showing 134 changed files with 3,854 additions and 1,031 deletions.
8 changes: 6 additions & 2 deletions CHANGELOG.unreleased.md
Expand Up @@ -8,21 +8,25 @@ and this project adheres to [Calendar Versioning](http://calver.org/) `0Y.0M.MIC
For upgrade instructions, please check the [migration guide](MIGRATIONS.released.md).

## Unreleased

[Commits](https://github.com/scalableminds/webknossos/compare/22.06.1...HEAD)

### Added

- Added a image data download speed indicator to the statusbar. On hover a tooltip is shown that show the total amount of downloaded shard data. [#6269](https://github.com/scalableminds/webknossos/pull/6269)
- Added a warning for when the resolution in the XY viewport on z=1-downsampled datasets becomes too low, explaining the problem and how to mitigate it. [#6255](https://github.com/scalableminds/webknossos/pull/6255)
- Provide a UI to download/export a dataset in view-mode. The UI explains how to access the data with the python library. [#6283](https://github.com/scalableminds/webknossos/pull/6283)
- Added the possibility to view and download older versions of read-only annotations. [#6274](https://github.com/scalableminds/webknossos/pull/6274)
- Added a proofreading tool which can be used to edit agglomerate mappings. After activating an agglomerate mapping the proofreading tool can be selected. While the tool is active, agglomerates can be clicked to load their agglomerate skeletons. Use the context menu to delete or create edges for those agglomerate skeletons to split or merge agglomerates. The changes will immediately reflect in the segmentation and meshes. [#6195](https://github.com/scalableminds/webknossos/pull/6195)
- Add new backend API routes for working with annotations without having to provide a 'type' argument [#6285](https://github.com/scalableminds/webknossos/pull/6285)

### Changed

- For the api routes that return annotation info objects, the user field was renamed to owner. User still exists as an alias, but will be removed in a future release. [#6250](https://github.com/scalableminds/webknossos/pull/6250)
- Slimmed the URLs for annotations by removing `Explorational` and `Task`. The old URLs are still supported, but will be redirected to the new format. [#6208](https://github.com/scalableminds/webknossos/pull/6208)
- When creating a task from a base annotation, the starting position/rotation and bounding box as specified during task creation are now used and overwrite the ones from the original base annotation. [#6249](https://github.com/scalableminds/webknossos/pull/6249)
- Increased maximum interpolation depth from 8 to 100. [#6292](https://github.com/scalableminds/webknossos/pull/6292)

### Fixed

- Fixed that bounding boxes were deletable in read-only tracings although the delete button was disabled. [#6273](https://github.com/scalableminds/webknossos/pull/6273)
- Fixed that (old) sharing links with tokens did not work, because the token was removed during a redirection. [#6281](https://github.com/scalableminds/webknossos/pull/6281)

Expand Down
2 changes: 2 additions & 0 deletions MIGRATIONS.unreleased.md
Expand Up @@ -8,4 +8,6 @@ User-facing changes are documented in the [changelog](CHANGELOG.released.md).
## Unreleased
[Commits](https://github.com/scalableminds/webknossos/compare/22.06.1...HEAD)

- FossilDB now has to be started with two new additional column families: editableMappings,editableMappingUpdates. Note that this upgrade can not be trivially rolled back, since new rocksDB column families are added and it is not easy to remove them again from an existing database. In case webKnossos needs to be rolled back, it is recommended to still keep the new column families in FossilDB. [#6195](https://github.com/scalableminds/webknossos/pull/6195)

### Postgres Evolutions:
4 changes: 2 additions & 2 deletions app/controllers/AnnotationIOController.scala
Expand Up @@ -428,9 +428,9 @@ Expects:

def exportMimeTypeForAnnotation(annotation: Annotation): String =
if (annotation.tracingType == TracingType.skeleton)
"application/xml"
xmlMimeType
else
"application/zip"
zipMimeType

for {
annotation <- provider.provideAnnotation(typ, annotationId, issuingUser) ~> NOT_FOUND
Expand Down
2 changes: 1 addition & 1 deletion app/controllers/DataSetController.scala
Expand Up @@ -116,7 +116,7 @@ class DataSetController @Inject()(userService: UserService,
dataLayerName) ~> NOT_FOUND
image <- imageFromCacheIfPossible(dataSet)
} yield {
addRemoteOriginHeaders(Ok(image)).as("image/jpeg").withHeaders(CACHE_CONTROL -> "public, max-age=86400")
addRemoteOriginHeaders(Ok(image)).as(jpegMimeType).withHeaders(CACHE_CONTROL -> "public, max-age=86400")
}
}

Expand Down
2 changes: 1 addition & 1 deletion app/controllers/SitemapController.scala
Expand Up @@ -15,7 +15,7 @@ class SitemapController @Inject()(sitemapWriter: SitemapWriter, sil: Silhouette[
val downloadStream = sitemapWriter.toSitemapStream(prefix)

Ok.chunked(Source.fromPublisher(IterateeStreams.enumeratorToPublisher(downloadStream)))
.as("application/xml")
.as(xmlMimeType)
.withHeaders(CONTENT_DISPOSITION ->
"""sitemap.xml""")
}
Expand Down
2 changes: 1 addition & 1 deletion app/models/annotation/AnnotationService.scala
Expand Up @@ -833,7 +833,7 @@ class AnnotationService @Inject()(
}

//for Explorative Annotations list
def compactWrites(annotation: Annotation)(implicit ctx: DBAccessContext): Fox[JsObject] =
def compactWrites(annotation: Annotation): Fox[JsObject] =
for {
dataSet <- dataSetDAO.findOne(annotation._dataSet)(GlobalAccessContext) ?~> "dataSet.notFoundForAnnotation"
organization <- organizationDAO.findOne(dataSet._organization)(GlobalAccessContext) ?~> "organization.notFound"
Expand Down
2 changes: 1 addition & 1 deletion conf/application.conf
Expand Up @@ -145,7 +145,7 @@ datastore {
address = "localhost"
port = 6379
}
agglomerateSkeleton.maxEdges = 10000
agglomerateSkeleton.maxEdges = 100000
}

# Proxy some routes to prefix + route (only if features.isDemoInstance, route "/" only if logged out)
Expand Down
2 changes: 2 additions & 0 deletions conf/messages
Expand Up @@ -277,6 +277,8 @@ annotation.reopen.failed=Failed to reopen the annotation.
annotation.sandbox.skeletonOnly=Sandbox annotations are currently available as skeleton only.
annotation.multiLayers.skeleton.notImplemented=This feature is not implemented for annotations with more than one skeleton layer
annotation.multiLayers.volume.notImplemented=This feature is not implemented for annotations with more than one volume layer
annotation.noMappingSet=No mapping is pinned for this annotation, cannot generate agglomerate skeleton.
annotation.volumeBucketsNotEmpty=Cannot make mapping editable in an annotation with mutated volume data

mesh.notFound=Mesh couldn’t be found
mesh.write.failed=Failed to convert mesh info to json
Expand Down
2 changes: 1 addition & 1 deletion docker-compose.yml
Expand Up @@ -265,7 +265,7 @@ services:
command:
- fossildb
- -c
- skeletons,skeletonUpdates,volumes,volumeData,volumeUpdates
- skeletons,skeletonUpdates,volumes,volumeData,volumeUpdates,editableMappings,editableMappingUpdates
user: ${USER_UID:-fossildb}:${USER_GID:-fossildb}

fossildb-persisted:
Expand Down
2 changes: 1 addition & 1 deletion fossildb/run.sh
Expand Up @@ -14,6 +14,6 @@ if [ ! -f "$JAR" ] || [ ! "$CURRENT_VERSION" == "$VERSION" ]; then
wget -q --show-progress -O "$JAR" "$URL"
fi

COLLECTIONS="skeletons,skeletonUpdates,volumes,volumeData,volumeUpdates"
COLLECTIONS="skeletons,skeletonUpdates,volumes,volumeData,volumeUpdates,editableMappings,editableMappingUpdates"

exec java -jar "$JAR" -c "$COLLECTIONS" -d "$FOSSILDB_HOME/data" -b "$FOSSILDB_HOME/backup"
52 changes: 49 additions & 3 deletions frontend/javascripts/admin/admin_rest_api.ts
Expand Up @@ -54,6 +54,7 @@ import type {
ServerTracing,
TracingType,
WkConnectDatasetConfig,
ServerEditableMapping,
APICompoundType,
} from "types/api_flow_types";
import { APIAnnotationTypeEnum } from "types/api_flow_types";
Expand Down Expand Up @@ -81,6 +82,7 @@ import Toast from "libs/toast";
import * as Utils from "libs/utils";
import messages from "messages";
import window, { location } from "libs/window";
import { SaveQueueType } from "oxalis/model/actions/save_actions";

const MAX_SERVER_ITEMS_PER_RESPONSE = 1000;

Expand Down Expand Up @@ -854,11 +856,11 @@ export async function getTracingForAnnotationType(
export function getUpdateActionLog(
tracingStoreUrl: string,
tracingId: string,
tracingType: "skeleton" | "volume",
versionedObjectType: SaveQueueType,
): Promise<Array<APIUpdateActionBatch>> {
return doWithToken((token) =>
Request.receiveJSON(
`${tracingStoreUrl}/tracings/${tracingType}/${tracingId}/updateActionLog?token=${token}`,
`${tracingStoreUrl}/tracings/${versionedObjectType}/${tracingId}/updateActionLog?token=${token}`,
),
);
}
Expand Down Expand Up @@ -1584,6 +1586,29 @@ export function fetchMapping(
);
}

export function makeMappingEditable(
tracingStoreUrl: string,
tracingId: string,
): Promise<ServerEditableMapping> {
return doWithToken((token) =>
Request.receiveJSON(
`${tracingStoreUrl}/tracings/volume/${tracingId}/makeMappingEditable?token=${token}`,
{
method: "POST",
},
),
);
}

export function getEditableMapping(
tracingStoreUrl: string,
tracingId: string,
): Promise<ServerEditableMapping> {
return doWithToken((token) =>
Request.receiveJSON(`${tracingStoreUrl}/tracings/mapping/${tracingId}?token=${token}`),
);
}

export async function getAgglomeratesForDatasetLayer(
datastoreUrl: string,
datasetId: APIDatasetId,
Expand Down Expand Up @@ -1871,10 +1896,12 @@ export function getMeshData(id: string): Promise<ArrayBuffer> {
// These parameters are bundled into an object to avoid that the computeIsosurface function
// receives too many parameters, since this doesn't play well with the saga typings.
type IsosurfaceRequest = {
// The position is in voxels in mag 1
position: Vector3;
mag: Vector3;
segmentId: number;
subsamplingStrides: Vector3;
// The cubeSize is in voxels in mag <mag>
cubeSize: Vector3;
scale: Vector3;
mappingName: string | null | undefined;
Expand Down Expand Up @@ -1938,7 +1965,26 @@ export function getAgglomerateSkeleton(
return doWithToken((token) =>
Request.receiveArraybuffer(
`${dataStoreUrl}/data/datasets/${datasetId.owningOrganization}/${datasetId.name}/layers/${layerName}/agglomerates/${mappingId}/skeleton/${agglomerateId}?token=${token}`, // The webworker code cannot do proper error handling and always expects an array buffer from the server.
// In this case, the server sends an error json instead of an array buffer sometimes. Therefore, don't use the webworker code.
// The webworker code cannot do proper error handling and always expects an array buffer from the server.
// However, the server might send an error json instead of an array buffer. Therefore, don't use the webworker code.
{
useWebworkerForArrayBuffer: false,
showErrorToast: false,
},
),
);
}

export function getEditableAgglomerateSkeleton(
tracingStoreUrl: string,
tracingId: string,
agglomerateId: number,
): Promise<ArrayBuffer> {
return doWithToken((token) =>
Request.receiveArraybuffer(
`${tracingStoreUrl}/tracings/volume/${tracingId}/agglomerateSkeleton/${agglomerateId}?token=${token}`,
// The webworker code cannot do proper error handling and always expects an array buffer from the server.
// However, the server might send an error json instead of an array buffer. Therefore, don't use the webworker code.
{
useWebworkerForArrayBuffer: false,
showErrorToast: false,
Expand Down
56 changes: 39 additions & 17 deletions frontend/javascripts/libs/format_utils.ts
Expand Up @@ -3,6 +3,7 @@ import { presetPalettes } from "@ant-design/colors";
import type { Vector3, Vector6 } from "oxalis/constants";
import { Unicode } from "oxalis/constants";
import * as Utils from "libs/utils";
import _ from "lodash";
import type { BoundingBoxObject } from "oxalis/store";
const { ThinSpace, MultiplicationSymbol } = Unicode;
const COLOR_MAP: Array<string> = [
Expand Down Expand Up @@ -72,6 +73,24 @@ export function formatScale(scaleArr: Vector3 | null | undefined, roundTo: numbe
return "";
}
}

export function formatNumberToUnit(number: number, unitMap: Map<number, string>): string {
const closestFactor = findClosestToUnitFactor(number, unitMap);
const unit = unitMap.get(closestFactor);

if (unit == null) {
throw new Error("Couldn't look up appropriate unit.");
}

const valueInUnit = number / closestFactor;

if (valueInUnit !== Math.floor(valueInUnit)) {
return `${valueInUnit.toFixed(1)}${ThinSpace}${unit}`;
}

return `${valueInUnit}${ThinSpace}${unit}`;
}

const nmFactorToUnit = new Map([
[1e-3, "pm"],
[1, "nm"],
Expand All @@ -80,28 +99,31 @@ const nmFactorToUnit = new Map([
[1e9, "m"],
[1e12, "km"],
]);
const sortedNmFactors = Array.from(nmFactorToUnit.keys()).sort((a, b) => a - b);
export function formatNumberToLength(lengthInNm: number): string {
const closestFactor = findClosestLengthUnitFactor(lengthInNm);
const unit = nmFactorToUnit.get(closestFactor);

if (unit == null) {
throw new Error("Couldn't look up appropriate length unit.");
}
return formatNumberToUnit(lengthInNm, nmFactorToUnit);
}

const lengthInUnit = lengthInNm / closestFactor;
const byteFactorToUnit = new Map([
[1, "B"],
[1e3, "KB"],
[1e6, "MB"],
[1e9, "GB"],
[1e12, "TB"],
]);
export function formatCountToDataAmountUnit(count: number): string {
return formatNumberToUnit(count, byteFactorToUnit);
}

if (lengthInUnit !== Math.floor(lengthInUnit)) {
return `${lengthInUnit.toFixed(1)}${ThinSpace}${unit}`;
}
const getSortedFactors = _.memoize((unitMap: Map<number, string>) =>
Array.from(unitMap.keys()).sort((a, b) => a - b),
);

return `${lengthInUnit}${ThinSpace}${unit}`;
}
export function findClosestLengthUnitFactor(lengthInNm: number): number {
let closestFactor = sortedNmFactors[0];
export function findClosestToUnitFactor(number: number, unitMap: Map<number, string>): number {
const sortedFactors = getSortedFactors(unitMap);
let closestFactor = sortedFactors[0];

for (const factor of sortedNmFactors) {
if (lengthInNm >= factor) {
for (const factor of sortedFactors) {
if (number >= factor) {
closestFactor = factor;
}
}
Expand Down
3 changes: 3 additions & 0 deletions frontend/javascripts/libs/window.ts
Expand Up @@ -55,6 +55,8 @@ const dummyLocation = {
// @ts-expect-error ts-migrate(2322) FIXME: Type 'Location | { ancestorOrigins: never[]; hash:... Remove this comment to see the full error message
export const location: Location = typeof window === "undefined" ? dummyLocation : window.location;

let performanceCounterForMocking = 0;

const _window =
typeof window === "undefined"
? {
Expand All @@ -72,6 +74,7 @@ const _window =
addEventListener,
removeEventListener,
open: (_url: string) => {},
performance: { now: () => ++performanceCounterForMocking },
}
: window;

Expand Down
3 changes: 3 additions & 0 deletions frontend/javascripts/messages.ts
Expand Up @@ -113,6 +113,9 @@ In order to restore the current window, a reload is necessary.`,
"undo.no_undo":
"There is no action that could be undone. However, if you want to restore an earlier version of this annotation, use the 'Restore Older Version' functionality in the dropdown next to the 'Save' button.",
"undo.no_redo": "There is no action that could be redone.",
"undo.no_undo_during_proofread":
"Undo is not supported during proofreading yet. Please manually revert the last action you took.",
"undo.no_redo_during_proofread": "Redo is not supported during proofreading yet.",
"undo.import_volume_tracing":
"Importing a volume annotation cannot be undone. However, if you want to restore an earlier version of this annotation, use the 'Restore Older Version' functionality in the dropdown next to the 'Save' button.",
"download.wait": "Please wait...",
Expand Down
6 changes: 5 additions & 1 deletion frontend/javascripts/oxalis/api/api_latest.ts
Expand Up @@ -1528,7 +1528,11 @@ class DataApi {
});
}

getRawDataCuboid(layerName: string, topLeft: Vector3, bottomRight: Vector3): Promise<void> {
getRawDataCuboid(
layerName: string,
topLeft: Vector3,
bottomRight: Vector3,
): Promise<ArrayBuffer> {
return doWithToken((token) => {
const downloadUrl = this._getDownloadUrlForRawDataCuboid(
layerName,
Expand Down
5 changes: 4 additions & 1 deletion frontend/javascripts/oxalis/constants.ts
Expand Up @@ -180,6 +180,7 @@ export enum AnnotationToolEnum {
FILL_CELL = "FILL_CELL",
PICK_CELL = "PICK_CELL",
BOUNDING_BOX = "BOUNDING_BOX",
PROOFREAD = "PROOFREAD",
}
export const VolumeTools: Array<keyof typeof AnnotationToolEnum> = [
AnnotationToolEnum.BRUSH,
Expand Down Expand Up @@ -257,7 +258,7 @@ export type ShowContextMenuFunction = (
arg1: number,
arg2: number | null | undefined,
arg3: number | null | undefined,
arg4: Vector3,
arg4: Vector3 | null | undefined,
arg5: OrthoView,
) => void;
const Constants = {
Expand Down Expand Up @@ -294,6 +295,8 @@ const Constants = {
DEFAULT_NODE_RADIUS: 1.0,
RESIZE_THROTTLE_TIME: 50,
MIN_TREE_ID: 1,
// TreeIds > 1024^2 break webKnossos, see https://github.com/scalableminds/webknossos/issues/5009
MAX_TREE_ID: 1048576,
MIN_NODE_ID: 1,
// Maximum of how many buckets will be held in RAM (per layer)
MAXIMUM_BUCKET_COUNT_PER_LAYER: 5000,
Expand Down
@@ -1,4 +1,7 @@
import { calculateGlobalPos } from "oxalis/model/accessors/view_mode_accessor";
import {
calculateGlobalPos,
calculateMaybeGlobalPos,
} from "oxalis/model/accessors/view_mode_accessor";
import _ from "lodash";
import type { OrthoView, Point2, Vector3, BoundingBoxType } from "oxalis/constants";
import Store from "oxalis/store";
Expand Down Expand Up @@ -140,7 +143,10 @@ export function getClosestHoveredBoundingBox(
plane: OrthoView,
): [SelectedEdge, SelectedEdge | null | undefined] | null {
const state = Store.getState();
const globalPosition = calculateGlobalPos(state, pos, plane);
const globalPosition = calculateMaybeGlobalPos(state, pos, plane);

if (globalPosition == null) return null;

const { userBoundingBoxes } = getSomeTracing(state.tracing);
const indices = Dimension.getIndices(plane);
const planeRatio = getBaseVoxelFactors(state.dataset.dataSource.scale);
Expand Down

0 comments on commit 07fb7e8

Please sign in to comment.