diff --git a/IMPLEMENTATION.md b/IMPLEMENTATION.md index caf4601a..8472c655 100644 --- a/IMPLEMENTATION.md +++ b/IMPLEMENTATION.md @@ -51,6 +51,11 @@ Parts of the current implementation may still change. This page is only a short - These are implementations of the `TilesetSource` and `TilesetTarget` interface (see `./src/tilesetData`), based on 3TZ or 3DTILES - `./src/pipelines`: **Preliminary** classes for modeling "processing pipelines" for tilesets + - The `Pipeline` class describes the pipeline with its input and output, and contains one or more `TilesetStage` objects + - The `TilesetStage` describes an operation that is applied to the tileset as a whole, usually focussing on modifications of the tileset JSON object. It may contain one or more `ContentStage` objects + - The `ContentStage` is an operation that may be applied to tile content (i.e. "files") that are part of the tileset + - Instances of these classes may be created with the `Pipelines`, `TilesetStages`, and `ContentStages` classes, respectively + - A pipeline may be executed by a `PipelineExecutor`. - `./src/spatial`: Basic classes for dealing with tree structures, specifically with quadtrees and octrees @@ -73,10 +78,13 @@ Parts of the current implementation may still change. This page is only a short - `TilesetCombiner`: Used to "inline" external tilesets into a single one - `TilesetMerger`: Used to create one tileset that refers to others as external tilesets - `TilesetUpgrader`: Upgrade a tileset to a newer version (many aspects unspecified here) + - The (abstract) `TilesetProcessor` class and the (concrete) `BasicTilesetProcessor` class offer an infrastructure for generic operations on the tilesets and their content. These classes serve as the basis for the implementation of the pipeline execution functionality. - `./src/tilesets`: Utility functions for tileset operations - `Tiles` for traversing (explicit!) tile hierarchies - `Tilesets` offering convenience functions for `merge/combine/upgrade` + - `Contents` with utility functions related to tile `content` objects + - `Extensions` for handling extensions and extension declarations in tilesets (and glTF objects) - `./src/traversal`: Classes for traversing tilesets - NOTE: The `SubtreeModel`/`SubtreeMetadataModel` interfaces _might_ at some point be moved into `implicitTiling`, but are currently tailored for the use in the traversal classes, and should be considered to be an "implementation detail" here. diff --git a/README.md b/README.md index b1589c0a..228b2cb5 100644 --- a/README.md +++ b/README.md @@ -45,9 +45,43 @@ npx ts-node ./src/main.ts combine -i ./specs/data/combineTilesets/input -o ./spe Merge multiple tilesets into a single one that refers to the input tilesets as external tilesets. ``` -npx ts-node ./src/main.ts merge -i ./specs/data/mergeTilesets/input/TilesetA -i ./specs/data/mergeTilesets/input/sub/TilesetA -o ./specs/data/mergeTilesets/output +npx ts-node ./src/main.ts merge -i ./specs/data/mergeTilesets/TilesetA -i ./specs/data/mergeTilesets/sub/TilesetA -o ./specs/data/mergeTilesets/output ``` +#### upgrade + +Upgrade a tileset to the latest 3D Tiles version. +``` +npx ts-node ./src/main.ts upgrade -i ./specs/data/TilesetOfTilesets/tileset.json -o ./output/upgraded +``` +The exact behavior of the upgrade operation is not yet specified. But when B3DM- and I3DM tile content in the input tileset uses glTF 1.0 assets, then the upgrade step will try to upgrade these assets to glTF 2.0. + +#### convert + +(This replaces the `databaseToTileset` and `tilesetToDatabase` commands) + +Convert between tilesets and tileset package formats. +``` +npx ts-node ./src/main.ts upgrade -i ./specs/data/TilesetOfTilesets/tileset.json -o ./output/TilesetOfTilesets.3tz +``` + +The input- and output arguments for this command may be + +- The name of a directory that contains a `tileset.json` file (or the full path to a tileset JSON file) +- The name of a `.3tz` file +- The name of a `.3dtiles` file + +The input may also be a `.zip` file that contains a `tileset.json` file. + +#### databaseToTileset + +Deprecated. This functionality is now offered via the `convert` command. + +#### tilesetToDatabase + +Deprecated. This functionality is now offered via the `convert` command. + + ### Command line tools for tile content @@ -116,7 +150,7 @@ npx ts-node ./src/main.ts optimizeB3dm -i ./specs/data/Textured/batchedTextured. This example optimizes the b3dm and compresses the meshes using Draco, with a high compression level. -### optimizeI3dm +#### optimizeI3dm Optimize a i3dm using [gltf-pipeline](https://github.com/CesiumGS/gltf-pipeline/blob/main/README.md). ``` @@ -125,18 +159,79 @@ npx ts-node ./src/main.ts optimizeI3dm -i ./specs/data/instancedWithBatchTableBi See [optimizeB3dm](#optimizeb3dm) for further examples. -### upgrade +### Pipeline -Upgrade a tileset to the latest 3D Tiles version. +Execute a sequence of operations that are described in a JSON file. + +> **Note:** The pipeline execution feature is preliminary. Many aspects of the pipeline definition, including the JSON representation and the exact set of operations that are supported as parts of pipelines may change in future releases. + + The basic structure of a pipeline JSON file is summarized here: + +- A pipeline has an `input` and `output`, which are the names of a tileset directory or package +- A pipeline has an array of 'tileset stages' +- A tileset stage has a `name` and a `description` +- A tileset stage has an array of 'content stages' +- A content stage has a `name` and a `description` +- A content stage can carry information about the content types that it is applied to + +A simple example pipline may therefore look like this: ``` -npx ts-node ./src/main.ts upgrade -i ./specs/data/TilesetOfTilesets/tileset.json -o ./output/upgraded +{ + "input": "./specs/data/TilesetOfTilesetsWithUris", + "output": "./output/TilesetOfTilesetsWithUris.3tz", + "tilesetStages": [ + { + "name": "_b3dmToGlb", + "description": "Convert B3DM to GLB", + "contentStages": [ + { + "name": "b3dmToGlb", + "description": "Convert each B3DM content into GLB" + } + ] + } + ] +} ``` -The exact behavior of the upgrade operation is not yet specified. But when B3DM- and I3DM tile content in the input tileset uses glTF 1.0 assets, then the upgrade step will try to upgrade these assets to glTF 2.0. + +The `name` of a tileset- or content stage can refer to a predefined set of operations that can be executed. If a `name` is not one of the known operations, it should start with an `_` underscore. + +The `description` of a tileset- or content stage is intended as a human-readable summary, to be shown as log output. + +The predefined operations largely correspond to the command-line functionality. + +The known tileset stages are: + +- `upgrade`: Upgrade the input tileset to the latest version. Details about what that means are omitted here. +- `combine`: Combine all external tilesets of the input tileset, to create a single tileset + +The known content stages are: + +- Compression: + - `gzip`: Apply GZIP compression to all files (with optional filters) + - `ungzip`: Uncompress all files that are compressed with GZIP +- Conversion: + - `glbToB3dm`: Convert all GLB tile contents into B3DM + - `glbToI3dm`: Convert all GLB tile contents into I3DM (with the GLB being the only instance) + - `b3dmToGlb`: Convert all B3DM tile contents into GLB (assuming that the B3DM is only a wrapper around GLB) + - `i3dmToGlb`: Convert all I3DM tile contents into GLB (assuming that the I3DM is only a wrapper around GLB) + - `separateGltf`: Convert all GLB tile contents into `.gltf` files with external resources +- Optimization: + + These operations receive an `options` object, which is an untyped object carrying the options that are passed to `gltf-pipeline` for the optimization. + - `optimizeGlb`: Optimize GLB tile content, using `gltf-pipeline` + - `optimizeB3dm`: Optimize the GLB payload of a B3DM tile content, using `gltf-pipeline` + - `optimizeI3dm`: Optimize the GLB payload of a I3DM tile content, using `gltf-pipeline` + +An example of a pipeline that combines a sequence of multiple operations is shown in [`examplePipeline.json`](./specs/data/pipelines/examplePipeline.json). + --- -**Draft** demos for the library usage: +## Demos + +The `demos` folder contains some examples of how the functionality of the tools may be used as a library. This is intended as a preview. The functionality is not yet exposed as a public API. ### General tool functions diff --git a/demos/BinaryMetadataDemos.ts b/demos/BinaryMetadataDemos.ts new file mode 100644 index 00000000..adac650e --- /dev/null +++ b/demos/BinaryMetadataDemos.ts @@ -0,0 +1,207 @@ +import { ClassProperty } from "../src/structure/Metadata/ClassProperty"; +import { BinaryPropertyTables } from "../src/metadata/binary/BinaryPropertyTables"; +import { BinaryPropertyTableModel } from "../src/metadata/binary/BinaryPropertyTableModel"; + +/** + * A test for the `BinaryPropertyTableModel` class. + * + * It creates a (binary) property table that contains a single + * property with the given structure and the given values. + * From that, it creates a `BinaryPropertyTable`, and goes + * through all its rows (as `MetadataEntityModel` instances), + * and prints the values that this entity model has for + * the given property. + * + * (These should be the same as the given input property values) + * + * @param name - A name for log messages + * @param classProperty - The `ClassProperty` + * @param propertyValues - The property values + */ +function runPropertyTableModelTest( + name: string, + classProperty: ClassProperty, + propertyValues: any +) { + const count = propertyValues.length; + const arrayOffsetType = "UINT32"; + const stringOffsetType = "UINT32"; + const binaryPropertyTable = + BinaryPropertyTables.createBinaryPropertyTableFromProperty( + "testProperty", + classProperty, + propertyValues, + arrayOffsetType, + stringOffsetType, + undefined + ); + const propertyTableModel = new BinaryPropertyTableModel(binaryPropertyTable); + console.log("For " + name); + console.log(" Original values: " + JSON.stringify(propertyValues)); + for (let i = 0; i < count; i++) { + const entity0 = propertyTableModel.getMetadataEntityModel(i); + const value0 = entity0.getPropertyValue("testProperty"); + console.log(` Value from MetadataEntity ${i}: ` + JSON.stringify(value0)); + } +} + +/** + * Calls `runPropertyTableModelTest` for various class property types + */ +function runPropertyTableModelTests() { + const example_variable_length_FLOAT32_SCALAR_array = { + type: "SCALAR", + componentType: "FLOAT32", + array: true, + }; + const example_fixed_length_FLOAT32_SCALAR_array = { + type: "SCALAR", + componentType: "FLOAT32", + array: true, + count: 5, + }; + const example_STRING = { + type: "STRING", + }; + const example_variable_length_STRING_array = { + type: "STRING", + array: true, + }; + const example_fixed_length_STRING_array = { + type: "STRING", + array: true, + count: 5, + }; + const example_BOOLEAN = { + type: "BOOLEAN", + }; + const example_variable_length_BOOLEAN_array = { + type: "BOOLEAN", + array: true, + }; + const example_fixed_length_BOOLEAN_array = { + type: "BOOLEAN", + array: true, + count: 5, + }; + + const example_variable_length_UINT32_VEC2_array = { + type: "VEC2", + componentType: "UINT32", + array: true, + }; + const example_fixed_length_UINT32_VEC2_array = { + type: "VEC2", + componentType: "UINT32", + array: true, + count: 5, + }; + + const example_variable_length_FLOAT32_SCALAR_array_values = [ + [-1.0, -0.5, 0.0, 0.5, 1.0], + [-1.0, 0.0, 1.0], + ]; + const example_fixed_length_FLOAT32_SCALAR_array_values = [ + [-1.0, -0.5, 0.0, 0.5, 1.0], + [1.0, 2.0, 3.0, 4.0, 5.0], + ]; + const example_STRING_values = ["This is a test", "This is another test"]; + const example_variable_length_STRING_array_values = [ + ["This", "is", "a", "test"], + ["Another", "test"], + ]; + const example_fixed_length_STRING_array_values = [ + ["zero", "one", "two", "three", "four"], + ["A", "B", "C", "D", "E"], + ]; + const example_BOOLEAN_values = [true, false]; + const example_variable_length_BOOLEAN_array_values = [ + [true, false, true, false], + [false, true, false], + ]; + const example_fixed_length_BOOLEAN_array_values = [ + [true, false, true, false, true], + [false, true, false, true, false], + ]; + + const example_variable_length_UINT32_VEC2_array_values = [ + [ + [0, 1], + [2, 3], + [4, 5], + ], + [ + [6, 7], + [8, 9], + ], + ]; + const example_fixed_length_UINT32_VEC2_array_values = [ + [ + [0, 1], + [2, 3], + [4, 5], + [6, 7], + [8, 9], + ], + [ + [10, 11], + [12, 13], + [14, 15], + [16, 17], + [18, 19], + ], + ]; + + runPropertyTableModelTest( + "example_fixed_length_STRING_array", + example_fixed_length_STRING_array, + example_fixed_length_STRING_array_values + ); + runPropertyTableModelTest( + "example_variable_length_BOOLEAN_array", + example_variable_length_BOOLEAN_array, + example_variable_length_BOOLEAN_array_values + ); + runPropertyTableModelTest( + "example_fixed_length_UINT32_VEC2_array", + example_fixed_length_UINT32_VEC2_array, + example_fixed_length_UINT32_VEC2_array_values + ); + runPropertyTableModelTest( + "example_variable_length_STRING_array", + example_variable_length_STRING_array, + example_variable_length_STRING_array_values + ); + runPropertyTableModelTest( + "example_fixed_length_FLOAT32_SCALAR_array", + example_fixed_length_FLOAT32_SCALAR_array, + example_fixed_length_FLOAT32_SCALAR_array_values + ); + runPropertyTableModelTest( + "example_STRING", + example_STRING, + example_STRING_values + ); + runPropertyTableModelTest( + "example_variable_length_FLOAT32_SCALAR_array", + example_variable_length_FLOAT32_SCALAR_array, + example_variable_length_FLOAT32_SCALAR_array_values + ); + runPropertyTableModelTest( + "example_BOOLEAN", + example_BOOLEAN, + example_BOOLEAN_values + ); + runPropertyTableModelTest( + "example_fixed_length_BOOLEAN_array", + example_fixed_length_BOOLEAN_array, + example_fixed_length_BOOLEAN_array_values + ); + runPropertyTableModelTest( + "example_variable_length_UINT32_VEC2_array", + example_variable_length_UINT32_VEC2_array, + example_variable_length_UINT32_VEC2_array_values + ); +} + +runPropertyTableModelTests(); diff --git a/demos/MetadataDemos.ts b/demos/MetadataDemos.ts new file mode 100644 index 00000000..0e730157 --- /dev/null +++ b/demos/MetadataDemos.ts @@ -0,0 +1,156 @@ +import { defaultValue } from "../src/base/defaultValue"; +import { readJsonUnchecked } from "./readJsonUnchecked"; + +import { MetadataClass } from "../src/structure/Metadata/MetadataClass"; +import { Tileset } from "../src/structure/Tileset"; + +import { MetadataEntityModels } from "../src/metadata/MetadataEntityModels"; + +/** + * Test the `MetadataEntityModels` class for the type + * "exampleScalarInt32" + */ +function testExampleScalarInt32() { + console.log("exampleScalarInt32:"); + const metadataClass: MetadataClass = { + properties: { + testProperty: { + type: "SCALAR", + componentType: "INT32", + }, + }, + }; + const entityJson = { + testProperty: 1234, + }; + const entity = MetadataEntityModels.createFromClass( + metadataClass, + entityJson + ); + + const value = entity.getPropertyValue("testProperty"); + console.log(" Property value: " + value); +} + +/** + * Test the `MetadataEntityModels` class for the type + * "exampleArrayInt16WithDefault" + */ +function testExampleArrayInt16WithDefault() { + console.log("exampleArrayInt16WithDefault:"); + const metadataClass: MetadataClass = { + properties: { + testProperty: { + array: true, + type: "SCALAR", + componentType: "INT16", + required: false, + noData: [], + default: [1, 1, 1], + }, + }, + }; + const entityJson = { + testProperty: undefined, + }; + const entity = MetadataEntityModels.createFromClass( + metadataClass, + entityJson + ); + const value = entity.getPropertyValue("testProperty"); + console.log(" Property value: " + value); +} + +/** + * Test the `MetadataEntityModels` class for the type + * "exampleVec3Uint16Normalized" + */ +function testExampleVec3Uint16Normalized() { + console.log("exampleVec3Uint16Normalized:"); + const metadataClass: MetadataClass = { + properties: { + testProperty: { + type: "VEC3", + componentType: "UINT16", + normalized: true, + }, + }, + }; + const entityJson = { + testProperty: [0, 32767, 65535], + }; + const entity = MetadataEntityModels.createFromClass( + metadataClass, + entityJson + ); + const value = entity.getPropertyValue("testProperty"); + console.log(" Property value: " + value); +} + +/** + * Creates a string for a (metadata property) value. + * The exact format is not specified. This is only + * intended for the demo. But the string will be + * exhaustive. + * + * @param value - The value + * @returns The string + */ +function createValueString(value: any) { + let result = ""; + if (Array.isArray(value)) { + result += "["; + for (let i = 0; i < value.length; i++) { + if (i > 0) { + result += ", "; + } + result += createValueString(value[i]); + } + result += "]"; + return result; + } + if (typeof value === "string") { + return '"' + value + '"'; + } + return value.toString(); +} + +/** + * Prints all properties of the `TilesetWithFullMetadata` sample + * data, as they are obtained via a `MetadataEntityModel`. + */ +async function testTilesetWithFullMetadata() { + // Note: This is making some assumptions about + // the input, and only intended for the basic + // demo of the `MetadataEntityModels` class + const tileset: Tileset = await readJsonUnchecked( + "./specs/data/TilesetWithFullMetadata/tileset.json" + ); + const metadataSchema = tileset.schema; + const metadataEntity = tileset.metadata; + if (!metadataSchema || !metadataEntity) { + console.log("Test input was invalid"); + return; + } + const entity = MetadataEntityModels.create(metadataSchema, metadataEntity); + + console.log("Metadata property values:"); + const metadataClasses = defaultValue(metadataSchema.classes, {}); + const metadataClass = metadataClasses["exampleClass"]; + const properties = metadataClass.properties ?? {}; + for (const propertyName of Object.keys(properties)) { + const nameString = propertyName.padStart(60); + const value = entity.getPropertyValue(propertyName); + const valueString = createValueString(value); + console.log(` Property value of ${nameString}: ${valueString}`); + } +} + +function runDemos() { + testExampleScalarInt32(); + testExampleArrayInt16WithDefault(); + testExampleVec3Uint16Normalized(); + testTilesetWithFullMetadata(); +} + +runDemos(); diff --git a/demos/PackageConversion.ts b/demos/PackageConversion.ts index a55ae399..7775484d 100644 --- a/demos/PackageConversion.ts +++ b/demos/PackageConversion.ts @@ -7,7 +7,7 @@ import minimist from "minimist"; import { TilesetTargets } from "../src/tilesetData/TilesetTargets"; import { TilesetSources } from "../src/tilesetData/TilesetSources"; -import { ZipToPackage } from "./ZipToPackage"; +import { ZipToPackage } from "../src/packages/ZipToPackage"; /** * Print the help message showing the command line options @@ -73,7 +73,7 @@ async function tilesetPackageConversion(options: any) { const inputExtension = path.extname(input).toLowerCase(); if (inputExtension === ".zip") { - ZipToPackage.convert(input, output, overwrite); + await ZipToPackage.convert(input, output, overwrite); } else { const tilesetSource = TilesetSources.createAndOpen(input); const tilesetTarget = TilesetTargets.createAndBegin(output, overwrite); diff --git a/demos/PackagesDemo.ts b/demos/PackagesDemo.ts index 5921c49a..6e4314bb 100644 --- a/demos/PackagesDemo.ts +++ b/demos/PackagesDemo.ts @@ -51,7 +51,7 @@ async function readPackageExample(fileName: string) { async function run() { console.log("Running test"); - const directory = "./data/"; + const directory = "./output/"; if (!fs.existsSync(directory)) { fs.mkdirSync(directory, { recursive: true }); } diff --git a/demos/PipelineDrafts.ts b/demos/PipelineDrafts.ts index d8d6e4f5..68175735 100644 --- a/demos/PipelineDrafts.ts +++ b/demos/PipelineDrafts.ts @@ -5,8 +5,8 @@ import { Pipelines } from "../src/pipelines/Pipelines"; function createPipelineDraftJson() { const pipelineJson = { - input: "./data/Tileset", - output: "./data/testTileset_final", + input: "./specs/data/TilesetWithUris", + output: "./output/pipelineDrafts", tilesetStages: [ { name: "unzip-and-zip-tiles-only", diff --git a/demos/PipelineExperiments.ts b/demos/PipelineExperiments.ts new file mode 100644 index 00000000..deedda09 --- /dev/null +++ b/demos/PipelineExperiments.ts @@ -0,0 +1,73 @@ +import { PipelineExecutor } from "../src/pipelines/PipelineExecutor"; +import { Pipelines } from "../src/pipelines/Pipelines"; + +function createPipelineExperimentsJson() { + const optimizeGlbOptions = { + dracoOptions: { + compressionLevel: 10, + }, + }; + + const b3dmToGlbJson = { + name: "B3DM to GLB", + description: "Convert B3DM to GLB", + contentStages: [ + { + name: "b3dmToGlb", + description: "Convert each B3DM content into GLB", + }, + ], + }; + + const optimizeGlbJson = { + name: "Optimize GLB", + description: "Optimize GLB", + contentStages: [ + { + name: "optimizeGlb", + description: + "Apply gltf-pipeline to each GLB content, with the given options", + options: optimizeGlbOptions, + }, + ], + }; + + const separateGltfJson = { + name: "Separate glTF", + description: "Separate glTF", + contentStages: [ + { + name: "separateGltf", + description: + "Convert each GLB content into a .gltf file with separate resources", + }, + ], + }; + + const dummyJson = { + name: "Dummy", + description: + "Dummy (to have the final output in a directory before writing it into the package)", + }; + + const pipelineJson = { + input: "./specs/data/TilesetWithUris/tileset.json", + output: "./output/result.3tz", + tilesetStages: [ + b3dmToGlbJson, + optimizeGlbJson, + separateGltfJson, + dummyJson, + ], + }; + return pipelineJson; +} + +async function runPipelineDrafts() { + const pipelineJson = createPipelineExperimentsJson(); + const pipeline = Pipelines.createPipeline(pipelineJson); + const overwrite = true; + await PipelineExecutor.executePipeline(pipeline, overwrite); +} + +runPipelineDrafts(); diff --git a/demos/PipelineExperimentsContentStages.ts b/demos/PipelineExperimentsContentStages.ts new file mode 100644 index 00000000..e194fe03 --- /dev/null +++ b/demos/PipelineExperimentsContentStages.ts @@ -0,0 +1,91 @@ +import { Pipeline } from "../src/pipelines/Pipeline"; +import { TilesetStages } from "../src/pipelines/TilesetStages"; +import { ContentStages } from "../src/pipelines/ContentStages"; +import { PipelineExecutor } from "../src/pipelines/PipelineExecutor"; +import { TilesetStage } from "../src/pipelines/TilesetStage"; + +function createPipeline(tilesetStage: TilesetStage) { + const nameSuffix = tilesetStage.name.replace(/[^\w\s]/gi, ""); + const input = "./specs/data/tilesetProcessing/contentProcessing"; + const output = + "./specs/data/output/tilesetProcessing/contentProcessing/output-" + + nameSuffix; + const pipeline: Pipeline = { + input: input, + output: output, + tilesetStages: [tilesetStage], + }; + return pipeline; +} + +async function example() { + const optimizeGlbOptions = { + dracoOptions: { + compressionLevel: 10, + }, + }; + + const tilesetStageB3dmToGlb = TilesetStages.create( + "_b3dmToGlb", + "Convert B3DM to GLB", + [ContentStages.createB3dmToGlb()] + ); + + const tilesetStageI3dmToGlb = TilesetStages.create( + "_i3dmToGlb", + "Convert I3DM to GLB", + [ContentStages.createI3dmToGlb()] + ); + + const tilesetStageGlbToB3dm = TilesetStages.create( + "_glbToB3dm", + "Convert GLB to B3DM", + [ContentStages.createGlbToB3dm()] + ); + + const tilesetStageGlbToI3dm = TilesetStages.create( + "_glbToI3dm", + "Convert GLB to I3DM", + [ContentStages.createGlbToI3dm()] + ); + + const tilesetStageOptimizeB3dm = TilesetStages.create( + "_optimizeB3dm", + "Optimize the GLB part of B3DM", + [ContentStages.createOptimizeB3dm(optimizeGlbOptions)] + ); + + const tilesetStageOptimizeI3dm = TilesetStages.create( + "_optimizeI3dm", + "Optimize the GLB part of I3DM", + [ContentStages.createOptimizeI3dm(optimizeGlbOptions)] + ); + + const overwrite = true; + await PipelineExecutor.executePipeline( + createPipeline(tilesetStageB3dmToGlb), + overwrite + ); + await PipelineExecutor.executePipeline( + createPipeline(tilesetStageI3dmToGlb), + overwrite + ); + await PipelineExecutor.executePipeline( + createPipeline(tilesetStageGlbToB3dm), + overwrite + ); + await PipelineExecutor.executePipeline( + createPipeline(tilesetStageGlbToI3dm), + overwrite + ); + await PipelineExecutor.executePipeline( + createPipeline(tilesetStageOptimizeB3dm), + overwrite + ); + await PipelineExecutor.executePipeline( + createPipeline(tilesetStageOptimizeI3dm), + overwrite + ); +} + +example(); diff --git a/demos/PipelineExperimentsTilesetStages.ts b/demos/PipelineExperimentsTilesetStages.ts new file mode 100644 index 00000000..9e726d0a --- /dev/null +++ b/demos/PipelineExperimentsTilesetStages.ts @@ -0,0 +1,46 @@ +import { Pipeline } from "../src/pipelines/Pipeline"; +import { TilesetStages } from "../src/pipelines/TilesetStages"; +import { ContentStages } from "../src/pipelines/ContentStages"; +import { PipelineExecutor } from "../src/pipelines/PipelineExecutor"; +import { ContentDataTypes } from "../src/contentTypes/ContentDataTypes"; + +async function example() { + const input = "./specs/data/TilesetOfTilesets"; + const output = "./output/result.3tz"; + const overwrite = true; + const optimizeGlbOptions = { + dracoOptions: { + compressionLevel: 10, + }, + }; + + const tilesetStages = [ + TilesetStages.createUpgrade(), + TilesetStages.createCombine(), + TilesetStages.create("_b3dmToGlb", "Convert B3DM to GLB", [ + ContentStages.createB3dmToGlb(), + ]), + TilesetStages.create("_optimizeGlb", "Optimize GLB", [ + ContentStages.createOptimizeGlb(optimizeGlbOptions), + ]), + TilesetStages.create("_separateGltf", "Separate glTF", [ + ContentStages.createSeparateGltf(), + ]), + TilesetStages.create("_gzip", "Gzip (glTF only)", [ + ContentStages.createGzip([ContentDataTypes.CONTENT_TYPE_GLTF]), + ]), + ]; + + const pipeline: Pipeline = { + input: input, + output: output, + tilesetStages: tilesetStages, + }; + + const pipelineJsonString = JSON.stringify(pipeline, null, 2); + console.log("Executing pipeline:\n" + pipelineJsonString); + + await PipelineExecutor.executePipeline(pipeline, overwrite); +} + +example(); diff --git a/demos/SpatialDemos.ts b/demos/SpatialDemos.ts new file mode 100644 index 00000000..5c4e620b --- /dev/null +++ b/demos/SpatialDemos.ts @@ -0,0 +1,75 @@ +import { Quadtrees } from "../src/spatial/Quadtrees"; +import { QuadtreeCoordinates } from "../src/spatial/QuadtreeCoordinates"; +import { OctreeCoordinates } from "../src/spatial/OctreeCoordinates"; + +import { TemplateUris } from "../src/implicitTiling/TemplateUris"; + +/** + * A basic demo of the `QuadtreeCoordinates.children` method + */ +function testQuadtreeChildren() { + const r = new QuadtreeCoordinates(0, 0, 0); + console.log("Children of " + r + ":"); + for (const c of r.children()) { + console.log(" " + c + " index " + c.toIndex() + " parent " + c.parent()); + } +} + +/** + * A basic demo of the `QuadtreeCoordinates.descendants` method + */ +function testQuadtreeDescendants() { + const r = new QuadtreeCoordinates(0, 0, 0); + const maxLevelInclusive = 3; + const depthFirst = true; + console.log("Descendants of " + r + " up to " + maxLevelInclusive + ":"); + for (const c of r.descendants(maxLevelInclusive, depthFirst)) { + console.log(" " + c + " index " + c.toIndex() + " parent " + c.parent()); + } +} + +/** + * A basic demo of the `Quadtrees.coordinatesForLevel` method + */ +function testQuadtreeLevel() { + const level = 3; + const coords = Quadtrees.coordinatesForLevel(3); + console.log("Coordinates in level " + level + ":"); + for (const c of coords) { + console.log(" " + c); + } +} + +/** + * A basic demo for the `TemplateUris.substituteQuadtree` method + */ +function testSubstituteQuadtree() { + const uri = "test-{level}-{x}-{y}"; + const c = new QuadtreeCoordinates(3, 4, 5); + const s = TemplateUris.substituteQuadtree(uri, c); + console.log("uri : " + uri); + console.log("coordinates: " + c); + console.log("result : " + s); +} + +/** + * A basic demo for the `TemplateUris.substituteOctree` method + */ +function testSubstituteOctree() { + const uri = "test-{level}-{x}-{y}-{z}"; + const c = new OctreeCoordinates(3, 4, 5, 6); + const s = TemplateUris.substituteOctree(uri, c); + console.log("uri : " + uri); + console.log("coordinates: " + c); + console.log("result : " + s); +} + +async function runDemos() { + testQuadtreeChildren(); + testQuadtreeDescendants(); + testQuadtreeLevel(); + testSubstituteQuadtree(); + testSubstituteOctree(); +} + +runDemos(); diff --git a/demos/SubtreeInfoDemos.ts b/demos/SubtreeInfoDemos.ts new file mode 100644 index 00000000..2116426e --- /dev/null +++ b/demos/SubtreeInfoDemos.ts @@ -0,0 +1,58 @@ +import path from "path"; + +import { readJsonUnchecked } from "./readJsonUnchecked"; + +import { ResourceResolvers } from "../src/io/ResourceResolvers"; + +import { QuadtreeCoordinates } from "../src/spatial/QuadtreeCoordinates"; +import { SubtreeInfos } from "../src/implicitTiling/SubtreeInfos"; + +async function testSubtreeInfo() { + // Create a `SubtreeInfo` for a valid subtree, from + // the specs data directory + const subtreeFilePath = "./specs/data/subtrees/validSubtree.json"; + const implcitTilingFilePath = + "specs/data/subtrees/validSubtreeImplicitTiling.json.input"; + const implicitTiling = await readJsonUnchecked(implcitTilingFilePath); + const subtree = await readJsonUnchecked(subtreeFilePath); + const directory = path.dirname(subtreeFilePath); + const resourceResolver = + ResourceResolvers.createFileResourceResolver(directory); + const subtreeInfo = await SubtreeInfos.createFromJson( + subtree, + implicitTiling, + resourceResolver + ); + if (!subtreeInfo) { + console.log("Could not resolve subtree data"); + return; + } + + // Print the tile availability information, accessing it by index + console.log("Tile availability from indices:"); + const tileAvailabilityInfo = subtreeInfo.tileAvailabilityInfo; + for (let i = 0; i < tileAvailabilityInfo.length; i++) { + const available = tileAvailabilityInfo.isAvailable(i); + console.log(" at index " + i + " available :" + available); + } + + // Print the tile availability information, accessing it with + // the index that is computed from QuadtreeCoordinates + console.log("Tile availability from coordinates:"); + const r = new QuadtreeCoordinates(0, 0, 0); + const depthFirst = false; + const maxLevelInclusive = implicitTiling.subtreeLevels - 1; + for (const c of r.descendants(maxLevelInclusive, depthFirst)) { + const index = c.toIndex(); + const available = tileAvailabilityInfo.isAvailable(index); + console.log( + " " + c + " index " + c.toIndex() + " available: " + available + ); + } +} + +async function runDemos() { + await testSubtreeInfo(); +} + +runDemos(); diff --git a/demos/TilesetProcessingDemos.ts b/demos/TilesetProcessingDemos.ts index 3a0e96bc..63bc5a6b 100644 --- a/demos/TilesetProcessingDemos.ts +++ b/demos/TilesetProcessingDemos.ts @@ -53,16 +53,16 @@ async function tilesetProcessingDemos() { ); await combineTilesets( - "./specs/data/combineTilesets/tileset.json", - "./specs/data/output/combineTilesets" + "./specs/data/combineTilesets/nestedExternal/tileset.json", + "./specs/data/output/nestedExternal/combineTilesets" ); await mergeTilesets( [ - "./specs/data/mergeTilesets/TilesetA/tileset.json", - "./specs/data/mergeTilesets/sub/TilesetA/tileset.json", + "./specs/data/mergeTilesets/basicMerge/TilesetA/tileset.json", + "./specs/data/mergeTilesets/basicMerge/sub/TilesetA/tileset.json", ], - "./specs/data/output/mergeTilesets/merged.3tz" + "./specs/data/output/mergeTilesets/basicMerge.3tz" ); } diff --git a/demos/TilesetProcessorExamples.ts b/demos/TilesetProcessorExamples.ts new file mode 100644 index 00000000..c11412a9 --- /dev/null +++ b/demos/TilesetProcessorExamples.ts @@ -0,0 +1,72 @@ +/* eslint-disable @typescript-eslint/no-unused-vars */ +import { Content } from "../src/structure/Content"; +import { Schema } from "../src/structure/Metadata/Schema"; +import { Tile } from "../src/structure/Tile"; +import { Tileset } from "../src/structure/Tileset"; + +import { TilesetEntry } from "../src/tilesetData/TilesetEntry"; + +import { BasicTilesetProcessor } from "../src/tilesetProcessing/BasicTilesetProcessor"; + +import { TraversedTile } from "../src/traversal/TraversedTile"; + +async function example() { + const tilesetSourceName = + "../3d-tiles-samples/1.1/SparseImplicitQuadtree/tileset.json"; + const tilesetTargetName = + "./output/SparseImplicitQuadtree-result/tileset.json"; + const overwrite = true; + + const tilesetProcessor = new BasicTilesetProcessor(); + await tilesetProcessor.begin(tilesetSourceName, tilesetTargetName, overwrite); + + // Apply a callback to each (explicit) `Tile` + await tilesetProcessor.forEachExplicitTile( + async (tile: Tile): Promise => { + console.log("In forEachExplicitTile"); + return; + } + ); + + // Apply a callback to each `TraversedTile` + await tilesetProcessor.forEachTile( + async (traversedTile: TraversedTile): Promise => { + console.log("In forEachTile"); + } + ); + + // Process all entries + await tilesetProcessor.processAllEntries( + async ( + sourceEntry: TilesetEntry, + type: string | undefined + ): Promise => { + console.log("In processAllEntries"); + return sourceEntry; + } + ); + + // Process all entries that are tile content + await tilesetProcessor.processTileContentEntries( + (uri: string) => uri, + async ( + sourceEntry: TilesetEntry, + type: string | undefined + ): Promise => { + console.log("In processTileContentEntries"); + return sourceEntry; + } + ); + + // Apply a callback to the tileset and its schema + await tilesetProcessor.forTileset( + async (tileset: Tileset, schema: Schema | undefined): Promise => { + console.log("In forTileset"); + return; + } + ); + + await tilesetProcessor.end(); +} + +example(); diff --git a/demos/TraversalDemo.ts b/demos/TraversalDemo.ts index d64624bd..86d76ce6 100644 --- a/demos/TraversalDemo.ts +++ b/demos/TraversalDemo.ts @@ -4,21 +4,23 @@ import { readJsonUnchecked } from "./readJsonUnchecked"; import { ResourceResolvers } from "../src/io/ResourceResolvers"; import { TilesetTraverser } from "../src/traversal/TilesetTraverser"; +import { TraversedTile } from "../src/traversal/TraversedTile"; async function tilesetTraversalDemo(filePath: string) { + console.log(`Traversing tileset ${filePath}`); + const directory = path.dirname(filePath); const resourceResolver = ResourceResolvers.createFileResourceResolver(directory); const tileset = await readJsonUnchecked(filePath); - // Note: External schemas are not considered here - const schema = tileset.schema; - const depthFirst = false; - console.log("Traversing tileset"); - await TilesetTraverser.traverse( + + const tilesetTraverser = new TilesetTraverser(directory, resourceResolver, { + depthFirst: false, + traverseExternalTilesets: true, + }); + await tilesetTraverser.traverse( tileset, - schema, - resourceResolver, - async (traversedTile) => { + async (traversedTile: TraversedTile) => { const contentUris = traversedTile.getFinalContents().map((c) => c.uri); const geometricError = traversedTile.asFinalTile().geometricError; console.log( @@ -28,16 +30,24 @@ async function tilesetTraversalDemo(filePath: string) { `geometricError ${geometricError}` ); return true; - }, - depthFirst + } ); console.log("Traversing tileset DONE"); } -async function runDemo() { - const tilesetFileName = - "../3d-tiles-samples/1.1/SparseImplicitQuadtree/tileset.json"; +async function runBasicDemo() { + const tilesetFileName = "./specs/data/TilesetWithUris/tileset.json"; await tilesetTraversalDemo(tilesetFileName); } +async function runExternalDemo() { + const tilesetFileName = "./specs/data/TilesetOfTilesetsWithUris/tileset.json"; + await tilesetTraversalDemo(tilesetFileName); +} + +async function runDemo() { + await runBasicDemo(); + await runExternalDemo(); +} + runDemo(); diff --git a/demos/TraversalStatsDemo.ts b/demos/TraversalStatsDemo.ts index fad50e03..7979c904 100644 --- a/demos/TraversalStatsDemo.ts +++ b/demos/TraversalStatsDemo.ts @@ -1,4 +1,3 @@ -import fs from "fs"; import path from "path"; import { readJsonUnchecked } from "./readJsonUnchecked"; @@ -7,40 +6,110 @@ import { ResourceResolvers } from "../src/io/ResourceResolvers"; import { TilesetTraverser } from "../src/traversal/TilesetTraverser"; import { TraversedTile } from "../src/traversal/TraversedTile"; +import { BufferedContentData } from "../src/contentTypes/BufferedContentData"; +import { ContentDataTypeChecks } from "../src/contentTypes/ContentDataTypeChecks"; +import { ContentDataTypes } from "../src/contentTypes/ContentDataTypes"; // A small demo that traverses a tileset, passes all // traversed tiles to a "StatsCollector" (defined below), // and creates a short JSON summary of some statistics. async function tilesetTraversalDemo(filePath: string) { + const statsCollector = new StatsCollector(); + + // Create a check that determines whether content + // data should count as "tile content" + const isTileFileContent = ContentDataTypeChecks.createIncludedCheck( + ContentDataTypes.CONTENT_TYPE_GLB, + ContentDataTypes.CONTENT_TYPE_B3DM, + ContentDataTypes.CONTENT_TYPE_I3DM, + ContentDataTypes.CONTENT_TYPE_CMPT, + ContentDataTypes.CONTENT_TYPE_PNTS, + ContentDataTypes.CONTENT_TYPE_GEOM, + ContentDataTypes.CONTENT_TYPE_VCTR, + ContentDataTypes.CONTENT_TYPE_GEOJSON, + ContentDataTypes.CONTENT_TYPE_GLTF + ); + + // A `TraversalCallback` that will be passed to the + // tileset traverser, and store information about the + // traversed tiles in the `StatsCollector` + const statsTraversalCallback = async (traversedTile: TraversedTile) => { + { + const indent = " ".repeat(traversedTile.level); + const contentUris = traversedTile.getFinalContents().map((c) => c.uri); + const geometricError = traversedTile.asFinalTile().geometricError; + const message = + indent + + "Level " + + traversedTile.level + + ", geometricError " + + geometricError + + ", contents " + + contentUris; + console.log(message); + } + + statsCollector.increment("totalNumberOfTiles"); + const subtreeUri = traversedTile.getSubtreeUri(); + if (subtreeUri !== undefined) { + statsCollector.increment("totalNumberOfSubtrees"); + } + if (!traversedTile.isImplicitTilesetRoot()) { + // Obtain all content URIs, resolve the associated data, + // and store the size in the "tileFileSize" summary if + // the data is one of the known tile content types + const contentUris = traversedTile.getFinalContents().map((c) => c.uri); + const tileResourceResolver = traversedTile.getResourceResolver(); + for (const contentUri of contentUris) { + const data = await tileResourceResolver.resolveData(contentUri); + if (!data) { + statsCollector.increment("unresolvableContents"); + } else { + const contentData = new BufferedContentData(contentUri, data); + const isTileFile = await isTileFileContent(contentData); + if (isTileFile) { + statsCollector.acceptEntry("tileFileSize", data.length); + statsCollector.acceptEntry( + "tileFileSize_" + traversedTile.level, + data.length + ); + } + } + } + } + + // Store the geometric error in the "geometricError" summary + const finalTile = traversedTile.asFinalTile(); + const geometricError = finalTile.geometricError; + statsCollector.acceptEntry("geometricError", geometricError); + statsCollector.acceptEntry( + "geometricError_" + traversedTile.level, + geometricError + ); + return true; + }; + // Read the tileset from the input path const directory = path.dirname(filePath); const resourceResolver = ResourceResolvers.createFileResourceResolver(directory); const tileset = await readJsonUnchecked(filePath); - // Note: External schemas are not considered here - const schema = tileset.schema; - // Traverse the tileset, and pass each tile to - // the StatsCollector + // Create the TilesetTraverser and traverse the tileset, + // passing each tile to the callback that stores the + // information in the StatsCollector console.log("Traversing tileset"); - const tilesetStatsCollector = new TilesetStatsCollector(); - const depthFirst = false; - await TilesetTraverser.traverse( - tileset, - schema, - resourceResolver, - async (traversedTile) => { - tilesetStatsCollector.accept(traversedTile); - return true; - }, - depthFirst - ); + const tilesetTraverser = new TilesetTraverser(directory, resourceResolver, { + depthFirst: false, + traverseExternalTilesets: true, + }); + await tilesetTraverser.traverse(tileset, statsTraversalCallback); console.log("Traversing tileset DONE"); // Print the statistics summary to the console console.log("Stats:"); - const json = tilesetStatsCollector.createJson(); + const json = statsCollector.createJson(); const jsonString = JSON.stringify(json, null, 2); console.log(jsonString); } @@ -174,43 +243,9 @@ class Summary { } } -// A specialization of the `StatsColleror` that collects -// information about the tiles that are traversed with -// a TilesetTraverser -class TilesetStatsCollector extends StatsCollector { - // Accept the given tile during traversal, and collect - // statistical information - accept(traversedTile: TraversedTile) { - this.increment("totalNumberOfTiles"); - - // NOTE: This is a means of checking whether a tile - // is the root of an implicit tileset. This may be - // refactored at some point. - if (traversedTile.getImplicitTiling()) { - this.increment("totalNumberOfSubtres"); - } else { - // Obtain all content URIs, resolve them, and obtain - // the sizes of the corresponding files, storing them - // in the "tileFileSize" summary - const contentUris = traversedTile.getFinalContents().map((c) => c.uri); - for (const contentUri of contentUris) { - const resolvedContentUri = traversedTile.resolveUri(contentUri); - const stats = fs.statSync(resolvedContentUri); - const tileFileSizeInBytes = stats.size; - this.acceptEntry("tileFileSize", tileFileSizeInBytes); - } - } - - // Store the geometric error in the "geometricError" summary - const finalTile = traversedTile.asFinalTile(); - const geometricError = finalTile.geometricError; - this.acceptEntry("geometricError", geometricError); - } -} - async function runDemo() { const tilesetFileName = - "../3d-tiles-samples/1.1/SparseImplicitQuadtree/tileset.json"; + "./specs/data/tilesetProcessing/implicitProcessing/tileset.json"; await tilesetTraversalDemo(tilesetFileName); } diff --git a/etc/3d-tiles-tools.api.md b/etc/3d-tiles-tools.api.md index b0acc469..07f4d317 100644 --- a/etc/3d-tiles-tools.api.md +++ b/etc/3d-tiles-tools.api.md @@ -203,7 +203,8 @@ export class Buffers { static getBufferPadded(buffer: Buffer, byteOffset?: number): Buffer; static getJson(buffer: Buffer): any; static getJsonBufferPadded(json: any, byteOffset?: number): Buffer; - static getMagic(buffer: Buffer, byteOffset?: number): string; + static getMagicBytes(buffer: Buffer, byteOffset: number, byteLength: number): Buffer; + static getMagicString(buffer: Buffer, byteOffset?: number): string; static getUnicodeBOMDescription(buffer: Buffer): string | undefined; static gunzip(inputBuffer: Buffer): Buffer; static gzip(inputBuffer: Buffer): Buffer; @@ -292,7 +293,7 @@ export interface ContentData { exists(): Promise; get extension(): string; getData(): Promise; - getMagic(): Promise; + getMagic(): Promise; getParsedObject(): Promise; get uri(): string; } @@ -302,6 +303,7 @@ export interface ContentData { // @internal export class ContentDataTypeRegistry { static findContentDataType(contentData: ContentData): Promise; + static findType(uri: string, data: Buffer): Promise; } // Warning: (ae-internal-missing-underscore) The name "DefaultMetadataEntityModel" should be prefixed with an underscore because the declaration is marked as @internal @@ -353,22 +355,19 @@ export class ExplicitTraversedTile implements TraversedTile { constructor(tile: Tile, path: string, level: number, parent: TraversedTile | undefined, schema: Schema | undefined, resourceResolver: ResourceResolver); asFinalTile(): Tile; asRawTile(): Tile; + static createRoot(root: Tile, schema: Schema | undefined, resourceResolver: ResourceResolver): TraversedTile; getChildren(): Promise; - // (undocumented) getFinalContents(): Content[]; - // (undocumented) getImplicitTiling(): TileImplicitTiling | undefined; - // (undocumented) getMetadata(): MetadataEntity | undefined; getParent(): TraversedTile | undefined; getRawContents(): Content[]; - // (undocumented) + getResourceResolver(): ResourceResolver; getSubtreeUri(): string | undefined; + isImplicitTilesetRoot(): boolean; get level(): number; get path(): string; // (undocumented) - resolveUri(uri: string): string; - // (undocumented) toString: () => string; } @@ -387,7 +386,6 @@ export class FileResourceResolver implements ResourceResolver { derive(uri: string): ResourceResolver; resolveData(uri: string): Promise; resolveDataPartial(uri: string, maxBytes: number): Promise; - resolveUri(uri: string): string; } // Warning: (ae-internal-missing-underscore) The name "Group" should be prefixed with an underscore because the declaration is marked as @internal @@ -426,23 +424,17 @@ export class ImplicitTraversedTile implements TraversedTile { asFinalTile(): Tile; asRawTile(): Tile; getChildren(): Promise; - // (undocumented) getFinalContents(): Content[]; getGlobalCoordinate(): TreeCoordinates; - // (undocumented) - getImplicitTiling(): TileImplicitTiling | undefined; getLocalCoordinate(): TreeCoordinates; - // (undocumented) - getMetadata(): MetadataEntity | undefined; getParent(): TraversedTile | undefined; getRawContents(): Content[]; - // (undocumented) + getResourceResolver(): ResourceResolver; getSubtreeUri(): string | undefined; + isImplicitTilesetRoot(): boolean; get level(): number; get path(): string; // (undocumented) - resolveUri(uri: string): string; - // (undocumented) toString: () => string; } @@ -480,7 +472,7 @@ export class Iterables { next(): IteratorResult; }; static map(iterable: IterableIterator, mapper: (element: S) => T): IterableIterator; - static overFiles(directory: string | PathLike, recurse?: boolean): IterableIterator; + static overFiles(directory: string | PathLike, recurse: boolean): IterableIterator; } // Warning: (ae-internal-missing-underscore) The name "LazyContentData" should be prefixed with an underscore because the declaration is marked as @internal @@ -491,7 +483,7 @@ export class LazyContentData implements ContentData { exists(): Promise; get extension(): string; getData(): Promise; - getMagic(): Promise; + getMagic(): Promise; getParsedObject(): Promise; get uri(): string; } @@ -761,7 +753,6 @@ export interface ResourceResolver { derive(uri: string): ResourceResolver; resolveData(uri: string): Promise; resolveDataPartial(uri: string, maxBytes: number): Promise; - resolveUri(uri: string): string; } // Warning: (ae-internal-missing-underscore) The name "ResourceResolvers" should be prefixed with an underscore because the declaration is marked as @internal @@ -1082,11 +1073,10 @@ export class TilesetSourceFs implements TilesetSource { // // @internal export class TilesetSourceResourceResolver implements ResourceResolver { - constructor(basePath: string, tilesetSourceFileName: string, tilesetSource: TilesetSource); + constructor(basePath: string, tilesetSource: TilesetSource); derive(uri: string): ResourceResolver; resolveData(uri: string): Promise; resolveDataPartial(uri: string, maxBytes: number): Promise; - resolveUri(uri: string): string; } // Warning: (ae-internal-missing-underscore) The name "TilesetSources" should be prefixed with an underscore because the declaration is marked as @internal @@ -1150,7 +1140,10 @@ export class TilesetTargets { // // @internal export class TilesetTraverser { - static traverse(tileset: Tileset, schema: Schema | undefined, resourceResolver: ResourceResolver, traversalCallback: TraversalCallback, depthFirst: boolean): Promise; + constructor(baseUri: string, resourceResolver: ResourceResolver, options?: TraversalOptions); + traverse(tileset: Tileset, traversalCallback: TraversalCallback): Promise; + traverseWithSchema(tileset: Tileset, schema: Schema | undefined, traversalCallback: TraversalCallback): Promise; + traverseWithSchemaAt(tile: Tile, schema: Schema | undefined, traversalCallback: TraversalCallback): Promise; } // Warning: (ae-internal-missing-underscore) The name "TraversalCallback" should be prefixed with an underscore because the declaration is marked as @internal @@ -1160,6 +1153,12 @@ export interface TraversalCallback { (traversedTile: TraversedTile): Promise; } +// @public +export type TraversalOptions = { + depthFirst?: boolean; + traverseExternalTilesets?: boolean; +}; + // Warning: (ae-internal-missing-underscore) The name "TraversedTile" should be prefixed with an underscore because the declaration is marked as @internal // // @internal @@ -1167,20 +1166,14 @@ export interface TraversedTile { asFinalTile(): Tile; asRawTile(): Tile; getChildren(): Promise; - // (undocumented) getFinalContents(): Content[]; - // (undocumented) - getImplicitTiling(): TileImplicitTiling | undefined; - // (undocumented) - getMetadata(): MetadataEntity | undefined; getParent(): TraversedTile | undefined; getRawContents(): Content[]; - // (undocumented) + getResourceResolver(): ResourceResolver; getSubtreeUri(): string | undefined; + isImplicitTilesetRoot(): boolean; get level(): number; get path(): string; - // (undocumented) - resolveUri(uri: string): string; } // Warning: (ae-internal-missing-underscore) The name "TreeCoordinates" should be prefixed with an underscore because the declaration is marked as @internal @@ -1204,7 +1197,6 @@ export class UnzippingResourceResolver implements ResourceResolver { derive(uri: string): ResourceResolver; resolveData(uri: string): Promise; resolveDataPartial(uri: string, maxBytes: number): Promise; - resolveUri(uri: string): string; } // Warning: (ae-internal-missing-underscore) The name "Uris" should be prefixed with an underscore because the declaration is marked as @internal diff --git a/specs/BasicTilesetProcessorSpec.ts b/specs/BasicTilesetProcessorSpec.ts new file mode 100644 index 00000000..0ad257e1 --- /dev/null +++ b/specs/BasicTilesetProcessorSpec.ts @@ -0,0 +1,202 @@ +/* eslint-disable @typescript-eslint/no-unused-vars */ +import fs from "fs"; + +import { SpecHelpers } from "./SpecHelpers"; +import { SpecProcessor } from "./SpecEntryProcessor"; + +import { Paths } from "../src/base/Paths"; + +import { BasicTilesetProcessor } from "../src/tilesetProcessing/BasicTilesetProcessor"; + +import { Tiles } from "../src/tilesets/Tiles"; + +import { Tile } from "../src/structure/Tile"; + +import { TraversedTile } from "../src/traversal/TraversedTile"; + +const basicInput = "./specs/data/tilesetProcessing/basicProcessing"; +const basicOutput = "./specs/data/output/tilesetProcessing/basicProcessing"; +const quiet = true; +const overwrite = true; + +/** + * Tests that verify that the `forEach...` and `process...` methods + * of the BasicTilesetProcessor visit and process the correct + * elements on explicit tilesets + */ +describe("BasicTilesetProcessor on explicit input", function () { + afterEach(function () { + SpecHelpers.forceDeleteDirectory(basicOutput); + }); + + it("forEachExplicitTile covers all explicit tiles", async function () { + const tilesetProcessor = new BasicTilesetProcessor(quiet); + await tilesetProcessor.begin(basicInput, basicOutput, overwrite); + + const actualContentUris: string[][] = []; + await tilesetProcessor.forEachExplicitTile(async (tile: Tile) => { + const contentUris = Tiles.getContentUris(tile); + actualContentUris.push(contentUris); + }); + await tilesetProcessor.end(); + + // NOTE: The order is actually not specified. + // This should be sorted lexographically for + // the comparison... + const expectedContentUris = [ + ["tileA.b3dm"], + ["tileB.b3dm", "sub/tileB.b3dm"], + ["tileC.b3dm"], + ["tileA.b3dm"], + ]; + expect(actualContentUris).toEqual(expectedContentUris); + }); + + it("forEachTile covers all tiles", async function () { + const tilesetProcessor = new BasicTilesetProcessor(quiet); + await tilesetProcessor.begin(basicInput, basicOutput, overwrite); + + const actualContentUris: string[][] = []; + await tilesetProcessor.forEachTile(async (traversedTile: TraversedTile) => { + const contentUris = traversedTile.getFinalContents().map((c) => c.uri); + actualContentUris.push(contentUris); + }); + await tilesetProcessor.end(); + + // NOTE: The order is actually not specified. + // This should be sorted lexographically for + // the comparison... + const expectedContentUris = [ + ["tileA.b3dm"], + ["tileB.b3dm", "sub/tileB.b3dm"], + ["tileC.b3dm"], + ["tileA.b3dm"], + ]; + expect(actualContentUris).toEqual(expectedContentUris); + }); + + it("processAllEntries processes all entries exactly once", async function () { + const tilesetProcessor = new BasicTilesetProcessor(quiet); + await tilesetProcessor.begin(basicInput, basicOutput, overwrite); + const specProcessor = new SpecProcessor(); + await tilesetProcessor.processAllEntries(specProcessor.processEntry); + await tilesetProcessor.end(); + + // Expect ALL files to have been processed + // (except for 'tileset.json') + const expectedProcessedKeys = [ + "README.md", + "sub/tileB.b3dm", + "tileA.b3dm", + "tileB.b3dm", + "tileC.b3dm", + ]; + const actualProcessedKeys = specProcessor.processedKeys; + actualProcessedKeys.sort(); + expect(actualProcessedKeys).toEqual(expectedProcessedKeys); + + // Expect the names of ALL files to have been modified + // (except for 'tileset.json') + const expectedOutputFiles = [ + "PROCESSED_README.md", + "PROCESSED_sub/tileB.b3dm", + "PROCESSED_tileA.b3dm", + "PROCESSED_tileB.b3dm", + "PROCESSED_tileC.b3dm", + "tileset.json", + ]; + const actualOutputFiles = SpecHelpers.collectRelativeFileNames(basicOutput); + actualOutputFiles.sort(); + expect(actualOutputFiles).toEqual(expectedOutputFiles); + }); + + it("processTileContentEntries processes the tile content entries", async function () { + const tilesetProcessor = new BasicTilesetProcessor(quiet); + await tilesetProcessor.begin(basicInput, basicOutput, overwrite); + const specProcessor = new SpecProcessor(); + await tilesetProcessor.processTileContentEntries( + specProcessor.processUri, + specProcessor.processEntry + ); + await tilesetProcessor.end(); + + // Expect the content files to have been processed + const expectedProcessedKeys = [ + "sub/tileB.b3dm", + "tileA.b3dm", + "tileB.b3dm", + "tileC.b3dm", + ]; + const actualProcessedKeys = specProcessor.processedKeys; + actualProcessedKeys.sort(); + expect(actualProcessedKeys).toEqual(expectedProcessedKeys); + + // Expect the names of content files to have been modified + const expectedOutputFiles = [ + "PROCESSED_sub/tileB.b3dm", + "PROCESSED_tileA.b3dm", + "PROCESSED_tileB.b3dm", + "PROCESSED_tileC.b3dm", + "README.md", + "tileset.json", + ]; + const actualOutputFiles = SpecHelpers.collectRelativeFileNames(basicOutput); + actualOutputFiles.sort(); + expect(actualOutputFiles).toEqual(expectedOutputFiles); + }); + + it("processTileContentEntries updates the content URIs", async function () { + const tilesetProcessor = new BasicTilesetProcessor(quiet); + await tilesetProcessor.begin(basicInput, basicOutput, overwrite); + const specProcessor = new SpecProcessor(); + await tilesetProcessor.processTileContentEntries( + specProcessor.processUri, + specProcessor.processEntry + ); + await tilesetProcessor.end(); + + // Ensure that the 'tileset.json' contains the + // proper content URIs for the processed output + const tilesetJsonBuffer = fs.readFileSync( + Paths.join(basicOutput, "tileset.json") + ); + const tileset = JSON.parse(tilesetJsonBuffer.toString()); + const actualContentUris = await SpecHelpers.collectExplicitContentUris( + tileset.root + ); + actualContentUris.sort(); + + const expectedContentUris = [ + "PROCESSED_sub/tileB.b3dm", + "PROCESSED_tileA.b3dm", + "PROCESSED_tileA.b3dm", + "PROCESSED_tileB.b3dm", + "PROCESSED_tileC.b3dm", + ]; + expect(actualContentUris).toEqual(expectedContentUris); + }); + + it("processAllEntries only processes unprocessed entries", async function () { + const tilesetProcessor = new BasicTilesetProcessor(quiet); + await tilesetProcessor.begin(basicInput, basicOutput, overwrite); + + // First, process all content entries + const contentsSpecProcessor = new SpecProcessor(); + await tilesetProcessor.processTileContentEntries( + contentsSpecProcessor.processUri, + contentsSpecProcessor.processEntry + ); + + // Now, process all remaining entries + const specProcessor = new SpecProcessor(); + await tilesetProcessor.processAllEntries(specProcessor.processEntry); + await tilesetProcessor.end(); + + // Expect only the non-content entries to have been processed + // in processAllEntries + const expectedProcessedKeys = ["README.md"]; + const actualProcessedKeys = specProcessor.processedKeys; + actualProcessedKeys.sort(); + expect(actualProcessedKeys).toEqual(expectedProcessedKeys); + }); +}); diff --git a/specs/ContentDataTypesSpec.ts b/specs/ContentDataTypesSpec.ts index 0aa97435..e58fdc3a 100644 --- a/specs/ContentDataTypesSpec.ts +++ b/specs/ContentDataTypesSpec.ts @@ -52,6 +52,34 @@ describe("ContentDataTypeRegistry.findContentDataType", function () { expect(type).toEqual(ContentDataTypes.CONTENT_TYPE_VCTR); }); + it("detects SUBT", async function () { + const contentUri = "specs/data/contentTypes/content.subtree"; + const c = BufferedContentData.create(contentUri); + const type = await ContentDataTypeRegistry.findContentDataType(c); + expect(type).toEqual(ContentDataTypes.CONTENT_TYPE_SUBT); + }); + + it("detects PNG", async function () { + const contentUri = "specs/data/contentTypes/content.png"; + const c = BufferedContentData.create(contentUri); + const type = await ContentDataTypeRegistry.findContentDataType(c); + expect(type).toEqual(ContentDataTypes.CONTENT_TYPE_PNG); + }); + + it("detects JPEG", async function () { + const contentUri = "specs/data/contentTypes/content.jpg"; + const c = BufferedContentData.create(contentUri); + const type = await ContentDataTypeRegistry.findContentDataType(c); + expect(type).toEqual(ContentDataTypes.CONTENT_TYPE_JPEG); + }); + + it("detects GIF", async function () { + const contentUri = "specs/data/contentTypes/content.gif"; + const c = BufferedContentData.create(contentUri); + const type = await ContentDataTypeRegistry.findContentDataType(c); + expect(type).toEqual(ContentDataTypes.CONTENT_TYPE_GIF); + }); + it("detects GEOJSON", async function () { const contentUri = "specs/data/contentTypes/content.geojson"; const c = BufferedContentData.create(contentUri); diff --git a/specs/GltfUtilitiesSpec.ts b/specs/GltfUtilitiesSpec.ts new file mode 100644 index 00000000..3ffcfccc --- /dev/null +++ b/specs/GltfUtilitiesSpec.ts @@ -0,0 +1,96 @@ +import GltfPipeline from "gltf-pipeline"; + +import { GltfUtilities } from "../src/contentProcessing/GtlfUtilities"; + +// A glTF that uses CESIUM_RTC. +// It defines two scenes +// - scene 0 with nodes 0 and 1 +// - scene 1 with nodes 2 and 3 +const inputGltfWithCesiumRtc: any = { + asset: { + version: "2.0", + }, + extensionsUsed: ["CESIUM_RTC"], + extensionsRequired: ["CESIUM_RTC"], + extensions: { + CESIUM_RTC: { + center: [123.456, 234.567, 345.678], + }, + }, + scene: 0, + scenes: [ + { + nodes: [0, 1], + }, + { + nodes: [2, 3], + }, + ], + nodes: [ + { + name: "node0", + }, + { + name: "node1", + }, + { + name: "node2", + }, + { + name: "node3", + }, + { + name: "node4", + }, + { + name: "node5", + }, + ], +}; + +describe("GltfUtilities", function () { + it("replaceCesiumRtcExtension replaces the CESIUM_RTC extension", async function () { + const inputGltf = inputGltfWithCesiumRtc; + const rtcTranslation = inputGltf.extensions["CESIUM_RTC"].center; + const options = { + keepUnusedElements: true, + }; + + // Create a GLB from the input glTF + const glbResults = await GltfPipeline.gltfToGlb(inputGltf, options); + const inputGlb = glbResults.glb; + + // Remove the RTC extension + const outputGlb = await GltfUtilities.replaceCesiumRtcExtension(inputGlb); + + // Create a glTF from the resulting GLB + const gltfResults = await GltfPipeline.glbToGltf(outputGlb, options); + const outputGltf = gltfResults.gltf; + + // There are 10 nodes, namely the 6 existing ones, plus 4 new roots + expect(outputGltf.nodes.length).toBe(10); + + // The former roots of scene 0 (nodes 0 and 1) have + // been re-parented to nodes 6 and 7 + expect(outputGltf.scenes[0].nodes).toEqual([6, 7]); + expect(outputGltf.nodes[6].children).toEqual([0]); + expect(outputGltf.nodes[7].children).toEqual([1]); + + // The former roots of scene 0 (nodes 2 and 3) have + // been re-parented to nodes 6 and 7 + expect(outputGltf.scenes[1].nodes).toEqual([8, 9]); + expect(outputGltf.nodes[8].children).toEqual([2]); + expect(outputGltf.nodes[9].children).toEqual([3]); + + // All new nodes have the RTC center as their translation + expect(outputGltf.nodes[8].translation).toEqual(rtcTranslation); + expect(outputGltf.nodes[9].translation).toEqual(rtcTranslation); + expect(outputGltf.nodes[6].translation).toEqual(rtcTranslation); + expect(outputGltf.nodes[7].translation).toEqual(rtcTranslation); + + // The extensions object and declarations have been removed + expect(outputGltf.extensions).toBeUndefined(); + expect(outputGltf.extensionsUsed).toBeUndefined(); + expect(outputGltf.extensionsRequired).toBeUndefined(); + }); +}); diff --git a/specs/ImplicitTilesetProcessorSpec.ts b/specs/ImplicitTilesetProcessorSpec.ts new file mode 100644 index 00000000..b8f82238 --- /dev/null +++ b/specs/ImplicitTilesetProcessorSpec.ts @@ -0,0 +1,138 @@ +/* eslint-disable @typescript-eslint/no-unused-vars */ +import fs from "fs"; + +import { SpecHelpers } from "./SpecHelpers"; +import { SpecProcessor } from "./SpecEntryProcessor"; + +import { BasicTilesetProcessor } from "../src/tilesetProcessing/BasicTilesetProcessor"; + +import { Tiles } from "../src/tilesets/Tiles"; + +import { Tile } from "../src/structure/Tile"; + +import { TraversedTile } from "../src/traversal/TraversedTile"; + +import { TilesetSources } from "../src/tilesetData/TilesetSources"; + +const implicitInput = "./specs/data/tilesetProcessing/implicitProcessing"; +const implicitOutput = + "./specs/data/output/tilesetProcessing/implicitProcessing"; +const quiet = true; +const overwrite = true; + +/** + * Tests that verify that the `forEach...` and `process...` methods + * of the BasicTilesetProcessor visit and process the correct + * elements on implicit tilesets + */ +describe("BasicTilesetProcessor on implicit input", function () { + afterEach(function () { + SpecHelpers.forceDeleteDirectory(implicitOutput); + }); + + it("forEachExplicitTile covers all explicit tiles", async function () { + const tilesetProcessor = new BasicTilesetProcessor(quiet); + await tilesetProcessor.begin(implicitInput, implicitOutput, overwrite); + + // There is only one explicit tile in the 'implicitProcessing' data + const actualContentUris: string[][] = []; + await tilesetProcessor.forEachExplicitTile(async (tile: Tile) => { + const contentUris = Tiles.getContentUris(tile); + actualContentUris.push(contentUris); + }); + await tilesetProcessor.end(); + + // NOTE: The order is actually not specified. + // This should be sorted lexographically for + // the comparison... + const expectedContentUris = [["content/content_{level}__{x}_{y}.glb"]]; + expect(actualContentUris).toEqual(expectedContentUris); + }); + + it("forEachTile covers all tiles", async function () { + const tilesetProcessor = new BasicTilesetProcessor(quiet); + await tilesetProcessor.begin(implicitInput, implicitOutput, overwrite); + + const actualContentUris: string[][] = []; + await tilesetProcessor.forEachTile(async (traversedTile: TraversedTile) => { + const contentUris = traversedTile.getFinalContents().map((c) => c.uri); + actualContentUris.push(contentUris); + }); + await tilesetProcessor.end(); + + // Just check the number of content URIs from visited tiles: + // - 1 for the implicit tiling root (with template URI as content URI) + // - 1 for the root + // - 4 for the tiles at level 1 + // - 4 for the tiles at level 2 + // - 12 for the tiles at level 3 + expect(actualContentUris.length).toEqual(22); + }); + + it("processAllEntries processes all entries exactly once", async function () { + const tilesetProcessor = new BasicTilesetProcessor(quiet); + await tilesetProcessor.begin(implicitInput, implicitOutput, overwrite); + const specProcessor = new SpecProcessor(); + await tilesetProcessor.processAllEntries(specProcessor.processEntry); + await tilesetProcessor.end(); + + const actualProcessedKeys = specProcessor.processedKeys; + const actualOutputFiles = + SpecHelpers.collectRelativeFileNames(implicitOutput); + + // Just check the number of processed entries: It should be the same + // as the number of output files, minus 1 for the 'tileset.json' + expect(actualProcessedKeys.length).toEqual(actualOutputFiles.length - 1); + }); + + it("processTileContentEntries processes the tile content entries", async function () { + const tilesetProcessor = new BasicTilesetProcessor(quiet); + await tilesetProcessor.begin(implicitInput, implicitOutput, overwrite); + const specProcessor = new SpecProcessor(); + await tilesetProcessor.processTileContentEntries( + specProcessor.processUri, + specProcessor.processEntry + ); + await tilesetProcessor.end(); + + const actualProcessedKeys = specProcessor.processedKeys; + + // Just check the number of processed entries: It should be the same + // as the number of tiles in the input + // - 1 for the root + // - 4 for the tiles at level 1 + // - 4 for the tiles at level 2 + // - 12 for the tiles at level 3 + expect(actualProcessedKeys.length).toEqual(21); + }); + + it("processTileContentEntries updates the content URIs", async function () { + const tilesetProcessor = new BasicTilesetProcessor(quiet); + await tilesetProcessor.begin(implicitInput, implicitOutput, overwrite); + const specProcessor = new SpecProcessor(); + await tilesetProcessor.processTileContentEntries( + specProcessor.processUri, + specProcessor.processEntry + ); + await tilesetProcessor.end(); + + // Collect all content URIs from the output tileset + const outputTilesetSource = TilesetSources.createAndOpen(implicitOutput); + const outputTileset = SpecHelpers.parseTileset(outputTilesetSource); + const actualContentUris = await SpecHelpers.collectContentUris( + outputTileset, + outputTilesetSource + ); + outputTilesetSource.close(); + + // Ensure that all content URIs have been updated + for (const contentUri of actualContentUris) { + expect(contentUri.startsWith("PROCESSED")).toBeTrue(); + } + + // Ensure that the template URI was updated + const templateUri = outputTileset.root.content?.uri; + // eslint-disable-next-line @typescript-eslint/no-non-null-assertion + expect(templateUri!.startsWith("PROCESSED")).toBeTrue(); + }); +}); diff --git a/specs/ImplicitTilingsSpec.ts b/specs/ImplicitTilingsSpec.ts new file mode 100644 index 00000000..48f01dce --- /dev/null +++ b/specs/ImplicitTilingsSpec.ts @@ -0,0 +1,309 @@ +import { ImplicitTilings } from "../src/implicitTiling/ImplicitTilings"; + +import { OctreeCoordinates } from "../src/spatial/OctreeCoordinates"; +import { QuadtreeCoordinates } from "../src/spatial/QuadtreeCoordinates"; + +import { SpecHelpers } from "./SpecHelpers"; + +function createQuadtreeImplicitTiling(subtreeLevels: number) { + const implicitTiling = { + subdivisionScheme: "QUADTREE", + subtreeLevels: subtreeLevels, + availableLevels: subtreeLevels * 2, + subtrees: { + uri: "SPEC_SUBTREES_URI", + }, + }; + return implicitTiling; +} + +function createOctreeImplicitTiling(subtreeLevels: number) { + const implicitTiling = { + subdivisionScheme: "OCTREE", + subtreeLevels: subtreeLevels, + availableLevels: subtreeLevels * 2, + subtrees: { + uri: "SPEC_SUBTREES_URI", + }, + }; + return implicitTiling; +} + +describe("ImplicitTilings", function () { + it("creates a proper iterator for QUADTREE with createSubtreeCoordinatesIterator", function () { + const implicitTiling = createQuadtreeImplicitTiling(3); + const iterator = + ImplicitTilings.createSubtreeCoordinatesIterator(implicitTiling); + const actualCoordinates = [...iterator].map((c) => c.toArray()); + + SpecHelpers.sortLexicographically(actualCoordinates); + const expectedCoordinates = [ + [0, 0, 0], + [1, 0, 0], + [1, 1, 0], + [1, 0, 1], + [1, 1, 1], + [2, 0, 0], + [2, 1, 0], + [2, 0, 1], + [2, 1, 1], + [2, 2, 0], + [2, 3, 0], + [2, 2, 1], + [2, 3, 1], + [2, 0, 2], + [2, 1, 2], + [2, 0, 3], + [2, 1, 3], + [2, 2, 2], + [2, 3, 2], + [2, 2, 3], + [2, 3, 3], + ]; + SpecHelpers.sortLexicographically(expectedCoordinates); + expect(actualCoordinates).toEqual(expectedCoordinates); + }); + + it("creates a proper iterator for OCTREE with createSubtreeCoordinatesIterator", function () { + const implicitTiling = createOctreeImplicitTiling(3); + const iterator = + ImplicitTilings.createSubtreeCoordinatesIterator(implicitTiling); + const actualCoordinates = [...iterator].map((c) => c.toArray()); + + SpecHelpers.sortLexicographically(actualCoordinates); + const expectedCoordinates = [ + [0, 0, 0, 0], + [1, 0, 0, 0], + [1, 1, 0, 0], + [1, 0, 1, 0], + [1, 1, 1, 0], + [1, 0, 0, 1], + [1, 1, 0, 1], + [1, 0, 1, 1], + [1, 1, 1, 1], + [2, 0, 0, 0], + [2, 1, 0, 0], + [2, 0, 1, 0], + [2, 1, 1, 0], + [2, 0, 0, 1], + [2, 1, 0, 1], + [2, 0, 1, 1], + [2, 1, 1, 1], + [2, 2, 0, 0], + [2, 3, 0, 0], + [2, 2, 1, 0], + [2, 3, 1, 0], + [2, 2, 0, 1], + [2, 3, 0, 1], + [2, 2, 1, 1], + [2, 3, 1, 1], + [2, 0, 2, 0], + [2, 1, 2, 0], + [2, 0, 3, 0], + [2, 1, 3, 0], + [2, 0, 2, 1], + [2, 1, 2, 1], + [2, 0, 3, 1], + [2, 1, 3, 1], + [2, 2, 2, 0], + [2, 3, 2, 0], + [2, 2, 3, 0], + [2, 3, 3, 0], + [2, 2, 2, 1], + [2, 3, 2, 1], + [2, 2, 3, 1], + [2, 3, 3, 1], + [2, 0, 0, 2], + [2, 1, 0, 2], + [2, 0, 1, 2], + [2, 1, 1, 2], + [2, 0, 0, 3], + [2, 1, 0, 3], + [2, 0, 1, 3], + [2, 1, 1, 3], + [2, 2, 0, 2], + [2, 3, 0, 2], + [2, 2, 1, 2], + [2, 3, 1, 2], + [2, 2, 0, 3], + [2, 3, 0, 3], + [2, 2, 1, 3], + [2, 3, 1, 3], + [2, 0, 2, 2], + [2, 1, 2, 2], + [2, 0, 3, 2], + [2, 1, 3, 2], + [2, 0, 2, 3], + [2, 1, 2, 3], + [2, 0, 3, 3], + [2, 1, 3, 3], + [2, 2, 2, 2], + [2, 3, 2, 2], + [2, 2, 3, 2], + [2, 3, 3, 2], + [2, 2, 2, 3], + [2, 3, 2, 3], + [2, 2, 3, 3], + [2, 3, 3, 3], + ]; + SpecHelpers.sortLexicographically(expectedCoordinates); + expect(actualCoordinates).toEqual(expectedCoordinates); + }); + + it("throws an error for non-positive subtree levels in createSubtreeCoordinatesIterator", function () { + const implicitTiling = createQuadtreeImplicitTiling(0); + expect(function () { + ImplicitTilings.createSubtreeCoordinatesIterator(implicitTiling); + }).toThrowError(); + }); + + it("computes the right number of nodes for QUADTREE with computeNumberOfNodesPerSubtree", function () { + const implicitTiling1 = createQuadtreeImplicitTiling(1); + const implicitTiling2 = createQuadtreeImplicitTiling(2); + const implicitTiling4 = createQuadtreeImplicitTiling(4); + const implicitTiling8 = createQuadtreeImplicitTiling(8); + + const actualNumberOfNodes1 = + ImplicitTilings.computeNumberOfNodesPerSubtree(implicitTiling1); + const actualNumberOfNodes2 = + ImplicitTilings.computeNumberOfNodesPerSubtree(implicitTiling2); + const actualNumberOfNodes4 = + ImplicitTilings.computeNumberOfNodesPerSubtree(implicitTiling4); + const actualNumberOfNodes8 = + ImplicitTilings.computeNumberOfNodesPerSubtree(implicitTiling8); + + const expectedNumberOfNodes1 = 1; + const expectedNumberOfNodes2 = 5; + const expectedNumberOfNodes4 = 85; + const expectedNumberOfNodes8 = 21845; + + expect(actualNumberOfNodes1).toEqual(expectedNumberOfNodes1); + expect(actualNumberOfNodes2).toEqual(expectedNumberOfNodes2); + expect(actualNumberOfNodes4).toEqual(expectedNumberOfNodes4); + expect(actualNumberOfNodes8).toEqual(expectedNumberOfNodes8); + }); + it("computes the right number of nodes for OCTREE with computeNumberOfNodesPerSubtree", function () { + const implicitTiling1 = createOctreeImplicitTiling(1); + const implicitTiling2 = createOctreeImplicitTiling(2); + const implicitTiling4 = createOctreeImplicitTiling(4); + const implicitTiling8 = createOctreeImplicitTiling(8); + + const actualNumberOfNodes1 = + ImplicitTilings.computeNumberOfNodesPerSubtree(implicitTiling1); + const actualNumberOfNodes2 = + ImplicitTilings.computeNumberOfNodesPerSubtree(implicitTiling2); + const actualNumberOfNodes4 = + ImplicitTilings.computeNumberOfNodesPerSubtree(implicitTiling4); + const actualNumberOfNodes8 = + ImplicitTilings.computeNumberOfNodesPerSubtree(implicitTiling8); + + const expectedNumberOfNodes1 = 1; + const expectedNumberOfNodes2 = 9; + const expectedNumberOfNodes4 = 585; + const expectedNumberOfNodes8 = 2396745; + + expect(actualNumberOfNodes1).toEqual(expectedNumberOfNodes1); + expect(actualNumberOfNodes2).toEqual(expectedNumberOfNodes2); + expect(actualNumberOfNodes4).toEqual(expectedNumberOfNodes4); + expect(actualNumberOfNodes8).toEqual(expectedNumberOfNodes8); + }); + + it("throws an error for non-positive subtree levels in computeNumberOfNodesPerSubtree", function () { + const implicitTiling = createQuadtreeImplicitTiling(0); + expect(function () { + ImplicitTilings.computeNumberOfNodesPerSubtree(implicitTiling); + }).toThrowError(); + }); + + it("computes the right number of nodes for QUADTREE with computeNumberOfNodesInLevel", function () { + const implicitTiling = createQuadtreeImplicitTiling(8); + + const actualNumberOfNodes0 = ImplicitTilings.computeNumberOfNodesInLevel( + implicitTiling, + 0 + ); + const actualNumberOfNodes1 = ImplicitTilings.computeNumberOfNodesInLevel( + implicitTiling, + 1 + ); + const actualNumberOfNodes2 = ImplicitTilings.computeNumberOfNodesInLevel( + implicitTiling, + 2 + ); + const actualNumberOfNodes4 = ImplicitTilings.computeNumberOfNodesInLevel( + implicitTiling, + 4 + ); + + const expectedNumberOfNodes0 = 1; + const expectedNumberOfNodes1 = 4; + const expectedNumberOfNodes2 = 16; + const expectedNumberOfNodes4 = 256; + + expect(actualNumberOfNodes0).toEqual(expectedNumberOfNodes0); + expect(actualNumberOfNodes1).toEqual(expectedNumberOfNodes1); + expect(actualNumberOfNodes2).toEqual(expectedNumberOfNodes2); + expect(actualNumberOfNodes4).toEqual(expectedNumberOfNodes4); + }); + it("computes the right number of nodes for OCTREE with computeNumberOfNodesInLevel", function () { + const implicitTiling = createOctreeImplicitTiling(8); + + const actualNumberOfNodes0 = ImplicitTilings.computeNumberOfNodesInLevel( + implicitTiling, + 0 + ); + const actualNumberOfNodes1 = ImplicitTilings.computeNumberOfNodesInLevel( + implicitTiling, + 1 + ); + const actualNumberOfNodes2 = ImplicitTilings.computeNumberOfNodesInLevel( + implicitTiling, + 2 + ); + const actualNumberOfNodes4 = ImplicitTilings.computeNumberOfNodesInLevel( + implicitTiling, + 4 + ); + + const expectedNumberOfNodes0 = 1; + const expectedNumberOfNodes1 = 8; + const expectedNumberOfNodes2 = 64; + const expectedNumberOfNodes4 = 4096; + + expect(actualNumberOfNodes0).toEqual(expectedNumberOfNodes0); + expect(actualNumberOfNodes1).toEqual(expectedNumberOfNodes1); + expect(actualNumberOfNodes2).toEqual(expectedNumberOfNodes2); + expect(actualNumberOfNodes4).toEqual(expectedNumberOfNodes4); + }); + + it("throws an error for negative level in computeNumberOfNodesInLevel", function () { + const implicitTiling = createQuadtreeImplicitTiling(3); + expect(function () { + ImplicitTilings.computeNumberOfNodesInLevel(implicitTiling, -1); + }).toThrowError(); + }); + + it("computes the right coordinates for QUADTREE in globalizeCoordinates", function () { + const implicitTiling = createQuadtreeImplicitTiling(3); + const actualCoordinates = ImplicitTilings.globalizeCoordinates( + implicitTiling, + new QuadtreeCoordinates(2, 1, 0), + new QuadtreeCoordinates(1, 2, 1) + ); + + const expectedCoordinates = new QuadtreeCoordinates(3, 4, 1); + expect(actualCoordinates.toArray()).toEqual(expectedCoordinates.toArray()); + }); + + it("computes the right coordinates for QUADTREE in globalizeCoordinates", function () { + const implicitTiling = createOctreeImplicitTiling(3); + const actualCoordinates = ImplicitTilings.globalizeCoordinates( + implicitTiling, + new OctreeCoordinates(2, 1, 0, 1), + new OctreeCoordinates(1, 2, 1, 2) + ); + + const expectedCoordinates = new OctreeCoordinates(3, 4, 1, 4); + expect(actualCoordinates.toArray()).toEqual(expectedCoordinates.toArray()); + }); +}); diff --git a/specs/LazyContentDataSpec.ts b/specs/LazyContentDataSpec.ts index 5d45ae29..9406a6d1 100644 --- a/specs/LazyContentDataSpec.ts +++ b/specs/LazyContentDataSpec.ts @@ -1,44 +1,24 @@ -import { DeveloperError } from "../src/base/DeveloperError"; +/* eslint-disable @typescript-eslint/no-unused-vars */ + import { LazyContentData } from "../src/contentTypes/LazyContentData"; import { ResourceResolver } from "../src/io/ResourceResolver"; -class SpecResourceResolver implements ResourceResolver { - private readonly dataMap: { [key: string]: Buffer } = {}; - - putData(uri: string, buffer: Buffer) { - this.dataMap[uri] = buffer; - } +import { TilesetSourceResourceResolver } from "../src/io/TilesetSourceResourceResolver"; +import { TilesetInMemory } from "../src/tilesetData/TilesetInMemory"; - // eslint-disable-next-line @typescript-eslint/no-unused-vars - resolveUri(uri: string): string { - throw new DeveloperError("Not supposed to be called."); - } - async resolveData(uri: string): Promise { - const data = this.dataMap[uri] as Buffer; - if (!data) { - return null; - } - return data; - } - async resolveDataPartial( - uri: string, - maxBytes: number - ): Promise { - const data = this.dataMap[uri] as Buffer; - if (!data) { - return null; - } - return data.subarray(0, maxBytes); - } - // eslint-disable-next-line @typescript-eslint/no-unused-vars - derive(uri: string): ResourceResolver { - throw new DeveloperError("Not supposed to be called."); - } +function createTestResourceResolver(): ResourceResolver { + const tilesetSource = new TilesetInMemory(); + tilesetSource.open(""); + const resourceResolver = new TilesetSourceResourceResolver( + ".", + tilesetSource + ); + return resourceResolver; } describe("LazyContentData", function () { it("does not read data at construction", function () { - const resourceResolver = new SpecResourceResolver(); + const resourceResolver = createTestResourceResolver(); const resolveDataSpy = spyOn( resourceResolver, "resolveData" @@ -60,20 +40,18 @@ describe("LazyContentData", function () { jasmine.any(Number) ); - // eslint-disable-next-line @typescript-eslint/no-unused-vars const magic = contentData.getMagic(); expect(resolveDataSpy).toHaveBeenCalledTimes(0); expect(resolveDataPartialSpy).toHaveBeenCalledTimes(2); expect(resolveDataPartialSpy).toHaveBeenCalledWith("example.glb", 4); - // eslint-disable-next-line @typescript-eslint/no-unused-vars const data = contentData.getData(); expect(resolveDataSpy).toHaveBeenCalledTimes(1); expect(resolveDataPartialSpy).toHaveBeenCalledTimes(2); }); it("reads only a few bytes for getMagic", function () { - const resourceResolver = new SpecResourceResolver(); + const resourceResolver = createTestResourceResolver(); const resolveDataSpy = spyOn( resourceResolver, "resolveData" @@ -87,7 +65,6 @@ describe("LazyContentData", function () { expect(resolveDataSpy).toHaveBeenCalledTimes(0); expect(resolveDataPartialSpy).toHaveBeenCalledTimes(0); - // eslint-disable-next-line @typescript-eslint/no-unused-vars const magic = contentData.getMagic(); expect(resolveDataSpy).toHaveBeenCalledTimes(0); expect(resolveDataPartialSpy).toHaveBeenCalledTimes(1); @@ -95,7 +72,7 @@ describe("LazyContentData", function () { }); it("reads the data only once", async function () { - const resourceResolver = new SpecResourceResolver(); + const resourceResolver = createTestResourceResolver(); const resolveDataSpy = spyOn( resourceResolver, "resolveData" @@ -109,13 +86,9 @@ describe("LazyContentData", function () { expect(resolveDataSpy).toHaveBeenCalledTimes(0); expect(resolveDataPartialSpy).toHaveBeenCalledTimes(0); - // eslint-disable-next-line @typescript-eslint/no-unused-vars const data0 = await contentData.getData(); - // eslint-disable-next-line @typescript-eslint/no-unused-vars const object0 = await contentData.getParsedObject(); - // eslint-disable-next-line @typescript-eslint/no-unused-vars const data1 = await contentData.getData(); - // eslint-disable-next-line @typescript-eslint/no-unused-vars const object1 = await contentData.getParsedObject(); expect(resolveDataSpy).toHaveBeenCalledTimes(1); diff --git a/specs/PackageTilesetProcessorSpec.ts b/specs/PackageTilesetProcessorSpec.ts new file mode 100644 index 00000000..c684ef80 --- /dev/null +++ b/specs/PackageTilesetProcessorSpec.ts @@ -0,0 +1,145 @@ +import { SpecHelpers } from "./SpecHelpers"; + +import { BasicTilesetProcessor } from "../src/tilesetProcessing/BasicTilesetProcessor"; + +const basicInput = "./specs/data/tilesetProcessing/basicProcessing"; +const basicInput3tz = "./specs/data/tilesetProcessing/basicProcessing.3tz"; +const basicInput3dtiles = + "./specs/data/tilesetProcessing/basicProcessing.3dtiles"; + +const implicitInput = "./specs/data/tilesetProcessing/implicitProcessing"; +const implicitInput3tz = + "./specs/data/tilesetProcessing/implicitProcessing.3tz"; +const implicitInput3dtiles = + "./specs/data/tilesetProcessing/implicitProcessing.3dtiles"; + +const basicOutput = "./specs/data/output/tilesetProcessing/basicProcessing"; +const basicOutput3tz = + "./specs/data/output/tilesetProcessing/basicProcessing.3tz"; +const basicOutput3dtiles = + "./specs/data/output/tilesetProcessing/basicProcessing.3dtiles"; + +const implicitOutput = + "./specs/data/output/tilesetProcessing/implicitProcessing"; +const implicitOutput3tz = + "./specs/data/output/tilesetProcessing/implicitProcessing.3tz"; +const implicitOutput3dtiles = + "./specs/data/output/tilesetProcessing/implicitProcessing.3dtiles"; + +const quiet = true; +const overwrite = true; + +/** + * Tests that verify that the BasicTilesetProcessor operates properly + * when using packages as input or output + */ +describe("BasicTilesetProcessor on packages", function () { + afterEach(function () { + SpecHelpers.forceDeleteDirectory(basicOutput); + SpecHelpers.forceDeleteDirectory(implicitOutput); + }); + + it("writes basic from directories to 3TZ", async function () { + const tilesetProcessor = new BasicTilesetProcessor(quiet); + await tilesetProcessor.begin(basicInput, basicOutput3tz, overwrite); + await tilesetProcessor.end(); + + const difference = SpecHelpers.computePackageDifference( + basicInput3tz, + basicOutput3tz + ); + expect(difference).toBeUndefined(); + }); + + it("writes basic from 3TZ to directories", async function () { + const tilesetProcessor = new BasicTilesetProcessor(quiet); + await tilesetProcessor.begin(basicInput3tz, basicOutput, overwrite); + await tilesetProcessor.end(); + + const difference = SpecHelpers.computePackageDifference( + basicInput, + basicOutput + ); + expect(difference).toBeUndefined(); + }); + + it("writes basic from directories to 3DTILES", async function () { + const tilesetProcessor = new BasicTilesetProcessor(quiet); + await tilesetProcessor.begin(basicInput, basicOutput3dtiles, overwrite); + await tilesetProcessor.end(); + + const difference = SpecHelpers.computePackageDifference( + basicInput3dtiles, + basicOutput3dtiles + ); + expect(difference).toBeUndefined(); + }); + + it("writes basic from 3DTILES to directories", async function () { + const tilesetProcessor = new BasicTilesetProcessor(quiet); + await tilesetProcessor.begin(basicInput3dtiles, basicOutput, overwrite); + await tilesetProcessor.end(); + + const difference = SpecHelpers.computePackageDifference( + basicInput, + basicOutput + ); + expect(difference).toBeUndefined(); + }); + + it("writes implicit from directories to 3TZ", async function () { + const tilesetProcessor = new BasicTilesetProcessor(quiet); + await tilesetProcessor.begin(implicitInput, implicitOutput3tz, overwrite); + await tilesetProcessor.end(); + + const difference = SpecHelpers.computePackageDifference( + implicitInput3tz, + implicitOutput3tz + ); + expect(difference).toBeUndefined(); + }); + + it("writes implicit from 3TZ to directories", async function () { + const tilesetProcessor = new BasicTilesetProcessor(quiet); + await tilesetProcessor.begin(implicitInput3tz, implicitOutput, overwrite); + await tilesetProcessor.end(); + + const difference = SpecHelpers.computePackageDifference( + implicitInput, + implicitOutput + ); + expect(difference).toBeUndefined(); + }); + + it("writes implicit from directories to 3DTILES", async function () { + const tilesetProcessor = new BasicTilesetProcessor(quiet); + await tilesetProcessor.begin( + implicitInput, + implicitOutput3dtiles, + overwrite + ); + await tilesetProcessor.end(); + + const difference = SpecHelpers.computePackageDifference( + implicitInput3dtiles, + implicitOutput3dtiles + ); + expect(difference).toBeUndefined(); + }); + + it("writes implicit from 3DTILES to directories", async function () { + const tilesetProcessor = new BasicTilesetProcessor(quiet); + await tilesetProcessor.begin( + implicitInput3dtiles, + implicitOutput, + overwrite + ); + await tilesetProcessor.end(); + + const difference = SpecHelpers.computePackageDifference( + implicitInput, + implicitOutput + ); + expect(difference).toBeUndefined(); + }); +}); diff --git a/specs/SpecEntryProcessor.ts b/specs/SpecEntryProcessor.ts new file mode 100644 index 00000000..177c6266 --- /dev/null +++ b/specs/SpecEntryProcessor.ts @@ -0,0 +1,32 @@ +/* eslint-disable @typescript-eslint/no-unused-vars */ + +import { TilesetEntry } from "../src/tilesetData/TilesetEntry"; + +/** + * Utility class for processing tileset entries for the specs. + * + * It offers "dummy" methods for + * - modification of the URIs (file names) + * - content processing (just changing the file name) + * and stores all processed source entry names so that + * the exact set of processed entries may be checked + * in the tests. + */ +export class SpecProcessor { + processedKeys: string[] = []; + + processUri = (uri: string) => { + return "PROCESSED_" + uri; + }; + + processEntry = async ( + sourceEntry: TilesetEntry, + type: string | undefined + ) => { + this.processedKeys.push(sourceEntry.key); + return { + key: this.processUri(sourceEntry.key), + value: sourceEntry.value, + }; + }; +} diff --git a/specs/SpecHelpers.ts b/specs/SpecHelpers.ts new file mode 100644 index 00000000..bff97517 --- /dev/null +++ b/specs/SpecHelpers.ts @@ -0,0 +1,269 @@ +import fs from "fs"; + +import { Iterables } from "../src/base/Iterables"; +import { Paths } from "../src/base/Paths"; +import { DeveloperError } from "../src/base/DeveloperError"; + +import { Tile } from "../src/structure/Tile"; +import { Tileset } from "../src/structure/Tileset"; + +import { Tiles } from "../src/tilesets/Tiles"; + +import { TilesetSourceResourceResolver } from "../src/io/TilesetSourceResourceResolver"; + +import { TilesetTraverser } from "../src/traversal/TilesetTraverser"; + +import { TilesetSource } from "../src/tilesetData/TilesetSource"; +import { TilesetSources } from "../src/tilesetData/TilesetSources"; + +/** + * Utility methods for the specs + */ +export class SpecHelpers { + /** + * Returns the given byte length, padded if necessary to + * be a multiple of 8 + * + * @param byteLength - The byte length + * @returns The padded byte length + */ + static getPaddedByteLength(byteLength: number): number { + const boundary = 8; + const remainder = byteLength % boundary; + const padding = remainder === 0 ? 0 : boundary - remainder; + return byteLength + padding; + } + + /** + * Forcefully deletes the given directory and all its contents + * and subdirectories. Be careful. + * + * @param directory - The directory to delete + */ + static forceDeleteDirectory(directory: string) { + fs.rmSync(directory, { + force: true, + recursive: true, + }); + } + + /** + * Returns an array that contains the names of all files in + * the given directory and its subdirectories, relative to + * the given directory (with `/` as the path separator), + * in unspecified order. + * + * @param directory - The directory + * @returns The relative file names + */ + static collectRelativeFileNames(directory: string): string[] { + const allFiles = Iterables.overFiles(directory, true); + const relativeFiles = Iterables.map(allFiles, (file: string) => + Paths.relativize(directory, file) + ); + return Array.from(relativeFiles); + } + + /** + * Collect all content URIs that appear in the given tile or + * any of its descendants, in unspecified order. + * + * @param startTile - The start tile + * @returns A promise to all content URIs + */ + static async collectExplicitContentUris(startTile: Tile) { + const allContentUris: string[] = []; + await Tiles.traverseExplicit(startTile, async (tiles: Tile[]) => { + const tile = tiles[tiles.length - 1]; + const contentUris = Tiles.getContentUris(tile); + allContentUris.push(...contentUris); + return true; + }); + return allContentUris; + } + + /** + * Collect all content URIs (excluding possible template URIs in + * implicit tiling roots) that appear in the given tileset, in + * unspecified order. + * + * @param tileset - The tileset + * @param tilesetSource - The tileset source + * @returns A promise to all content URIs + */ + static async collectContentUris( + tileset: Tileset, + tilesetSource: TilesetSource + ) { + const resourceResolver = new TilesetSourceResourceResolver( + ".", + tilesetSource + ); + const tilesetTraverser = new TilesetTraverser(".", resourceResolver, { + depthFirst: false, + traverseExternalTilesets: true, + }); + const allContentUris: string[] = []; + await tilesetTraverser.traverse(tileset, async (traversedTile) => { + if (!traversedTile.isImplicitTilesetRoot()) { + const contentUris = traversedTile.getFinalContents().map((c) => c.uri); + allContentUris.push(...contentUris); + } + return true; + }); + return allContentUris; + } + + /** + * Parse the tileset from the 'tileset.json' in the given source + * + * @param tilesetSource - The tileset source + * @returns The tileset + * @throws DeveloperError if the tileset could not be read + */ + static parseTileset(tilesetSource: TilesetSource) { + const tilesetJsonBuffer = tilesetSource.getValue("tileset.json"); + if (!tilesetJsonBuffer) { + throw new DeveloperError("No tileset.json found in input"); + } + try { + const tileset = JSON.parse(tilesetJsonBuffer.toString()) as Tileset; + return tileset; + } catch (e) { + throw new DeveloperError(`${e}`); + } + } + + /** + * Returns whether the specified packages are equal. + * + * This means that they contain the same keys, and the + * keys are mapped to the same values. + * + * @param nameA - The first package name + * @param nameB - The second package name + * @returns A string describing the difference, or `undefined` + * if there is no difference. + */ + static computePackageDifference( + nameA: string, + nameB: string + ): string | undefined { + const tilesetSourceA = TilesetSources.createAndOpen(nameA); + const tilesetSourceB = TilesetSources.createAndOpen(nameB); + const result = SpecHelpers.computePackageDifferenceInternal( + nameA, + tilesetSourceA, + nameB, + tilesetSourceB + ); + tilesetSourceA.close(); + tilesetSourceB.close(); + return result; + } + + /** + * Returns whether the specified packages are equal. + * + * This means that they contain the same keys, and the + * keys are mapped to the same values. + * + * Entries that end in `.json` will be parsed and strigified + * for the comparison (to handle formatting differences), + * whereas other entries will be treated as "binary", and + * their values will be compared byte-wise. + * + * @param nameA - The first package name + * @param tilesetSourceA - The first package + * @param nameB - The second package name + * @param tilesetSourceB - The second package + * @returns A string describing the difference, or `undefined` + * if there is no difference. + */ + static computePackageDifferenceInternal( + nameA: string, + tilesetSourceA: TilesetSource, + nameB: string, + tilesetSourceB: TilesetSource + ): string | undefined { + const keysA = [...tilesetSourceA.getKeys()]; + const keysB = [...tilesetSourceB.getKeys()]; + + if (keysA.length != keysB.length) { + return `There are ${keysA.length} keys in ${nameA} and ${keysB.length} keys in ${nameB}`; + } + for (let i = 0; i < keysA.length; i++) { + if (keysA[i] != keysB[i]) { + return `Key ${i} is ${keysA[i]} in ${nameA} and ${keysA[i]} in ${nameB}`; + } + } + for (let i = 0; i < keysA.length; i++) { + const valueA = tilesetSourceA.getValue(keysA[i]); + const valueB = tilesetSourceB.getValue(keysB[i]); + if (valueA && valueB) { + if (keysA[i].endsWith(".json")) { + const jsonA = JSON.parse(valueA.toString()); + const jsonB = JSON.parse(valueB.toString()); + const stringA = JSON.stringify(jsonA); + const stringB = JSON.stringify(jsonB); + if (stringA !== stringB) { + return `Value ${keysA[i]} has different JSON contents in ${nameA} and in ${nameB}`; + } + } else { + if (valueA?.length != valueB?.length) { + return `Value ${keysA[i]} has ${valueA?.length} bytes in ${nameA} and ${valueB?.length} bytes in ${nameB}`; + } + const n = valueA.length; + for (let j = 0; j < n; j++) { + if (valueA[i] != valueB[i]) { + return `Value ${keysA[i]} has ${valueA[i]} at index ${j} in ${nameA} but ${valueB[i]} in ${nameB}`; + } + } + } + } + } + } + + /** + * Compares two arrays lexicographically. + * + * When the arrays have different lengths, then the shorter + * one will be "padded" with elements that are smaller than + * all other elements in the other array. + * + * @param a - The first array + * @param b - The second array + * @returns The result of the comparison + */ + private static compareLexicographically(a: number[], b: number[]) { + const n = Math.min(a.length, b.length); + for (let i = 0; i < n; i++) { + const d = a[i] - b[i]; + if (d !== 0) { + return d; + } + } + if (a.length < b.length) { + return -1; + } + if (a.length > b.length) { + return 1; + } + return 0; + } + + /** + * Sorts a 2D array lexicographically, in place. + * + * When two elements have different lengths, then the shorter + * one will be "padded" with elements that are smaller than + * all other elements in the other array. + * + * @param array - The array + * @returns The array + */ + static sortLexicographically(array: number[][]) { + array.sort(SpecHelpers.compareLexicographically); + return array; + } +} diff --git a/specs/TileFormatsSpec.ts b/specs/TileFormatsSpec.ts index bcb2e12f..45017fdb 100644 --- a/specs/TileFormatsSpec.ts +++ b/specs/TileFormatsSpec.ts @@ -2,7 +2,7 @@ import fs from "fs"; import { Buffers } from "../src/base/Buffers"; import { TileFormats } from "../src/tileFormats/TileFormats"; -import { SpecHelpers } from "./legacy/SpecHelpers"; +import { SpecHelpers } from "./SpecHelpers"; describe("TileFormats", function () { it("reads B3DM (deprecated 1) from a buffer", function () { @@ -271,7 +271,7 @@ describe("TileFormats", function () { ]); const cmpt = TileFormats.createCompositeTileDataBuffer(cmptTileData); - const magic = Buffers.getMagic(cmpt); + const magic = Buffers.getMagicString(cmpt); const version = cmpt.readUInt32LE(4); const byteLength = cmpt.readUInt32LE(8); const tilesLength = cmpt.readUInt32LE(12); @@ -288,12 +288,15 @@ describe("TileFormats", function () { expect(byteLength).toBe(expectedByteLength); expect(tilesLength).toBe(2); - const b3dmMagic = Buffers.getMagic(cmpt, headerByteLength); + const b3dmMagic = Buffers.getMagicString(cmpt, headerByteLength); const b3dmByteLength = cmpt.readUInt32LE(headerByteLength + 8); expect(b3dmMagic).toBe("b3dm"); expect(b3dmByteLength % 8 === 0).toBe(true); // b3dm is aligned - const i3dmMagic = Buffers.getMagic(cmpt, headerByteLength + b3dmByteLength); + const i3dmMagic = Buffers.getMagicString( + cmpt, + headerByteLength + b3dmByteLength + ); const i3dmByteLength = cmpt.readUInt32LE( headerByteLength + b3dmByteLength + 8 ); diff --git a/specs/TilesetCombinerSpec.ts b/specs/TilesetCombinerSpec.ts new file mode 100644 index 00000000..82ed7ea9 --- /dev/null +++ b/specs/TilesetCombinerSpec.ts @@ -0,0 +1,56 @@ +import fs from "fs"; + +import { Paths } from "../src/base/Paths"; + +import { Tilesets } from "../src/tilesets/Tilesets"; + +import { SpecHelpers } from "./SpecHelpers"; + +describe("TilesetCombiner", function () { + it("combines external tilesets into a single tileset", async function () { + const tilesetSourceName = "./specs/data/combineTilesets/nestedExternal"; + const tilesetTargetName = + "./specs/data/output/combineTilesets/nestedExternal"; + const overwrite = true; + + await Tilesets.combine(tilesetSourceName, tilesetTargetName, overwrite); + + // Ensure that the output directory contains the expected files: + // All files of the input, except for the external tileset JSON files + const actualRelativeFiles = + SpecHelpers.collectRelativeFileNames(tilesetTargetName); + actualRelativeFiles.sort(); + const expectedRelativeFiles = [ + "README.md", + "sub0/sub01/tileD.b3dm", + "sub0/tileC.b3dm", + "sub1/sub10/tileF.b3dm", + "sub1/tileE.b3dm", + "tileA.b3dm", + "tileB.b3dm", + "tileset.json", + ]; + expect(actualRelativeFiles).toEqual(expectedRelativeFiles); + + // Ensure that the single 'tileset.json' contains the + // proper content URIs for the combined output + const tilesetJsonBuffer = fs.readFileSync( + Paths.join(tilesetTargetName, "tileset.json") + ); + const tileset = JSON.parse(tilesetJsonBuffer.toString()); + const actualContentUris = await SpecHelpers.collectExplicitContentUris( + tileset.root + ); + actualContentUris.sort(); + + const expectedContentUris = [ + "sub0/sub01/tileD.b3dm", + "sub0/tileC.b3dm", + "sub1/sub10/tileF.b3dm", + "sub1/tileE.b3dm", + "tileA.b3dm", + "tileB.b3dm", + ]; + expect(actualContentUris).toEqual(expectedContentUris); + }); +}); diff --git a/specs/TilesetMergerSpec.ts b/specs/TilesetMergerSpec.ts new file mode 100644 index 00000000..931a3658 --- /dev/null +++ b/specs/TilesetMergerSpec.ts @@ -0,0 +1,61 @@ +import fs from "fs"; + +import { Paths } from "../src/base/Paths"; + +import { Tilesets } from "../src/tilesets/Tilesets"; + +import { SpecHelpers } from "./SpecHelpers"; + +describe("TilesetMerger", function () { + it("merges tilesets into a single tileset", async function () { + const tilesetSourceNames = [ + "./specs/data/mergeTilesets/basicMerge/TilesetA/tileset.json", + "./specs/data/mergeTilesets/basicMerge/sub/TilesetA/tileset.json", + ]; + const tilesetTargetName = "./specs/data/output/mergeTilesets/basicMerge"; + const overwrite = true; + + await Tilesets.merge(tilesetSourceNames, tilesetTargetName, overwrite); + + // Ensure that the output directory contains the expected files: + // All files of the input, disambiguated for the same base name + // (i.e. "TilesetA" and "TilesetA-0" - this is not specified, + // but has to be assumed here) + const actualRelativeFiles = + SpecHelpers.collectRelativeFileNames(tilesetTargetName); + actualRelativeFiles.sort(); + const expectedRelativeFiles = [ + "TilesetA-0/ll.b3dm", + "TilesetA-0/lr.b3dm", + "TilesetA-0/parent.b3dm", + "TilesetA-0/tileset.json", + "TilesetA-0/ul.b3dm", + "TilesetA-0/ur.b3dm", + "TilesetA/ll.b3dm", + "TilesetA/lr.b3dm", + "TilesetA/parent.b3dm", + "TilesetA/tileset.json", + "TilesetA/ul.b3dm", + "TilesetA/ur.b3dm", + "tileset.json", + ]; + expect(actualRelativeFiles).toEqual(expectedRelativeFiles); + + // Ensure that the single 'tileset.json' contains the + // proper content URIs for the external tilesets: + const tilesetJsonBuffer = fs.readFileSync( + Paths.join(tilesetTargetName, "tileset.json") + ); + const tileset = JSON.parse(tilesetJsonBuffer.toString()); + const actualContentUris = await SpecHelpers.collectExplicitContentUris( + tileset.root + ); + actualContentUris.sort(); + + const expectedContentUris = [ + "TilesetA-0/tileset.json", + "TilesetA/tileset.json", + ]; + expect(actualContentUris).toEqual(expectedContentUris); + }); +}); diff --git a/specs/TilesetProcessorSpec.ts b/specs/TilesetProcessorSpec.ts new file mode 100644 index 00000000..4db24b70 --- /dev/null +++ b/specs/TilesetProcessorSpec.ts @@ -0,0 +1,85 @@ +import { BasicTilesetProcessor } from "../src/tilesetProcessing/BasicTilesetProcessor"; + +import { SpecHelpers } from "./SpecHelpers"; + +const basicInput = "./specs/data/tilesetProcessing/basicProcessing"; +const basicOutput = "./specs/data/output/tilesetProcessing/basicProcessing"; +const quiet = true; +const overwrite = true; + +/** + * Tests for the base functionality of the (abstract) TilesetProcessor + * base class, using its only concrete implementation, namely the + * BasicTilesetProcessor + */ +describe("TilesetProcessor", function () { + afterEach(function () { + SpecHelpers.forceDeleteDirectory(basicOutput); + }); + + it("throws when trying to call 'begin' with invalid path", async function () { + const tilesetProcessor = new BasicTilesetProcessor(quiet); + await expectAsync( + (async function () { + await tilesetProcessor.begin( + basicInput + "_INVALID", + basicOutput, + overwrite + ); + })() + //^ This () is important to really CALL the anonymous function + // and return a promise. + ).toBeRejectedWithError(); + }); + + it("throws when trying to call 'begin' twice", async function () { + const tilesetProcessor = new BasicTilesetProcessor(quiet); + await tilesetProcessor.begin(basicInput, basicOutput, overwrite); + await expectAsync( + (async function () { + await tilesetProcessor.begin(basicInput, basicOutput, overwrite); + })() + //^ This () is important to really CALL the anonymous function + // and return a promise. + ).toBeRejectedWithError(); + }); + + it("throws when trying to call 'end' without 'begin'", async function () { + const tilesetProcessor = new BasicTilesetProcessor(quiet); + await expectAsync( + (async function () { + await tilesetProcessor.end(); + })() + //^ This () is important to really CALL the anonymous function + // and return a promise. + ).toBeRejectedWithError(); + }); + + it("throws when trying to call 'end' twice", async function () { + const tilesetProcessor = new BasicTilesetProcessor(quiet); + await tilesetProcessor.begin(basicInput, basicOutput, overwrite); + await tilesetProcessor.end(); + await expectAsync( + (async function () { + await tilesetProcessor.end(); + })() + //^ This () is important to really CALL the anonymous function + // and return a promise. + ).toBeRejectedWithError(); + }); + + it("performs a 'no-op' of just copying the data when when no other functions are called", async function () { + const tilesetProcessor = new BasicTilesetProcessor(quiet); + await tilesetProcessor.begin(basicInput, basicOutput, overwrite); + await tilesetProcessor.end(); + + // Ensure that the output directory contains exactly the + // input files + const relativeInputFiles = SpecHelpers.collectRelativeFileNames(basicInput); + relativeInputFiles.sort(); + const relativeOutputFiles = + SpecHelpers.collectRelativeFileNames(basicOutput); + relativeOutputFiles.sort(); + expect(relativeOutputFiles).toEqual(relativeInputFiles); + }); +}); diff --git a/specs/TilesetSourceSpec.ts b/specs/TilesetSourceSpec.ts index 01ef567c..c73512ea 100644 --- a/specs/TilesetSourceSpec.ts +++ b/specs/TilesetSourceSpec.ts @@ -1,24 +1,39 @@ import { TilesetSource } from "../src/tilesetData/TilesetSource"; import { TilesetSourceFs } from "../src/tilesetData/TilesetSourceFs"; +import { TilesetInMemory } from "../src/tilesetData/TilesetInMemory"; + import { TilesetSource3tz } from "../src/packages/TilesetSource3tz"; import { TilesetSource3dtiles } from "../src/packages/TilesetSource3dtiles"; +async function createTilesetInMemory() { + const tileset = new TilesetInMemory(); + tileset.begin("", true); + tileset.addEntry("tileset.json", Buffer.alloc(0)); + await tileset.end(); + return tileset; +} + // The basic contract that is established by the `TilesetSource` // interface is checked for these implementations: const testCases = [ { description: "TilesetSourceFs", - constructorFunction: TilesetSourceFs, + creationFunction: () => new TilesetSourceFs(), sourceName: "./specs/data/Tileset/", }, { description: "TilesetSource3tz", - constructorFunction: TilesetSource3tz, + creationFunction: () => new TilesetSource3tz(), sourceName: "./specs/data/tileset.3tz", }, { description: "TilesetSource3dtiles", - constructorFunction: TilesetSource3dtiles, + creationFunction: () => new TilesetSource3dtiles(), + sourceName: "./specs/data/tileset.3dtiles", + }, + { + description: "TilesetInMemory", + creationFunction: createTilesetInMemory, sourceName: "./specs/data/tileset.3dtiles", }, ]; @@ -28,8 +43,8 @@ for (const testCase of testCases) { let tilesetSource: TilesetSource; let sourceName: string; - beforeEach(function () { - tilesetSource = new testCase.constructorFunction(); + beforeEach(async function () { + tilesetSource = await testCase.creationFunction(); sourceName = testCase.sourceName; }); diff --git a/specs/TilesetTargetSpec.ts b/specs/TilesetTargetSpec.ts index beffca7c..ef478542 100644 --- a/specs/TilesetTargetSpec.ts +++ b/specs/TilesetTargetSpec.ts @@ -1,5 +1,7 @@ import { TilesetTarget } from "../src/tilesetData/TilesetTarget"; import { TilesetTargetFs } from "../src/tilesetData/TilesetTargetFs"; +import { TilesetInMemory } from "../src/tilesetData/TilesetInMemory"; + import { TilesetTarget3tz } from "../src/packages/TilesetTarget3tz"; import { TilesetTarget3dtiles } from "../src/packages/TilesetTarget3dtiles"; @@ -8,19 +10,24 @@ import { TilesetTarget3dtiles } from "../src/packages/TilesetTarget3dtiles"; const testCases = [ { description: "TilesetTargetFs", - constructorFunction: TilesetTargetFs, + creationFunction: () => new TilesetTargetFs(), targetName: "./specs/data/output/Tileset/", }, { description: "TilesetTarget3tz", - constructorFunction: TilesetTarget3tz, + creationFunction: () => new TilesetTarget3tz(), targetName: "./specs/data/output/tileset.3tz", }, { description: "TilesetTarget3dtiles", - constructorFunction: TilesetTarget3dtiles, + creationFunction: () => new TilesetTarget3dtiles(), targetName: "./specs/data/output/tileset.3dtiles", }, + { + description: "TilesetInMemory", + creationFunction: () => new TilesetInMemory(), + targetName: "", + }, ]; for (const testCase of testCases) { @@ -28,8 +35,8 @@ for (const testCase of testCases) { let tilesetTarget: TilesetTarget; let targetName: string; - beforeEach(function () { - tilesetTarget = new testCase.constructorFunction(); + beforeEach(async function () { + tilesetTarget = testCase.creationFunction(); targetName = testCase.targetName; }); diff --git a/specs/TilesetUpgraderSpec.ts b/specs/TilesetUpgraderSpec.ts index 420ba627..91baa252 100644 --- a/specs/TilesetUpgraderSpec.ts +++ b/specs/TilesetUpgraderSpec.ts @@ -20,10 +20,18 @@ const unitBoundingBox = { // - The `refine` value must be given in uppercase // - The `content.url` must be a `content.uri` // (checked for both `content` and `contents`) +// - The `extensionsUsed` and `extensionsRequired` should no +// longer contain `3DTILES_content_gltf` const inputTilesetJsonRaw: unknown = { asset: { version: "0.0", }, + extensionsUsed: [ + "3DTILES_content_gltf", + "EXAMPLE_extension_A", + "EXAMPLE_extension_B", + ], + extensionsRequired: ["3DTILES_content_gltf", "EXAMPLE_extension_A"], geometricError: 2.0, root: { boundingVolume: unitBoundingBox, @@ -85,4 +93,14 @@ describe("TilesetUpgrader", function () { expect(tileset.root.children![0].refine).toBe("ADD"); expect(tileset.root.children![1].refine).toBeUndefined(); }); + + it("removes the unnecessary extension declaration for 3DTILES_content_gltf", async function () { + const tilesetUpgrader = new TilesetUpgrader(quiet); + + const tileset = JSON.parse(inputTilesetJsonString) as Tileset; + await tilesetUpgrader.upgradeTileset(tileset); + + expect(tileset.extensionsUsed).not.toContain("3DTILES_content_gltf"); + expect(tileset.extensionsRequired).not.toContain("3DTILES_content_gltf"); + }); }); diff --git a/specs/data/mergeTilesets/TilesetA/ll.b3dm b/specs/data/TilesetWithUris/ll.b3dm similarity index 100% rename from specs/data/mergeTilesets/TilesetA/ll.b3dm rename to specs/data/TilesetWithUris/ll.b3dm diff --git a/specs/data/mergeTilesets/TilesetA/lr.b3dm b/specs/data/TilesetWithUris/lr.b3dm similarity index 100% rename from specs/data/mergeTilesets/TilesetA/lr.b3dm rename to specs/data/TilesetWithUris/lr.b3dm diff --git a/specs/data/mergeTilesets/TilesetA/parent.b3dm b/specs/data/TilesetWithUris/parent.b3dm similarity index 100% rename from specs/data/mergeTilesets/TilesetA/parent.b3dm rename to specs/data/TilesetWithUris/parent.b3dm diff --git a/specs/data/TilesetWithUris/tileset.json b/specs/data/TilesetWithUris/tileset.json new file mode 100644 index 00000000..ef60cd7b --- /dev/null +++ b/specs/data/TilesetWithUris/tileset.json @@ -0,0 +1,118 @@ +{ + "asset": { + "version": "1.0", + "tilesetVersion": "1.2.3" + }, + "properties": { + "id": { + "minimum": 0, + "maximum": 9 + }, + "Longitude": { + "minimum": -1.3197192952275933, + "maximum": -1.319644104024109 + }, + "Latitude": { + "minimum": 0.698848878034009, + "maximum": 0.6989046192460953 + }, + "Height": { + "minimum": 6.161747192963958, + "maximum": 84.83180232718587 + } + }, + "geometricError": 240, + "root": { + "boundingVolume": { + "region": [ + -1.3197209591796106, + 0.6988424218, + -1.3196390408203893, + 0.6989055782, + 0, + 88 + ] + }, + "geometricError": 70, + "refine": "ADD", + "content": { + "uri": "parent.b3dm", + "boundingVolume": { + "region": [ + -1.3197004795898053, + 0.6988582109, + -1.3196595204101946, + 0.6988897891, + 0, + 88 + ] + } + }, + "children": [ + { + "boundingVolume": { + "region": [ + -1.3197209591796106, + 0.6988424218, + -1.31968, + 0.698874, + 0, + 20 + ] + }, + "geometricError": 0, + "content": { + "uri": "ll.b3dm" + } + }, + { + "boundingVolume": { + "region": [ + -1.31968, + 0.6988424218, + -1.3196390408203893, + 0.698874, + 0, + 20 + ] + }, + "geometricError": 0, + "content": { + "uri": "lr.b3dm" + } + }, + { + "boundingVolume": { + "region": [ + -1.31968, + 0.698874, + -1.3196390408203893, + 0.6989055782, + 0, + 20 + ] + }, + "geometricError": 0, + "content": { + "uri": "ur.b3dm" + } + }, + { + "boundingVolume": { + "region": [ + -1.3197209591796106, + 0.698874, + -1.31968, + 0.6989055782, + 0, + 20 + ] + }, + "geometricError": 0, + "content": { + "uri": "ul.b3dm" + } + } + ] + } +} diff --git a/specs/data/mergeTilesets/TilesetA/ul.b3dm b/specs/data/TilesetWithUris/ul.b3dm similarity index 100% rename from specs/data/mergeTilesets/TilesetA/ul.b3dm rename to specs/data/TilesetWithUris/ul.b3dm diff --git a/specs/data/mergeTilesets/TilesetA/ur.b3dm b/specs/data/TilesetWithUris/ur.b3dm similarity index 100% rename from specs/data/mergeTilesets/TilesetA/ur.b3dm rename to specs/data/TilesetWithUris/ur.b3dm diff --git a/specs/data/combineTilesets/README.md b/specs/data/combineTilesets/nestedExternal/README.md similarity index 100% rename from specs/data/combineTilesets/README.md rename to specs/data/combineTilesets/nestedExternal/README.md diff --git a/specs/data/combineTilesets/externalA.json b/specs/data/combineTilesets/nestedExternal/externalA.json similarity index 100% rename from specs/data/combineTilesets/externalA.json rename to specs/data/combineTilesets/nestedExternal/externalA.json diff --git a/specs/data/combineTilesets/sub0/externalB.json b/specs/data/combineTilesets/nestedExternal/sub0/externalB.json similarity index 100% rename from specs/data/combineTilesets/sub0/externalB.json rename to specs/data/combineTilesets/nestedExternal/sub0/externalB.json diff --git a/specs/data/combineTilesets/sub0/sub01/tileD.b3dm b/specs/data/combineTilesets/nestedExternal/sub0/sub01/tileD.b3dm similarity index 100% rename from specs/data/combineTilesets/sub0/sub01/tileD.b3dm rename to specs/data/combineTilesets/nestedExternal/sub0/sub01/tileD.b3dm diff --git a/specs/data/combineTilesets/sub0/tileC.b3dm b/specs/data/combineTilesets/nestedExternal/sub0/tileC.b3dm similarity index 100% rename from specs/data/combineTilesets/sub0/tileC.b3dm rename to specs/data/combineTilesets/nestedExternal/sub0/tileC.b3dm diff --git a/specs/data/combineTilesets/sub1/externalC.json b/specs/data/combineTilesets/nestedExternal/sub1/externalC.json similarity index 100% rename from specs/data/combineTilesets/sub1/externalC.json rename to specs/data/combineTilesets/nestedExternal/sub1/externalC.json diff --git a/specs/data/combineTilesets/sub1/sub10/tileF.b3dm b/specs/data/combineTilesets/nestedExternal/sub1/sub10/tileF.b3dm similarity index 100% rename from specs/data/combineTilesets/sub1/sub10/tileF.b3dm rename to specs/data/combineTilesets/nestedExternal/sub1/sub10/tileF.b3dm diff --git a/specs/data/combineTilesets/sub1/tileE.b3dm b/specs/data/combineTilesets/nestedExternal/sub1/tileE.b3dm similarity index 100% rename from specs/data/combineTilesets/sub1/tileE.b3dm rename to specs/data/combineTilesets/nestedExternal/sub1/tileE.b3dm diff --git a/specs/data/combineTilesets/tileA.b3dm b/specs/data/combineTilesets/nestedExternal/tileA.b3dm similarity index 100% rename from specs/data/combineTilesets/tileA.b3dm rename to specs/data/combineTilesets/nestedExternal/tileA.b3dm diff --git a/specs/data/combineTilesets/tileB.b3dm b/specs/data/combineTilesets/nestedExternal/tileB.b3dm similarity index 100% rename from specs/data/combineTilesets/tileB.b3dm rename to specs/data/combineTilesets/nestedExternal/tileB.b3dm diff --git a/specs/data/combineTilesets/tileset.json b/specs/data/combineTilesets/nestedExternal/tileset.json similarity index 100% rename from specs/data/combineTilesets/tileset.json rename to specs/data/combineTilesets/nestedExternal/tileset.json diff --git a/specs/data/contentTypes/README.md b/specs/data/contentTypes/README.md index d3694305..5bdbeaa4 100644 --- a/specs/data/contentTypes/README.md +++ b/specs/data/contentTypes/README.md @@ -4,6 +4,10 @@ Tile content files used in the specs. - The `content.3tz` is a 3TZ file with a minimal, valid `tileset.json` - The `content.gltf` is the (embedded) `Triangle` sample model from https://github.com/KhronosGroup/glTF-Sample-Models/blob/8e9a5a6ad1a2790e2333e3eb48a1ee39f9e0e31b/2.0/ - The `content.glb` is the `Box` sample model from https://github.com/KhronosGroup/glTF-Sample-Models/blob/8e9a5a6ad1a2790e2333e3eb48a1ee39f9e0e31b/2.0/ +- The `content.subt` is the subtree file from https://github.com/CesiumGS/3d-tiles-samples/blob/902ea3dca1821a9ef9d23d141f800c68627c452b/1.1/SparseImplicitQuadtree/subtrees/0.0.0.subtree +- The `content.png` is a 1x1 PNG image +- The `content.jpg` is a 1x1 JPEG image +- The `content.gif` is a 1x1 GIF image The other files are taken from https://github.com/CesiumGS/cesium/tree/c0ec95713b6cde5a91eea320795c84408159dcad/Apps/SampleData/Cesium3DTiles diff --git a/specs/data/contentTypes/content.gif b/specs/data/contentTypes/content.gif new file mode 100644 index 00000000..c71e11fb Binary files /dev/null and b/specs/data/contentTypes/content.gif differ diff --git a/specs/data/contentTypes/content.jpg b/specs/data/contentTypes/content.jpg new file mode 100644 index 00000000..dfee859b Binary files /dev/null and b/specs/data/contentTypes/content.jpg differ diff --git a/specs/data/contentTypes/content.png b/specs/data/contentTypes/content.png new file mode 100644 index 00000000..818c71d0 Binary files /dev/null and b/specs/data/contentTypes/content.png differ diff --git a/specs/data/contentTypes/content.subtree b/specs/data/contentTypes/content.subtree new file mode 100644 index 00000000..b9120399 Binary files /dev/null and b/specs/data/contentTypes/content.subtree differ diff --git a/specs/data/mergeTilesets/sub/TilesetA/ll.b3dm b/specs/data/mergeTilesets/basicMerge/TilesetA/ll.b3dm similarity index 100% rename from specs/data/mergeTilesets/sub/TilesetA/ll.b3dm rename to specs/data/mergeTilesets/basicMerge/TilesetA/ll.b3dm diff --git a/specs/data/mergeTilesets/sub/TilesetA/lr.b3dm b/specs/data/mergeTilesets/basicMerge/TilesetA/lr.b3dm similarity index 100% rename from specs/data/mergeTilesets/sub/TilesetA/lr.b3dm rename to specs/data/mergeTilesets/basicMerge/TilesetA/lr.b3dm diff --git a/specs/data/mergeTilesets/sub/TilesetA/parent.b3dm b/specs/data/mergeTilesets/basicMerge/TilesetA/parent.b3dm similarity index 100% rename from specs/data/mergeTilesets/sub/TilesetA/parent.b3dm rename to specs/data/mergeTilesets/basicMerge/TilesetA/parent.b3dm diff --git a/specs/data/mergeTilesets/TilesetA/tileset.json b/specs/data/mergeTilesets/basicMerge/TilesetA/tileset.json similarity index 100% rename from specs/data/mergeTilesets/TilesetA/tileset.json rename to specs/data/mergeTilesets/basicMerge/TilesetA/tileset.json diff --git a/specs/data/mergeTilesets/sub/TilesetA/ul.b3dm b/specs/data/mergeTilesets/basicMerge/TilesetA/ul.b3dm similarity index 100% rename from specs/data/mergeTilesets/sub/TilesetA/ul.b3dm rename to specs/data/mergeTilesets/basicMerge/TilesetA/ul.b3dm diff --git a/specs/data/mergeTilesets/sub/TilesetA/ur.b3dm b/specs/data/mergeTilesets/basicMerge/TilesetA/ur.b3dm similarity index 100% rename from specs/data/mergeTilesets/sub/TilesetA/ur.b3dm rename to specs/data/mergeTilesets/basicMerge/TilesetA/ur.b3dm diff --git a/specs/data/mergeTilesets/basicMerge/sub/TilesetA/ll.b3dm b/specs/data/mergeTilesets/basicMerge/sub/TilesetA/ll.b3dm new file mode 100644 index 00000000..dbd0d214 Binary files /dev/null and b/specs/data/mergeTilesets/basicMerge/sub/TilesetA/ll.b3dm differ diff --git a/specs/data/mergeTilesets/basicMerge/sub/TilesetA/lr.b3dm b/specs/data/mergeTilesets/basicMerge/sub/TilesetA/lr.b3dm new file mode 100644 index 00000000..f052e259 Binary files /dev/null and b/specs/data/mergeTilesets/basicMerge/sub/TilesetA/lr.b3dm differ diff --git a/specs/data/mergeTilesets/basicMerge/sub/TilesetA/parent.b3dm b/specs/data/mergeTilesets/basicMerge/sub/TilesetA/parent.b3dm new file mode 100644 index 00000000..81db7f15 Binary files /dev/null and b/specs/data/mergeTilesets/basicMerge/sub/TilesetA/parent.b3dm differ diff --git a/specs/data/mergeTilesets/sub/TilesetA/tileset.json b/specs/data/mergeTilesets/basicMerge/sub/TilesetA/tileset.json similarity index 100% rename from specs/data/mergeTilesets/sub/TilesetA/tileset.json rename to specs/data/mergeTilesets/basicMerge/sub/TilesetA/tileset.json diff --git a/specs/data/mergeTilesets/basicMerge/sub/TilesetA/ul.b3dm b/specs/data/mergeTilesets/basicMerge/sub/TilesetA/ul.b3dm new file mode 100644 index 00000000..a3860471 Binary files /dev/null and b/specs/data/mergeTilesets/basicMerge/sub/TilesetA/ul.b3dm differ diff --git a/specs/data/mergeTilesets/basicMerge/sub/TilesetA/ur.b3dm b/specs/data/mergeTilesets/basicMerge/sub/TilesetA/ur.b3dm new file mode 100644 index 00000000..24a6c3e4 Binary files /dev/null and b/specs/data/mergeTilesets/basicMerge/sub/TilesetA/ur.b3dm differ diff --git a/specs/data/pipelines/examplePipeline.json b/specs/data/pipelines/examplePipeline.json new file mode 100644 index 00000000..2418f382 --- /dev/null +++ b/specs/data/pipelines/examplePipeline.json @@ -0,0 +1,62 @@ +{ + "input": "./specs/data/TilesetOfTilesets", + "output": "./output/result", + "tilesetStages": [ + { + "name": "upgrade", + "description": "Upgrade the input tileset to the latest version" + }, + { + "name": "combine", + "description": "Combine all external tilesets into one" + }, + { + "name": "_b3dmToGlb", + "description": "Convert B3DM to GLB", + "contentStages": [ + { + "name": "b3dmToGlb", + "description": "Convert each B3DM content into GLB" + } + ] + }, + { + "name": "_optimizeGlb", + "description": "Optimize GLB", + "contentStages": [ + { + "name": "optimizeGlb", + "description": "Apply gltf-pipeline to each GLB content, with the given options", + "options": { + "dracoOptions": { + "compressionLevel": 10 + } + } + } + ] + }, + { + "name": "_separateGltf", + "description": "Separate glTF", + "contentStages": [ + { + "name": "separateGltf", + "description": "Convert each GLB content into a .gltf file with separate resources" + } + ] + }, + { + "name": "_gzip", + "description": "Gzip (glTF only)", + "contentStages": [ + { + "name": "gzip", + "description": "Compresses each entry with GZIP", + "includedContentTypes": [ + "CONTENT_TYPE_GLTF" + ] + } + ] + } + ] +} diff --git a/specs/data/pipelines/simplePipeline.json b/specs/data/pipelines/simplePipeline.json new file mode 100644 index 00000000..e8dbaf95 --- /dev/null +++ b/specs/data/pipelines/simplePipeline.json @@ -0,0 +1,16 @@ +{ + "input": "./specs/data/TilesetOfTilesetsWithUris", + "output": "./output/TilesetOfTilesetsWithUris.3tz", + "tilesetStages": [ + { + "name": "B3DM to GLB", + "description": "Convert B3DM to GLB", + "contentStages": [ + { + "name": "b3dmToGlb", + "description": "Convert each B3DM content into GLB" + } + ] + } + ] +} \ No newline at end of file diff --git a/specs/data/subtrees/validBuffer.bin b/specs/data/subtrees/validBuffer.bin new file mode 100644 index 00000000..694af876 Binary files /dev/null and b/specs/data/subtrees/validBuffer.bin differ diff --git a/specs/data/subtrees/validSubtree.json b/specs/data/subtrees/validSubtree.json new file mode 100644 index 00000000..eba990ae --- /dev/null +++ b/specs/data/subtrees/validSubtree.json @@ -0,0 +1,10 @@ +{ + "buffers": [{ "uri": "validBuffer.bin", "byteLength": 16 }], + "bufferViews": [ + { "buffer": 0, "byteOffset": 0, "byteLength": 3 }, + { "buffer": 0, "byteOffset": 8, "byteLength": 8 } + ], + "tileAvailability": { "bitstream": 0, "availableCount": 7 }, + "contentAvailability": [{ "availableCount": 0, "constant": 0 }], + "childSubtreeAvailability": { "bitstream": 1, "availableCount": 8 } +} diff --git a/specs/data/subtrees/validSubtreeImplicitTiling.json.input b/specs/data/subtrees/validSubtreeImplicitTiling.json.input new file mode 100644 index 00000000..61869d4d --- /dev/null +++ b/specs/data/subtrees/validSubtreeImplicitTiling.json.input @@ -0,0 +1,8 @@ +{ + "subdivisionScheme": "QUADTREE", + "subtreeLevels": 3, + "availableLevels": 6, + "subtrees": { + "uri": "subtrees/{level}.{x}.{y}.subtree" + } +} diff --git a/specs/data/tilesetProcessing/basicProcessing.3dtiles b/specs/data/tilesetProcessing/basicProcessing.3dtiles new file mode 100644 index 00000000..6d2d1090 Binary files /dev/null and b/specs/data/tilesetProcessing/basicProcessing.3dtiles differ diff --git a/specs/data/tilesetProcessing/basicProcessing.3tz b/specs/data/tilesetProcessing/basicProcessing.3tz new file mode 100644 index 00000000..4963350f Binary files /dev/null and b/specs/data/tilesetProcessing/basicProcessing.3tz differ diff --git a/specs/data/tilesetProcessing/basicProcessing/README.md b/specs/data/tilesetProcessing/basicProcessing/README.md new file mode 100644 index 00000000..305feaca --- /dev/null +++ b/specs/data/tilesetProcessing/basicProcessing/README.md @@ -0,0 +1,19 @@ + +A basic tileset consisting of dummy data, with a structure +suitable for testing various aspects of the tileset +processing. + +A tile with content and multiple children, where one child +contains multiple contents (with the same file name, but +located in different directories), one content that appears +twice in different tiles (even though that may not make +sense - it is intended for tests), and a README.md that is +unrelated to the tileset itself. + +root.content.uri = tileA.b3dm +root.children[0].contents[0].uri = tileB.b3dm +root.children[0].contents[1].uri = sub/tileB.b3dm +root.children[1].content.uri = tileC.b3dm +root.children[2].content.uri = tileA.b3dm + + diff --git a/specs/data/tilesetProcessing/basicProcessing/sub/tileB.b3dm b/specs/data/tilesetProcessing/basicProcessing/sub/tileB.b3dm new file mode 100644 index 00000000..98ed9397 Binary files /dev/null and b/specs/data/tilesetProcessing/basicProcessing/sub/tileB.b3dm differ diff --git a/specs/data/tilesetProcessing/basicProcessing/tileA.b3dm b/specs/data/tilesetProcessing/basicProcessing/tileA.b3dm new file mode 100644 index 00000000..98ed9397 Binary files /dev/null and b/specs/data/tilesetProcessing/basicProcessing/tileA.b3dm differ diff --git a/specs/data/tilesetProcessing/basicProcessing/tileB.b3dm b/specs/data/tilesetProcessing/basicProcessing/tileB.b3dm new file mode 100644 index 00000000..98ed9397 Binary files /dev/null and b/specs/data/tilesetProcessing/basicProcessing/tileB.b3dm differ diff --git a/specs/data/tilesetProcessing/basicProcessing/tileC.b3dm b/specs/data/tilesetProcessing/basicProcessing/tileC.b3dm new file mode 100644 index 00000000..98ed9397 Binary files /dev/null and b/specs/data/tilesetProcessing/basicProcessing/tileC.b3dm differ diff --git a/specs/data/tilesetProcessing/basicProcessing/tileset.json b/specs/data/tilesetProcessing/basicProcessing/tileset.json new file mode 100644 index 00000000..b8bf3b4a --- /dev/null +++ b/specs/data/tilesetProcessing/basicProcessing/tileset.json @@ -0,0 +1,47 @@ +{ + "asset" : { + "version" : "1.1" + }, + "geometricError" : 4.0, + "root" : { + "boundingVolume" : { + "box" : [ 0.5, 0.5, 0.5, 0.5, 0.0, 0.0, 0.0, 0.5, 0.0, 0.0, 0.0, 0.5 ] + }, + "content": { + "uri": "tileA.b3dm" + }, + "geometricError" : 2.0, + "children": [ + { + "boundingVolume" : { + "box" : [ 0.5, 0.5, 0.5, 0.5, 0.0, 0.0, 0.0, 0.5, 0.0, 0.0, 0.0, 0.5 ] + }, + "geometricError" : 1.0, + "contents": [ + { + "uri": "tileB.b3dm" + }, + { + "uri": "sub/tileB.b3dm" + } + ] + }, { + "boundingVolume" : { + "box" : [ 0.5, 0.5, 0.5, 0.5, 0.0, 0.0, 0.0, 0.5, 0.0, 0.0, 0.0, 0.5 ] + }, + "geometricError" : 1.0, + "content": { + "uri": "tileC.b3dm" + } + }, { + "boundingVolume" : { + "box" : [ 0.5, 0.5, 0.5, 0.5, 0.0, 0.0, 0.0, 0.5, 0.0, 0.0, 0.0, 0.5 ] + }, + "geometricError" : 1.0, + "content": { + "uri": "tileA.b3dm" + } + } + ] + } +} \ No newline at end of file diff --git a/specs/data/tilesetProcessing/contentProcessing/README.md b/specs/data/tilesetProcessing/contentProcessing/README.md new file mode 100644 index 00000000..a6d603e3 --- /dev/null +++ b/specs/data/tilesetProcessing/contentProcessing/README.md @@ -0,0 +1,12 @@ + +A dummy tileset for the processing tests. + +It contains B3DM, I3DM and GLB content, and is used for basic +tests of the content processing stages, namely + +- `b3dmToGlb` +- `i3dmToGlb` +- `glbToB3dm` +- `glbToI3dm` +- `optimizeB3dm` +- `optimizeI3dm` diff --git a/specs/data/tilesetProcessing/contentProcessing/b3dmContent.b3dm b/specs/data/tilesetProcessing/contentProcessing/b3dmContent.b3dm new file mode 100644 index 00000000..8cb95895 Binary files /dev/null and b/specs/data/tilesetProcessing/contentProcessing/b3dmContent.b3dm differ diff --git a/specs/data/tilesetProcessing/contentProcessing/glbContent.glb b/specs/data/tilesetProcessing/contentProcessing/glbContent.glb new file mode 100644 index 00000000..95ec886b Binary files /dev/null and b/specs/data/tilesetProcessing/contentProcessing/glbContent.glb differ diff --git a/specs/data/tilesetProcessing/contentProcessing/i3dmContent.i3dm b/specs/data/tilesetProcessing/contentProcessing/i3dmContent.i3dm new file mode 100644 index 00000000..b0f0bef8 Binary files /dev/null and b/specs/data/tilesetProcessing/contentProcessing/i3dmContent.i3dm differ diff --git a/specs/data/tilesetProcessing/contentProcessing/tileset.json b/specs/data/tilesetProcessing/contentProcessing/tileset.json new file mode 100644 index 00000000..a4be4137 --- /dev/null +++ b/specs/data/tilesetProcessing/contentProcessing/tileset.json @@ -0,0 +1,39 @@ +{ + "asset" : { + "version" : "1.1" + }, + "geometricError" : 4.0, + "root" : { + "boundingVolume" : { + "box" : [ 0.5, 0.5, 0.5, 0.5, 0.0, 0.0, 0.0, 0.5, 0.0, 0.0, 0.0, 0.5 ] + }, + "geometricError" : 2.0, + "children": [ + { + "boundingVolume" : { + "box" : [ 0.5, 0.5, 0.5, 0.5, 0.0, 0.0, 0.0, 0.5, 0.0, 0.0, 0.0, 0.5 ] + }, + "geometricError" : 1.0, + "content": { + "uri": "b3dmContent.b3dm" + } + }, { + "boundingVolume" : { + "box" : [ 0.5, 0.5, 0.5, 0.5, 0.0, 0.0, 0.0, 0.5, 0.0, 0.0, 0.0, 0.5 ] + }, + "geometricError" : 1.0, + "content": { + "uri": "i3dmContent.i3dm" + } + }, { + "boundingVolume" : { + "box" : [ 0.5, 0.5, 0.5, 0.5, 0.0, 0.0, 0.0, 0.5, 0.0, 0.0, 0.0, 0.5 ] + }, + "geometricError" : 1.0, + "content": { + "uri": "glbContent.glb" + } + } + ] + } +} \ No newline at end of file diff --git a/specs/data/tilesetProcessing/implicitProcessing.3dtiles b/specs/data/tilesetProcessing/implicitProcessing.3dtiles new file mode 100644 index 00000000..3268a78e Binary files /dev/null and b/specs/data/tilesetProcessing/implicitProcessing.3dtiles differ diff --git a/specs/data/tilesetProcessing/implicitProcessing.3tz b/specs/data/tilesetProcessing/implicitProcessing.3tz new file mode 100644 index 00000000..14b9bcdf Binary files /dev/null and b/specs/data/tilesetProcessing/implicitProcessing.3tz differ diff --git a/specs/data/tilesetProcessing/implicitProcessing/ImplicitProcessing.gif b/specs/data/tilesetProcessing/implicitProcessing/ImplicitProcessing.gif new file mode 100644 index 00000000..720b6c3a Binary files /dev/null and b/specs/data/tilesetProcessing/implicitProcessing/ImplicitProcessing.gif differ diff --git a/specs/data/tilesetProcessing/implicitProcessing/README.md b/specs/data/tilesetProcessing/implicitProcessing/README.md new file mode 100644 index 00000000..3275a806 --- /dev/null +++ b/specs/data/tilesetProcessing/implicitProcessing/README.md @@ -0,0 +1,24 @@ + +An example tileset for basic tests of the processing functionality +for implicit tilesets. + +It is an implicit quadttree tileset with 4 levels (2 subtree levels), +with content being available at `level, x, y`: + +- 3, 3, 2 +- 3, 4, 2 +- 3, 2, 3 +- 3, 3, 3 +- 3, 4, 3 +- 3, 5, 3 +- 3, 2, 4 +- 3, 3, 4 +- 3, 4, 4 +- 3, 5, 4 +- 3, 3, 5 +- 3, 4, 5 + +forming a small (low-resolution) "circle" in the center, and +content being available for all ancestor tiles of these +coordinates. + diff --git a/specs/data/tilesetProcessing/implicitProcessing/SandcastleCode.js b/specs/data/tilesetProcessing/implicitProcessing/SandcastleCode.js new file mode 100644 index 00000000..a04ac29b --- /dev/null +++ b/specs/data/tilesetProcessing/implicitProcessing/SandcastleCode.js @@ -0,0 +1,31 @@ +const viewer = new Cesium.Viewer("cesiumContainer"); + +// Create the tileset in the viewer +const tileset = viewer.scene.primitives.add( + new Cesium.Cesium3DTileset({ + url: "http://localhost:8003/tileset.json", + debugShowBoundingVolume: true, + maximumScreenSpaceError: 512, + }) +); + +// Move the tileset to a certain position on the globe, +// and scale it up +const transform = Cesium.Transforms.eastNorthUpToFixedFrame( + Cesium.Cartesian3.fromDegrees(-75.152408, 39.946975, 1) +); +const scale = 15.0; +tileset.modelMatrix = Cesium.Matrix4.multiplyByUniformScale( + transform, + scale, + new Cesium.Matrix4() +); + +// Zoom to the tileset, with a small offset so that +// it is fully visible +const offset = new Cesium.HeadingPitchRange( + 0, + Cesium.Math.toRadians(-90), + 40.0 +); +viewer.zoomTo(tileset, offset); \ No newline at end of file diff --git a/specs/data/tilesetProcessing/implicitProcessing/content/content_0__0_0.glb b/specs/data/tilesetProcessing/implicitProcessing/content/content_0__0_0.glb new file mode 100644 index 00000000..daf4f72b Binary files /dev/null and b/specs/data/tilesetProcessing/implicitProcessing/content/content_0__0_0.glb differ diff --git a/specs/data/tilesetProcessing/implicitProcessing/content/content_1__0_0.glb b/specs/data/tilesetProcessing/implicitProcessing/content/content_1__0_0.glb new file mode 100644 index 00000000..0dfb61d5 Binary files /dev/null and b/specs/data/tilesetProcessing/implicitProcessing/content/content_1__0_0.glb differ diff --git a/specs/data/tilesetProcessing/implicitProcessing/content/content_1__0_1.glb b/specs/data/tilesetProcessing/implicitProcessing/content/content_1__0_1.glb new file mode 100644 index 00000000..0ed74c5b Binary files /dev/null and b/specs/data/tilesetProcessing/implicitProcessing/content/content_1__0_1.glb differ diff --git a/specs/data/tilesetProcessing/implicitProcessing/content/content_1__1_0.glb b/specs/data/tilesetProcessing/implicitProcessing/content/content_1__1_0.glb new file mode 100644 index 00000000..ed46cf43 Binary files /dev/null and b/specs/data/tilesetProcessing/implicitProcessing/content/content_1__1_0.glb differ diff --git a/specs/data/tilesetProcessing/implicitProcessing/content/content_1__1_1.glb b/specs/data/tilesetProcessing/implicitProcessing/content/content_1__1_1.glb new file mode 100644 index 00000000..33674c6a Binary files /dev/null and b/specs/data/tilesetProcessing/implicitProcessing/content/content_1__1_1.glb differ diff --git a/specs/data/tilesetProcessing/implicitProcessing/content/content_2__1_1.glb b/specs/data/tilesetProcessing/implicitProcessing/content/content_2__1_1.glb new file mode 100644 index 00000000..4eccfdb3 Binary files /dev/null and b/specs/data/tilesetProcessing/implicitProcessing/content/content_2__1_1.glb differ diff --git a/specs/data/tilesetProcessing/implicitProcessing/content/content_2__1_2.glb b/specs/data/tilesetProcessing/implicitProcessing/content/content_2__1_2.glb new file mode 100644 index 00000000..0d5eeca7 Binary files /dev/null and b/specs/data/tilesetProcessing/implicitProcessing/content/content_2__1_2.glb differ diff --git a/specs/data/tilesetProcessing/implicitProcessing/content/content_2__2_1.glb b/specs/data/tilesetProcessing/implicitProcessing/content/content_2__2_1.glb new file mode 100644 index 00000000..553217a1 Binary files /dev/null and b/specs/data/tilesetProcessing/implicitProcessing/content/content_2__2_1.glb differ diff --git a/specs/data/tilesetProcessing/implicitProcessing/content/content_2__2_2.glb b/specs/data/tilesetProcessing/implicitProcessing/content/content_2__2_2.glb new file mode 100644 index 00000000..6b245769 Binary files /dev/null and b/specs/data/tilesetProcessing/implicitProcessing/content/content_2__2_2.glb differ diff --git a/specs/data/tilesetProcessing/implicitProcessing/content/content_3__2_3.glb b/specs/data/tilesetProcessing/implicitProcessing/content/content_3__2_3.glb new file mode 100644 index 00000000..1ba18599 Binary files /dev/null and b/specs/data/tilesetProcessing/implicitProcessing/content/content_3__2_3.glb differ diff --git a/specs/data/tilesetProcessing/implicitProcessing/content/content_3__2_4.glb b/specs/data/tilesetProcessing/implicitProcessing/content/content_3__2_4.glb new file mode 100644 index 00000000..7d6818dd Binary files /dev/null and b/specs/data/tilesetProcessing/implicitProcessing/content/content_3__2_4.glb differ diff --git a/specs/data/tilesetProcessing/implicitProcessing/content/content_3__3_2.glb b/specs/data/tilesetProcessing/implicitProcessing/content/content_3__3_2.glb new file mode 100644 index 00000000..6f543033 Binary files /dev/null and b/specs/data/tilesetProcessing/implicitProcessing/content/content_3__3_2.glb differ diff --git a/specs/data/tilesetProcessing/implicitProcessing/content/content_3__3_3.glb b/specs/data/tilesetProcessing/implicitProcessing/content/content_3__3_3.glb new file mode 100644 index 00000000..69893315 Binary files /dev/null and b/specs/data/tilesetProcessing/implicitProcessing/content/content_3__3_3.glb differ diff --git a/specs/data/tilesetProcessing/implicitProcessing/content/content_3__3_4.glb b/specs/data/tilesetProcessing/implicitProcessing/content/content_3__3_4.glb new file mode 100644 index 00000000..78457fcb Binary files /dev/null and b/specs/data/tilesetProcessing/implicitProcessing/content/content_3__3_4.glb differ diff --git a/specs/data/tilesetProcessing/implicitProcessing/content/content_3__3_5.glb b/specs/data/tilesetProcessing/implicitProcessing/content/content_3__3_5.glb new file mode 100644 index 00000000..0045bc89 Binary files /dev/null and b/specs/data/tilesetProcessing/implicitProcessing/content/content_3__3_5.glb differ diff --git a/specs/data/tilesetProcessing/implicitProcessing/content/content_3__4_2.glb b/specs/data/tilesetProcessing/implicitProcessing/content/content_3__4_2.glb new file mode 100644 index 00000000..707b8c03 Binary files /dev/null and b/specs/data/tilesetProcessing/implicitProcessing/content/content_3__4_2.glb differ diff --git a/specs/data/tilesetProcessing/implicitProcessing/content/content_3__4_3.glb b/specs/data/tilesetProcessing/implicitProcessing/content/content_3__4_3.glb new file mode 100644 index 00000000..84619a77 Binary files /dev/null and b/specs/data/tilesetProcessing/implicitProcessing/content/content_3__4_3.glb differ diff --git a/specs/data/tilesetProcessing/implicitProcessing/content/content_3__4_4.glb b/specs/data/tilesetProcessing/implicitProcessing/content/content_3__4_4.glb new file mode 100644 index 00000000..3bbe3fd5 Binary files /dev/null and b/specs/data/tilesetProcessing/implicitProcessing/content/content_3__4_4.glb differ diff --git a/specs/data/tilesetProcessing/implicitProcessing/content/content_3__4_5.glb b/specs/data/tilesetProcessing/implicitProcessing/content/content_3__4_5.glb new file mode 100644 index 00000000..f91ae5e5 Binary files /dev/null and b/specs/data/tilesetProcessing/implicitProcessing/content/content_3__4_5.glb differ diff --git a/specs/data/tilesetProcessing/implicitProcessing/content/content_3__5_3.glb b/specs/data/tilesetProcessing/implicitProcessing/content/content_3__5_3.glb new file mode 100644 index 00000000..ff5f4ec2 Binary files /dev/null and b/specs/data/tilesetProcessing/implicitProcessing/content/content_3__5_3.glb differ diff --git a/specs/data/tilesetProcessing/implicitProcessing/content/content_3__5_4.glb b/specs/data/tilesetProcessing/implicitProcessing/content/content_3__5_4.glb new file mode 100644 index 00000000..0f592c0c Binary files /dev/null and b/specs/data/tilesetProcessing/implicitProcessing/content/content_3__5_4.glb differ diff --git a/specs/data/tilesetProcessing/implicitProcessing/subtreeInfo.md b/specs/data/tilesetProcessing/implicitProcessing/subtreeInfo.md new file mode 100644 index 00000000..4a62e974 --- /dev/null +++ b/specs/data/tilesetProcessing/implicitProcessing/subtreeInfo.md @@ -0,0 +1,296 @@ +## Coordinates (level=0, (0,0)) + +Subtree JSON: + +``` +{ + "buffers" : [ + { + "byteLength" : 8 + } + ], + "bufferViews" : [ + { + "buffer" : 0, + "byteOffset" : 0, + "byteLength" : 2 + } + ], + "tileAvailability" : { + "availableCount" : 5, + "constant" : 1 + }, + "contentAvailability" : [ + { + "availableCount" : 5, + "constant" : 1 + } + ], + "childSubtreeAvailability" : { + "bitstream" : 0, + "availableCount" : 4 + } +} +``` + +#### Tile Availability: +Constant: 1 + + +#### Content Availability (0 of 1) +Constant: 1 + + +#### Child Subtree Availability: + +| Byte index: | 0|1| +| --- | --- | --- | +| Bytes: | 0x48 | 0x12 | +| Bits [0...7] : | 00010010|01001000| + + + +## Coordinates (level=2, (1,1)) + +Subtree JSON: + +``` +{ + "buffers" : [ + { + "byteLength" : 16 + } + ], + "bufferViews" : [ + { + "buffer" : 0, + "byteOffset" : 0, + "byteLength" : 1 + }, + { + "buffer" : 0, + "byteOffset" : 8, + "byteLength" : 1 + } + ], + "tileAvailability" : { + "bitstream" : 0, + "availableCount" : 4 + }, + "contentAvailability" : [ + { + "bitstream" : 1, + "availableCount" : 4 + } + ], + "childSubtreeAvailability" : { + "availableCount" : 0, + "constant" : 0 + } +} +``` + +#### Tile Availability: + +| Byte index: | 0| +| --- | --- | +| Bytes: | 0x1d | +| Bits [0...7] : | 10111000| + + +#### Content Availability (0 of 1) + +| Byte index: | 0| +| --- | --- | +| Bytes: | 0x1d | +| Bits [0...7] : | 10111000| + + +#### Child Subtree Availability: +Constant: 0 + + + +## Coordinates (level=2, (2,1)) + +Subtree JSON: + +``` +{ + "buffers" : [ + { + "byteLength" : 16 + } + ], + "bufferViews" : [ + { + "buffer" : 0, + "byteOffset" : 0, + "byteLength" : 1 + }, + { + "buffer" : 0, + "byteOffset" : 8, + "byteLength" : 1 + } + ], + "tileAvailability" : { + "bitstream" : 0, + "availableCount" : 4 + }, + "contentAvailability" : [ + { + "bitstream" : 1, + "availableCount" : 4 + } + ], + "childSubtreeAvailability" : { + "availableCount" : 0, + "constant" : 0 + } +} +``` + +#### Tile Availability: + +| Byte index: | 0| +| --- | --- | +| Bytes: | 0x1b | +| Bits [0...7] : | 11011000| + + +#### Content Availability (0 of 1) + +| Byte index: | 0| +| --- | --- | +| Bytes: | 0x1b | +| Bits [0...7] : | 11011000| + + +#### Child Subtree Availability: +Constant: 0 + + + +## Coordinates (level=2, (1,2)) + +Subtree JSON: + +``` +{ + "buffers" : [ + { + "byteLength" : 16 + } + ], + "bufferViews" : [ + { + "buffer" : 0, + "byteOffset" : 0, + "byteLength" : 1 + }, + { + "buffer" : 0, + "byteOffset" : 8, + "byteLength" : 1 + } + ], + "tileAvailability" : { + "bitstream" : 0, + "availableCount" : 4 + }, + "contentAvailability" : [ + { + "bitstream" : 1, + "availableCount" : 4 + } + ], + "childSubtreeAvailability" : { + "availableCount" : 0, + "constant" : 0 + } +} +``` + +#### Tile Availability: + +| Byte index: | 0| +| --- | --- | +| Bytes: | 0x17 | +| Bits [0...7] : | 11101000| + + +#### Content Availability (0 of 1) + +| Byte index: | 0| +| --- | --- | +| Bytes: | 0x17 | +| Bits [0...7] : | 11101000| + + +#### Child Subtree Availability: +Constant: 0 + + + +## Coordinates (level=2, (2,2)) + +Subtree JSON: + +``` +{ + "buffers" : [ + { + "byteLength" : 16 + } + ], + "bufferViews" : [ + { + "buffer" : 0, + "byteOffset" : 0, + "byteLength" : 1 + }, + { + "buffer" : 0, + "byteOffset" : 8, + "byteLength" : 1 + } + ], + "tileAvailability" : { + "bitstream" : 0, + "availableCount" : 4 + }, + "contentAvailability" : [ + { + "bitstream" : 1, + "availableCount" : 4 + } + ], + "childSubtreeAvailability" : { + "availableCount" : 0, + "constant" : 0 + } +} +``` + +#### Tile Availability: + +| Byte index: | 0| +| --- | --- | +| Bytes: | 0x0f | +| Bits [0...7] : | 11110000| + + +#### Content Availability (0 of 1) + +| Byte index: | 0| +| --- | --- | +| Bytes: | 0x0f | +| Bits [0...7] : | 11110000| + + +#### Child Subtree Availability: +Constant: 0 + + + diff --git a/specs/data/tilesetProcessing/implicitProcessing/subtrees/0.0.0.subtree b/specs/data/tilesetProcessing/implicitProcessing/subtrees/0.0.0.subtree new file mode 100644 index 00000000..fb3f7dd8 Binary files /dev/null and b/specs/data/tilesetProcessing/implicitProcessing/subtrees/0.0.0.subtree differ diff --git a/specs/data/tilesetProcessing/implicitProcessing/subtrees/2.1.1.subtree b/specs/data/tilesetProcessing/implicitProcessing/subtrees/2.1.1.subtree new file mode 100644 index 00000000..f42a55fc Binary files /dev/null and b/specs/data/tilesetProcessing/implicitProcessing/subtrees/2.1.1.subtree differ diff --git a/specs/data/tilesetProcessing/implicitProcessing/subtrees/2.1.2.subtree b/specs/data/tilesetProcessing/implicitProcessing/subtrees/2.1.2.subtree new file mode 100644 index 00000000..3e714c26 Binary files /dev/null and b/specs/data/tilesetProcessing/implicitProcessing/subtrees/2.1.2.subtree differ diff --git a/specs/data/tilesetProcessing/implicitProcessing/subtrees/2.2.1.subtree b/specs/data/tilesetProcessing/implicitProcessing/subtrees/2.2.1.subtree new file mode 100644 index 00000000..cebc9edb Binary files /dev/null and b/specs/data/tilesetProcessing/implicitProcessing/subtrees/2.2.1.subtree differ diff --git a/specs/data/tilesetProcessing/implicitProcessing/subtrees/2.2.2.subtree b/specs/data/tilesetProcessing/implicitProcessing/subtrees/2.2.2.subtree new file mode 100644 index 00000000..33f029ef Binary files /dev/null and b/specs/data/tilesetProcessing/implicitProcessing/subtrees/2.2.2.subtree differ diff --git a/specs/data/tilesetProcessing/implicitProcessing/tileset.json b/specs/data/tilesetProcessing/implicitProcessing/tileset.json new file mode 100644 index 00000000..4ae231fa --- /dev/null +++ b/specs/data/tilesetProcessing/implicitProcessing/tileset.json @@ -0,0 +1,24 @@ +{ + "asset" : { + "version" : "1.1" + }, + "geometricError" : 64.0, + "root" : { + "boundingVolume" : { + "box" : [ 0.5, 0.5, 0.00625, 0.5, 0.0, 0.0, 0.0, 0.5, 0.0, 0.0, 0.0, 0.00625 ] + }, + "geometricError" : 8.0, + "refine" : "REPLACE", + "content" : { + "uri" : "content/content_{level}__{x}_{y}.glb" + }, + "implicitTiling" : { + "subdivisionScheme" : "QUADTREE", + "subtreeLevels" : 2, + "availableLevels" : 4, + "subtrees" : { + "uri" : "subtrees/{level}.{x}.{y}.subtree" + } + } + } +} \ No newline at end of file diff --git a/specs/data/upgradeTileset/tilesetWithExternalTilesetWithUrls/content.b3dm b/specs/data/upgradeTileset/tilesetWithExternalTilesetWithUrls/content.b3dm new file mode 100644 index 00000000..f052e259 Binary files /dev/null and b/specs/data/upgradeTileset/tilesetWithExternalTilesetWithUrls/content.b3dm differ diff --git a/specs/data/upgradeTileset/tilesetWithExternalTilesetWithUrls/external.json b/specs/data/upgradeTileset/tilesetWithExternalTilesetWithUrls/external.json new file mode 100644 index 00000000..94154bf1 --- /dev/null +++ b/specs/data/upgradeTileset/tilesetWithExternalTilesetWithUrls/external.json @@ -0,0 +1,22 @@ +{ + "asset": { + "version": "1.0" + }, + "geometricError": 70, + "root": { + "boundingVolume": { + "region": [ + -1.31968, + 0.6988424218, + -1.3196390408203893, + 0.698874, + 0, + 20 + ] + }, + "geometricError": 70, + "content": { + "url": "content.b3dm" + } + } +} \ No newline at end of file diff --git a/specs/data/upgradeTileset/tilesetWithExternalTilesetWithUrls/tileset.json b/specs/data/upgradeTileset/tilesetWithExternalTilesetWithUrls/tileset.json new file mode 100644 index 00000000..4cdb9c97 --- /dev/null +++ b/specs/data/upgradeTileset/tilesetWithExternalTilesetWithUrls/tileset.json @@ -0,0 +1,22 @@ +{ + "asset": { + "version": "1.0" + }, + "geometricError": 70, + "root": { + "boundingVolume": { + "region": [ + -1.31968, + 0.6988424218, + -1.3196390408203893, + 0.698874, + 0, + 20 + ] + }, + "geometricError": 70, + "content": { + "url": "external.json" + } + } +} \ No newline at end of file diff --git a/specs/legacy/README.md b/specs/legacy/README.md deleted file mode 100644 index 2b3aefdd..00000000 --- a/specs/legacy/README.md +++ /dev/null @@ -1,5 +0,0 @@ -The tests in this directory are largely "ported" from -https://github.com/CesiumGS/3d-tiles-tools/tree/98b5d1e5369d5b2c2c6860b9c0ba3890e886ba25 - -When the respective functionality is covered with new, dedicated -tests, these may be removed. diff --git a/specs/legacy/SpecHelpers.ts b/specs/legacy/SpecHelpers.ts deleted file mode 100644 index 932caecd..00000000 --- a/specs/legacy/SpecHelpers.ts +++ /dev/null @@ -1,46 +0,0 @@ -import fs from "fs"; -import path from "path"; - -import { Iterables } from "../../src/base/Iterables"; - -export class SpecHelpers { - static getPaddedByteLength(byteLength: number): number { - const boundary = 8; - const remainder = byteLength % boundary; - const padding = remainder === 0 ? 0 : boundary - remainder; - return byteLength + padding; - } - - static forceDeleteDirectory(p: string) { - fs.rmSync(p, { - force: true, - recursive: true, - }); - } - - static isJson(file: string): boolean { - return path.extname(file) === ".json"; - } - - static getContentUris(string: string): string[] { - const regex = new RegExp('"uri"\\s?:\\s?"([^"]*)"', "g"); - const matches = string.matchAll(regex); - const uris: string[] = []; - for (const match of matches) { - uris.push(match[1]); - } - return uris; - } - - static getNumberOfTilesets(directory: string): number { - const recurse = true; - const files = Iterables.overFiles(directory, recurse); - let numberOfJsonFiles = 0; - for (const file of files) { - if (SpecHelpers.isJson(file)) { - numberOfJsonFiles++; - } - } - return numberOfJsonFiles; - } -} diff --git a/specs/legacy/combineTilesetSpec.ts b/specs/legacy/combineTilesetSpec.ts deleted file mode 100644 index 32383976..00000000 --- a/specs/legacy/combineTilesetSpec.ts +++ /dev/null @@ -1,36 +0,0 @@ -import fs from "fs"; - -import { Tilesets } from "../../src/tilesets/Tilesets"; -import { SpecHelpers } from "./SpecHelpers"; - -const tilesetDirectory = "./specs/data/TilesetOfTilesetsWithUris/"; -const combinedDirectory = - "./specs/data/output/TilesetOfTilesetsWithUris-combined"; -const combinedJson = - "./specs/data/output/TilesetOfTilesetsWithUris-combined/tileset.json"; - -describe("combineTileset", function () { - afterEach(async function () { - //SpecHelpers.forceDeleteDirectory(combinedDirectory); - }); - - it("combines external tilesets into a single tileset", async function () { - const overwrite = true; - await Tilesets.combine(tilesetDirectory, combinedDirectory, overwrite); - - const numberOfTilesets = SpecHelpers.getNumberOfTilesets(combinedDirectory); - - const combinedJsonBuffer = fs.readFileSync(combinedJson); - const combinedJsonString = combinedJsonBuffer.toString(); - const contentUris = SpecHelpers.getContentUris(combinedJsonString); - - expect(numberOfTilesets).toBe(1); - expect(contentUris).toEqual([ - "parent.b3dm", - "tileset3/ll.b3dm", - "lr.b3dm", - "ur.b3dm", - "ul.b3dm", - ]); - }); -}); diff --git a/src/ToolsMain.ts b/src/ToolsMain.ts index d34b75ae..fc1363ed 100644 --- a/src/ToolsMain.ts +++ b/src/ToolsMain.ts @@ -3,10 +3,12 @@ import path from "path"; import { Paths } from "./base/Paths"; import { DeveloperError } from "./base/DeveloperError"; +import { Buffers } from "./base/Buffers"; import { Tilesets } from "./tilesets/Tilesets"; import { TileFormats } from "./tileFormats/TileFormats"; +import { TileDataLayouts } from "./tileFormats/TileDataLayouts"; import { ContentOps } from "./contentProcessing/ContentOps"; import { GltfUtilities } from "./contentProcessing/GtlfUtilities"; @@ -15,8 +17,11 @@ import { ContentDataTypes } from "./contentTypes/ContentDataTypes"; import { PipelineExecutor } from "./pipelines/PipelineExecutor"; import { Pipelines } from "./pipelines/Pipelines"; -import { Buffers } from "./base/Buffers"; -import { TileDataLayouts } from "./tileFormats/TileDataLayouts"; + +import { ZipToPackage } from "./packages/ZipToPackage"; + +import { TilesetSources } from "./tilesetData/TilesetSources"; +import { TilesetTargets } from "./tilesetData/TilesetTargets"; /** * Functions that directly correspond to the command line functionality. @@ -177,7 +182,7 @@ export class ToolsMain { }; // Read the buffer and its magic header - const magic = Buffers.getMagic(inputBuffer, 0); + const magic = Buffers.getMagicString(inputBuffer, 0); if (magic === "b3dm" || magic === "i3dm" || magic === "pnts") { // Handle the basic legacy tile formats @@ -327,66 +332,26 @@ export class ToolsMain { await PipelineExecutor.executePipeline(pipeline, force); } - private static createTilesetToDatabasePipeline( - input: string, - output: string - ) { - const pipelineJson = { - input: input, - output: output, - tilesetStages: [ - { - name: "tilesetToDatabase", - }, - ], - }; - return pipelineJson; - } - - static async tilesetToDatabase( - input: string, - output: string, - force: boolean - ) { + static async convert(input: string, output: string, force: boolean) { ToolsMain.ensureCanWrite(output, force); + const inputExtension = path.extname(input).toLowerCase(); - const pipelineJson = ToolsMain.createTilesetToDatabasePipeline( - input, - output - ); - const pipeline = Pipelines.createPipeline(pipelineJson); - await PipelineExecutor.executePipeline(pipeline, force); - } - - private static createDatabaseToTilesetPipeline( - input: string, - output: string - ) { - const pipelineJson = { - input: input, - output: output, - tilesetStages: [ - { - name: "databaseToTileset", - }, - ], - }; - return pipelineJson; - } - - static async databaseToTileset( - input: string, - output: string, - force: boolean - ) { - ToolsMain.ensureCanWrite(output, force); - - const pipelineJson = ToolsMain.createDatabaseToTilesetPipeline( - input, - output - ); - const pipeline = Pipelines.createPipeline(pipelineJson); - await PipelineExecutor.executePipeline(pipeline, force); + if (inputExtension === ".zip") { + await ZipToPackage.convert(input, output, force); + } else { + const tilesetSource = TilesetSources.createAndOpen(input); + const tilesetTarget = TilesetTargets.createAndBegin(output, force); + + const keys = tilesetSource.getKeys(); + for (const key of keys) { + const content = tilesetSource.getValue(key); + if (content) { + tilesetTarget.addEntry(key, content); + } + } + tilesetSource.close(); + await tilesetTarget.end(); + } } static async combine(input: string, output: string, force: boolean) { @@ -404,6 +369,13 @@ export class ToolsMain { await Tilesets.merge(inputs, output, force); } + static async pipeline(input: string, force: boolean) { + const pipelineJsonBuffer = fs.readFileSync(input); + const pipelineJson = JSON.parse(pipelineJsonBuffer.toString()); + const pipeline = Pipelines.createPipeline(pipelineJson); + await PipelineExecutor.executePipeline(pipeline, force); + } + /** * Returns whether the specified file can be written. * diff --git a/src/base/Buffers.ts b/src/base/Buffers.ts index a77d45c7..0bda103c 100644 --- a/src/base/Buffers.ts +++ b/src/base/Buffers.ts @@ -52,21 +52,45 @@ export class Buffers { } /** - * Obtains the magic header from the given buffer. + * Obtains the magic header bytes from the given buffer. + * + * This returns up to `byteLength` bytes of the given buffer, + * starting at the given byte offset. If the buffer length + * is too small, then a shorter buffer may be returned. + * + * @param buffer - The buffer + * @param byteOffset - The byte offset, usually 0 + * @param byteLength - The byte length + * @returns The magic header. + */ + static getMagicBytes( + buffer: Buffer, + byteOffset: number, + byteLength: number + ): Buffer { + const start = Math.min(buffer.length, byteOffset); + const end = Math.min(buffer.length, start + byteLength); + return buffer.subarray(start, end); + } + + /** + * Obtains the magic header from the given buffer, interpreted + * as an ASCII string * * This returns up to 4 bytes of the given buffer, starting at - * the given byte offset, converted to a string. If the buffer - * length is too small, then a shorter string may be returned. + * the given byte offset, converted to an ASCII string. If the + * buffer length is too small, then a shorter string may be + * returned. * * @param buffer - The buffer * @param byteOffset - The optional byte offset, defaulting to 0 * @returns The magic header. */ - static getMagic(buffer: Buffer, byteOffset?: number): string { + static getMagicString(buffer: Buffer, byteOffset?: number): string { const magicLength = 4; const start = Math.min(buffer.length, byteOffset ?? 0); const end = Math.min(buffer.length, start + magicLength); - return buffer.toString("utf8", start, end); + return buffer.toString("ascii", start, end); } /** diff --git a/src/base/Iterables.ts b/src/base/Iterables.ts index 5c2e09ac..bd6ddd13 100644 --- a/src/base/Iterables.ts +++ b/src/base/Iterables.ts @@ -15,14 +15,13 @@ export class Iterables { * `recurse` is `true`. * * @param directory - The directory - * @param recurse - [true] Whether the files should + * @param recurse - Whether the files should * be listed recursively * @returns The generator for path strings */ static *overFiles( directory: string | PathLike, - // eslint-disable-next-line @typescript-eslint/no-inferrable-types - recurse: boolean = true + recurse: boolean ): IterableIterator { const fileNames = fs.readdirSync(directory); for (const fileName of fileNames) { diff --git a/src/contentProcessing/ContentOps.ts b/src/contentProcessing/ContentOps.ts index b7e22e67..c00dd7e5 100644 --- a/src/contentProcessing/ContentOps.ts +++ b/src/contentProcessing/ContentOps.ts @@ -2,23 +2,66 @@ import { GltfUtilities } from "./GtlfUtilities"; import { TileFormats } from "../tileFormats/TileFormats"; +/** + * Low-level operations on tile content. + * + * The methods in this class are supposed to represent basic + * operations that receive a buffer with tile content data, + * and return a buffer with tile content data. + * + * They are used for implementing some of the command line + * functionalities (like `b3dmToGlb`), as well as serving + * as building blocks for the tileset content processing + * in pipelines. + */ export class ContentOps { + /** + * Extracts the GLB buffer from the given B3DM buffer. + * + * @param inputBuffer - The input buffer + * @returns The resulting buffer + */ static b3dmToGlbBuffer(inputBuffer: Buffer): Buffer { const inputTileData = TileFormats.readTileData(inputBuffer); const outputBuffer = inputTileData.payload; return outputBuffer; } + /** + * Extracts the GLB buffer from the given I3DM buffer. + * + * @param inputBuffer - The input buffer + * @returns The resulting buffer + */ static i3dmToGlbBuffer(inputBuffer: Buffer): Buffer { const inputTileData = TileFormats.readTileData(inputBuffer); const outputBuffer = inputTileData.payload; return outputBuffer; } + /** + * Extracts all GLB buffers from the given CMPT buffer. + * + * This will recursively resolve all inner tiles. If they + * are B3DM or I3DM tiles, then their GLBs will be added + * to the results array, in unspecified order. + * + * @param inputBuffer - The input buffer + * @returns The resulting buffers + */ static cmptToGlbBuffers(inputBuffer: Buffer): Buffer[] { return TileFormats.extractGlbBuffers(inputBuffer); } + /** + * Creates a B3DM buffer from the given GLB buffer. + * + * This will create a B3DM that contains the minimum required + * default feature table, and the given GLB as its payload. + * + * @param inputBuffer - The input buffer + * @returns The resulting buffer + */ static glbToB3dmBuffer(inputBuffer: Buffer): Buffer { const outputTileData = TileFormats.createDefaultB3dmTileDataFromGlb(inputBuffer); @@ -26,6 +69,15 @@ export class ContentOps { return outputBuffer; } + /** + * Creates a I3DM buffer from the given GLB buffer. + * + * This will create an I3DM that contains the minimum required + * default feature table, and the given GLB as its payload. + * + * @param inputBuffer - The input buffer + * @returns The resulting buffer + */ static glbToI3dmBuffer(inputBuffer: Buffer): Buffer { const outputTileData = TileFormats.createDefaultI3dmTileDataFromGlb(inputBuffer); @@ -33,6 +85,17 @@ export class ContentOps { return outputBuffer; } + /** + * Optimize the GLB that is contained in the given B3DM. + * + * This will optimize the GLB in the given B3DM, using `gltf-pipeline` + * with the given options, and create a new B3DM from the result. + * The result will have the same feature- and batch table data + * as the given input. + * + * @param inputBuffer - The input buffer + * @returns The resulting buffer + */ static async optimizeB3dmBuffer( inputBuffer: Buffer, options: any @@ -51,6 +114,17 @@ export class ContentOps { return outputBuffer; } + /** + * Optimize the GLB that is contained in the given I3DM. + * + * This will optimize the GLB in the given I3DM, using `gltf-pipeline` + * with the given options, and create a new I3DM from the result. + * The result will have the same feature- and batch table data + * as the given input. + * + * @param inputBuffer - The input buffer + * @returns The resulting buffer + */ static async optimizeI3dmBuffer( inputBuffer: Buffer, options: any diff --git a/src/contentProcessing/GltfPipelineLegacy.ts b/src/contentProcessing/GltfPipelineLegacy.ts index 822f6932..00ddda1e 100644 --- a/src/contentProcessing/GltfPipelineLegacy.ts +++ b/src/contentProcessing/GltfPipelineLegacy.ts @@ -1,6 +1,7 @@ import { defaultValue } from "../base/defaultValue"; import { Cartesian3, defined, DeveloperError, Ellipsoid } from "cesium"; +import { Extensions } from "../tilesets/Extensions"; /** * Methods and fragments ported from a legacy version of gltf-pipeline. @@ -57,8 +58,7 @@ export class GltfPipelineLegacy { extensions.CESIUM_RTC = { center: positionArray, }; - GltfPipelineLegacy.addExtensionsRequired(gltf, "CESIUM_RTC"); - GltfPipelineLegacy.addExtensionsUsed(gltf, "CESIUM_RTC"); + Extensions.addExtensionRequired(gltf, "CESIUM_RTC"); } private static fixBatchIdSemantic(gltf: any) { @@ -92,40 +92,4 @@ export class GltfPipelineLegacy { } } } - - private static addExtensionsRequired(gltf: any, extension: string) { - let extensionsRequired = gltf.extensionsRequired; - if (!defined(extensionsRequired)) { - extensionsRequired = []; - gltf.extensionsRequired = extensionsRequired; - } - GltfPipelineLegacy.addToArray(extensionsRequired, extension, true); - GltfPipelineLegacy.addExtensionsUsed(gltf, extension); - } - - private static addExtensionsUsed(gltf: any, extension: string) { - let extensionsUsed = gltf.extensionsUsed; - if (!defined(extensionsUsed)) { - extensionsUsed = []; - gltf.extensionsUsed = extensionsUsed; - } - GltfPipelineLegacy.addToArray(extensionsUsed, extension, true); - } - - private static addToArray( - array: string[], - element: string, - checkDuplicates: boolean - ) { - checkDuplicates = defaultValue(checkDuplicates, false); - if (checkDuplicates) { - const index = array.indexOf(element); - if (index > -1) { - return index; - } - } - - array.push(element); - return array.length - 1; - } } diff --git a/src/contentProcessing/GtlfUtilities.ts b/src/contentProcessing/GtlfUtilities.ts index 1ed71bcc..a423cb9d 100644 --- a/src/contentProcessing/GtlfUtilities.ts +++ b/src/contentProcessing/GtlfUtilities.ts @@ -1,6 +1,11 @@ import GltfPipeline from "gltf-pipeline"; + import { Buffers } from "../base/Buffers"; + import { TileFormatError } from "../tileFormats/TileFormatError"; + +import { Extensions } from "../tilesets/Extensions"; + import { GltfPipelineLegacy } from "./GltfPipelineLegacy"; /** @@ -37,7 +42,7 @@ export class GltfUtilities { * @throws TileFormatError If the input does not contain valid GLB data. */ static extractJsonFromGlb(glbBuffer: Buffer): Buffer { - const magic = Buffers.getMagic(glbBuffer); + const magic = Buffers.getMagicString(glbBuffer); if (magic !== "glTF") { throw new TileFormatError( `Expected magic header to be 'gltf', but found ${magic}` @@ -130,4 +135,68 @@ export class GltfUtilities { const result = await GltfPipeline.processGlb(glbBuffer, options); return result.glb; } + + /** + * Given an input buffer containing a binary glTF asset, remove + * its use of the `CESIUM_RTC` extension by inserting new nodes + * (above the former root nodes) that contain the RTC center as + * their translation. + * + * @param glbBuffer The buffer containing the binary glTF. + * @returns A promise that resolves to the resulting binary glTF. + */ + static async replaceCesiumRtcExtension(glbBuffer: Buffer) { + // eslint-disable-next-line @typescript-eslint/no-unused-vars + const customStage = (gltf: any, options: any) => { + GltfUtilities.replaceCesiumRtcExtensionInternal(gltf); + return gltf; + }; + const options = { + customStages: [customStage], + keepUnusedElements: true, + }; + const result = await GltfPipeline.processGlb(glbBuffer, options); + return result.glb; + } + + /** + * Replaces the `CESIUM_RTC` extension in the given glTF object. + * + * This will insert a new parent node above each root node of + * a scene. These new parent nodes will have a `translation` + * that is directly taken from the `CESIUM_RTC` `center`. + * + * The `CESIUM_RTC` extension object and its used/required + * usage declarations will be removed. + * + * @param gltf - The glTF object + */ + private static replaceCesiumRtcExtensionInternal(gltf: any) { + const rtcExtension = gltf.extensions["CESIUM_RTC"]; + if (!rtcExtension) { + return; + } + const rtcTranslation = rtcExtension.center; + const scenes = gltf.scenes; + if (!scenes) { + return; + } + for (const scene of scenes) { + const sceneNodes = scene.nodes; + if (sceneNodes) { + for (let i = 0; i < sceneNodes.length; i++) { + const nodeIndex = sceneNodes[i]; + const newParent = { + translation: rtcTranslation, + children: [nodeIndex], + }; + const newParentIndex = gltf.nodes.length; + gltf.nodes.push(newParent); + sceneNodes[i] = newParentIndex; + } + } + } + Extensions.removeExtensionUsed(gltf, "CESIUM_RTC"); + Extensions.removeExtension(gltf, "CESIUM_RTC"); + } } diff --git a/src/contentTypes/BufferedContentData.ts b/src/contentTypes/BufferedContentData.ts index 2b43c3a3..47713bb3 100644 --- a/src/contentTypes/BufferedContentData.ts +++ b/src/contentTypes/BufferedContentData.ts @@ -50,12 +50,12 @@ export class BufferedContentData implements ContentData { private readonly _extension: string; /** - * The "magic string" from the content data. This is - * a string consisting of the first (up to) 4 bytes - * of the content data, or the empty string if the - * content data could not be resolved. + * The "magic header bytes" from the content data. These + * are the first (up to) 4 bytes of the content data, + * or the empty buffer if the content data could not + * be resolved. */ - private readonly _magic: string; + private readonly _magic: Buffer; /** * The content data, or `null` if the data could not @@ -87,9 +87,10 @@ export class BufferedContentData implements ContentData { this._uri = uri; this._extension = path.extname(uri).toLowerCase(); if (data) { - this._magic = Buffers.getMagic(data); + const magicHeaderLength = 4; + this._magic = Buffers.getMagicBytes(data, 0, magicHeaderLength); } else { - this._magic = ""; + this._magic = Buffer.alloc(0); } this._data = data; this._parsedObject = undefined; @@ -112,7 +113,7 @@ export class BufferedContentData implements ContentData { } /** {@inheritDoc ContentData.magic} */ - async getMagic(): Promise { + async getMagic(): Promise { return this._magic; } diff --git a/src/contentTypes/ContentData.ts b/src/contentTypes/ContentData.ts index 2ec6bb4c..d4afbc7b 100644 --- a/src/contentTypes/ContentData.ts +++ b/src/contentTypes/ContentData.ts @@ -26,13 +26,12 @@ export interface ContentData { exists(): Promise; /** - * Returns a string that consists of the first 4 bytes - * of the buffer data (or fewer, if the buffer contains - * fewer than 4 bytes) + * Returns the first 4 bytes of the buffer data (or fewer, if the + * buffer contains fewer than 4 bytes) * - * @returns The magic string + * @returns The magic bytes */ - getMagic(): Promise; + getMagic(): Promise; /** * Returns the actual content data that was read from the URI. diff --git a/src/contentTypes/ContentDataTypeChecks.ts b/src/contentTypes/ContentDataTypeChecks.ts index fa398e3b..79a05be5 100644 --- a/src/contentTypes/ContentDataTypeChecks.ts +++ b/src/contentTypes/ContentDataTypeChecks.ts @@ -19,6 +19,33 @@ export class ContentDataTypeChecks { includedContentDataTypes: string[] | undefined, excludedContentDataTypes: string[] | undefined ): (contentData: ContentData) => Promise { + const typeCheck = ContentDataTypeChecks.createTypeCheck( + includedContentDataTypes, + excludedContentDataTypes + ); + return async (contentData: ContentData) => { + const contentDataType = await ContentDataTypeRegistry.findContentDataType( + contentData + ); + const result = typeCheck(contentDataType); + return result; + }; + } + + /** + * Creates a predicate that checks whether a given string + * (that represents one of the `ContentDataTypes`) is + * contained in the given included types, and NOT + * contained in the given excluded types. + * + * @param includedContentDataTypes - The included types + * @param excludedContentDataTypes - The excluded types + * @returns The predicate + */ + static createTypeCheck( + includedContentDataTypes: string[] | undefined, + excludedContentDataTypes: string[] | undefined + ): (contentDataType: string | undefined) => boolean { const localIncluded: string[] = []; if (includedContentDataTypes) { localIncluded.push(...includedContentDataTypes); @@ -27,10 +54,7 @@ export class ContentDataTypeChecks { if (excludedContentDataTypes) { localExcluded.push(...excludedContentDataTypes); } - return async (contentData: ContentData) => { - const contentDataType = await ContentDataTypeRegistry.findContentDataType( - contentData - ); + return (contentDataType: string | undefined) => { if (!contentDataType) { return false; } diff --git a/src/contentTypes/ContentDataTypeRegistry.ts b/src/contentTypes/ContentDataTypeRegistry.ts index 1e8b4b7a..92235f50 100644 --- a/src/contentTypes/ContentDataTypeRegistry.ts +++ b/src/contentTypes/ContentDataTypeRegistry.ts @@ -3,14 +3,15 @@ import { ContentDataTypes } from "./ContentDataTypes"; import { ContentDataTypeEntry } from "./ContentDataTypeEntry"; import { defined } from "../base/defined"; import { DeveloperError } from "../base/DeveloperError"; +import { BufferedContentData } from "./BufferedContentData"; /** * A class for determining the type of data that a URI points to. * - * The only public methods (for now) are `registerDefaults`, - * which registers all known content data types, and - * `findContentDataType`, which returns the string that - * describes the type of a `ContentData` object. + * The only public methods (for now) are `findType`, which + * determines the type of data that is given as a URI and + * a buffer, and `findContentDataType`, which returns the + * string that describes the type of a `ContentData` object. * * @internal */ @@ -45,38 +46,62 @@ export class ContentDataTypeRegistry { // In the future, there might be a mechanism for 'overriding' a // previously registered type. ContentDataTypeRegistry.register( - ContentDataTypeRegistry.byMagic("glTF"), + ContentDataTypeRegistry.byMagicString("glTF"), ContentDataTypes.CONTENT_TYPE_GLB ); ContentDataTypeRegistry.register( - ContentDataTypeRegistry.byMagic("b3dm"), + ContentDataTypeRegistry.byMagicString("b3dm"), ContentDataTypes.CONTENT_TYPE_B3DM ); ContentDataTypeRegistry.register( - ContentDataTypeRegistry.byMagic("i3dm"), + ContentDataTypeRegistry.byMagicString("i3dm"), ContentDataTypes.CONTENT_TYPE_I3DM ); ContentDataTypeRegistry.register( - ContentDataTypeRegistry.byMagic("cmpt"), + ContentDataTypeRegistry.byMagicString("cmpt"), ContentDataTypes.CONTENT_TYPE_CMPT ); ContentDataTypeRegistry.register( - ContentDataTypeRegistry.byMagic("pnts"), + ContentDataTypeRegistry.byMagicString("pnts"), ContentDataTypes.CONTENT_TYPE_PNTS ); ContentDataTypeRegistry.register( - ContentDataTypeRegistry.byMagic("geom"), + ContentDataTypeRegistry.byMagicString("geom"), ContentDataTypes.CONTENT_TYPE_GEOM ); + ContentDataTypeRegistry.register( - ContentDataTypeRegistry.byMagic("vctr"), + ContentDataTypeRegistry.byMagicString("vctr"), ContentDataTypes.CONTENT_TYPE_VCTR ); + + ContentDataTypeRegistry.register( + ContentDataTypeRegistry.byMagicString("subt"), + ContentDataTypes.CONTENT_TYPE_SUBT + ); + + const pngMagicHeader = [0x89, 0x50, 0x4e, 0x47]; + ContentDataTypeRegistry.register( + ContentDataTypeRegistry.byMagicBytes(pngMagicHeader), + ContentDataTypes.CONTENT_TYPE_PNG + ); + + const jpegMagicHeader = [0xff, 0xd8, 0xff]; + ContentDataTypeRegistry.register( + ContentDataTypeRegistry.byMagicBytes(jpegMagicHeader), + ContentDataTypes.CONTENT_TYPE_JPEG + ); + + ContentDataTypeRegistry.register( + ContentDataTypeRegistry.byMagicString("GIF8"), + ContentDataTypes.CONTENT_TYPE_GIF + ); + ContentDataTypeRegistry.register( ContentDataTypeRegistry.byExtension(".geojson"), ContentDataTypes.CONTENT_TYPE_GEOJSON @@ -99,6 +124,29 @@ export class ContentDataTypeRegistry { ContentDataTypeRegistry._registeredDefaults = true; } + /** + * Tries to find the string that describes the type of the given data. + * If the type of the data cannot be determined, then `undefined` is + * returned. + * + * This is a convenience method for `findContentDataType`, for the + * case that the data is already fully available as a buffer. If + * the data should be fetched lazily, then `findContentDataType` + * should be called with a `LazyContentData` object. + * + * @param uri - The URI of the data + * @param data - the actual data + * @returns The string, or `undefined` + */ + static async findType( + uri: string, + data: Buffer + ): Promise { + const contentData = new BufferedContentData(uri, data); + const contentDataType = await this.findContentDataType(contentData); + return contentDataType; + } + /** * Tries to find the string that describes the given content data * type. If the type of the content data cannot be determined, @@ -127,16 +175,42 @@ export class ContentDataTypeRegistry { /** * Creates a predicate that checks whether the magic header of - * a ContentData matches the given magic header string. + * a ContentData (interpreted as ASCII characters) starts with + * the given magic header string. * * @param magic - The magic header string * @returns The predicate */ - private static byMagic( + private static byMagicString( magic: string ): (contentData: ContentData) => Promise { - const predicate = async (contentData: ContentData) => - (await contentData.getMagic()) === magic; + const predicate = async (contentData: ContentData) => { + const contentMagic = await contentData.getMagic(); + const contentMagicString = contentMagic.toString("ascii"); + return contentMagicString.startsWith(magic); + }; + return predicate; + } + + /** + * Creates a predicate that checks whether the magic header of + * a ContentData starts with the given bytes + * + * @param magic - The magic bytes + * @returns The predicate + */ + private static byMagicBytes( + magic: number[] + ): (contentData: ContentData) => Promise { + const predicate = async (contentData: ContentData) => { + const contentMagic = await contentData.getMagic(); + for (let i = 0; i < magic.length; i++) { + if (contentMagic[i] != magic[i]) { + return false; + } + } + return true; + }; return predicate; } diff --git a/src/contentTypes/ContentDataTypes.ts b/src/contentTypes/ContentDataTypes.ts index 3ee0d88c..95e07f65 100644 --- a/src/contentTypes/ContentDataTypes.ts +++ b/src/contentTypes/ContentDataTypes.ts @@ -15,6 +15,10 @@ export class ContentDataTypes { static readonly CONTENT_TYPE_PNTS = "CONTENT_TYPE_PNTS"; static readonly CONTENT_TYPE_GEOM = "CONTENT_TYPE_GEOM"; static readonly CONTENT_TYPE_VCTR = "CONTENT_TYPE_VCTR"; + static readonly CONTENT_TYPE_SUBT = "CONTENT_TYPE_SUBT"; + static readonly CONTENT_TYPE_PNG = "CONTENT_TYPE_PNG"; + static readonly CONTENT_TYPE_JPEG = "CONTENT_TYPE_JPEG"; + static readonly CONTENT_TYPE_GIF = "CONTENT_TYPE_GIF"; static readonly CONTENT_TYPE_GEOJSON = "CONTENT_TYPE_GEOJSON"; static readonly CONTENT_TYPE_3TZ = "CONTENT_TYPE_3TZ"; diff --git a/src/contentTypes/LazyContentData.ts b/src/contentTypes/LazyContentData.ts index 7e2da7fd..ea89e697 100644 --- a/src/contentTypes/LazyContentData.ts +++ b/src/contentTypes/LazyContentData.ts @@ -42,12 +42,12 @@ export class LazyContentData implements ContentData { private _exists: boolean | undefined; /** - * The "magic string" from the content data. This is - * a string consisting of the first (up to) 4 bytes - * of the content data, or the empty string if the - * content data could not be resolved. + * The "magic header bytes" from the content data. These + * are the first (up to) 4 bytes of the content data, + * or the empty buffer if the content data could not + * be resolved. */ - private _magic: string | undefined; + private _magic: Buffer | undefined; /** * The content data, or `null` if the data could not @@ -117,19 +117,20 @@ export class LazyContentData implements ContentData { } /** {@inheritDoc ContentData.getMagic} */ - async getMagic(): Promise { + async getMagic(): Promise { if (defined(this._magic)) { return this._magic; } + const magicHeaderLength = 4; const partialData = await this._resourceResolver.resolveDataPartial( this._uri, - 4 + magicHeaderLength ); if (partialData) { - this._magic = Buffers.getMagic(partialData); + this._magic = Buffers.getMagicBytes(partialData, 0, magicHeaderLength); this._exists = true; } else { - this._magic = ""; + this._magic = Buffer.alloc(0); this._exists = false; } return this._magic; diff --git a/src/implicitTiling/ImplicitTilings.ts b/src/implicitTiling/ImplicitTilings.ts index 6f3f7595..1dd52a56 100644 --- a/src/implicitTiling/ImplicitTilings.ts +++ b/src/implicitTiling/ImplicitTilings.ts @@ -38,14 +38,21 @@ export class ImplicitTilings { * @param implicitTiling - The `TileImplicitTiling` object * @returns The generator * @throws ImplicitTilingError if the given object does not - * have a valid `subdivisionScheme`. + * have a valid `subdivisionScheme`, or the number of subtree + * levels is not positive */ static createSubtreeCoordinatesIterator( implicitTiling: TileImplicitTiling ): IterableIterator { + const levels = implicitTiling.subtreeLevels; + if (levels < 1) { + throw new ImplicitTilingError( + `Invalid number of subtree levels: ${levels}` + ); + } const r = this.createRootCoordinates(implicitTiling); const depthFirst = false; - return r.descendants(implicitTiling.subtreeLevels - 1, depthFirst); + return r.descendants(levels - 1, depthFirst); } /** @@ -55,12 +62,18 @@ export class ImplicitTilings { * @param implicitTiling - The `TileImplicitTiling` object * @returns The number of nodes * @throws ImplicitTilingError if the given object does not - * have a valid `subdivisionScheme`. + * have a valid `subdivisionScheme`, or the number of subtree + * levels is not positive */ static computeNumberOfNodesPerSubtree( implicitTiling: TileImplicitTiling ): number { const levels = implicitTiling.subtreeLevels; + if (levels < 1) { + throw new ImplicitTilingError( + `Invalid number of subtree levels: ${levels}` + ); + } if (implicitTiling.subdivisionScheme === "QUADTREE") { return Quadtrees.computeNumberOfNodesForLevels(levels); } diff --git a/src/io/FileResourceResolver.ts b/src/io/FileResourceResolver.ts index 6211ed1d..5f4b473e 100644 --- a/src/io/FileResourceResolver.ts +++ b/src/io/FileResourceResolver.ts @@ -17,8 +17,7 @@ export class FileResourceResolver implements ResourceResolver { this._basePath = basePath; } - /** {@inheritDoc ResourceResolver.resolveUri} */ - resolveUri(uri: string): string { + private resolveUri(uri: string): string { let resolved = path.resolve(this._basePath, decodeURIComponent(uri)); resolved = resolved.replace(/\\/g, "/"); return resolved; diff --git a/src/io/ResourceResolver.ts b/src/io/ResourceResolver.ts index 96817451..26d787c1 100644 --- a/src/io/ResourceResolver.ts +++ b/src/io/ResourceResolver.ts @@ -5,15 +5,6 @@ * @internal */ export interface ResourceResolver { - /** - * Returns the URI that results from resolving the given - * URI against the base URI of this resource resolver. - * - * @param uri - The URI - * @returns The resolved URI - */ - resolveUri(uri: string): string; - /** * Resolve the data from the given URI. * diff --git a/src/io/TilesetSourceResourceResolver.ts b/src/io/TilesetSourceResourceResolver.ts index 4aa44548..48924522 100644 --- a/src/io/TilesetSourceResourceResolver.ts +++ b/src/io/TilesetSourceResourceResolver.ts @@ -1,5 +1,3 @@ -import path from "path"; - import { Paths } from "../base/Paths"; import { Uris } from "../base/Uris"; @@ -21,14 +19,24 @@ export class TilesetSourceResourceResolver implements ResourceResolver { this._tilesetSource = tilesetSource; } - /** {@inheritDoc ResourceResolver.resolveUri} */ - resolveUri(uri: string): string { - const resolved = path.resolve(this._basePath, decodeURIComponent(uri)); - return resolved; - } - /** {@inheritDoc ResourceResolver.resolveData} */ async resolveData(uri: string): Promise { + return this.resolveDataInternal(uri); + } + + /** {@inheritDoc ResourceResolver.resolveDataPartial} */ + async resolveDataPartial( + uri: string, + maxBytes: number + ): Promise { + const buffer = await this.resolveDataInternal(uri); + if (!buffer) { + return null; + } + return buffer.subarray(0, maxBytes); + } + + private async resolveDataInternal(uri: string): Promise { if (Uris.isDataUri(uri)) { const data = Buffer.from(uri.split(",")[1], "base64"); return data; @@ -44,15 +52,6 @@ export class TilesetSourceResourceResolver implements ResourceResolver { return value; } - /** {@inheritDoc ResourceResolver.resolveDataPartial} */ - async resolveDataPartial( - uri: string, - // eslint-disable-next-line @typescript-eslint/no-unused-vars - maxBytes: number - ): Promise { - return await this.resolveData(uri); - } - /** {@inheritDoc ResourceResolver.derive} */ derive(uri: string): ResourceResolver { const resolved = Paths.join(this._basePath, decodeURIComponent(uri)); diff --git a/src/io/UnzippingResourceResolver.ts b/src/io/UnzippingResourceResolver.ts index 2676d28b..9b3eb6da 100644 --- a/src/io/UnzippingResourceResolver.ts +++ b/src/io/UnzippingResourceResolver.ts @@ -17,11 +17,6 @@ export class UnzippingResourceResolver implements ResourceResolver { this._delegate = delegate; } - /** {@inheritDoc ResourceResolver.resolveUri} */ - resolveUri(uri: string): string { - return this._delegate.resolveUri(uri); - } - /** {@inheritDoc ResourceResolver.resolveData} */ async resolveData(uri: string): Promise { const delegateData = await this._delegate.resolveData(uri); diff --git a/src/main.ts b/src/main.ts index 5bd5f510..52e97043 100644 --- a/src/main.ts +++ b/src/main.ts @@ -45,6 +45,15 @@ const inputArrayDefinition: any = { demandOption: true, }; +const outputStringDefinition: any = { + alias: "output", + description: "Output path for the command.", + global: true, + normalize: true, + type: "string", + demandOption: true, +}; + /** * Parses the arguments that are intended for the actual 3D Tiles tools * (ignoring the option arguments), and returns the result. @@ -54,125 +63,138 @@ const inputArrayDefinition: any = { */ function parseToolArgs(a: string[]) { const args = yargs(a) - .usage("Usage: $0 [options]") - .help("h") - .alias("h", "help") - .options({ - o: { - alias: "output", - description: "Output path for the command.", - global: true, - normalize: true, - type: "string", - demandOption: true, - }, - f: { - alias: "force", - default: false, - description: "Output can be overwritten if it already exists.", - global: true, - type: "boolean", - }, - }) - .command("tilesetToDatabase", "Create a sqlite database for a tileset.", { - i: inputStringDefinition, - }) - .command( - "databaseToTileset", - "Unpack a tileset database to a tileset folder.", - { i: inputStringDefinition } - ) - .command( - "glbToB3dm", - "Repackage the input glb as a b3dm with a basic header.", - { i: inputStringDefinition } - ) - .command( - "glbToI3dm", - "Repackage the input glb as a i3dm with a basic header.", - { i: inputStringDefinition } - ) + .usage("Usage: $0 [options]") + .help("h") + .alias("h", "help") + .options({ + f: { + alias: "force", + default: false, + description: "Output can be overwritten if it already exists.", + global: true, + type: "boolean", + }, + }) + .command("tilesetToDatabase", "Create a sqlite database for a tileset.", { + i: inputStringDefinition, + o: outputStringDefinition, + }) + .command( + "databaseToTileset", + "Unpack a tileset database to a tileset folder.", + { i: inputStringDefinition, o: outputStringDefinition } + ) + .command( + "convert", + "Convert between tilesets and tileset package formats. " + + "The input and output can be paths to tileset JSON files, " + + "'.3tz', or '.3dtiles' files.", + { + i: inputStringDefinition, + o: outputStringDefinition, + } + ) + .command( + "glbToB3dm", + "Repackage the input glb as a b3dm with a basic header.", + { i: inputStringDefinition, o: outputStringDefinition } + ) + .command( + "glbToI3dm", + "Repackage the input glb as a i3dm with a basic header.", + { i: inputStringDefinition, o: outputStringDefinition } + ) .command( "b3dmToGlb", "Extract the binary glTF asset from the input b3dm.", { - i: inputStringDefinition, + i: inputStringDefinition, + o: outputStringDefinition, } ) .command( "i3dmToGlb", "Extract the binary glTF asset from the input i3dm.", { - i: inputStringDefinition, + i: inputStringDefinition, + o: outputStringDefinition, } ) .command( "cmptToGlb", "Extract the binary glTF assets from the input cmpt.", { - i: inputStringDefinition, + i: inputStringDefinition, + o: outputStringDefinition, } ) - .command( - "optimizeB3dm", - "Pass the input b3dm through gltf-pipeline. To pass options to gltf-pipeline, place them after --options. (--options -h for gltf-pipeline help)", - { + .command( + "optimizeB3dm", + "Pass the input b3dm through gltf-pipeline. To pass options to gltf-pipeline, place them after --options. (--options -h for gltf-pipeline help)", + { i: inputStringDefinition, - options: { - description: - "All arguments after this flag will be passed to gltf-pipeline as command line options.", - }, - } - ) - .command( - "optimizeI3dm", - "Pass the input i3dm through gltf-pipeline. To pass options to gltf-pipeline, place them after --options. (--options -h for gltf-pipeline help)", - { + o: outputStringDefinition, + options: { + description: + "All arguments after this flag will be passed to gltf-pipeline as command line options.", + }, + } + ) + .command( + "optimizeI3dm", + "Pass the input i3dm through gltf-pipeline. To pass options to gltf-pipeline, place them after --options. (--options -h for gltf-pipeline help)", + { + i: inputStringDefinition, + o: outputStringDefinition, + options: { + description: + "All arguments after this flag will be passed to gltf-pipeline as command line options.", + }, + } + ) + .command("gzip", "Gzips the input tileset directory.", { i: inputStringDefinition, - options: { - description: - "All arguments after this flag will be passed to gltf-pipeline as command line options.", + o: outputStringDefinition, + t: { + alias: "tilesOnly", + default: false, + description: "Only tile content files should be gzipped.", + type: "boolean", }, - } - ) - .command("gzip", "Gzips the input tileset directory.", { - i: inputStringDefinition, - t: { - alias: "tilesOnly", - default: false, - description: "Only tile content files should be gzipped.", - type: "boolean", - }, - }) - .command("ungzip", "Ungzips the input tileset directory.", { - i: inputStringDefinition, - }) - .command( - "combine", - "Combines all external tilesets into a single tileset.json file.", - { i: inputStringDefinition } - ) - .command( - "merge", - "Merge any number of tilesets together into a single tileset.", - { i: inputArrayDefinition } - ) - .command( - "upgrade", - "Upgrades the input tileset to the latest version of the 3D Tiles spec. Embedded glTF models will be upgraded to glTF 2.0.", - { i: inputStringDefinition } - ) - .command( - "analyze", - "Analyze the input file, and write the results to the output directory. " + - "This will accept B3DM, I3DM, PNTS, CMPT, and GLB files (both for glTF " + - "1.0 and for glTF 2.0), and write files into the output directory that " + - "contain the feature table, batch table, layout information, the GLB, " + - "and the JSON of the GLB", - { i: inputStringDefinition } - ) - .demandCommand(1) - .strict(); + }) + .command("ungzip", "Ungzips the input tileset directory.", { + i: inputStringDefinition, + o: outputStringDefinition, + }) + .command( + "combine", + "Combines all external tilesets into a single tileset.json file.", + { i: inputStringDefinition, o: outputStringDefinition } + ) + .command( + "merge", + "Merge any number of tilesets together into a single tileset.", + { i: inputArrayDefinition, o: outputStringDefinition } + ) + .command( + "upgrade", + "Upgrades the input tileset to the latest version of the 3D Tiles spec. Embedded glTF models will be upgraded to glTF 2.0.", + { i: inputStringDefinition, o: outputStringDefinition } + ) + .command("pipeline", "Execute a pipeline that is provided as a JSON file", { + i: inputStringDefinition, + }) + .command( + "analyze", + "Analyze the input file, and write the results to the output directory. " + + "This will accept B3DM, I3DM, PNTS, CMPT, and GLB files (both for glTF " + + "1.0 and for glTF 2.0), and write files into the output directory that " + + "contain the feature table, batch table, layout information, the GLB, " + + "and the JSON of the GLB", + { i: inputStringDefinition, o: outputStringDefinition } + ) + .demandCommand(1) + .strict(); return args.argv as any; } @@ -249,17 +271,27 @@ async function runCommand(command: string, toolArgs: any, optionArgs: any) { } else if (command === "ungzip") { await ToolsMain.ungzip(input, output, force); } else if (command === "tilesetToDatabase") { - await ToolsMain.tilesetToDatabase(input, output, force); + console.log( + `The 'tilesetToDatabase' command is deprecated. Use 'convert' instead.` + ); + await ToolsMain.convert(input, output, force); } else if (command === "databaseToTileset") { - await ToolsMain.databaseToTileset(input, output, force); + console.log( + `The 'databaseToTileset' command is deprecated. Use 'convert' instead.` + ); + await ToolsMain.convert(input, output, force); + } else if (command === "convert") { + await ToolsMain.convert(input, output, force); } else if (command === "combine") { await ToolsMain.combine(input, output, force); } else if (command === "upgrade") { await ToolsMain.upgrade(input, output, force); } else if (command === "merge") { await ToolsMain.merge(inputs, output, force); + } else if (command === "pipeline") { + await ToolsMain.pipeline(input, force); } else if (command === "analyze") { - await ToolsMain.analyze(input, output, force); + ToolsMain.analyze(input, output, force); } else { throw new DeveloperError(`Invalid command: ${command}`); } diff --git a/demos/ZipToPackage.ts b/src/packages/ZipToPackage.ts similarity index 95% rename from demos/ZipToPackage.ts rename to src/packages/ZipToPackage.ts index 25c3ff87..63a82837 100644 --- a/demos/ZipToPackage.ts +++ b/src/packages/ZipToPackage.ts @@ -1,6 +1,6 @@ import StreamZip from "node-stream-zip"; -import { TilesetTargets } from "../src/tilesetData/TilesetTargets"; +import { TilesetTargets } from "../tilesetData/TilesetTargets"; /** * Methods for converting ZIP files into 3D Tiles packages. diff --git a/src/pipelines/ContentStage.ts b/src/pipelines/ContentStage.ts index b557307c..8ed249d7 100644 --- a/src/pipelines/ContentStage.ts +++ b/src/pipelines/ContentStage.ts @@ -1,6 +1,40 @@ import { Stage } from "./Stage"; -import { TilesetEntry } from "../tilesetData/TilesetEntry"; +/** + * An interface that describes an operation that may be + * applied to one `TilesetEntry` during the execution + * of a pipeline. + * + * Instances of this are created with the `ContentStages` + * class, and contained within a `TilesetStage`. + */ export interface ContentStage extends Stage { - condition: ((e: TilesetEntry) => Promise) | undefined; + /** + * An optional array of `ContentDataType` strings that + * indicates which content types this stage should be + * applied to. + * + * The stage will be applied to types that are contained + * in the `includedContentTypes`, but NOT contained in + * the `excludedContentTypes` + */ + includedContentTypes?: string[]; + + /** + * An optional array of `ContentDataType` strings that + * indicates which content types this stage should be + * applied to. + * + * The stage will be applied to types that are contained + * in the `includedContentTypes`, but NOT contained in + * the `excludedContentTypes` + */ + excludedContentTypes?: string[]; + + /** + * Arbitrary options that may have been given in the + * input JSON, and will be passed to implementations + * that may support these options (e.g. `gltf-pipeline`). + */ + options?: any; } diff --git a/src/pipelines/ContentStageExecutor.ts b/src/pipelines/ContentStageExecutor.ts index b1b07990..36e45342 100644 --- a/src/pipelines/ContentStageExecutor.ts +++ b/src/pipelines/ContentStageExecutor.ts @@ -1,71 +1,572 @@ +import path from "path"; +import GltfPipeline from "gltf-pipeline"; + +import { Paths } from "../base/Paths"; +import { Buffers } from "../base/Buffers"; + +import { ContentDataTypes } from "../contentTypes/ContentDataTypes"; +import { ContentDataTypeChecks } from "../contentTypes/ContentDataTypeChecks"; + import { TilesetEntry } from "../tilesetData/TilesetEntry"; -import { TilesetSource } from "../tilesetData/TilesetSource"; -import { TilesetSources } from "../tilesetData/TilesetSources"; -import { TilesetTarget } from "../tilesetData/TilesetTarget"; -import { TilesetTargets } from "../tilesetData/TilesetTargets"; import { ContentStage } from "./ContentStage"; -import { TilesetEntries } from "./TilesetEntries"; +import { ContentStages } from "./ContentStages"; +import { PipelineError } from "./PipelineError"; + +import { BasicTilesetProcessor } from "../tilesetProcessing/BasicTilesetProcessor"; +import { GltfUtilities } from "../contentProcessing/GtlfUtilities"; +import { ContentOps } from "../contentProcessing/ContentOps"; + +/** + * Methods to execute `ContentStage` objects. + */ export class ContentStageExecutor { + /** + * Execute the given `ContentStage`. + * + * @param contentStage - The `ContentStage` object + * @param tilesetProcessor The `BasicTilesetProcessor` + * @returns A promise that resolves when the process is finished + * @throws PipelineError If one of the processing steps causes + * an error. + */ static async executeContentStage( - tilesetSource: TilesetSource, - tilesetTarget: TilesetTarget, - contentStage: ContentStage + contentStage: ContentStage, + tilesetProcessor: BasicTilesetProcessor ) { - if (contentStage.name === "gzip") { - await ContentStageExecutor.executeGzip( - tilesetSource, - tilesetTarget, - contentStage.condition + try { + await ContentStageExecutor.executeContentStageInternal( + contentStage, + tilesetProcessor ); - } else if (contentStage.name === "ungzip") { - await ContentStageExecutor.executeGunzip(tilesetSource, tilesetTarget); + } catch (e) { + throw new PipelineError(`${e}`); + } + } + + /** + * Execute the given `ContentStage`. + * + * @param contentStage - The `ContentStage` object + * @param tilesetProcessor The `BasicTilesetProcessor` + * @returns A promise that resolves when the process is finished + * @throws Error If one of the processing steps causes + * an error. + */ + private static async executeContentStageInternal( + contentStage: ContentStage, + tilesetProcessor: BasicTilesetProcessor + ) { + if (contentStage.name === ContentStages.CONTENT_STAGE_GZIP) { + const condition = ContentDataTypeChecks.createTypeCheck( + contentStage.includedContentTypes, + contentStage.excludedContentTypes + ); + await ContentStageExecutor.executeGzip(tilesetProcessor, condition); + } else if (contentStage.name === ContentStages.CONTENT_STAGE_UNGZIP) { + await ContentStageExecutor.executeGunzip(tilesetProcessor); + } else if (contentStage.name === ContentStages.CONTENT_STAGE_GLB_TO_B3DM) { + await ContentStageExecutor.executeGlbToB3dm(tilesetProcessor); + } else if (contentStage.name === ContentStages.CONTENT_STAGE_GLB_TO_I3DM) { + await ContentStageExecutor.executeGlbToI3dm(tilesetProcessor); + } else if (contentStage.name === ContentStages.CONTENT_STAGE_B3DM_TO_GLB) { + await ContentStageExecutor.executeB3dmToGlb(tilesetProcessor); + } else if (contentStage.name === ContentStages.CONTENT_STAGE_I3DM_TO_GLB) { + await ContentStageExecutor.executeI3dmToGlb(tilesetProcessor); + } else if ( + contentStage.name === ContentStages.CONTENT_STAGE_OPTIMIZE_B3DM + ) { + const options = contentStage.options; + await ContentStageExecutor.executeOptimizeB3dm(tilesetProcessor, options); + } else if ( + contentStage.name === ContentStages.CONTENT_STAGE_OPTIMIZE_I3DM + ) { + const options = contentStage.options; + await ContentStageExecutor.executeOptimizeI3dm(tilesetProcessor, options); + } else if (contentStage.name === ContentStages.CONTENT_STAGE_OPTIMIZE_GLB) { + const options = contentStage.options; + await ContentStageExecutor.executeOptimizeGlb(tilesetProcessor, options); + } else if ( + contentStage.name === ContentStages.CONTENT_STAGE_SEPARATE_GLTF + ) { + await ContentStageExecutor.executeSeparateGltf(tilesetProcessor); } else { - // TODO Review and document this - const message = - ` Unknown contentStage name: ${contentStage.name} ` + - `- performing no-op`; + const message = ` Unknown contentStage name: ${contentStage.name}`; console.log(message); - await ContentStageExecutor.executeNoOp(tilesetSource, tilesetTarget); } } + /** + * Performs the 'gzip' content stage with the given processor. + * + * This will process all entries of the source tileset. The + * data of entries that match the given condition will be + * compressed with gzip. Other entries remain unaffected. + * + * @param tilesetProcessor - The `BasicTilesetProcessor` + * @param condition The condition that was created from + * the included- and excluded types that have been defined + * in the `ContentStage` + * @returns A promise that resolves when the process is finished + * @throws Error If one of the processing steps causes + * an error. + */ private static async executeGzip( - tilesetSource: TilesetSource, - tilesetTarget: TilesetTarget, - condition: ((e: TilesetEntry) => Promise) | undefined + tilesetProcessor: BasicTilesetProcessor, + condition: ((type: string | undefined) => boolean) | undefined ): Promise { - const inputEntries = TilesetSources.getEntries(tilesetSource); - for (const inputEntry of inputEntries) { - let included = true; + // The entry processor receives the source entry, and + // returns a target entry where the `value` is zipped + // if the source entry matches the given condition. + const entryProcessor = async ( + sourceEntry: TilesetEntry, + type: string | undefined + ) => { + let targetValue = sourceEntry.value; if (condition) { - included = await condition(inputEntry); + const shouldZip = condition(type); + if (shouldZip) { + targetValue = Buffers.gzip(sourceEntry.value); + } } - let outputEntry = inputEntry; - if (included) { - outputEntry = TilesetEntries.gzip(inputEntry); - } - tilesetTarget.addEntry(outputEntry.key, outputEntry.value); - } + const targetEntry = { + key: sourceEntry.key, + value: targetValue, + }; + return targetEntry; + }; + + await tilesetProcessor.processAllEntries(entryProcessor); } + /** + * Performs the 'gunzip' content stage with the given processor. + * + * This will process all entries of the source tileset. The + * data of entries that is compressed with gzip will be + * uncompressed. Other entries remain unaffected. + * + * @param tilesetProcessor - The `BasicTilesetProcessor` + * @returns A promise that resolves when the process is finished + * @throws Error If one of the processing steps causes + * an error. + */ private static async executeGunzip( - tilesetSource: TilesetSource, - tilesetTarget: TilesetTarget + tilesetProcessor: BasicTilesetProcessor ): Promise { - const inputEntries = TilesetSources.getEntries(tilesetSource); - for (const inputEntry of inputEntries) { - const outputEntry = TilesetEntries.gunzip(inputEntry); - tilesetTarget.addEntry(outputEntry.key, outputEntry.value); - } + // The entry processor receives the source entry, and + // returns a target entry where the `value` is unzipped + // (If the data was not zipped, then `Buffers.gunzip` + // returns an unmodified result) + const entryProcessor = async ( + sourceEntry: TilesetEntry, + // eslint-disable-next-line @typescript-eslint/no-unused-vars + type: string | undefined + ) => { + const targetEntry = { + key: sourceEntry.key, + value: Buffers.gunzip(sourceEntry.value), + }; + return targetEntry; + }; + + await tilesetProcessor.processAllEntries(entryProcessor); + } + + /** + * Performs the 'glbToB3dm' content stage with the given processor. + * + * This will process all tile contents entries of the source tileset + * that have the `CONTENT_TYPE_GLB`. These entries will be replaced + * by entries that contain the B3DM data that was created from the GLB. + * + * If the entries have names that end in `.glb`, then these + * extensions will be changed to `.b3dm`. + * + * @param tilesetProcessor - The `BasicTilesetProcessor` + * @returns A promise that resolves when the process is finished + * @throws Error If one of the processing steps causes + * an error. + */ + private static async executeGlbToB3dm( + tilesetProcessor: BasicTilesetProcessor + ): Promise { + // Define the rule for updating the key (file name) of + // the entries, as well as possible template URIs of + // implicit tileset roots. + const uriProcessor = (uri: string) => { + if (Paths.hasExtension(uri, ".glb")) { + return Paths.replaceExtension(uri, ".b3dm"); + } + return uri; + }; + + // Define the `TilesetEntryProcessor` that generates an + // entry with B3DM data from an entry with GLB data. + const entryProcessor = async ( + sourceEntry: TilesetEntry, + type: string | undefined + ) => { + if (type !== ContentDataTypes.CONTENT_TYPE_GLB) { + return sourceEntry; + } + const targetEntry = { + key: uriProcessor(sourceEntry.key), + value: ContentOps.glbToB3dmBuffer(sourceEntry.value), + }; + return targetEntry; + }; + await tilesetProcessor.processTileContentEntries( + uriProcessor, + entryProcessor + ); + } + + /** + * Performs the 'glbToI3dm' content stage with the given processor. + * + * This will process all tile contents entries of the source tileset + * that have the `CONTENT_TYPE_GLB`. These entries will be replaced + * by entries that contain the I3DM data that was created from the GLB. + * + * If the entries have names that end in `.glb`, then these + * extensions will be changed to `.i3dm`. + * + * @param tilesetProcessor - The `BasicTilesetProcessor` + * @returns A promise that resolves when the process is finished + * @throws Error If one of the processing steps causes + * an error. + */ + private static async executeGlbToI3dm( + tilesetProcessor: BasicTilesetProcessor + ): Promise { + // Define the rule for updating the key (file name) of + // the entries, as well as possible template URIs of + // implicit tileset roots. + const uriProcessor = (uri: string) => { + if (Paths.hasExtension(uri, ".glb")) { + return Paths.replaceExtension(uri, ".i3dm"); + } + return uri; + }; + + // Define the `TilesetEntryProcessor` that generates an + // entry with I3DM data from an entry with GLB data. + const entryProcessor = async ( + sourceEntry: TilesetEntry, + type: string | undefined + ) => { + if (type !== ContentDataTypes.CONTENT_TYPE_GLB) { + return sourceEntry; + } + const targetEntry = { + key: uriProcessor(sourceEntry.key), + value: ContentOps.glbToI3dmBuffer(sourceEntry.value), + }; + return targetEntry; + }; + await tilesetProcessor.processTileContentEntries( + uriProcessor, + entryProcessor + ); + } + + /** + * Performs the 'b3dmToGlb' content stage with the given processor. + * + * This will process all tile contents entries of the source tileset + * that have the `CONTENT_TYPE_B3DM`. These entries will be replaced + * by entries that contain the GLB data from the B3DM. + * + * If the entries have names that end in `.b3dm`, then these + * extensions will be changed to `.glb`. + * + * @param tilesetProcessor - The `BasicTilesetProcessor` + * @returns A promise that resolves when the process is finished + * @throws Error If one of the processing steps causes + * an error. + */ + private static async executeB3dmToGlb( + tilesetProcessor: BasicTilesetProcessor + ): Promise { + // Define the rule for updating the key (file name) of + // the entries, as well as possible template URIs of + // implicit tileset roots. + const uriProcessor = (uri: string) => { + if (Paths.hasExtension(uri, ".b3dm")) { + return Paths.replaceExtension(uri, ".glb"); + } + return uri; + }; + + // Define the `TilesetEntryProcessor` that generates an + // entry with GLB data from an entry with B3DM data. + const entryProcessor = async ( + sourceEntry: TilesetEntry, + type: string | undefined + ) => { + if (type !== ContentDataTypes.CONTENT_TYPE_B3DM) { + return sourceEntry; + } + const targetEntry = { + key: uriProcessor(sourceEntry.key), + value: ContentOps.b3dmToGlbBuffer(sourceEntry.value), + }; + return targetEntry; + }; + await tilesetProcessor.processTileContentEntries( + uriProcessor, + entryProcessor + ); + } + + /** + * Performs the 'i3dmToGlb' content stage with the given processor. + * + * This will process all tile contents entries of the source tileset + * that have the `CONTENT_TYPE_I3DM`. These entries will be replaced + * by entries that contain the GLB data from the I3DM. + * + * If the entries have names that end in `.i3dm`, then these + * extensions will be changed to `.glb`. + * + * @param tilesetProcessor - The `BasicTilesetProcessor` + * @returns A promise that resolves when the process is finished + * @throws Error If one of the processing steps causes + * an error. + */ + private static async executeI3dmToGlb( + tilesetProcessor: BasicTilesetProcessor + ): Promise { + // Define the rule for updating the key (file name) of + // the entries, as well as possible template URIs of + // implicit tileset roots. + const uriProcessor = (uri: string) => { + if (Paths.hasExtension(uri, ".i3dm")) { + return Paths.replaceExtension(uri, ".glb"); + } + return uri; + }; + + // Define the `TilesetEntryProcessor` that generates an + // entry with GLB data from an entry with I3DM data. + const entryProcessor = async ( + sourceEntry: TilesetEntry, + type: string | undefined + ) => { + if (type !== ContentDataTypes.CONTENT_TYPE_I3DM) { + return sourceEntry; + } + const targetEntry = { + key: uriProcessor(sourceEntry.key), + value: ContentOps.i3dmToGlbBuffer(sourceEntry.value), + }; + return targetEntry; + }; + await tilesetProcessor.processTileContentEntries( + uriProcessor, + entryProcessor + ); + } + + /** + * Performs the 'optimizeB3dm' content stage with the given processor. + * + * This will process all tile contents entries of the source tileset + * that have the `CONTENT_TYPE_B3DM`, and apply the `gltf-pipeline` + * optimization with the given options to their GLB data. + * + * @param tilesetProcessor - The `BasicTilesetProcessor` + * @param options - The options for `gltf-pipeline` + * @returns A promise that resolves when the process is finished + * @throws Error If one of the processing steps causes + * an error. + */ + private static async executeOptimizeB3dm( + tilesetProcessor: BasicTilesetProcessor, + options: any + ): Promise { + // The entry processor receives the source entry, and + // returns a target entry where the the `value` contains + // GLB data that was optimized with `gltf-pipeline` + // and the given options + const entryProcessor = async ( + sourceEntry: TilesetEntry, + type: string | undefined + ) => { + if (type !== ContentDataTypes.CONTENT_TYPE_B3DM) { + return sourceEntry; + } + const targetValue = await ContentOps.optimizeB3dmBuffer( + sourceEntry.value, + options + ); + const targetEntry = { + key: sourceEntry.key, + value: targetValue, + }; + return targetEntry; + }; + await tilesetProcessor.processTileContentEntries( + (uri: string) => uri, + entryProcessor + ); + } + + /** + * Performs the 'optimizeI3dm' content stage with the given processor. + * + * This will process all tile contents entries of the source tileset + * that have the `CONTENT_TYPE_I3DM`, and apply the `gltf-pipeline` + * optimization with the given options to their GLB data. + * + * @param tilesetProcessor - The `BasicTilesetProcessor` + * @param options - The options for `gltf-pipeline` + * @returns A promise that resolves when the process is finished + * @throws Error If one of the processing steps causes + * an error. + */ + private static async executeOptimizeI3dm( + tilesetProcessor: BasicTilesetProcessor, + options: any + ): Promise { + // The entry processor receives the source entry, and + // returns a target entry where the the `value` contains + // GLB data that was optimized with `gltf-pipeline` + // and the given options + const entryProcessor = async ( + sourceEntry: TilesetEntry, + type: string | undefined + ) => { + if (type !== ContentDataTypes.CONTENT_TYPE_I3DM) { + return sourceEntry; + } + const targetValue = await ContentOps.optimizeI3dmBuffer( + sourceEntry.value, + options + ); + const targetEntry = { + key: sourceEntry.key, + value: targetValue, + }; + return targetEntry; + }; + await tilesetProcessor.processTileContentEntries( + (uri: string) => uri, + entryProcessor + ); } - private static async executeNoOp( - tilesetSource: TilesetSource, - tilesetTarget: TilesetTarget + /** + * Performs the 'optimizeGlb' content stage with the given processor. + * + * This will process all tile contents entries of the source tileset + * that have the `CONTENT_TYPE_GLB`, and apply the `gltf-pipeline` + * optimization with the given options to them. + * + * @param tilesetProcessor - The `BasicTilesetProcessor` + * @param options - The options for `gltf-pipeline` + * @returns A promise that resolves when the process is finished + * @throws Error If one of the processing steps causes + * an error. + */ + private static async executeOptimizeGlb( + tilesetProcessor: BasicTilesetProcessor, + options: any ): Promise { - const inputEntries = TilesetSources.getEntries(tilesetSource); - TilesetTargets.putEntries(tilesetTarget, inputEntries); + // The entry processor receives the source entry, and + // returns a target entry where the the `value` contains + // GLB data that was optimized with `gltf-pipeline` + // and the given options + const entryProcessor = async ( + sourceEntry: TilesetEntry, + type: string | undefined + ) => { + if (type !== ContentDataTypes.CONTENT_TYPE_GLB) { + return sourceEntry; + } + const targetValue = await GltfUtilities.optimizeGlb( + sourceEntry.value, + options + ); + const targetEntry = { + key: sourceEntry.key, + value: targetValue, + }; + return targetEntry; + }; + await tilesetProcessor.processTileContentEntries( + (uri: string) => uri, + entryProcessor + ); + } + + /** + * Performs the 'separateGltf' content stage with the given processor. + * + * @param tilesetProcessor - The `BasicTilesetProcessor` + * @returns A promise that resolves when the process is finished + * @throws PipelineError If one of the processing steps causes + * an error. + */ + private static async executeSeparateGltf( + tilesetProcessor: BasicTilesetProcessor + ): Promise { + // Define the rule for updating the key (file name) of + // the entries, as well as possible template URIs of + // implicit tileset roots. + const uriProcessor = (uri: string) => { + if (Paths.hasExtension(uri, ".glb")) { + return Paths.replaceExtension(uri, ".gltf"); + } + return uri; + }; + + // The entry processor receives the source entry, and + // returns a target entry where the the `value` contains + // the glTF data that was generated with `gltf-pipeline`. + // The additional external resources will be passed to + // the tileset processor, to be stored in the target. + const entryProcessor = async ( + sourceEntry: TilesetEntry, + type: string | undefined + ) => { + if (type !== ContentDataTypes.CONTENT_TYPE_GLB) { + return sourceEntry; + } + const dirname = path.dirname(sourceEntry.key); + const prefix = Paths.replaceExtension(path.basename(sourceEntry.key), ""); + const options = { + separate: true, + name: prefix, + }; + const gltfPipelineResults = await GltfPipeline.glbToGltf( + sourceEntry.value, + options + ); + const targetValue = Buffer.from(JSON.stringify(gltfPipelineResults.gltf)); + const targetEntry = { + key: uriProcessor(sourceEntry.key), + value: targetValue, + }; + + const separateResources = gltfPipelineResults.separateResources; + const resourceKeys = Object.keys(separateResources); + for (const resourceKey of resourceKeys) { + const resourceValue = separateResources[resourceKey]; + const resourceTargetEntry = { + key: Paths.join(dirname, resourceKey), + value: resourceValue, + }; + tilesetProcessor.storeTargetEntries(resourceTargetEntry); + tilesetProcessor.markAsProcessed(resourceKey); + } + return targetEntry; + }; + await tilesetProcessor.processTileContentEntries( + uriProcessor, + entryProcessor + ); } } diff --git a/src/pipelines/ContentStages.ts b/src/pipelines/ContentStages.ts index 4d7d46bc..f60d2664 100644 --- a/src/pipelines/ContentStages.ts +++ b/src/pipelines/ContentStages.ts @@ -1,19 +1,216 @@ import { defined } from "../base/defined"; import { DeveloperError } from "../base/DeveloperError"; -import { TilesetEntry } from "../tilesetData/TilesetEntry"; - -import { ContentDataTypeChecks } from "../contentTypes/ContentDataTypeChecks"; -import { BufferedContentData } from "../contentTypes/BufferedContentData"; - import { ContentStage } from "./ContentStage"; +/** + * Methods to create `ContentStage` objects + */ export class ContentStages { + /** + * The `name` that identifies the "gzip" content stage + */ + public static readonly CONTENT_STAGE_GZIP = "gzip"; + + /** + * The `name` that identifies the "ungzip" content stage + */ + public static readonly CONTENT_STAGE_UNGZIP = "ungzip"; + + /** + * The `name` that identifies the "glbToB3dm" content stage + */ + public static readonly CONTENT_STAGE_GLB_TO_B3DM = "glbToB3dm"; + + /** + * The `name` that identifies the "glbToI3dm" content stage + */ + public static readonly CONTENT_STAGE_GLB_TO_I3DM = "glbToI3dm"; + + /** + * The `name` that identifies the "b3dmToGlb" content stage + */ + public static readonly CONTENT_STAGE_B3DM_TO_GLB = "b3dmToGlb"; + + /** + * The `name` that identifies the "i3dmToGlb" content stage + */ + public static readonly CONTENT_STAGE_I3DM_TO_GLB = "i3dmToGlb"; + + /** + * The `name` that identifies the "optimizeB3dm" content stage + */ + public static readonly CONTENT_STAGE_OPTIMIZE_B3DM = "optimizeB3dm"; + + /** + * The `name` that identifies the "optimizeI3dm" content stage + */ + public static readonly CONTENT_STAGE_OPTIMIZE_I3DM = "optimizeI3dm"; + + /** + * The `name` that identifies the "optimizeGlb" content stage + */ + public static readonly CONTENT_STAGE_OPTIMIZE_GLB = "optimizeGlb"; + + /** + * The `name` that identifies the "separateGltf" content stage + */ + public static readonly CONTENT_STAGE_SEPARATE_GLTF = "separateGltf"; + + /** + * Creates a content stage that performs the "gzip" operation + * + * @param - The array of `ContentDataType` strings that the operation + * should be applied to (or `undefined` if it should be applied to + * all data types) + * @returns The content stage + */ + public static createGzip( + includedContentTypes: string[] | undefined + ): ContentStage { + const contentStage: ContentStage = { + name: ContentStages.CONTENT_STAGE_GZIP, + description: "Compresses each entry with GZIP", + includedContentTypes: includedContentTypes, + }; + return contentStage; + } + + /** + * Creates a content stage that performs the "ungzip" operation + * + * @returns The content stage + */ + public static createUngzip(): ContentStage { + const contentStage: ContentStage = { + name: ContentStages.CONTENT_STAGE_UNGZIP, + description: "Uncompress each entry that was compressed with GZIP", + }; + return contentStage; + } + + /** + * Creates a content stage that performs the "glbToB3dm" operation + * + * @returns The content stage + */ + public static createGlbToB3dm(): ContentStage { + const contentStage: ContentStage = { + name: ContentStages.CONTENT_STAGE_GLB_TO_B3DM, + description: "Convert each GLB into a default B3DM", + }; + return contentStage; + } + + /** + * Creates a content stage that performs the "glbToI3dm" operation + * + * @returns The content stage + */ + public static createGlbToI3dm(): ContentStage { + const contentStage: ContentStage = { + name: ContentStages.CONTENT_STAGE_GLB_TO_I3DM, + description: "Convert each GLB into a default I3DM", + }; + return contentStage; + } + + /** + * Creates a content stage that performs the "b3dmToGlb" operation + * + * @returns The content stage + */ + public static createB3dmToGlb(): ContentStage { + const contentStage: ContentStage = { + name: ContentStages.CONTENT_STAGE_B3DM_TO_GLB, + description: "Convert each B3DM content into GLB", + }; + return contentStage; + } + + /** + * Creates a content stage that performs the "i3dmToGlb" operation + * + * @returns The content stage + */ + public static createI3dmToGlb(): ContentStage { + const contentStage: ContentStage = { + name: ContentStages.CONTENT_STAGE_I3DM_TO_GLB, + description: "Convert each I3DM content into GLB", + }; + return contentStage; + } + + /** + * Creates a content stage that performs the "optimizeGlb" operation + * + * @returns The content stage + */ + public static createOptimizeB3dm(options: any): ContentStage { + const contentStage: ContentStage = { + name: ContentStages.CONTENT_STAGE_OPTIMIZE_B3DM, + description: + "Apply gltf-pipeline to the GLB part of each B3DM content, with the given options", + options: options, + }; + return contentStage; + } + + /** + * Creates a content stage that performs the "optimizeI3dm" operation + * + * @returns The content stage + */ + public static createOptimizeI3dm(options: any): ContentStage { + const contentStage: ContentStage = { + name: ContentStages.CONTENT_STAGE_OPTIMIZE_B3DM, + description: + "Apply gltf-pipeline to the GLB part of each I3DM content, with the given options", + options: options, + }; + return contentStage; + } + + /** + * Creates a content stage that performs the "optimizeGlb" operation + * + * @returns The content stage + */ + public static createOptimizeGlb(options: any): ContentStage { + const contentStage: ContentStage = { + name: ContentStages.CONTENT_STAGE_OPTIMIZE_GLB, + description: + "Apply gltf-pipeline to each GLB content, with the given options", + options: options, + }; + return contentStage; + } + + /** + * Creates a content stage that performs the "separateGlb" operation + * + * @returns The content stage + */ + public static createSeparateGltf(): ContentStage { + const contentStage: ContentStage = { + name: ContentStages.CONTENT_STAGE_SEPARATE_GLTF, + description: + "Convert each GLB content into a .gltf file with separate resources", + }; + return contentStage; + } + + /** + * Creates a `ContentStage` object from the given (untyped) JSON. + * + * @param contentStageJson - The JSON object + * @returns The `ContentStage` object + * @throws DeveloperError When the input was not valid + */ static createContentStage(contentStageJson: any): ContentStage { if (typeof contentStageJson === "string") { const contentStage: ContentStage = { name: contentStageJson, - condition: undefined, }; return contentStage; } @@ -22,28 +219,6 @@ export class ContentStages { if (!defined(contentStage.name)) { throw new DeveloperError("The contentStage JSON does not define a name"); } - contentStage.condition = - ContentStages.createContentStageCondition(contentStageJson); - return contentStage; - } - - static createContentStageCondition( - contentStageJson: any - ): ((e: TilesetEntry) => Promise) | undefined { - const included = contentStageJson.includedContentTypes; - const excluded = contentStageJson.excludedContentTypes; - if (included || excluded) { - const contentDataCheck = ContentDataTypeChecks.createCheck( - included, - excluded - ); - const condition = async (entry: TilesetEntry) => { - const contentData = new BufferedContentData(entry.key, entry.value); - const matches = await contentDataCheck(contentData); - return matches; - }; - return condition; - } - return undefined; + return contentStage; } } diff --git a/src/pipelines/Pipeline.ts b/src/pipelines/Pipeline.ts index e3975b11..47905b40 100644 --- a/src/pipelines/Pipeline.ts +++ b/src/pipelines/Pipeline.ts @@ -1,7 +1,36 @@ import { TilesetStage } from "./TilesetStage"; +/** + * An interface that describes a pipeline of operations + * that should be applied to tileset data. + * + * It consists of the input- and output definition, and a + * set of `TilesetStage` steps that should be executed. + * Instances of this are created with the `Pipelines` + * class, and executed with a `PipelineExecutor`. + */ export interface Pipeline { + /** + * The name of the input tileset. + * + * This may be a path to a tileset JSON file, or a directory + * name (which is then assumed to contain a `tileset.json`), + * or a tileset package, as indicated by the file extension + * being `.3tz` or `.3dtiles`. + */ input: string; + + /** + * The name of the output tileset. + * + * (See `input` for details) + */ output: string; + + /** + * The array of `TilesetStage` objects that will be + * applied to the input data to eventually generate + * the output data. + */ tilesetStages: TilesetStage[]; } diff --git a/src/pipelines/PipelineError.ts b/src/pipelines/PipelineError.ts new file mode 100644 index 00000000..f6f94a6e --- /dev/null +++ b/src/pipelines/PipelineError.ts @@ -0,0 +1,19 @@ +/** + * An error that may be thrown to indicate that a pipeline + * was invalid (for example, due to unknown stage names), + * or one of the stages caused an error during execution. + * + * @internal + */ +export class PipelineError extends Error { + constructor(message: string) { + super(message); + // See https://github.com/Microsoft/TypeScript/wiki/Breaking-Changes + // #extending-built-ins-like-error-array-and-map-may-no-longer-work + Object.setPrototypeOf(this, PipelineError.prototype); + } + + override toString = (): string => { + return `${this.name}: ${this.message}`; + }; +} diff --git a/src/pipelines/PipelineExecutor.ts b/src/pipelines/PipelineExecutor.ts index 3beee044..b2783a45 100644 --- a/src/pipelines/PipelineExecutor.ts +++ b/src/pipelines/PipelineExecutor.ts @@ -1,7 +1,47 @@ +import fs from "fs"; +import os from "os"; +import path from "path"; + import { Pipeline } from "./Pipeline"; import { TilesetStageExecutor } from "./TilesetStageExecutor"; +/** + * Methods to execute `Pipeline` objects. + */ export class PipelineExecutor { + /** + * The directory to store temporary files in. + * + * If this is `undefined`, then a directory in the + * default system temp directory will be used. + */ + private static tempBaseDirectory: string | undefined; + + /** + * Set the directory to store temporary files in. + * + * If this is `undefined`, then a directory in the + * default system temp directory will be used. + * + * This is primarily intended for testing, demos, and + * debugging. + * + * @param directory - The directory + */ + static setTempBaseDirectory(directory: string | undefined) { + PipelineExecutor.tempBaseDirectory = directory; + } + + /** + * Executes the given `Pipeline`. + * + * @param pipeline - The `Pipeline` object + * @param overwrite - Whether outputs should be overwritten if + * they already exist + * @returns A promise that resolves when the process is finished + * @throws PipelineError If one of the processing steps causes + * an error. + */ static async executePipeline(pipeline: Pipeline, overwrite: boolean) { console.log("Executing pipeline"); @@ -10,6 +50,20 @@ export class PipelineExecutor { let currentOverwrite = true; const tilesetStages = pipeline.tilesetStages; + + // Create a temporary directory for the intermediate + // processing steps (if there are more than one) + // TODO: This is not cleaned up at the end... + let tempBasePath = PipelineExecutor.tempBaseDirectory; + if (!tempBasePath) { + if (tilesetStages.length > 1) { + tempBasePath = fs.mkdtempSync( + path.join(os.tmpdir(), "3d-tiles-tools-pipeline-") + ); + } + } + + // Execute each `TilesetStage` for (let t = 0; t < tilesetStages.length; t++) { const tilesetStage = tilesetStages[t]; @@ -22,9 +76,8 @@ export class PipelineExecutor { currentOutput = pipeline.output; currentOverwrite = overwrite; } else { - // TODO Use proper OS-specific temp directory here. - // (And maybe even clean up afterwards...) - currentOutput = "./data/TEMP/tilesetStage-" + t; + const nameSuffix = tilesetStage.name.replace(/[^\w\s]/gi, ""); + currentOutput = `${tempBasePath}/tilesetStage-${t}-${nameSuffix}`; currentOverwrite = true; } diff --git a/src/pipelines/Pipelines.ts b/src/pipelines/Pipelines.ts index feabd492..74d27531 100644 --- a/src/pipelines/Pipelines.ts +++ b/src/pipelines/Pipelines.ts @@ -4,17 +4,33 @@ import { Pipeline } from "./Pipeline"; import { TilesetStage } from "./TilesetStage"; import { TilesetStages } from "./TilesetStages"; +/** + * Methods to create `Pipeline` objects from JSON input. + */ export class Pipelines { + /** + * Creates a `Pipeline` object from the given (untyped) JSON. + * + * @param pipelineJson - The JSON object + * @returns The `Pipeline` object + * @throws DeveloperError When the input was not valid + */ static createPipeline(pipelineJson: any): Pipeline { - const tilesetStages: TilesetStage[] = []; - if (!pipelineJson.tilesetStages) { - throw new DeveloperError( - "The pipeline JSON does not define tilesetStages" - ); + if (!pipelineJson.input) { + throw new DeveloperError("The pipeline JSON does not define an input"); + } + if (!pipelineJson.output) { + throw new DeveloperError("The pipeline JSON does not define an output"); } - for (const tilesetStageJson of pipelineJson.tilesetStages) { - const tilesetStage = TilesetStages.createTilesetStage(tilesetStageJson); - tilesetStages.push(tilesetStage); + + // The tilesetStages may be undefined, resulting + // in an empty array here + const tilesetStages: TilesetStage[] = []; + if (pipelineJson.tilesetStages) { + for (const tilesetStageJson of pipelineJson.tilesetStages) { + const tilesetStage = TilesetStages.createTilesetStage(tilesetStageJson); + tilesetStages.push(tilesetStage); + } } const pipeline: Pipeline = { input: pipelineJson.input, diff --git a/src/pipelines/Stage.ts b/src/pipelines/Stage.ts index 39463e02..db8a5c8d 100644 --- a/src/pipelines/Stage.ts +++ b/src/pipelines/Stage.ts @@ -1,4 +1,19 @@ +/** + * Common base interface for a stage in a pipeline. + * + * Specializations are `TilesetStage` and `ContentStage`. + */ export interface Stage { + /** + * The name of this stage. + */ name: string; - [option: string]: any; + + /** + * An optional description. + * + * This should be a single-line, human-readable description that + * summarizes what the stage is doing. + */ + description?: string; } diff --git a/src/pipelines/TilesetEntries.ts b/src/pipelines/TilesetEntries.ts deleted file mode 100644 index d7beae81..00000000 --- a/src/pipelines/TilesetEntries.ts +++ /dev/null @@ -1,27 +0,0 @@ -import { Buffers } from "../base/Buffers"; - -import { TilesetEntry } from "../tilesetData/TilesetEntry"; - -export class TilesetEntries { - static gzip(inputEntry: TilesetEntry): TilesetEntry { - const inputKey = inputEntry.key; - const inputValue = inputEntry.value; - const outputKey = inputKey; - const outputValue = Buffers.gzip(inputValue); - return { - key: outputKey, - value: outputValue, - }; - } - - static gunzip(inputEntry: TilesetEntry): TilesetEntry { - const inputKey = inputEntry.key; - const inputValue = inputEntry.value; - const outputKey = inputKey; - const outputValue = Buffers.gunzip(inputValue); - return { - key: outputKey, - value: outputValue, - }; - } -} diff --git a/src/pipelines/TilesetStage.ts b/src/pipelines/TilesetStage.ts index 0107782b..df502696 100644 --- a/src/pipelines/TilesetStage.ts +++ b/src/pipelines/TilesetStage.ts @@ -1,6 +1,17 @@ import { ContentStage } from "./ContentStage"; import { Stage } from "./Stage"; +/** + * Interface for a stage within a `Pipeline` of + * operations that should be applied to tileset data. + * + * Instances of this class are created with `TilesetStages` + * and executed with a `TilesetStageExecutor`. + */ export interface TilesetStage extends Stage { - contentStages: ContentStage[]; + /** + * The `ContentStage` steps representing the sequence of + * operations that should be applied to content. + */ + contentStages?: ContentStage[]; } diff --git a/src/pipelines/TilesetStageExecutor.ts b/src/pipelines/TilesetStageExecutor.ts index ef6ab372..755a204b 100644 --- a/src/pipelines/TilesetStageExecutor.ts +++ b/src/pipelines/TilesetStageExecutor.ts @@ -1,79 +1,136 @@ -import { TilesetSource } from "../tilesetData/TilesetSource"; -import { TilesetSources } from "../tilesetData/TilesetSources"; -import { TilesetTarget } from "../tilesetData/TilesetTarget"; -import { TilesetTargets } from "../tilesetData/TilesetTargets"; - import { TilesetStage } from "./TilesetStage"; import { ContentStageExecutor } from "./ContentStageExecutor"; +import { PipelineError } from "./PipelineError"; +import { TilesetStages } from "./TilesetStages"; + +import { BasicTilesetProcessor } from "../tilesetProcessing/BasicTilesetProcessor"; +import { TilesetUpgrader } from "../tilesetProcessing/TilesetUpgrader"; +import { TilesetCombiner } from "../tilesetProcessing/TilesetCombiner"; +import { ContentDataTypeChecks } from "../contentTypes/ContentDataTypeChecks"; +import { ContentDataTypes } from "../contentTypes/ContentDataTypes"; + +/** + * Methods to execute `TilesetStage` objects. + */ export class TilesetStageExecutor { - static async executeTilesetStageInternal( - tilesetSource: TilesetSource, - tilesetTarget: TilesetTarget, - tilesetStage: TilesetStage + /** + * Executes the given `TilesetStage`. + * + * @param tilesetStage - The `TilesetStage` object + * @param currentInput - The current input name, or a temporary + * name for intermediate steps (see `Pipeline.input` for details) + * @param currentOutput - The current output name, or a temporary + * name for intermediate steps (see `Pipeline.input` for details) + * @param overwrite - Whether outputs should be overwritten if + * they already exist + * @returns A promise that resolves when the process is finished + * @throws PipelineError If one of the processing steps causes + * an error. + */ + static async executeTilesetStage( + tilesetStage: TilesetStage, + currentInput: string, + currentOutput: string, + overwrite: boolean ) { - const contentStages = tilesetStage.contentStages; + console.log(` Executing tilesetStage : ${tilesetStage.name}`); + console.log(` currentInput: ${currentInput}`); + console.log(` currentOutput: ${currentOutput}`); - if (contentStages.length === 0) { - // TODO This should probably not cause a message. - // A TilesetStage might be something like `databaseToTileset` - // that is self-contained and "atomic", and does not have - // any content stages. - const message = ` No contentStages - performing no-op`; - console.log(message); - const inputEntries = TilesetSources.getEntries(tilesetSource); - TilesetTargets.putEntries(tilesetTarget, inputEntries); - return; + try { + await TilesetStageExecutor.executeTilesetStageInternal( + tilesetStage, + currentInput, + currentOutput, + overwrite + ); + } catch (e) { + throw new PipelineError(`${e}`); } + } - for (let c = 0; c < contentStages.length; c++) { - const contentStage = contentStages[c]; - - const message = - ` Executing contentStage ${c} of ` + - `${contentStages.length}: ${contentStage.name}`; - console.log(message); - - await ContentStageExecutor.executeContentStage( - tilesetSource, - tilesetTarget, - contentStage + /** + * Implementation for `executeTilesetStage`. + * + * For details about the arguments, see `executeTilesetStage`. + * + * @param tilesetStage - The `TilesetStage` object + * @param currentInput - The current input name + * @param currentOutput - The current output name + * @param overwrite - Whether outputs should be overwritten + * @returns A promise that resolves when the process is finished + * @throws Error If one of the processing steps causes + * an error. + */ + private static async executeTilesetStageInternal( + tilesetStage: TilesetStage, + currentInput: string, + currentOutput: string, + overwrite: boolean + ) { + if (tilesetStage.name === TilesetStages.TILESET_STAGE_UPGRADE) { + const quiet = false; + const tilesetUpgrader = new TilesetUpgrader(quiet); + await tilesetUpgrader.upgrade(currentInput, currentOutput, overwrite); + } else if (tilesetStage.name === TilesetStages.TILESET_STAGE_COMBINE) { + const externalTilesetDetector = ContentDataTypeChecks.createIncludedCheck( + ContentDataTypes.CONTENT_TYPE_TILESET + ); + const tilesetCombiner = new TilesetCombiner(externalTilesetDetector); + await tilesetCombiner.combine(currentInput, currentOutput, overwrite); + } else { + await TilesetStageExecutor.executeTilesetContentStages( + tilesetStage, + currentInput, + currentOutput, + overwrite ); } } - static async executeTilesetStage( + /** + * Execute all `ContentStage` objects in the given `TilesetStage`. + * + * For details about the arguments, see `executeTilesetStage`. + * + * @param tilesetStage - The `TilesetStage` object + * @param currentInput - The current input name + * @param currentOutput - The current output name + * @param overwrite - Whether outputs should be overwritten + * @returns A promise that resolves when the process is finished + * @throws Error If one of the processing steps causes + * an error. + */ + private static async executeTilesetContentStages( tilesetStage: TilesetStage, currentInput: string, currentOutput: string, overwrite: boolean ) { - console.log(` Executing tilesetStage : ${tilesetStage.name}`); - console.log(` currentInput: ${currentInput}`); - console.log(` currentOutput: ${currentOutput}`); - - let tilesetSource; - let tilesetTarget; try { - tilesetSource = TilesetSources.createAndOpen(currentInput); - tilesetTarget = TilesetTargets.createAndBegin(currentOutput, overwrite); + const tilesetProcessor = new BasicTilesetProcessor(); + await tilesetProcessor.begin(currentInput, currentOutput, overwrite); - await TilesetStageExecutor.executeTilesetStageInternal( - tilesetSource, - tilesetTarget, - tilesetStage - ); + const contentStages = tilesetStage.contentStages; + if (contentStages) { + for (let c = 0; c < contentStages.length; c++) { + const contentStage = contentStages[c]; - tilesetSource.close(); - await tilesetTarget.end(); - } catch (error) { - if (tilesetSource) { - tilesetSource.close(); - } - if (tilesetTarget) { - await tilesetTarget.end(); + const message = + ` Executing contentStage ${c} of ` + + `${contentStages.length}: ${contentStage.name}`; + console.log(message); + + await ContentStageExecutor.executeContentStage( + contentStage, + tilesetProcessor + ); + } } - throw error; + await tilesetProcessor.end(); + } catch (e) { + throw new PipelineError(`${e}`); } } } diff --git a/src/pipelines/TilesetStages.ts b/src/pipelines/TilesetStages.ts index 011e02f9..c84ce07c 100644 --- a/src/pipelines/TilesetStages.ts +++ b/src/pipelines/TilesetStages.ts @@ -5,20 +5,90 @@ import { ContentStage } from "./ContentStage"; import { TilesetStage } from "./TilesetStage"; import { ContentStages } from "./ContentStages"; +/** + * Methods to create `TilesetStage` objects. + */ export class TilesetStages { + /** + * The `name` that identifies the "upgrade" tileset stage + */ + public static readonly TILESET_STAGE_UPGRADE = "upgrade"; + + /** + * The `name` that identifies the "combine" tileset stage + */ + public static readonly TILESET_STAGE_COMBINE = "combine"; + + /** + * Creates a tileset stage that performs the "upgrade" operation + * + * @returns The tileset stage + */ + public static createUpgrade(): TilesetStage { + const tilesetStage: TilesetStage = { + name: TilesetStages.TILESET_STAGE_UPGRADE, + description: "Upgrade the input tileset to the latest version", + }; + return tilesetStage; + } + + /** + * Creates a tileset stage that performs the "combine" operation + * + * @returns The tileset stage + */ + public static createCombine(): TilesetStage { + const tilesetStage: TilesetStage = { + name: TilesetStages.TILESET_STAGE_COMBINE, + description: "Combine all external tilesets into one", + }; + return tilesetStage; + } + + /** + * Creates a tileset stage from the given parameters. + * + * @param name - The `name` of the tileset stage + * @param description - The `description` of the tileset stage + * @param contentStages - The content stages + * @returns The tileset stage + */ + public static create( + name: string, + description: string, + contentStages: ContentStage[] + ): TilesetStage { + const tilesetStage: TilesetStage = { + name: name, + description: description, + contentStages: contentStages, + }; + return tilesetStage; + } + + /** + * Creates a `TilesetStage` object from the given (untyped) JSON. + * + * @param tilesetStageJson - The JSON object + * @returns The `TilesetStage` object + * @throws DeveloperError When the input was not valid + */ static createTilesetStage(tilesetStageJson: any): TilesetStage { - const contentStages: ContentStage[] = []; if (typeof tilesetStageJson === "string") { const tilesetStage: TilesetStage = { name: tilesetStageJson, - contentStages: contentStages, + contentStages: [], }; return tilesetStage; } + if (!defined(tilesetStageJson.name)) { throw new DeveloperError("The tilesetStage JSON does not define a name"); } + + let contentStages: ContentStage[] | undefined = undefined; if (tilesetStageJson.contentStages) { + contentStages = []; for (const contentStageJson of tilesetStageJson.contentStages) { const contentStage = ContentStages.createContentStage(contentStageJson); contentStages.push(contentStage); diff --git a/src/tileFormats/TileDataLayouts.ts b/src/tileFormats/TileDataLayouts.ts index 53548f9e..d80a3d3c 100644 --- a/src/tileFormats/TileDataLayouts.ts +++ b/src/tileFormats/TileDataLayouts.ts @@ -45,7 +45,7 @@ export interface TileDataLayout { export class TileDataLayouts { static create(buffer: Buffer): TileDataLayout { // Basic checks for magic number, length and version - const magic = Buffers.getMagic(buffer); + const magic = Buffers.getMagicString(buffer); if (magic !== "b3dm" && magic !== "pnts" && magic !== "i3dm") { throw new TileFormatError( `Expected magic "b3dm", "i3dm", or "pnts", but found "${magic}"` diff --git a/src/tileFormats/TileFormats.ts b/src/tileFormats/TileFormats.ts index 27350e7a..d2678521 100644 --- a/src/tileFormats/TileFormats.ts +++ b/src/tileFormats/TileFormats.ts @@ -2,7 +2,8 @@ import { Buffers } from "../base/Buffers"; import { CompositeTileData } from "./CompositeTileData"; import { TileData } from "./TileData"; -import { TileDataLayout, TileDataLayouts } from "./TileDataLayouts"; +import { TileDataLayout } from "./TileDataLayouts"; +import { TileDataLayouts } from "./TileDataLayouts"; import { TileFormatError } from "./TileFormatError"; /** @@ -20,7 +21,7 @@ export class TileFormats { * @returns Whether the given buffer contains composite tile data */ static isComposite(buffer: Buffer): boolean { - const magic = Buffers.getMagic(buffer); + const magic = Buffers.getMagicString(buffer); return magic === "cmpt"; } @@ -38,7 +39,7 @@ export class TileFormats { */ static readCompositeTileData(buffer: Buffer): CompositeTileData { // Basic checks for magic number, length and version - const magic = Buffers.getMagic(buffer); + const magic = Buffers.getMagicString(buffer); if (magic !== "cmpt") { throw new TileFormatError(`Expected magic "cmpt", but found "${magic}"`); } @@ -128,7 +129,7 @@ export class TileFormats { * @internal */ static extractTileData(buffer: Buffer, tileDataLayout: TileDataLayout) { - const magic = Buffers.getMagic(buffer); + const magic = Buffers.getMagicString(buffer); const version = buffer.readUInt32LE(4); // The `gltfFormat` is only stored for I3DM diff --git a/src/tilesetData/TilesetInMemory.ts b/src/tilesetData/TilesetInMemory.ts new file mode 100644 index 00000000..75f50a97 --- /dev/null +++ b/src/tilesetData/TilesetInMemory.ts @@ -0,0 +1,106 @@ +import { TilesetSource } from "./TilesetSource"; +import { TilesetError } from "./TilesetError"; +import { TilesetTarget } from "./TilesetTarget"; + +/** + * Implementation of a TilesetSource and TilesetTarget that + * stores the data in memory. + * + * This is mainly intended for tests and debugging. + * + * @internal + */ +export class TilesetInMemory implements TilesetSource, TilesetTarget { + /** + * The mapping from keys to the actual data + */ + private readonly dataMap: { [key: string]: Buffer } = {}; + + /** + * Whether this source has already been opened + */ + private sourceIsOpen: boolean; + + /** + * Whether this target has already been opened + */ + private targetIsOpen: boolean; + + /** + * The overwrite flag for the target + */ + private overwrite: boolean; + + /** + * Default constructor + */ + constructor() { + this.sourceIsOpen = false; + this.targetIsOpen = false; + this.overwrite = false; + } + + /** {@inheritDoc TilesetSource.open} */ + // eslint-disable-next-line @typescript-eslint/no-unused-vars + open(fullInputName: string) { + if (this.sourceIsOpen) { + throw new TilesetError("Source already opened"); + } + this.sourceIsOpen = true; + } + + /** {@inheritDoc TilesetSource.getKeys} */ + getKeys() { + if (!this.sourceIsOpen) { + throw new TilesetError("Source is not opened. Call 'open' first."); + } + return Object.keys(this.dataMap).values(); + } + + /** {@inheritDoc TilesetSource.getValue} */ + getValue(key: string): Buffer | undefined { + if (!this.sourceIsOpen) { + throw new TilesetError("Source is not opened. Call 'open' first."); + } + return this.dataMap[key]; + } + + /** {@inheritDoc TilesetSource.close} */ + close() { + if (!this.sourceIsOpen) { + throw new TilesetError("Source is not opened. Call 'open' first."); + } + this.sourceIsOpen = false; + } + + /** {@inheritDoc TilesetTarget.begin} */ + begin(fullOutputName: string, overwrite: boolean) { + if (this.targetIsOpen) { + throw new TilesetError("Target already opened"); + } + this.targetIsOpen = true; + this.overwrite = overwrite; + } + + /** {@inheritDoc TilesetTarget.addEntry} */ + addEntry(key: string, content: Buffer) { + if (!this.targetIsOpen) { + throw new TilesetError("Target is not opened. Call 'begin' first."); + } + if (this.dataMap[key]) { + if (!this.overwrite) { + throw new TilesetError("Entry already exists: " + key); + } + } + this.dataMap[key] = content; + } + + /** {@inheritDoc TilesetTarget.end} */ + async end() { + if (!this.targetIsOpen) { + throw new TilesetError("Target is not opened. Call 'begin' first."); + } + this.targetIsOpen = false; + this.overwrite = false; + } +} diff --git a/src/tilesetProcessing/BasicTilesetProcessor.ts b/src/tilesetProcessing/BasicTilesetProcessor.ts new file mode 100644 index 00000000..b6c780dc --- /dev/null +++ b/src/tilesetProcessing/BasicTilesetProcessor.ts @@ -0,0 +1,340 @@ +import { Tile } from "../structure/Tile"; +import { Tileset } from "../structure/Tileset"; +import { Content } from "../structure/Content"; +import { Schema } from "../structure/Metadata/Schema"; + +import { TilesetSourceResourceResolver } from "../io/TilesetSourceResourceResolver"; + +import { Tiles } from "../tilesets/Tiles"; + +import { TilesetProcessor } from "./TilesetProcessor"; +import { TilesetEntryProcessor } from "./TilesetEntryProcessor"; + +import { TraversedTile } from "../traversal/TraversedTile"; +import { TilesetTraverser } from "../traversal/TilesetTraverser"; + +/** + * Implementation of a `TilesetProcessor` that offers methods for + * common operations on tileset data. + * + * The operations are applied by callbacks on certain elements + * of the tileset data: + * + * - All tiles (as `TraversedTile` instances) + * - Explicit tiles (as `Tile` instances) + * - Unspecified entries (files) in the tileset (as `TilesetEntry` objects) + * - The tileset (and its schema) itself + */ +export class BasicTilesetProcessor extends TilesetProcessor { + /** + * Creates a new instance + * + * @param quiet - Whether log messages should be omitted + */ + constructor(quiet?: boolean) { + super(quiet); + } + + /** + * Apply the given callback to all `TraversedTile` instances + * that result from traversing the tileset. + * + * @param callback - The callback + * @returns A promise that resolves when the process is finished + * @throws DeveloperError If `begin` was not called yet + * @throws TilesetError When an error is thrown during processing + */ + async forEachTile( + callback: (traversedTile: TraversedTile) => Promise + ): Promise { + const context = this.getContext(); + const tileset = context.tileset; + await this.forEachTileAt(tileset.root, callback); + } + + /** + * Apply the given callback to all `TraversedTile` instances + * that result from traversing the tile hierarchy, starting + * at the given tile. + * + * The given tile is assumed to be an explicit tile in the + * current tileset. + * + * @param tile The tile where to start the traversal + * @param callback - The callback + * @returns A promise that resolves when the process is finished + * @throws DeveloperError If `begin` was not called yet + * @throws TilesetError When an error is thrown during processing + */ + private async forEachTileAt( + tile: Tile, + callback: (traversedTile: TraversedTile) => Promise + ): Promise { + const context = this.getContext(); + const tilesetSource = context.tilesetSource; + const schema = context.schema; + + // Create the resource resolver that will be used for + // resolving ".subtree" files of implicit tilesets + // during the traversal + const resourceResolver = new TilesetSourceResourceResolver( + ".", + tilesetSource + ); + const tilesetTraverser = new TilesetTraverser(".", resourceResolver, { + depthFirst: false, + traverseExternalTilesets: true, + }); + await tilesetTraverser.traverseWithSchemaAt( + tile, + schema, + async (traversedTile) => { + await callback(traversedTile); + return true; + } + ); + } + + /** + * Apply the given callback to each tile that appears as `Tile` + * object in the tileset JSON + * + * @param callback - The callback + * @returns A promise that resolves when the process is finished + * @throws DeveloperError If `begin` was not called yet + * @throws TilesetError When an error is thrown during processing + */ + async forEachExplicitTile( + callback: (tile: Tile) => Promise + ): Promise { + const context = this.getContext(); + const tileset = context.tileset; + const root = tileset.root; + await Tiles.traverseExplicit(root, async (tilePath: Tile[]) => { + const tile = tilePath[tilePath.length - 1]; + await callback(tile); + return true; + }); + } + + /** + * Apply the given callback to the `Tileset` and the metadata + * schema. + * + * @param callback - The callback + * @returns A promise that resolves when the process is finished + * @throws DeveloperError If `begin` was not called yet + * @throws TilesetError When an error is thrown during processing + */ + async forTileset( + callback: (tileset: Tileset, schema: Schema | undefined) => Promise + ) { + const context = this.getContext(); + const tileset = context.tileset; + const schema = context.schema; + await callback(tileset, schema); + } + + /** + * Applies the given entry processor to each `TilesetEntry` that + * has not yet been processed + * + * @param entryProcessor - The callback + * @returns A promise that resolves when the process is finished + * @throws DeveloperError If `begin` was not called yet + * @throws TilesetError When the input could not be processed + */ + async processAllEntries(entryProcessor: TilesetEntryProcessor) { + const context = this.getContext(); + const tilesetSource = context.tilesetSource; + const tilesetSourceJsonFileName = context.tilesetSourceJsonFileName; + const sourceKeys = tilesetSource.getKeys(); + for (const sourceKey of sourceKeys) { + if (sourceKey !== tilesetSourceJsonFileName) { + await this.processEntry(sourceKey, entryProcessor); + } + } + } + + /** + * Process all entries that are tile content. + * + * This will process all tile content entries of the source tileset + * with the given `TilesetEntryProcessor`. The given `uriProcessor` + * will be used for updating the `key` (file name) of the entries, + * as well as possible template URIs at the roots of implicit + * tilesets. + * + * @param uriProcessor - The processor that updates keys and URIs + * @param entryProcessor - The `TilesetEntryProcessor` + * @returns A promise that resolves when the process is finished + * @throws Error If one of the processing steps causes + * an error. + */ + async processTileContentEntries( + uriProcessor: (uri: string) => string, + entryProcessor: TilesetEntryProcessor + ): Promise { + // Traverse the (explicit) tiles of the input tileset + await this.forEachExplicitTile(async (tile: Tile): Promise => { + // When the tile is not an implicit tiling root, + // then just update the entries that correspond + // to the tile contents. + if (!tile.implicitTiling) { + await this.processExplicitTileContentEntries(tile, entryProcessor); + } else { + // For implicit tiling roots, traverse the implicit tile hierarchy + // that starts at this tile, and process each entry that corresponds + // to the content of one of the implicit tiles. + await this.forEachTileAt(tile, async (traversedTile: TraversedTile) => { + await this.processTraversedTileContentEntries( + traversedTile, + entryProcessor + ); + }); + + // After the traversal, update the content URIs of the + // implicit tiling root (which are template URIs) + const contents = Tiles.getContents(tile); + for (const content of contents) { + content.uri = uriProcessor(content.uri); + } + } + }); + } + + /** + * Process all entries that correspond to content of the given tile. + * + * The `tile.content.uri` or `tile.contents[i].uri` of the given tile + * will be updated to reflect possible changes of the keys (file + * names) that are performed by the `entryProcessor`. + * + * @param tile - The tile + * @param entryProcessor The `TilesetEntryProcessor` + * @returns A promise that resolves when the process is finished + */ + private async processExplicitTileContentEntries( + tile: Tile, + entryProcessor: TilesetEntryProcessor + ): Promise { + const contents = BasicTilesetProcessor.getTileContents(tile); + const targetContentUris = await this.processContentEntries( + contents, + entryProcessor + ); + BasicTilesetProcessor.updateTileContent(tile, targetContentUris); + } + + /** + * Process all entries that correspond to content of the given traversed tile. + * + * This determines the entries in the tileset source that represent + * content of the given tile, calls `processEntries` on them, + * and stores the resulting entries. + * + * @param traversedTile - The traversed tile + * @param entryProcessor The `TilesetEntryProcessor` + * @returns A promise that resolves when the process is finished + */ + private async processTraversedTileContentEntries( + traversedTile: TraversedTile, + entryProcessor: TilesetEntryProcessor + ): Promise { + if (traversedTile.isImplicitTilesetRoot()) { + return; + } + const contents = traversedTile.getFinalContents(); + await this.processContentEntries(contents, entryProcessor); + } + + /** + * Process all entries that correspond to the given contents. + * + * @param tile - The tile + * @param entryProcessor The `TilesetEntryProcessor` + * @returns A promise that resolves when the process is finished, + * containing the new names that the entries have after processing + */ + private async processContentEntries( + contents: Content[], + entryProcessor: TilesetEntryProcessor + ): Promise { + const targetContentUris: string[] = []; + for (const content of contents) { + const sourceContentUri = content.uri; + let targetContentUri; + if (this.isProcessed(sourceContentUri)) { + targetContentUri = this.getTargetKey(sourceContentUri); + } else { + await this.processEntry(sourceContentUri, entryProcessor); + targetContentUri = this.getTargetKey(sourceContentUri); + } + if (targetContentUri) { + targetContentUris.push(targetContentUri); + } + } + return targetContentUris; + } + + /** + * Returns an array with all contents of the give tile. + * + * @param contents - The contents + * @returns A promise with the entries + */ + private static getTileContents(tile: Tile): Content[] { + if (tile.content) { + return [tile.content]; + } + if (tile.contents) { + return tile.contents; + } + return []; + } + + /** + * Update the content of the given tile to reflect the given URIs. + * + * When the given array is empty, then the `content` and `contents` + * of the given tile will be deleted. + * + * When there is one element, then the `content` of the given tile will + * receive this element as the content `uri`. + * + * When there are multiple elements, the tile will receive `contents` + * where each content `uri` is one element of the array. + * + * @param tile - The tile + * @param contentUris - The content URIs + */ + private static updateTileContent(tile: Tile, contentUris: string[]) { + if (contentUris.length === 0) { + delete tile.content; + delete tile.contents; + return; + } + if (contentUris.length === 1) { + const contentUri = contentUris[0]; + if (tile.content) { + tile.content.uri = contentUri; + } else { + const content = { + uri: contentUri, + }; + tile.content = content; + delete tile.contents; + } + } + + const newContents: Content[] = []; + for (const contentUri of contentUris) { + const content = { + uri: contentUri, + }; + newContents.push(content); + } + tile.contents = newContents; + delete tile.content; + } +} diff --git a/src/tilesetProcessing/TilesetCombiner.ts b/src/tilesetProcessing/TilesetCombiner.ts index fc6ab4db..10306bf8 100644 --- a/src/tilesetProcessing/TilesetCombiner.ts +++ b/src/tilesetProcessing/TilesetCombiner.ts @@ -87,13 +87,17 @@ export class TilesetCombiner { this.tilesetSource = tilesetSource; this.tilesetTarget = tilesetTarget; - const tilesetJsonFileName = + const tilesetSourceJsonFileName = Tilesets.determineTilesetJsonFileName(tilesetSourceName); + const tilesetTargetJsonFileName = + Tilesets.determineTilesetJsonFileName(tilesetTargetName); + await this.combineInternal( tilesetSource, - tilesetJsonFileName, - tilesetTarget + tilesetSourceJsonFileName, + tilesetTarget, + tilesetTargetJsonFileName ); tilesetSource.close(); @@ -111,9 +115,11 @@ export class TilesetCombiner { * source and target. * * @param tilesetSource The tileset source - * @param tilesetJsonFileName The name of the top-level tileset in + * @param tilesetSourceJsonFileName The name of the top-level tileset in * the given source (usually `tileset.json`). * @param tilesetTarget The tileset target + * @param tilesetTargetJsonFileName The name of the top-level tileset in + * the given target (usually `tileset.json`). * @returns A promise that resolves when the process is finished. * @throws TilesetError When the input tileset file can not be * found @@ -122,12 +128,13 @@ export class TilesetCombiner { */ private async combineInternal( tilesetSource: TilesetSource, - tilesetJsonFileName: string, - tilesetTarget: TilesetTarget + tilesetSourceJsonFileName: string, + tilesetTarget: TilesetTarget, + tilesetTargetJsonFileName: string ): Promise { - const tilesetJsonBuffer = tilesetSource.getValue(tilesetJsonFileName); + const tilesetJsonBuffer = tilesetSource.getValue(tilesetSourceJsonFileName); if (!tilesetJsonBuffer) { - const message = `No ${tilesetJsonFileName} found in input`; + const message = `No ${tilesetSourceJsonFileName} found in input`; throw new TilesetError(message); } const tileset = JSON.parse(tilesetJsonBuffer.toString()) as Tileset; @@ -139,7 +146,10 @@ export class TilesetCombiner { const combinedTilesetJsonString = JSON.stringify(tileset, null, 2); const combinedTilesetJsonBuffer = Buffer.from(combinedTilesetJsonString); - tilesetTarget.addEntry("tileset.json", combinedTilesetJsonBuffer); + tilesetTarget.addEntry( + tilesetTargetJsonFileName, + combinedTilesetJsonBuffer + ); } /** diff --git a/src/tilesetProcessing/TilesetEntryProcessor.ts b/src/tilesetProcessing/TilesetEntryProcessor.ts new file mode 100644 index 00000000..6d4a8024 --- /dev/null +++ b/src/tilesetProcessing/TilesetEntryProcessor.ts @@ -0,0 +1,29 @@ +import { TilesetEntry } from "../tilesetData/TilesetEntry"; + +/** + * A function that can process one `TilesetEntry` that is part + * of a tileset dataset. + * + * This is used as the type for the functions that process one + * entry in a `TilesetProcessor`. + * + * It receives the source entry, which may represent a content + * of an (explicit) tile, a content of any (possibly implicit) + * tile, or just one entry of the tileset source (i.e. a "file" + * that is not a tile content). + * + * It returns the "processed" entry that is supposed to put into + * the tileset target (which may be `undefined`, causing the + * corresponding entry to be omitted in the target) + * + * @param sourceEntry - The source entry + * @param type - The type of the entry data (see `ContentDataTypes`), + * or `undefined` if the type could not be determined. + * @returns A promise that resolves when the process is finished, + * containing the resulting entry + * @throws TilesetError When the input could not be processed + */ +export type TilesetEntryProcessor = ( + sourceEntry: TilesetEntry, + type: string | undefined +) => Promise; diff --git a/src/tilesetProcessing/TilesetMerger.ts b/src/tilesetProcessing/TilesetMerger.ts index 36ac0329..5c8776f3 100644 --- a/src/tilesetProcessing/TilesetMerger.ts +++ b/src/tilesetProcessing/TilesetMerger.ts @@ -39,7 +39,7 @@ export class TilesetMerger { * If the inputs are directories or files that do not end with ".json", * then these names will default to "tileset.json" */ - private tilesetJsonFileNames: string[]; + private tilesetSourceJsonFileNames: string[]; /** * Identifiers for the external tilesets. These will usually @@ -54,12 +54,18 @@ export class TilesetMerger { */ private tilesetTarget: TilesetTarget | undefined; + /** + * The name of the tileset JSON file in the target. + * (Usually `tileset.json`) + */ + private tilesetTargetJsonFileName: string | undefined; + /** * Creates a new instance */ constructor() { this.tilesetSources = []; - this.tilesetJsonFileNames = []; + this.tilesetSourceJsonFileNames = []; this.tilesetSourceIdentifiers = []; } @@ -84,7 +90,7 @@ export class TilesetMerger { // Create the sources and target for (const tilesetSourceName of tilesetSourceNames) { // Determine the name of the file that contains the tileset JSON data - const tilesetJsonFileName = + const tilesetSourceJsonFileName = Tilesets.determineTilesetJsonFileName(tilesetSourceName); // Determine an "identifier" for the tileset source @@ -104,10 +110,12 @@ export class TilesetMerger { const tilesetSource = TilesetSources.createAndOpen(tilesetSourceName); this.tilesetSources.push(tilesetSource); - this.tilesetJsonFileNames.push(tilesetJsonFileName); + this.tilesetSourceJsonFileNames.push(tilesetSourceJsonFileName); this.tilesetSourceIdentifiers.push(tilesetSourceIdentifier); } + this.tilesetTargetJsonFileName = + Tilesets.determineTilesetJsonFileName(tilesetTargetName); this.tilesetTarget = TilesetTargets.createAndBegin( tilesetTargetName, overwrite @@ -125,13 +133,18 @@ export class TilesetMerger { this.tilesetSources.length = 0; this.tilesetSourceIdentifiers.length = 0; this.tilesetTarget = undefined; + this.tilesetTargetJsonFileName = undefined; } /** * Internal method for `merge` */ private mergeInternal() { - if (this.tilesetSources.length == 0 || !this.tilesetTarget) { + if ( + this.tilesetSources.length == 0 || + !this.tilesetTarget || + !this.tilesetTargetJsonFileName + ) { throw new DeveloperError("The sources and target must be defined"); } @@ -140,10 +153,12 @@ export class TilesetMerger { const length = this.tilesetSources.length; for (let i = 0; i < length; ++i) { const tilesetSource = this.tilesetSources[i]; - const tilesetJsonFileName = this.tilesetJsonFileNames[i]; - const tilesetJsonBuffer = tilesetSource.getValue(tilesetJsonFileName); + const tilesetSourceJsonFileName = this.tilesetSourceJsonFileNames[i]; + const tilesetJsonBuffer = tilesetSource.getValue( + tilesetSourceJsonFileName + ); if (!tilesetJsonBuffer) { - const message = `No ${tilesetJsonFileName} found in input`; + const message = `No ${tilesetSourceJsonFileName} found in input`; throw new TilesetError(message); } const tileset = JSON.parse(tilesetJsonBuffer.toString()) as Tileset; @@ -156,7 +171,7 @@ export class TilesetMerger { const children = TilesetMerger.getChildren( tilesets, this.tilesetSourceIdentifiers, - this.tilesetJsonFileNames + this.tilesetSourceJsonFileNames ); const mergedTileset = { asset: { @@ -176,7 +191,10 @@ export class TilesetMerger { // Write the merged tileset into the target const mergedTilesetJson = JSON.stringify(mergedTileset, null, 2); const mergedTilesetBuffer = Buffer.from(mergedTilesetJson); - this.tilesetTarget.addEntry("tileset.json", mergedTilesetBuffer); + this.tilesetTarget.addEntry( + this.tilesetTargetJsonFileName, + mergedTilesetBuffer + ); // Copy the resources from the sources to the target this.copyResources(); diff --git a/src/tilesetProcessing/TilesetProcessor.ts b/src/tilesetProcessing/TilesetProcessor.ts new file mode 100644 index 00000000..99dd2e07 --- /dev/null +++ b/src/tilesetProcessing/TilesetProcessor.ts @@ -0,0 +1,536 @@ +import { Buffers } from "../base/Buffers"; +import { DeveloperError } from "../base/DeveloperError"; + +import { ContentDataTypeRegistry } from "../contentTypes/ContentDataTypeRegistry"; + +import { Tileset } from "../structure/Tileset"; +import { Schema } from "../structure/Metadata/Schema"; + +import { TilesetError } from "../tilesetData/TilesetError"; +import { TilesetTargets } from "../tilesetData/TilesetTargets"; +import { TilesetSources } from "../tilesetData/TilesetSources"; +import { TilesetEntry } from "../tilesetData/TilesetEntry"; + +import { Tilesets } from "../tilesets/Tilesets"; + +import { TilesetEntryProcessor } from "./TilesetEntryProcessor"; +import { TilesetProcessorContext } from "./TilesetProcessorContext"; +import { TilesetSource } from "../tilesetData/TilesetSource"; +import { TilesetTarget } from "../tilesetData/TilesetTarget"; + +/** + * A base class for classes that can process tilesets. + * + * This class offers the infrastructure for opening a `TilesetSource` + * and a `TilesetTarget`, parsing the `Tileset` object from the + * source, and performing operations on `Tileset` and the + * `TilesetEntry` objects. + * + * Subclasses will offer predefined sets of operations that can + * be performed on the `Tileset` and the `TilesetEntry` objects. + */ +export abstract class TilesetProcessor { + /** + * A function that will receive log messages + */ + private readonly logCallback: (message: any) => void; + + /** + * The context that was created in `begin` + */ + private context: TilesetProcessorContext | undefined; + + /** + * Creates a new instance + * + * @param quiet - Whether log messages should be omitted + */ + constructor(quiet?: boolean) { + if (quiet !== true) { + this.logCallback = (message: any) => console.log(message); + } else { + // eslint-disable-next-line @typescript-eslint/no-unused-vars, @typescript-eslint/no-empty-function + this.logCallback = (message: any) => {}; + } + } + + /** + * Internal method to just call the log callback + * + * @param message - The message + */ + protected log(message: any): void { + this.logCallback(message); + } + + /** + * Returns the `TilesetProcessorContext` that contains all + * elements that are required for processing the tileset + * + * @returns The `TilesetProcessorContext` + * @throws DeveloperError If `begin` was not called yet + */ + protected getContext(): TilesetProcessorContext { + if (!this.context) { + throw new DeveloperError( + "The processor was not initialized. Call 'begin' first." + ); + } + return this.context; + } + + /** + * Prepare processing the given tileset source and writing + * the results into the given tileset target. + * + * @param tilesetSourceName - The tileset source name + * @param tilesetTargetName - The tileset target name + * @param overwrite Whether the target should be overwritten if + * it already exists + * @returns A promise that resolves when this processor has been + * initialized + * @throws TilesetError When the input could not be opened, + * or when the output already exists and `overwrite` was `false`. + */ + async begin( + tilesetSourceName: string, + tilesetTargetName: string, + overwrite: boolean + ): Promise { + if (this.context) { + throw new TilesetError("Processing has already begun"); + } + let tilesetSource; + let tilesetTarget; + try { + tilesetSource = TilesetSources.createAndOpen(tilesetSourceName); + tilesetTarget = TilesetTargets.createAndBegin( + tilesetTargetName, + overwrite + ); + + const tilesetSourceJsonFileName = + Tilesets.determineTilesetJsonFileName(tilesetSourceName); + + const tilesetTargetJsonFileName = + Tilesets.determineTilesetJsonFileName(tilesetTargetName); + + // Obtain the tileset object from the tileset JSON file + const parsedTileset = TilesetProcessor.parseSourceValue( + tilesetSource, + tilesetSourceJsonFileName + ); + + // Resolve the schema, either from the `tileset.schema` + // or the `tileset.schemaUri` + const schema = TilesetProcessor.resolveSchema( + tilesetSource, + parsedTileset.result + ); + + // If nothing has thrown up to this point, then + // a `TilesetProcessorContext` with a valid + // state can be created: + this.context = { + tilesetSource: tilesetSource, + tilesetSourceJsonFileName: tilesetSourceJsonFileName, + tileset: parsedTileset.result, + tilesetJsonWasZipped: parsedTileset.wasZipped, + schema: schema, + tilesetTarget: tilesetTarget, + tilesetTargetJsonFileName: tilesetTargetJsonFileName, + processedKeys: {}, + targetKeys: {}, + }; + } catch (error) { + if (tilesetSource) { + try { + tilesetSource.close(); + } catch (e) { + // Error already about to be re-thrown + } + } + if (tilesetTarget) { + try { + await tilesetTarget.end(); + } catch (e) { + // Error already about to be re-thrown + } + } + delete this.context; + throw error; + } + } + + /** + * Finish processing the source tileset and write all entries + * that have not been processed yet into the target. + * + * @returns A promise that resolves when the operation finished + * @throws TilesetError When there was an error while processing + * or storing the entries. + */ + async end() { + const context = this.getContext(); + const tilesetSource = context.tilesetSource; + const tilesetTarget = context.tilesetTarget; + const tilesetSourceJsonFileName = context.tilesetSourceJsonFileName; + + // Perform a no-op on all entries that have not yet + // been marked as processed + const sourceKeys = tilesetSource.getKeys(); + for (const sourceKey of sourceKeys) { + if (sourceKey !== tilesetSourceJsonFileName) { + await this.processEntry( + sourceKey, + // eslint-disable-next-line @typescript-eslint/no-unused-vars + async (sourceEntry: TilesetEntry, type: string | undefined) => { + return sourceEntry; + } + ); + } + } + + const entries = TilesetSources.getEntries(tilesetSource); + for (const entry of entries) { + const key = entry.key; + // The tileset JSON file will be added explicitly below + if (!this.isProcessed(key) && key !== tilesetSourceJsonFileName) { + this.markAsProcessed(key); + this.storeTargetEntries(entry); + } + } + + const tilesetTargetJsonFileName = context.tilesetTargetJsonFileName; + const tileset = context.tileset; + const tilesetJsonWasZipped = context.tilesetJsonWasZipped; + + // Store the resulting tileset as JSON + TilesetProcessor.storeTargetValue( + tilesetTarget, + tilesetTargetJsonFileName, + tilesetJsonWasZipped, + tileset + ); + + // Clean up by closing the source and the target + delete this.context; + try { + tilesetSource.close(); + } catch (error) { + try { + await tilesetTarget.end(); + } catch (e) { + // Error already about to be re-thrown + } + throw error; + } + await tilesetTarget.end(); + } + + /** + * Process the specified entry. + * + * If the entry with the specified key was already processed, + * then this method does nothing. + * + * Otherwise, the specified entry will be looked up in the tileset + * source, and passed to the given entry processor, together with + * its type information. + * + * The resulting target entry (if any) will be stored in the + * tileset target, and both the source and the target will + * be marked as 'processed' + * + * @param sourceKey - The key (file name) of the entry + * @param entryProcessor - The `TilesetEntryProcessor` that will + * be called to process the actual entry. + * @returns A promise that resolves when the process is finished + * @throws DeveloperError When the source or target is not opened + * @throws TilesetError When the input could not be processed + */ + protected async processEntry( + sourceKey: string, + entryProcessor: TilesetEntryProcessor + ): Promise { + if (this.isProcessed(sourceKey)) { + return; + } + const sourceEntry = await this.fetchSourceEntry(sourceKey); + if (!sourceEntry) { + this.markAsProcessed(sourceKey); + const message = `No ${sourceKey} found in input`; + //throw new TilesetError(message); + console.warn(message); + return; + } + const targetEntry = await this.processEntryInternal( + sourceEntry, + entryProcessor + ); + + this.markAsProcessed(sourceEntry.key); + if (targetEntry) { + this.putTargetKey(sourceEntry.key, targetEntry.key); + this.markAsProcessed(targetEntry.key); + this.storeTargetEntries(targetEntry); + } + } + + /** + * Process the given source entry, and return the processed result. + * + * This will determine the content type of the given entry, and pass + * it together with its type information to the `entryProcessor`. + * + * This will *not* store the returned target entry in the tileset + * target. To do so, `storeTargetEntries` has to be called with + * the result. + * + * @param sourceEntry - The source entry + * @param entryProcessor The `TilesetEntryProcessor` + * @returns The target entry + */ + private async processEntryInternal( + sourceEntry: TilesetEntry, + entryProcessor: TilesetEntryProcessor + ): Promise { + const type = await ContentDataTypeRegistry.findType( + sourceEntry.key, + sourceEntry.value + ); + + this.log(`Processing source: ${sourceEntry.key} with type ${type}`); + + const targetEntry = await entryProcessor(sourceEntry, type); + + this.log(` to target: ${targetEntry?.key}`); + + return targetEntry; + } + + /** + * Fetch the entry for the specified key from the current tileset + * source. If there is no entry for the given key, then `undefined` + * is returned. + * + * @param key - The key (file name) + * @returns The object containing the entry and its type + */ + private async fetchSourceEntry( + key: string + ): Promise { + const context = this.getContext(); + const tilesetSource = context.tilesetSource; + + const sourceKey = key; + const sourceValue = tilesetSource.getValue(sourceKey); + if (!sourceValue) { + console.warn("No input found for " + sourceKey); + return undefined; + } + const sourceEntry: TilesetEntry = { + key: sourceKey, + value: sourceValue, + }; + return sourceEntry; + } + + /** + * Store the given entries in the current target + * + * @param targetEntries - The target entries + */ + storeTargetEntries(...targetEntries: TilesetEntry[]) { + const context = this.getContext(); + const tilesetTarget = context.tilesetTarget; + for (const targetEntry of targetEntries) { + tilesetTarget.addEntry(targetEntry.key, targetEntry.value); + } + } + + /** + * Mark a certain entry (file) as already having been processed, + * and no longer be considered in subsequent steps. + * + * @param key - The key (file name) + */ + markAsProcessed(key: string) { + const context = this.getContext(); + context.processedKeys[key] = true; + } + + /** + * Returns whether the entry with the given key (file name) was + * already processed. + * + * @param key - The key (file name) + * @returns Whether the entry was already processed + */ + isProcessed(key: string): boolean { + const context = this.getContext(); + return context.processedKeys[key] === true; + } + + /** + * Stores the new key (file name) that the the entry with the + * given key received during processing. + * + * @param sourceKey - The key (file name) + * @returns The target key, or `undefined` + */ + protected putTargetKey(sourceKey: string, targetKey: string) { + const context = this.getContext(); + context.targetKeys[sourceKey] = targetKey; + } + + /** + * Returns the new key (file name) that the the entry with the + * given key received during processing. + * + * When this is `undefined`, then this may either mean that + * the entry was removed during processing, or that it has + * not been procesed yet. The latter can be checked with + * `isProcessed`. + * + * @param sourceKey - The key (file name) + * @returns The target key, or `undefined` + */ + protected getTargetKey(sourceKey: string): string | undefined { + const context = this.getContext(); + return context.targetKeys[sourceKey]; + } + + /** + * Parses the JSON from the value with the given key (file name), + * and returns the parsed result, AND information of whether the + * input was zipped. + * + * This is mainly a convenience function to emulate the behavior of the + * "legacy" tools in terms of handling the tileset JSON: When writing + * the tileset JSON data to the target, then it should zip that JSON + * data if and only if it was zipped in the input. + * + * See `storeTargetValue` for the counterpart of this method. + * + * In the future, there might be mechanisms for a more fine-grained + * control over whether certain files should be zipped or not... + * + * @param tilesetSource - The `TilesetSource` + * @param key - The key (file name) + * @returns A structure containing the `wasZipped` information, and + * the parsed result + * @throws TilesetError If the source is not opened, the specified + * entry cannot be found, or the entry data could not be unzipped, + * or its contents could not be parsed as JSON. + */ + private static parseSourceValue( + tilesetSource: TilesetSource, + key: string + ): { wasZipped: boolean; result: T } { + let value = TilesetProcessor.getSourceValue(tilesetSource, key); + let wasZipped = false; + if (Buffers.isGzipped(value)) { + wasZipped = true; + try { + value = Buffers.gunzip(value); + } catch (e) { + const message = `Could not unzip ${key}: ${e}`; + throw new TilesetError(message); + } + } + try { + const result = JSON.parse(value.toString()) as T; + return { + wasZipped: wasZipped, + result: result, + }; + } catch (e) { + const message = `Could not parse ${key}: ${e}`; + throw new TilesetError(message); + } + } + + /** + * Convert the given object into a JSON string, put it into a buffer, + * zip it (based on the `doZip` flag), and put the result into the + * tileset target. + * + * This is only intended for the "legacy" handling of the tileset + * JSON data, and is the counterpart of `parseSourceValue`. See + * `parseSourceValue` for details. + * + * @param tilesetTarget - The `TilesetTarget` + * @param key - The key (file name) + * @param doZip - Whether the output should be zipped + * @param object - The object for which the JSON should be stored + * @throws DeveloperError When the target is not opened + */ + private static storeTargetValue( + tilesetTarget: TilesetTarget, + key: string, + doZip: boolean, + object: object + ) { + const jsonString = JSON.stringify(object, null, 2); + let jsonBuffer = Buffer.from(jsonString); + if (doZip) { + jsonBuffer = Buffers.gzip(jsonBuffer); + } + tilesetTarget.addEntry(key, jsonBuffer); + } + + /** + * Obtains the value for the given key from the current tileset source, + * throwing an error if the source is not opened, or when the + * given key cannot be found. + * + * @param tilesetSource - The `TilesetSource` + * @param key - The key (file name) + * @returns The value (file contents) + * @throws DeveloperError When the source is not opened + * @throws TilesetError When the given key cannot be found + */ + private static getSourceValue( + tilesetSource: TilesetSource, + key: string + ): Buffer { + const buffer = tilesetSource.getValue(key); + if (!buffer) { + const message = `No ${key} found in input`; + throw new TilesetError(message); + } + return buffer; + } + + /** + * Resolve the `Schema` for the given tileset. + * + * This is either the `tileset.schema`, or the schema that is + * obtained from the `tileset.schemaUri`, or `undefined` if + * neither of them are present. + * + * @param tilesetSource - The `TilesetSource` + * @param tileset - The tileset + * @returns The `Schema`, or `undefined` if there is none + * @throws DeveloperError If the source is not opened + * @throws TilesetError If the schema from the `schemaUri` + * could not be resolved or parsed. + */ + private static resolveSchema( + tilesetSource: TilesetSource, + tileset: Tileset + ): Schema | undefined { + if (tileset.schema) { + return tileset.schema; + } + if (tileset.schemaUri) { + const parsedSchema = TilesetProcessor.parseSourceValue( + tilesetSource, + tileset.schemaUri + ); + return parsedSchema.result; + } + return undefined; + } +} diff --git a/src/tilesetProcessing/TilesetProcessorContext.ts b/src/tilesetProcessing/TilesetProcessorContext.ts new file mode 100644 index 00000000..baee1e53 --- /dev/null +++ b/src/tilesetProcessing/TilesetProcessorContext.ts @@ -0,0 +1,66 @@ +import { Schema } from "../structure/Metadata/Schema"; +import { Tileset } from "../structure/Tileset"; + +import { TilesetSource } from "../tilesetData/TilesetSource"; +import { TilesetTarget } from "../tilesetData/TilesetTarget"; + +/** + * A class summarizing the data that a `TilesetProcessor` is operating on. + * + * This is initialized during the `TilesetProcessor.begin` call, if all + * the source- and target information could be resolved, and is supposed + * to represent a consistent, properly initialized state to work on. + */ +export interface TilesetProcessorContext { + /** + * The tileset source for the input + */ + tilesetSource: TilesetSource; + + /** + * The name of the file that contains the tileset JSON + * data in the source (usually `tileset.json`) + */ + tilesetSourceJsonFileName: string; + + /** + * The tileset that was parsed from the input + */ + tileset: Tileset; + + /** + * Whether the tileset JSON was zipped (a legacy feature, + * see `TilesetProcessor.parseSourceValue` for details) + */ + tilesetJsonWasZipped: boolean; + + /** + * The optional metadata schema associated with the tileset + */ + schema: Schema | undefined; + + /** + * The tileset target for the output. + */ + tilesetTarget: TilesetTarget; + + /** + * The name of the file that contains the tileset JSON + * data in the target (usually `tileset.json`) + */ + tilesetTargetJsonFileName: string; + + /** + * The set of keys (file names) that have already been processed. + * + * This includes the original keys, as well as new keys that + * have been assigned to entries while they have been processed. + */ + processedKeys: { [key: string]: boolean }; + + /** + * A mapping from source keys (file names) to the target names + * that they received during processing. + */ + targetKeys: { [key: string]: string }; +} diff --git a/src/tilesetProcessing/TilesetUpgrader.ts b/src/tilesetProcessing/TilesetUpgrader.ts index eb7c1309..1d190fae 100644 --- a/src/tilesetProcessing/TilesetUpgrader.ts +++ b/src/tilesetProcessing/TilesetUpgrader.ts @@ -3,6 +3,7 @@ import { DeveloperError } from "../base/DeveloperError"; import { BufferedContentData } from "../contentTypes/BufferedContentData"; import { ContentDataTypeRegistry } from "../contentTypes/ContentDataTypeRegistry"; +import { ContentDataTypes } from "../contentTypes/ContentDataTypes"; import { Tile } from "../structure/Tile"; import { Tileset } from "../structure/Tileset"; @@ -16,6 +17,7 @@ import { TilesetSources } from "../tilesetData/TilesetSources"; import { Tiles } from "../tilesets/Tiles"; import { Tilesets } from "../tilesets/Tilesets"; +import { Extensions } from "../tilesets/Extensions"; import { TileFormats } from "../tileFormats/TileFormats"; @@ -35,6 +37,8 @@ type UpgradeOptions = { upgradeContentUrlToUri: boolean; upgradeB3dmGltf1ToGltf2: boolean; upgradeI3dmGltf1ToGltf2: boolean; + upgradeExternalTilesets: boolean; + upgradeExtensionDeclarations: boolean; }; /** @@ -83,6 +87,8 @@ export class TilesetUpgrader { upgradeContentUrlToUri: true, upgradeB3dmGltf1ToGltf2: true, upgradeI3dmGltf1ToGltf2: true, + upgradeExternalTilesets: true, + upgradeExtensionDeclarations: true, }; } @@ -118,9 +124,12 @@ export class TilesetUpgrader { const tilesetTargetJsonFileName = Tilesets.determineTilesetJsonFileName(tilesetTargetName); - await this.upgradeInternal( - tilesetSourceJsonFileName, - tilesetTargetJsonFileName + const upgradedTilesetJsonBuffer = await this.upgradeInternal( + tilesetSourceJsonFileName + ); + this.tilesetTarget.addEntry( + tilesetTargetJsonFileName, + upgradedTilesetJsonBuffer ); await this.upgradeResources(tilesetSourceJsonFileName); @@ -134,19 +143,17 @@ export class TilesetUpgrader { /** * Internal method for the actual upgrade. * - * It justo obtains the tileset JSON data from the source, passes - * it to `upgradeTileset`, and writes the result under the given - * name into the target. + * It just obtains the tileset JSON data from the source, passes + * it to `upgradeTileset`, and returns the buffer containing the + * JSON data of the upgraded result. * * @param tilesetSourceJsonFileName - The name of the tileset JSON in the source - * @param tilesetTargetJsonFileName - The name of the tileset JSON in the target * @returns A promise that resolves when the process is finished * @throws TilesetError When the input could not be processed */ private async upgradeInternal( - tilesetSourceJsonFileName: string, - tilesetTargetJsonFileName: string - ): Promise { + tilesetSourceJsonFileName: string + ): Promise { if (!this.tilesetSource || !this.tilesetTarget) { throw new DeveloperError("The source and target must be defined"); } @@ -177,10 +184,7 @@ export class TilesetUpgrader { if (tilesetJsonBufferWasZipped) { resultTilesetJsonBuffer = Buffers.gzip(resultTilesetJsonBuffer); } - this.tilesetTarget.addEntry( - tilesetTargetJsonFileName, - resultTilesetJsonBuffer - ); + return resultTilesetJsonBuffer; } /** @@ -191,7 +195,7 @@ export class TilesetUpgrader { async upgradeTileset(tileset: Tileset): Promise { if (this.upgradeOptions.upgradeAssetVersionNumber) { this.logCallback(`Upgrading asset version number`); - await this.upgradeAssetVersionNumber(tileset); + this.upgradeAssetVersionNumber(tileset); } if (this.upgradeOptions.upgradeRefineCase) { this.logCallback(`Upgrading refine to be in uppercase`); @@ -201,6 +205,10 @@ export class TilesetUpgrader { this.logCallback(`Upgrading content.url to content.uri`); await this.upgradeEachContentUrlToUri(tileset); } + if (this.upgradeOptions.upgradeExtensionDeclarations) { + this.logCallback(`Upgrading extension declarations`); + Extensions.removeExtensionUsed(tileset, "3DTILES_content_gltf"); + } } /** @@ -223,8 +231,8 @@ export class TilesetUpgrader { * * This will examine each `tile.content` in the explicit representation * of the tile hierarchy in the given tileset. If any content does not - * define a `uri`, but a (legacy) `url` property, then a warning is - * printed and the `url` is renamed to `uri`. + * define a `uri`, but a (legacy) `url` property, then the `url` is + * renamed to `uri`. * * @param tileset - The tileset */ @@ -336,20 +344,29 @@ export class TilesetUpgrader { value: Buffer, type: string | undefined ): Promise { - if (type === "CONTENT_TYPE_B3DM") { + if (type === ContentDataTypes.CONTENT_TYPE_B3DM) { if (this.upgradeOptions.upgradeB3dmGltf1ToGltf2) { this.logCallback(` Upgrading GLB in ${key}`); value = await TilesetUpgrader.upgradeB3dmGltf1ToGltf2(value); } else { this.logCallback(` Not upgrading GLB in ${key} (disabled via option)`); } - } else if (type === "CONTENT_TYPE_I3DM") { + } else if (type === ContentDataTypes.CONTENT_TYPE_I3DM) { if (this.upgradeOptions.upgradeI3dmGltf1ToGltf2) { this.logCallback(` Upgrading GLB in ${key}`); value = await TilesetUpgrader.upgradeI3dmGltf1ToGltf2(value); } else { this.logCallback(` Not upgrading GLB in ${key} (disabled via option)`); } + } else if (type == ContentDataTypes.CONTENT_TYPE_TILESET) { + if (this.upgradeOptions.upgradeExternalTilesets) { + this.logCallback(` Upgrading external tileset in ${key}`); + value = await this.upgradeInternal(key); + } else { + this.logCallback( + ` Not upgrading external tileset in ${key} (disabled via option)` + ); + } } else { this.logCallback(` No upgrade operation to perform for ${key}`); } diff --git a/src/tilesets/Extensions.ts b/src/tilesets/Extensions.ts new file mode 100644 index 00000000..1a94dd02 --- /dev/null +++ b/src/tilesets/Extensions.ts @@ -0,0 +1,154 @@ +/** + * A type for objects that can contain extensions + */ +type Extended = { extensions?: { [key: string]: any } }; + +/** + * A type for objects that can contain extension declarations + */ +type Extensible = { extensionsUsed?: any; extensionsRequired?: any }; + +/** + * Utility methods for handling extensions + */ +export class Extensions { + /** + * Returns whether the given object contains the given extension. + * + * That is, whether the `object.extensions` contains a key + * that is the given extension name. + * + * @param extended - The object that may contain the extension + * @param extension The extension (i.e. its name as a string) + * @returns Whether the object contains the extension + */ + static containsExtension(extended: Extended, extension: string) { + if (!extended.extensions) { + return false; + } + return Object.keys(extended.extensions).includes(extension); + } + + /** + * Remove the specified extension from the `extensions` dictionary + * of the given object, deleting the `extensions` if they become + * empty. + * + * @param extended - The extended object + * @param extension The extension (i.e. its name as a string) + */ + static removeExtension(extended: Extended, extension: string) { + if (!extended.extensions) { + return; + } + delete extended.extensions[extension]; + if (Object.keys(extended.extensions).length === 0) { + delete extended.extensions; + } + } + + /** + * Add the given extension to the `extensionsUsed` of the given object. + * + * The extension will be added if it was not yet contained in the + * array, creating the array of necessary. + * + * @param extensible - The object + * @param extension - The extension name + */ + static addExtensionUsed(extensible: Extensible, extension: string) { + Extensions.addUniqueTo(extensible, "extensionsUsed", extension); + } + + /** + * Remove the given extension from the `extensionsUsed` of the given object. + * + * The array will be deleted if it becomes empty, and the + * extension will also be removed from `extensionsRequired`. + * + * @param extensible - The object + * @param extension - The extension name + */ + static removeExtensionUsed(extensible: Extensible, extension: string) { + Extensions.removeFrom(extensible, "extensionsUsed", extension); + Extensions.removeExtensionRequired(extensible, extension); + } + + /** + * Add the given extension to the `extensionsRequired` of the given object. + * + * The extension will be added if it was not yet contained in the + * array, creating the array of necessary. This will also add + * the extension to `extensionsUsed`. + * + * @param extensible - The object + * @param extension - The extension name + */ + static addExtensionRequired(extensible: Extensible, extension: string) { + Extensions.addUniqueTo(extensible, "extensionsRequired", extension); + Extensions.addExtensionUsed(extensible, extension); + } + + /** + * Remove the given extension to the `extensionsUsed` of the given object. + * + * The array will be deleted if it becomes empty. + * + * This will *not* remove the extension from the `extensionsUsed`! + * + * @param extensible - The object + * @param extension - The extension name + */ + static removeExtensionRequired(extensible: Extensible, extension: string) { + Extensions.removeFrom(extensible, "extensionsRequired", extension); + } + + /** + * Adds the given element to the specified array if it was not + * contained yet, creating a new array if it did not exist yet. + * + * @param object - The object containing the array + * @param key - The name of the array property + * @param element - The element + */ + private static addUniqueTo( + object: { [key: string]: T[] | undefined }, + key: string, + element: T + ) { + let array = object[key]; + if (!array) { + array = []; + object[key] = array; + } + if (!array.includes(element)) { + array.push(element); + } + } + + /** + * Remove the given element from the specified array. If the array + * becomes empty, it is deleted. + * + * @param object - The object containing the array + * @param key - The name of the array property + * @param element - The element + */ + private static removeFrom( + object: { [key: string]: T[] | undefined }, + key: string, + element: T + ) { + const array = object[key]; + if (!array) { + return; + } + const index = array.indexOf(element); + if (index !== -1) { + array.splice(index, 1); + } + if (array.length === 0) { + delete object[key]; + } + } +} diff --git a/src/tilesets/Tiles.ts b/src/tilesets/Tiles.ts index d242c124..662ff8eb 100644 --- a/src/tilesets/Tiles.ts +++ b/src/tilesets/Tiles.ts @@ -1,3 +1,4 @@ +import { Content } from "../structure/Content"; import { Tile } from "../structure/Tile"; import { Contents } from "./Contents"; import { TileTraversalCallback } from "./TileTraversalCallback"; @@ -6,6 +7,27 @@ import { TileTraversalCallback } from "./TileTraversalCallback"; * Utility methods related to tiles, given as `Tile` objects. */ export class Tiles { + /** + * Obtains the contents from the given tile. + * + * This will either return a single-element array, when the tile + * defined `tile.content`, or a multi-element array, when the tile + * defined `tile.contents`, or an empty array, when the tile does + * not have contents. + * + * @param tile The `Tile` + * @returns The content URIs + */ + static getContents(tile: Tile): Content[] { + if (tile.content) { + return [tile.content]; + } + if (tile.contents) { + return tile.contents; + } + return []; + } + /** * Obtains the content URIs from the given tile. * @@ -20,19 +42,13 @@ export class Tiles { * @returns The content URIs */ static getContentUris(tile: Tile): string[] { + const contents = Tiles.getContents(tile); const contentUris = []; - if (tile.content) { - const uri = Contents.getUri(tile.content); + for (const content of contents) { + const uri = Contents.getUri(content); if (uri) { contentUris.push(uri); } - } else if (tile.contents) { - for (const content of tile.contents) { - const uri = Contents.getUri(content); - if (uri) { - contentUris.push(uri); - } - } } return contentUris; } diff --git a/src/traversal/ExplicitTraversedTile.ts b/src/traversal/ExplicitTraversedTile.ts index 5d014ab7..28745522 100644 --- a/src/traversal/ExplicitTraversedTile.ts +++ b/src/traversal/ExplicitTraversedTile.ts @@ -57,6 +57,44 @@ export class ExplicitTraversedTile implements TraversedTile { */ private readonly _resourceResolver; + /** + * Convenience function to create the root tile for a tile + * traversal. + * + * @param root - The root tile from the tileset + * @param schema - The optional metadata schema + * @param resourceResolver - The `ResourceResolver` for + * external references (like subtree files) + * @returns The root `TraversedTile` + */ + static createRoot( + root: Tile, + schema: Schema | undefined, + resourceResolver: ResourceResolver + ): TraversedTile { + const traversedRoot = new ExplicitTraversedTile( + root, + "/root", + 0, + undefined, + schema, + resourceResolver + ); + return traversedRoot; + } + + /** + * Creates a new instance + * + * @param tile - The `Tile` from the tileset JSON + * @param path - A JSON-path-like string describing this tile + * @param level - The level, referring to the root of the + * traversal, starting at 0 + * @param parent - The optional parent tile + * @param schema - The optional metadata schema + * @param resourceResolver - The `ResourceResolver` for + * external references (like subtree files) + */ constructor( tile: Tile, path: string, @@ -73,6 +111,28 @@ export class ExplicitTraversedTile implements TraversedTile { this._resourceResolver = resourceResolver; } + /** + * Returns the `metadata` from the input JSON that defines the + * `MetadataEntity` that is associated with this tile, or + * `undefined` if the input did not contain a metadata entity. + * + * @returns The `MetadataEntity` object, or `undefined` + */ + getMetadata(): MetadataEntity | undefined { + return this._tile.metadata; + } + + /** + * Returns the `implicitTiling` from the input JSON that defines the + * `TileImplicitTiling` that is associated with this tile, or + * `undefined` if this tile does not define an implicit tiling. + * + * @returns The `TileImplicitTiling` object + */ + getImplicitTiling(): TileImplicitTiling | undefined { + return this._tile.implicitTiling; + } + /** {@inheritDoc TraversedTile.asRawTile} */ asRawTile(): Tile { return this._tile; @@ -198,6 +258,16 @@ export class ExplicitTraversedTile implements TraversedTile { return finalContents; } + /** {@inheritDoc TraversedTile.getResourceResolver} */ + getResourceResolver(): ResourceResolver { + return this._resourceResolver; + } + + /** {@inheritDoc TraversedTile.isImplicitTilesetRoot} */ + isImplicitTilesetRoot(): boolean { + return this._tile.implicitTiling !== undefined; + } + /** {@inheritDoc TraversedTile.getSubtreeUri} */ getSubtreeUri(): string | undefined { const implicitTiling = this._tile.implicitTiling; @@ -214,21 +284,6 @@ export class ExplicitTraversedTile implements TraversedTile { return subtreeUri; } - /** {@inheritDoc TraversedTile.getImplicitTiling} */ - getImplicitTiling(): TileImplicitTiling | undefined { - return this._tile.implicitTiling; - } - - /** {@inheritDoc TraversedTile.getMetadata} */ - getMetadata(): MetadataEntity | undefined { - return this._tile.metadata; - } - - /** {@inheritDoc TraversedTile.resolveUri} */ - resolveUri(uri: string): string { - return this._resourceResolver.resolveUri(uri); - } - // TODO For debugging toString = (): string => { return `ExplicitTraversedTile, level ${this.level}, path ${this.path}`; diff --git a/src/traversal/ImplicitTraversedTile.ts b/src/traversal/ImplicitTraversedTile.ts index 32bbd144..496eddfc 100644 --- a/src/traversal/ImplicitTraversedTile.ts +++ b/src/traversal/ImplicitTraversedTile.ts @@ -8,7 +8,6 @@ import { SubtreeModels } from "./SubtreeModels"; import { Tile } from "../structure/Tile"; import { Content } from "../structure/Content"; -import { MetadataEntity } from "../structure/MetadataEntity"; import { TileImplicitTiling } from "../structure/TileImplicitTiling"; import { TreeCoordinates } from "../spatial/TreeCoordinates"; @@ -330,10 +329,9 @@ export class ImplicitTraversedTile implements TraversedTile { const contents = []; const subtreeInfo = this._subtreeModel.subtreeInfo; const contentAvailabilityInfos = subtreeInfo.contentAvailabilityInfos; + const tileIndex = this._localCoordinate.toIndex(); for (const contentAvailabilityInfo of contentAvailabilityInfos) { - const available = contentAvailabilityInfo.isAvailable( - this._localCoordinate.toIndex() - ); + const available = contentAvailabilityInfo.isAvailable(tileIndex); if (available) { // TODO The existence of the root content URI should // have been validated. So this could also throw @@ -365,9 +363,9 @@ export class ImplicitTraversedTile implements TraversedTile { if (!subtreeMetadataModel) { return contents; } + const tileIndex = this._localCoordinate.toIndex(); for (let i = 0; i < contents.length; i++) { const content = contents[i]; - const tileIndex = this._localCoordinate.toIndex(); MetadataSemanticOverrides.applyImplicitContentMetadataSemanticOverrides( content, i, @@ -378,6 +376,16 @@ export class ImplicitTraversedTile implements TraversedTile { return contents; } + /** {@inheritDoc TraversedTile.getResourceResolver} */ + getResourceResolver(): ResourceResolver { + return this._resourceResolver; + } + + /** {@inheritDoc TraversedTile.isImplicitTilesetRoot} */ + isImplicitTilesetRoot(): boolean { + return false; + } + /** {@inheritDoc TraversedTile.getSubtreeUri} */ getSubtreeUri(): string | undefined { const localCoordinate = this._localCoordinate; @@ -394,24 +402,6 @@ export class ImplicitTraversedTile implements TraversedTile { return undefined; } - /** {@inheritDoc TraversedTile.getImplicitTiling} */ - getImplicitTiling(): TileImplicitTiling | undefined { - const localCoordinate = this._localCoordinate; - if (localCoordinate.level === 0) { - return this._implicitTiling; - } - } - - /** {@inheritDoc TraversedTile.getMetadata} */ - getMetadata(): MetadataEntity | undefined { - return undefined; - } - - /** {@inheritDoc TraversedTile.resolveUri} */ - resolveUri(uri: string): string { - return this._resourceResolver.resolveUri(uri); - } - // TODO For debugging toString = (): string => { return ( diff --git a/src/traversal/MetadataSemanticOverrides.ts b/src/traversal/MetadataSemanticOverrides.ts index fde547fb..b64e75cd 100644 --- a/src/traversal/MetadataSemanticOverrides.ts +++ b/src/traversal/MetadataSemanticOverrides.ts @@ -21,7 +21,7 @@ export class MetadataSemanticOverrides { // TODO There are far too few error checks (e.g. for invalid // indices) here. This COULD be delegated to the assumption // that the input is "valid" (as determined by the validator), - // but the error handling here should still be improvev. + // but the error handling here should still be improved. /** * Perform the overrides of the properties of the given tile that @@ -195,7 +195,7 @@ export class MetadataSemanticOverrides { const semanticGeometricError = metadataEntityModel.getPropertyValueBySemantic("TILE_GEOMETRIC_ERROR"); if (defined(semanticGeometricError)) { - tile.geometricError = semanticGeometricError as number; + tile.geometricError = semanticGeometricError; } const semanticRefine = @@ -255,13 +255,13 @@ export class MetadataSemanticOverrides { const semanticUri = metadataEntityModel.getPropertyValueBySemantic("CONTENT_URI"); if (defined(semanticUri)) { - content.uri = semanticUri as string; + content.uri = semanticUri; } const semanticGroupId = metadataEntityModel.getPropertyValueBySemantic("CONTENT_GROUP_ID"); if (defined(semanticGroupId)) { - content.group = semanticGroupId as number; + content.group = semanticGroupId; } } } diff --git a/src/traversal/SubtreeModels.ts b/src/traversal/SubtreeModels.ts index 8c90170e..f291cf97 100644 --- a/src/traversal/SubtreeModels.ts +++ b/src/traversal/SubtreeModels.ts @@ -112,7 +112,7 @@ export class SubtreeModels { // For SUBT (binary subtree data), create the SubtreeModel // from the whole buffer - const isSubt = Buffers.getMagic(subtreeData) === "subt"; + const isSubt = Buffers.getMagicString(subtreeData) === "subt"; if (isSubt) { const binarySubtreeData = await BinarySubtreeDataResolver.resolveFromBuffer( diff --git a/src/traversal/TilesetTraverser.ts b/src/traversal/TilesetTraverser.ts index fb3ac562..ad21ffa4 100644 --- a/src/traversal/TilesetTraverser.ts +++ b/src/traversal/TilesetTraverser.ts @@ -6,7 +6,28 @@ import { Schema } from "../structure/Metadata/Schema"; import { TraversedTile } from "./TraversedTile"; import { ExplicitTraversedTile } from "./ExplicitTraversedTile"; import { TraversalCallback } from "./TraversalCallback"; +import { TilesetTraversers } from "./TilesetTraversers"; + import { DeveloperError } from "../base/DeveloperError"; +import { Tile } from "../structure/Tile"; + +/** + * A collection of configuration options for the traversal. + * + * @internal + */ +export type TraversalOptions = { + /** + * Whether the traversal should be depth-first (in contrast + * to the default breadth-first order) + */ + depthFirst?: boolean; + + /** + * Whether external tilesets should be traversed + */ + traverseExternalTilesets?: boolean; +}; /** * A class that can traverse the tiles of a tileset. @@ -14,6 +35,53 @@ import { DeveloperError } from "../base/DeveloperError"; * @internal */ export class TilesetTraverser { + /** + * The base URI against which content URIs are resolved + * when they refer to 3D Tiles Packages. + * (The current implementations of 3D Tiles Package based + * `TilesetSource`, specifically `TilesetSource3tz`, + * require an absolute URI) + */ + private readonly baseUri: string; + + /** + * The `ResourceResolver` that is used to resolve resources like + * external metadata schema files, subtree files for implicit + * tilesets, or external tilesets. + */ + private readonly resourceResolver: ResourceResolver; + + /** + * The `TraversalOptions` + */ + private readonly options: TraversalOptions; + + /** + * Creates a new instance. + * + * NOTE: The exact set of traversal options is not yet specified. + * + * @param baseUri - The URI against which content URI are resolved + * in order to obtain an absolute URI. This is only used for traversing + * package (3TZ or 3DTILES) content + * @param resourceResolver - The `ResourceResolver` that is used to + * resolve resources like external metadata schema files, subtree + * files for implicit tilesets, or external tilesets. + * @param options Options for the traveral process. + */ + constructor( + baseUri: string, + resourceResolver: ResourceResolver, + options?: TraversalOptions + ) { + this.baseUri = baseUri; + this.resourceResolver = resourceResolver; + this.options = { + depthFirst: options?.depthFirst === true, + traverseExternalTilesets: options?.traverseExternalTilesets === true, + }; + } + /** * Traverses the tiles in the given tileset. * @@ -22,37 +90,84 @@ export class TilesetTraverser { * as `TraversedTile` instances. * * @param tileset - The `Tileset` + * @param traversalCallback - The `TraversalCallback` + * @returns A Promise that resolves when the traversal finished + */ + async traverse( + tileset: Tileset, + traversalCallback: TraversalCallback + ): Promise { + const schema = await TilesetTraversers.resolveSchema( + tileset, + this.resourceResolver + ); + return this.traverseWithSchema(tileset, schema, traversalCallback); + } + + /** + * Traverses the tiles in the given tileset. + * + * This is only the implementation of `traverse`, with the + * option to pass in a `Schema` object that already has + * been resolved. + * + * @param tileset - The `Tileset` * @param schema - The schema from the `tileset.schema` or the - * `tileset.schemaUri`. If this is defined, then it is assumed - * to be a valid schema definition. - * @param resourceResolver - The `ResourceResolver` that is used to - * resolve resources for implicit tilesets (subtree files) + * `tileset.schemaUri`, or `undefined` if the tileset does + * not have an associated schema. * @param traversalCallback - The `TraversalCallback` - * @param depthFirst - Whether the traversal should be depth-first * @returns A Promise that resolves when the traversal finished */ - static async traverse( + async traverseWithSchema( tileset: Tileset, schema: Schema | undefined, - resourceResolver: ResourceResolver, - traversalCallback: TraversalCallback, - depthFirst: boolean + traversalCallback: TraversalCallback ): Promise { - const root = tileset.root; - if (!root) { - return; - } + await this.traverseInternal(tileset.root, schema, traversalCallback); + } + + /** + * Traverses the hierarchy of tiles, starting at the + * given tile. + * + * @param tile - The `Tile` where the traversal should start + * @param schema - The schema from the `tileset.schema` or the + * `tileset.schemaUri`, or `undefined` if the tileset does + * not have an associated schema. + * @param traversalCallback - The `TraversalCallback` + * @returns A Promise that resolves when the traversal finished + */ + async traverseWithSchemaAt( + tile: Tile, + schema: Schema | undefined, + traversalCallback: TraversalCallback + ): Promise { + await this.traverseInternal(tile, schema, traversalCallback); + } + + /** + * Actual implementation of the traversal. + * + * @param tile - The `Tile` where to start the traversal + * @param tileset - The `Tileset` + * @param traversalCallback - The `TraversalCallback` + * @returns A Promise that resolves when the traversal finished + */ + private async traverseInternal( + tile: Tile, + schema: Schema | undefined, + traversalCallback: TraversalCallback + ) { + const depthFirst = this.options.depthFirst; + const stack: TraversedTile[] = []; - const traversedRoot = new ExplicitTraversedTile( - root, - "/root", - 0, - undefined, + const traversalRoot = ExplicitTraversedTile.createRoot( + tile, schema, - resourceResolver + this.resourceResolver ); - stack.push(traversedRoot); + stack.push(traversalRoot); while (stack.length > 0) { const traversedTile = depthFirst ? stack.pop() : stack.shift(); @@ -63,13 +178,44 @@ export class TilesetTraverser { const traverseChildren = await traversalCallback(traversedTile); if (traverseChildren) { - const children = await traversedTile.getChildren(); - const length = children.length; - for (let i = 0; i < length; i++) { - const traversedChild = children[i]; - stack.push(traversedChild); - } + const children = await this.createChildren(traversedTile); + stack.push(...children); } } } + + /** + * Create the children for the traversal for the given tile. + * + * If the given `TraversedTile` has children, then they will + * be returned. + * Otherwise, if `options.traverseExternalTilesets` was set, + * then this will be the roots of external tilesets. + * Otherwise, it will be the empty array. + * + * @param traversedTile - The `TraversedTile` + * @returns The children + */ + private async createChildren( + traversedTile: TraversedTile + ): Promise { + const traverseExternalTilesets = this.options.traverseExternalTilesets; + const children = await traversedTile.getChildren(); + const length = children.length; + + if (length !== 0) { + return children; + } + if (traverseExternalTilesets) { + // When there are no children, but external tilesets should + // be traversed, determine the roots of external tilesets + // and put them on the traversal stack + const externalRoots = await TilesetTraversers.createExternalTilesetRoots( + this.baseUri, + traversedTile + ); + return externalRoots; + } + return []; + } } diff --git a/src/traversal/TilesetTraversers.ts b/src/traversal/TilesetTraversers.ts new file mode 100644 index 00000000..e1d9e207 --- /dev/null +++ b/src/traversal/TilesetTraversers.ts @@ -0,0 +1,223 @@ +import path from "path"; + +import { ResourceResolver } from "../io/ResourceResolver"; +import { TilesetSourceResourceResolver } from "../io/TilesetSourceResourceResolver"; + +import { Tileset } from "../structure/Tileset"; +import { Schema } from "../structure/Metadata/Schema"; + +import { TraversedTile } from "./TraversedTile"; +import { ExplicitTraversedTile } from "./ExplicitTraversedTile"; + +import { DataError } from "../base/DataError"; + +import { LazyContentData } from "../contentTypes/LazyContentData"; +import { ContentDataTypeRegistry } from "../contentTypes/ContentDataTypeRegistry"; +import { ContentDataTypes } from "../contentTypes/ContentDataTypes"; + +import { TilesetSource3tz } from "../packages/TilesetSource3tz"; +import { Paths } from "../base/Paths"; + +/** + * Internal utility methods for tileset traversal, used for + * the `TilesetTraverser` implementation. + * + * @internal + */ +export class TilesetTraversers { + /** + * Create the nodes that are the roots of external tilesets + * that are referred to by the given traversed tile. + * + * If the given tile does not have any contents or none of + * them refers to a tileset, then an empty array is returned. + * + * @param baseUri - The URI against which content URI are resolved + * in order to obtain an absolute URI. This is only used for the case + * of package (3TZ or 3DTILES) content, to create a `TilesetSource` + * from the absolute URI. + * @param traversedTile - The `TraversedTile` + * @returns The external tileset roots + * @throws DataError If one of the externa tilesets or + * its associated files could not be resolved. + */ + static async createExternalTilesetRoots( + baseUri: string, + traversedTile: TraversedTile + ): Promise { + if (traversedTile.isImplicitTilesetRoot()) { + return []; + } + const contents = traversedTile.getRawContents(); + if (contents.length === 0) { + return []; + } + + const resourceResolver = traversedTile.getResourceResolver(); + + const externalRoots: TraversedTile[] = []; + for (const content of contents) { + const contentUri = content.uri; + + // Try to obtain an external tileset from the content + const externalTilesetContext = + await TilesetTraversers.resolveExternalTilesetContext( + baseUri, + contentUri, + resourceResolver + ); + + if (externalTilesetContext) { + const externalTileset = externalTilesetContext.tileset; + const externalResourceResolver = + externalTilesetContext.resourceResolver; + + // If an external tileset was found, resolve its schema, + // and create an explicit traversed tile for its root. + const externalSchema = await TilesetTraversers.resolveSchema( + externalTileset, + externalResourceResolver + ); + const externalRoot = new ExplicitTraversedTile( + externalTileset.root, + traversedTile.path + `/[external:${contentUri}]/root`, + traversedTile.level + 1, + traversedTile, + externalSchema, + externalResourceResolver + ); + externalRoots.push(externalRoot); + } + } + return externalRoots; + } + + /** + * Fetch the information that is required for creating the root + * nodes of external tilesets from the given URI. + * + * If the given URI does not refer to an external tileset, + * then `undefined` is returned. + * + * Otherwise, it will return the parsed `Tileset` object, + * and the `ResourceResolver` that can be used to resolve + * resources from this tileset. + * + * @param baseUri - The URI against which the given URI is resolved + * in order to obtain an absolute URI. This is only used for the case + * of package (3TZ or 3DTILES) content, to create a `TilesetSource` + * from the absolute URI. + * @param uri - The URI + * @param resourceResolver - The `ResourceResolver` + * @returns The tileset + * @throws DataError If an external tileset could not be + * resolved or parsed. + */ + private static async resolveExternalTilesetContext( + baseUri: string, + uri: string, + resourceResolver: ResourceResolver + ): Promise< + { tileset: Tileset; resourceResolver: ResourceResolver } | undefined + > { + const contentData = new LazyContentData(uri, resourceResolver); + const contentDataType = await ContentDataTypeRegistry.findContentDataType( + contentData + ); + const isTileset = contentDataType === ContentDataTypes.CONTENT_TYPE_TILESET; + + // For external tileset JSON files, just return the parsed + // tileset and a resource resolver that resolves against + // the base directory of the tileset JSON file + if (isTileset) { + const externalTileset = await contentData.getParsedObject(); + const basePath = path.dirname(uri); + const externalResourceResolver = resourceResolver.derive(basePath); + const result = { + tileset: externalTileset, + resourceResolver: externalResourceResolver, + }; + return result; + } + + // For tileset packages, create a `TilesetSource`, extract + // the `Tileset` object from its `tileset.json` file, + // and return the `Tileset` and a resource resolver that + // resolves against the tileset source. + const isPackage = contentDataType === ContentDataTypes.CONTENT_TYPE_3TZ; + if (isPackage) { + const absoluteUri = Paths.join(baseUri, uri); + const externalTilesetSource = new TilesetSource3tz(); + const tilesetJsonFileName = "tileset.json"; + // XXX TODO There is no matching 'close' call for this! + try { + externalTilesetSource.open(absoluteUri); + } catch (e) { + console.warn( + `Could not open external tileset from ${absoluteUri} - ignoring` + ); + return undefined; + } + let externalTileset; + const tilesetJsonData = + externalTilesetSource.getValue(tilesetJsonFileName); + if (!tilesetJsonData) { + throw new DataError(`Could not resolve ${tilesetJsonFileName}`); + } + try { + externalTileset = JSON.parse(tilesetJsonData.toString("utf-8")); + } catch (e) { + throw new DataError( + `Could not parse tileset from ${tilesetJsonFileName}` + ); + } + const externalResourceResolver = new TilesetSourceResourceResolver( + ".", + externalTilesetSource + ); + const result = { + tileset: externalTileset, + resourceResolver: externalResourceResolver, + }; + return result; + } + return undefined; + } + + /** + * Resolve the `Schema` for the given tileset. + * + * This is either the `tileset.schema`, or the schema that is + * obtained from the `tileset.schemaUri`, or `undefined` if + * neither of them are present. + * + * @param tileset - The tileset + * @param resourceResolver - The `ResourceResolver` for loading + * the schema from the `schemaUri` if necessary + * @returns The `Schema`, or `undefined` if there is none + * @throws DataError If the schema from the `schemaUri` + * could not be resolved or parsed. + */ + static async resolveSchema( + tileset: Tileset, + resourceResolver: ResourceResolver + ): Promise { + if (tileset.schema) { + return tileset.schema; + } + if (tileset.schemaUri) { + const uri = tileset.schemaUri; + const schemaData = await resourceResolver.resolveData(uri); + if (!schemaData) { + throw new DataError(`Could not resolve ${uri}`); + } + try { + const schema = JSON.parse(schemaData.toString("utf-8")); + return schema; + } catch (e) { + throw new DataError(`Could not parse schema from ${uri}`); + } + } + return undefined; + } +} diff --git a/src/traversal/TraversedTile.ts b/src/traversal/TraversedTile.ts index f9a34591..c5abf608 100644 --- a/src/traversal/TraversedTile.ts +++ b/src/traversal/TraversedTile.ts @@ -1,7 +1,6 @@ import { Tile } from "../structure/Tile"; import { Content } from "../structure/Content"; -import { MetadataEntity } from "../structure/MetadataEntity"; -import { TileImplicitTiling } from "../structure/TileImplicitTiling"; +import { ResourceResolver } from "../io/ResourceResolver"; /** * An interface that summarizes context information for @@ -15,15 +14,13 @@ export interface TraversedTile { * of the tile. This is just a plain data structure corresponding * to the tile. * - * The returned object reflects the "raw" state of the tile that + * The returned object reflects the "raw" state of the object that * is either contained in the tileset JSON, or derived from the - * subdivision rules of implicit tiles. - * - * Specifically: This is the state BEFORE any semantic-based - * overrides have been applied. When there is metadata - * associated with the tile, and this metadata has semantics - * that override certain tile properties, then these overrides - * are NOT reflected in the returned tile. + * subdivision rules of implicit tiles. Specifically: This is the + * state BEFORE any semantic-based overrides have been applied. + * When there is metadata associated with the object, and this + * metadata has semantics that override certain properties, then + * these overrides are NOT reflected in the returned object. * * In order to obtain a tile where the semantic-based overrides * are applied, `asFinalTile` can be used. @@ -43,7 +40,7 @@ export interface TraversedTile { /** * Returns a `Tile` object that contains the "JSON"-representation * of the tile. This is just a plain data structure corresponding - * the tile. + * to the tile. * * In contrast to `asRawTile`, this method returns a `Tile` object * where semantic-based overrides have already been applied. When @@ -110,20 +107,68 @@ export interface TraversedTile { * single `tile.content` object), or an array that resembles * the `tile.contents` array. * + * Note that the returned content objects may contain + * template URIs for tiles that are roots of implicit + * tilesets. Use `isImplicitTilesetRoot` to detect + * whether this tile is the root of an implicit tileset, + * and the content URIs may be template URIs. + * + * The returned content objects reflect the state BEFORE + * any semantic-based overrides have been applied. + * See `asRawTile` for details about the semantic-based + * overrides. + * * @returns The contents */ getRawContents(): Content[]; - // TODO Document or improve this - the same difference as between - // asRawTile and asFinalTile + /** + * Returns the `Content` objects of the tile. + * + * The returned objects correspond to the ones returned by + * `getRawContents`, but in a state where semantic-based + * overrides have been applied. + * + * See `asRawTile` and `asFinalTile` for details about the + * semantic-based overrides. + * + * @returns The contents + */ getFinalContents(): Content[]; - // TODO Some information has to be exposed here solely - // for the validation. This should preferably not be - // visible in this interface. The traversal might be - // refactored to hide this information here. + /** + * Returns the `ResourceResolver` that can be used for + * resolving the data that appears as `content.uri` in + * this tile. + * + * @returns The `ResourceResolver` + */ + getResourceResolver(): ResourceResolver; + + /** + * Returns whether this tile is the root of an implicit tileset. + * + * This is `true` for tiles that appear in the explicit + * tile hierarchy of a tileset JSON, and which have a + * `tile.implicitTiling` property. + * + * For these tiles, the `content.uri` properties do not define + * actual URIs, but *template* URIs. + * + * @returns Whether this is an implicit tileset root + */ + isImplicitTilesetRoot(): boolean; + + /** + * Returns the URI of the subtree file for this tile, or + * `undefined` if this is not the root of a subtree. + * + * If this tile is the root of a subtree in an implicit tileset, then + * the returned URI will contain the actual subtree URI that was + * created by substituting the coordinates of this tile into the + * `implicitTiling.subtrees.uri` template URI. + * + * @returns The subtree URI, or `undefined` + */ getSubtreeUri(): string | undefined; - getImplicitTiling(): TileImplicitTiling | undefined; - getMetadata(): MetadataEntity | undefined; - resolveUri(uri: string): string; }