Skip to content

Commit

Permalink
API Cleanup
Browse files Browse the repository at this point in the history
  • Loading branch information
Ib Green committed Apr 7, 2019
1 parent a16ba9c commit 334878f
Show file tree
Hide file tree
Showing 66 changed files with 370 additions and 342 deletions.
10 changes: 5 additions & 5 deletions docs/api-reference/3d-tile-loaders/tile-3d-loader.md
Original file line number Diff line number Diff line change
Expand Up @@ -15,18 +15,18 @@ Parses a [3D tile](https://github.com/AnalyticalGraphicsInc/3d-tiles). glTF file
## Usage

```
import {loadFile} from '@loaders.gl/core';
import {load} from '@loaders.gl/core';
import {Tile3DLoader} from '@loaders.gl/3d-tiles';
const gltf = await loadFile(url, Tile3DLoader);
const gltf = await load(url, Tile3DLoader);
```

To decompress tiles containing Draco compressed glTF models or Draco compressed point clouds:

```
import {loadFile} from '@loaders.gl/core';
import {load} from '@loaders.gl/core';
import {Tile3DLoader} from '@loaders.gl/3d-tiles';
import {DracoDecoder} from '@loaders.gl/draco';
const gltf = loadFile(url, Tile3DLoader, {DracoDecoder, decompress: true});
const gltf = load(url, Tile3DLoader, {DracoDecoder, decompress: true});
```

## Options
Expand All @@ -48,7 +48,7 @@ const gltf = loadFile(url, Tile3DLoader, {DracoDecoder, decompress: true});

Returns a JSON object with "embedded" binary data in the form of typed javascript arrays.

When parsed asynchronously (not using `loadSync` or `parseSync`):
When parsed asynchronously (i.e. not using `parseSync`):

- linked binary resources will be loaded and resolved (if url is available).
- base64 encoded binary data inside the JSON payload will be decoded
Expand Down
Original file line number Diff line number Diff line change
@@ -1,10 +1,10 @@
# encodeFile
# encode

> Needs update
## Functions

### encodeFile(fileData : ArrayBuffer | String, writer : Object | Array [, options : Object [, url : String]]) : Promise<Any>
### encode(fileData : ArrayBuffer | String, writer : Object | Array [, options : Object [, url : String]]) : Promise<Any>

Encodes data asynchronously using the provided writer.

Expand All @@ -15,7 +15,7 @@ Encodes data asynchronously using the provided writer.

- `options.log`=`console` Any object with methods `log`, `info`, `warn` and `error`. By default set to `console`. Setting log to `null` will turn off logging.

### encodeFileSync(fileData : ArrayBuffer | String, writer : Object | Array, [, options : Object [, url : String]]) : any
### encodeSync(fileData : ArrayBuffer | String, writer : Object | Array, [, options : Object [, url : String]]) : any

Encodes data synchronously using the provided writer, if possible. If not, returns `null`, in which case asynchronous loading is required.

Expand Down
8 changes: 4 additions & 4 deletions docs/api-reference/core/fetch-file.md
Original file line number Diff line number Diff line change
Expand Up @@ -7,21 +7,21 @@ Small optional file reading utilities that work consistently across browser (bot
Use the `fetchFile` function as follows:

```js
import {fetchFile, parseFile} from '@loaders.gl/core';
import {fetchFile, parse} from '@loaders.gl/core';
import {OBJLoader} from '@loaders.gl/obj';

data = await parseFile(fetchFile(url), OBJLoader);
data = await parse(fetchFile(url), OBJLoader);
// Application code here
...
```

Note that if you don't care about Node.js compatibility, you can just use the browser's built-in `fetch` directly.

```js
import {parseFile} from '@loaders.gl/core';
import {parse} from '@loaders.gl/core';
import {OBJLoader} from '@loaders.gl/obj';

data = await parseFile(fetch(url), OBJLoader);
data = await parse(fetch(url), OBJLoader);
// Application code here
...
```
Expand Down
32 changes: 0 additions & 32 deletions docs/api-reference/core/load-file.md

This file was deleted.

26 changes: 26 additions & 0 deletions docs/api-reference/core/load.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,26 @@
# load

The `load` function can be used with any _loader object_. They takes a `url` and one or more _loader objects_, checks what type of data that loader prefers to work on (e.g. text, JSON, binary, stream, ...), loads the data in the appropriate way, and passes it to the loader.

### load(url : String | File, loaders : Object | Object[][, options : object]) : Promise<Response>

### load(url : String | File [, options : Object]) : Promise<Response>

The `load` function is used to load and parse data with a specific _loader object_. An array of loader objects can be provided, in which case `load` will attempt to autodetect which loader is appropriate for the file.

The `loaders` parameter can also be ommitted, in which case any _loader objects_ previously registered with [`registerLoaders`](docs/api-reference/core/register-loaders) will be used.

- `url` - Can be a string, either a data url or a request url, or in Node.js, a file name, or in the browser, a File object.
- `data` - loaded data, either in binary or text format.
- `loaders` - can be a single loader or an array of loaders. If ommitted, will use the list of registered loaders (see `registerLoaders`)
- `options` - optional, contains both options for the read process and options for the loader (see documentation of the specific loader).
- `options.dataType`=`arraybuffer` - By default reads as binary. Set to 'text' to read as text.

Returns:

- Return value depends on the _loader object_ category

Notes:

- Any path prefix set by `setPathPrefix` will be appended to relative urls.
- `load` takes a `url` and a loader object, checks what type of data that loader prefers to work on (e.g. text, binary, stream, ...), loads the data in the appropriate way, and passes it to the loader.
Original file line number Diff line number Diff line change
@@ -1,37 +1,37 @@
# parseFile
# parse

This function parses already loaded data. As a special case, it can also load (and then parse) data from `fetch` or `fetchFile` response object).
This function parses already loaded data. As a special case, it can also load (and then parse) data from a `fetch` or `fetchFile` response object).

## Usage

The return value from `fetch` or `fetchFile` is a `Promise` that resolves to the fetch response object and can be passed directly to the non-sync parser functions:

```js
import {fetchFile, parseFile} from '@loaders.gl/core';
import {fetchFile, parse} from '@loaders.gl/core';
import {OBJLoader} from '@loaders.gl/obj';

data = await parseFile(fetchFile(url), OBJLoader);
data = await parse(fetchFile(url), OBJLoader);
// Application code here
...
```

Batched (streaming) parsing is supported by some loaders

```js
import {fetchFile, parseFileInBatches} from '@loaders.gl/core';
import {fetchFile, parseInBatches} from '@loaders.gl/core';
import {CSVLoader} from '@loaders.gl/obj';

const batchIterator = await parseFileInBatches(fetchFile(url), CSVLoader);
const batchIterator = await parseInBatches(fetchFile(url), CSVLoader);
for await (const batch of batchIterator) {
console.log(batch.length);
}
```

## Functions

### parseFileInBatches(data : any, loaders : Object | Object\[] [, options : Object [, url : String]]) : AsyncIterator
### parseInBatches(data : any, loaders : Object | Object\[] [, options : Object [, url : String]]) : AsyncIterator

### parseFileInBatches(data : any [, options : Object [, url : String]]) : AsyncIterator
### parseInBatches(data : any [, options : Object [, url : String]]) : AsyncIterator

> Batched loading is not supported by all _loader objects_
Expand All @@ -57,9 +57,9 @@ Returns:

- Returns an async iterator that yields batches of data. The exact format for the batches depends on the _loader object_ category.

### parseFile(data : ArrayBuffer | String, loaders : Object | Object\[] [, options : Object [, url : String]]) : Promise<Any>
### parse(data : ArrayBuffer | String, loaders : Object | Object\[] [, options : Object [, url : String]]) : Promise<Any>

### parseFile(data : ArrayBuffer | String, [, options : Object [, url : String]]) : Promise<Any>
### parse(data : ArrayBuffer | String, [, options : Object [, url : String]]) : Promise<Any>

Parses data asynchronously using the provided loader.
Used to parse data with a selected _loader object_. An array of `loaders` can be provided, in which case an attempt will be made to autodetect which loader is appropriate for the file (using url extension and header matching).
Expand All @@ -84,9 +84,9 @@ Returns:

- Return value depends on the _loader object_ category

### parseFileSync(fileData : ArrayBuffer | String, loaders : Object | Object\[], [, options : Object [, url : String]]) : any
### parseSync(fileData : ArrayBuffer | String, loaders : Object | Object\[], [, options : Object [, url : String]]) : any

### parseFileSync(fileData : ArrayBuffer | String, [, options : Object [, url : String]]) : any
### parseSync(fileData : ArrayBuffer | String, [, options : Object [, url : String]]) : any

> Synchronous parsing is not supported by all _loader objects_
Expand Down
8 changes: 4 additions & 4 deletions docs/api-reference/core/register-loaders.md
Original file line number Diff line number Diff line change
Expand Up @@ -2,7 +2,7 @@

The loader registry allows applications to cherry-pick which loaders to include in their application bundle by importing just the loaders they need and registering them during initialization.

Applications can then make all those imported loaders available (via format autodetection) to all subsequent `parseFile` and `loadFile` calls, without those calls having to specify which loaders to use.
Applications can then make all those imported loaders available (via format autodetection) to all subsequent `parse` and `load` calls, without those calls having to specify which loaders to use.

## Usage

Expand All @@ -18,16 +18,16 @@ registerLoaders(CSVLoader);
Some other file that needs to load CSV:

```js
import {loadFile} from '@loaders.gl/core';
import {load} from '@loaders.gl/core';

// The pre-registered CSVLoader gets auto selected based on file extension...
const data = await loadFile('data.csv');
const data = await load('data.csv');
```

## Functions

### registerLoaders(loaders : Object | Object[])

Registers one or more _loader objects_ to a global _loader object registry_, these loaders will be used if no loader object is supplied to `parseFile` and `loadFile`.
Registers one or more _loader objects_ to a global _loader object registry_, these loaders will be used if no loader object is supplied to `parse` and `load`.

- `loaders` - can be a single loader or an array of loaders. The specified loaders will be added to any previously registered loaders.
33 changes: 0 additions & 33 deletions docs/api-reference/core/save-file.md

This file was deleted.

33 changes: 33 additions & 0 deletions docs/api-reference/core/save.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,33 @@
# save

> Needs update
`save` and `saveSync` function can be used with any writer. `save` takes a `url` and a writer object, checks what type of data that writer prefers to work on (e.g. text, JSON, binary, stream, ...), saves the data in the appropriate way, and passes it to the writer.

## Functions

### save(url : String | File, writer : Object [, options : Object]) : Promise<ArrayBuffer | String>

The `save` function can be used with any writer.

`save` takes a `url` and a writer object, checks what type of data that writer prefers to work on (e.g. text, JSON, binary, stream, ...), saves the data in the appropriate way, and passes it to the writer.

- `url` - Can be a string, either a data url or a request url, or in Node.js, a file name, or in the browser, a File object.
- `data` - saveed data, either in binary or text format.
- `writer` - can be a single writer or an array of writers.
- `options` - optional, contains both options for the read process and options for the writer (see documentation of the specific writer).
- `options.dataType`=`arraybuffer` - By default reads as binary. Set to 'text' to read as text.

Returns:

- Return value depends on the category

Notes:

- Any path prefix set by `setPathPrefix` will be appended to relative urls.

### saveSync(url : String [, options : Object]) : ArrayBuffer | String

Similar to `save` except saves and parses data synchronously.

Note that for `saveSync` to work, the `url` needs to be saveable synchronously _and_ the writer used must support synchronous parsing. Synchronous saveing only works on data URLs or files in Node.js. In many cases, the asynchronous `save` is more appropriate.
2 changes: 1 addition & 1 deletion docs/api-reference/geojson-loaders/kml-loader.md
Original file line number Diff line number Diff line change
Expand Up @@ -11,7 +11,7 @@ References:

```js
import {KMLLoader} from '@loaders.gl/kml';
import {loadFileSync} from '@loaders.gl/core';
import {load} from '@loaders.gl/core';
```

## Structure of Loaded Data
Expand Down
4 changes: 2 additions & 2 deletions docs/api-reference/gltf-loaders/glb-parser.md
Original file line number Diff line number Diff line change
Expand Up @@ -12,13 +12,13 @@ References:

```js
import {GLBParser} from '@loaders.gl/gltf';
import {loadFileSync} from '@loaders.gl/core';
import {load} from '@loaders.gl/core';

// Create a parser
const glbParser = new GLBParser();

// Load and parse a file
const GLB_BINARY = loadFileSync(...);
const GLB_BINARY = await load(...);
glbParser.parse(GLB_BINARY);

// Get the complete GLB JSON structure
Expand Down
8 changes: 4 additions & 4 deletions docs/api-reference/gltf-loaders/gltf-loader.md
Original file line number Diff line number Diff line change
Expand Up @@ -15,18 +15,18 @@ Parses a glTF file into a hierarchical scenegraph description that can be used t
## Usage

```
import {loadFile} from '@loaders.gl/core';
import {load} from '@loaders.gl/core';
import {GLTFLoader} from '@loaders.gl/gltf';
const gltf = await loadFile(url, GLTFLoader);
const gltf = await load(url, GLTFLoader);
```

To decompress Draco compressed meshes:

```
import {loadFile} from '@loaders.gl/core';
import {load} from '@loaders.gl/core';
import {GLTFLoader} from '@loaders.gl/gltf';
import {DracoLoader} from '@loaders.gl/draco';
const gltf = loadFile(url, GLTFLoader, {DracoLoader, decompress: true});
const gltf = load(url, GLTFLoader, {DracoLoader, decompress: true});
```

## Options
Expand Down
4 changes: 2 additions & 2 deletions docs/api-reference/gltf-loaders/gltf-parser.md
Original file line number Diff line number Diff line change
Expand Up @@ -17,13 +17,13 @@ References:

```js
import {GLTFParser} from '@loaders.gl/gltf';
import {loadFileSync} from '@loaders.gl/core';
import {load} from '@loaders.gl/core';

// Create a parser
const gltfParser = new GLTFParser();

// Load and parse a file
const GLTF_BINARY = loadFileSync(...);
const GLTF_BINARY = await load(...);
gltfParser.parseSync(GLTF_BINARY);

// Get the complete glTF JSON structure
Expand Down
4 changes: 2 additions & 2 deletions docs/api-reference/gltf-loaders/gltf-writer.md
Original file line number Diff line number Diff line change
Expand Up @@ -16,9 +16,9 @@ The `GLTFWriter` is a writer for glTF scenegraphs.

```js
import {GLTFWriter} from '@loaders.gl/gltf';
import {encodeFileSync} from '@loaders.gl/core';
import {encodeSync} from '@loaders.gl/core';

const arrayBuffer = encodeFileSync(gltf, GLTFWriter);
const arrayBuffer = encodeSync(gltf, GLTFWriter);
```

## Options
Expand Down
Loading

0 comments on commit 334878f

Please sign in to comment.