Skip to content
This repository has been archived by the owner on Apr 29, 2020. It is now read-only.

Commit

Permalink
feat: convert to async/await (#21)
Browse files Browse the repository at this point in the history
* feat: convert to async/await

* chore: readme update

* chore: fix linting

* feat: convert internals to be async/await

* test: increase test coverage

* chore: remove uncessary await

* feat: add export depth and recursive exports

* chore: address PR comments

* chore: update ipld formats

* chore: PR comments

* chore: standardise error codes
  • Loading branch information
achingbrain committed May 17, 2019
1 parent bf8bad2 commit 7119a09
Show file tree
Hide file tree
Showing 26 changed files with 1,636 additions and 2,306 deletions.
7 changes: 7 additions & 0 deletions .aegir.js
Original file line number Diff line number Diff line change
@@ -0,0 +1,7 @@
'use strict'

module.exports = {
karma: {
browserNoActivityTimeout: 1000 * 1000,
}
}
297 changes: 170 additions & 127 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -19,13 +19,22 @@

## Table of Contents

- [Install](#install)
- [Usage](#usage)
- [Example](#example)
- [API](#api)
- [exporter(cid, ipld)](#exportercid-ipld-options)
- [Contribute](#contribute)
- [License](#license)
- [ipfs-unixfs-exporter](#ipfs-unixfs-exporter)
- [Lead Maintainer](#lead-maintainer)
- [Table of Contents](#table-of-contents)
- [Install](#install)
- [Usage](#usage)
- [Example](#example)
- [API](#api)
- [`exporter(cid, ipld)`](#exportercid-ipld)
- [UnixFS V1 entries](#unixfs-v1-entries)
- [Raw entries](#raw-entries)
- [CBOR entries](#cbor-entries)
- [`entry.content({ offset, length })`](#entrycontent-offset-length)
- [`exporter.path(cid, ipld)`](#exporterpathcid-ipld)
- [`exporter.recursive(cid, ipld)`](#exporterrecursivecid-ipld)
- [Contribute](#contribute)
- [License](#license)

## Install

Expand All @@ -38,29 +47,41 @@
### Example

```js
// Create an export source pull-stream cid or ipfs path you want to export and a
// <dag or ipld-resolver instance> to fetch the file from
// import a file and export it again
const importer = require('ipfs-unixfs-importer')
const exporter = require('ipfs-unixfs-exporter')
const pull = require('pull-stream/pull')
const { stdout } = require('pull-stdio')

const options = {}

pull(
exporter(cid, ipld, options),
collect((error, files) => {
if (error) {
// ...handle error
}

// Set up a pull stream that sends the file content to process.stdout
pull(
// files[0].content is a pull-stream that contains the bytes of the file
files[0].content,
stdout()
)
})
)

const files = []

for await (const file of importer([{
path: '/foo/bar.txt',
content: Buffer.from(0, 1, 2, 3)
}], ipld)) {
files.push(file)
}

console.info(files[0].cid) // Qmbaz

const entry = await exporter(files[0].cid, ipld)

console.info(entry.cid) // Qmqux
console.info(entry.path) // Qmbaz/foo/bar.txt
console.info(entry.name) // bar.txt
console.info(entry.unixfs.fileSize()) // 4

// stream content from unixfs node
const bytes = []

for await (const buf of entry.content({
offset: 0, // optional offset
length: 4 // optional length
})) {
bytes.push(buf)
}

const content = Buffer.concat(bytes)

console.info(content) // 0, 1, 2, 3
```

#### API
Expand All @@ -69,124 +90,146 @@ pull(
const exporter = require('ipfs-unixfs-exporter')
```

### exporter(cid, ipld, options)
### `exporter(cid, ipld)`

Uses the given [dag API][] or an [ipld-resolver instance][] to fetch an IPFS [UnixFS][] object(s) by their CID.
Uses the given [js-ipld instance][] to fetch an IPFS node by it's CID.

Creates a new pull stream that outputs objects of the form
Returns a Promise which resolves to an `entry`.

```js
#### UnixFS V1 entries

Entries with a `dag-pb` codec `CID` return UnixFS V1 entries:

```javascript
{
path: 'a name',
content: <pull stream>
name: 'foo.txt',
path: 'Qmbar/foo.txt',
cid: CID, // see https://github.com/multiformats/js-cid
node: DAGNode, // see https://github.com/ipld/js-ipld-dag-pb
content: function, // returns an async iterator
unixfs: UnixFS // see https://github.com/ipfs/js-ipfs-unixfs
}
```

#### `offset` and `length`
If the entry is a file, `entry.content()` returns an async iterator that yields one or more buffers containing the file content:

`offset` and `length` arguments can optionally be passed to the exporter function. These will cause the returned stream to only emit bytes starting at `offset` and with length of `length`.
```javascript
if (entry.unixfs.type === 'file') {
for await (const chunk of entry.content()) {
// chunk is a Buffer
}
}
```
See [the tests](test/exporter.js) for examples of using these arguments.
If the entry is a directory or hamt shard, `entry.content()` returns further `entry` objects:
```js
const exporter = require('ipfs-unixfs-exporter')
const pull = require('pull-stream')
const drain = require('pull-stream/sinks/drain')

pull(
exporter(cid, ipld, {
offset: 0,
length: 10
})
drain((file) => {
// file.content is a pull stream containing only the first 10 bytes of the file
})
)
```javascript
if (entry.unixfs.type.includes('directory')) { // can be 'directory' or 'hamt-sharded-directory'
for await (const entry of dir.content()) {
console.info(entry.name)
}
}
```
### `fullPath`
#### Raw entries
If specified the exporter will emit an entry for every path component encountered.
Entries with a `raw` codec `CID` return raw entries:
```javascript
const exporter = require('ipfs-unixfs-exporter')
const pull = require('pull-stream')
const collect = require('pull-stream/sinks/collect')

pull(
exporter('QmFoo.../bar/baz.txt', ipld, {
fullPath: true
})
collect((err, files) => {
console.info(files)

// [{
// depth: 0,
// name: 'QmFoo...',
// path: 'QmFoo...',
// size: ...
// cid: CID
// content: undefined
// type: 'dir'
// }, {
// depth: 1,
// name: 'bar',
// path: 'QmFoo.../bar',
// size: ...
// cid: CID
// content: undefined
// type: 'dir'
// }, {
// depth: 2,
// name: 'baz.txt',
// path: 'QmFoo.../bar/baz.txt',
// size: ...
// cid: CID
// content: <Pull stream>
// type: 'file'
// }]
//
})
)
{
name: 'foo.txt',
path: 'Qmbar/foo.txt',
cid: CID, // see https://github.com/multiformats/js-cid
node: Buffer, // see https://nodejs.org/api/buffer.html
content: function, // returns an async iterator
}
```

### `maxDepth`
`entry.content()` returns an async iterator that yields a buffer containing the node content:

If specified the exporter will only emit entries up to the specified depth.
```javascript
for await (const chunk of entry.content()) {
// chunk is a Buffer
}
```
Unless you an options object containing `offset` and `length` keys as an argument to `entry.content()`, `chunk` will be equal to `entry.node`.
#### CBOR entries
Entries with a `dag-cbor` codec `CID` return JavaScript object entries:
```javascript
const exporter = require('ipfs-unixfs-exporter')
const pull = require('pull-stream')
const collect = require('pull-stream/sinks/collect')

pull(
exporter('QmFoo.../bar/baz.txt', ipld, {
fullPath: true,
maxDepth: 1
})
collect((err, files) => {
console.info(files)

// [{
// depth: 0,
// name: 'QmFoo...',
// path: 'QmFoo...',
// size: ...
// cid: CID
// content: undefined
// type: 'dir'
// }, {
// depth: 1,
// name: 'bar',
// path: 'QmFoo.../bar',
// size: ...
// cid: CID
// content: undefined
// type: 'dir'
// }]
//
})
)
{
name: 'foo.txt',
path: 'Qmbar/foo.txt',
cid: CID, // see https://github.com/multiformats/js-cid
node: Object, // see https://github.com/ipld/js-ipld-dag-cbor
}
```
There is no `content` function for a `CBOR` node.
#### `entry.content({ offset, length })`
When `entry` is a file or a `raw` node, `offset` and/or `length` arguments can be passed to `entry.content()` to return slices of data:
```javascript
const bufs = []

for await (const chunk of entry.content({
offset: 0,
length: 5
})) {
bufs.push(chunk)
}

// `data` contains the first 5 bytes of the file
const data = Buffer.concat(bufs)
```
If `entry` is a directory or hamt shard, passing `offset` and/or `length` to `entry.content()` will limit the number of files returned from the directory.
```javascript
const entries = []

for await (const entry of dir.content({
offset: 0,
length: 5
})) {
entries.push(entry)
}

// `entries` contains the first 5 files/directories in the directory
```
### `exporter.path(cid, ipld)`
`exporter.path` will return an async iterator that yields entries for all segments in a path:
```javascript
const entries = []

for await (const entry of exporter.path('Qmfoo/foo/bar/baz.txt', ipld)) {
entries.push(entry)
}

// entries contains 4x `entry` objects
```
### `exporter.recursive(cid, ipld)`
`exporter.recursive` will return an async iterator that yields all entries beneath a given CID or IPFS path, as well as the containing directory.
```javascript
const entries = []

for await (const child of exporter.recursive('Qmfoo/foo/bar', ipld)) {
entries.push(entry)
}

// entries contains all children of the `Qmfoo/foo/bar` directory and it's children
```
[dag API]: https://github.com/ipfs/interface-ipfs-core/blob/master/SPEC/DAG.md
Expand Down

0 comments on commit 7119a09

Please sign in to comment.