Skip to content
This repository has been archived by the owner on Aug 12, 2020. It is now read-only.

Commit

Permalink
feat: allow specify hash algorithm for large files (#184)
Browse files Browse the repository at this point in the history
  • Loading branch information
alanshaw authored and daviddias committed Sep 8, 2017
1 parent 33c9d1c commit 69915da
Show file tree
Hide file tree
Showing 5 changed files with 40 additions and 5 deletions.
3 changes: 2 additions & 1 deletion README.md
Original file line number Diff line number Diff line change
Expand Up @@ -115,7 +115,7 @@ The `import` object is a duplex pull stream that takes objects of the form:
}
```

`import` will outoyt file info objects as files get stored in IPFS. When stats on a node are emitted they are guaranteed to have been written.
`import` will output file info objects as files get stored in IPFS. When stats on a node are emitted they are guaranteed to have been written.

`dag` is an instance of the [`IPLD Resolver`](https://github.com/ipld/js-ipld-resolver) or the [`js-ipfs` `dag api`](https://github.com/ipfs/interface-ipfs-core/tree/master/API/dag)

Expand All @@ -140,6 +140,7 @@ The input's file paths and directory structure will be preserved in the [`dag-pb
- bits (positive integer, defaults to `8`): the number of bits at each bucket of the HAMT
- `progress` (function): a function that will be called with the byte length of chunks as a file is added to ipfs.
- `onlyHash` (boolean, defaults to false): Only chunk and hash - do not write to disk
- `hashAlg` (string): multihash hashing algorithm to use

### Exporter

Expand Down
2 changes: 1 addition & 1 deletion src/builder/reduce.js
Original file line number Diff line number Diff line change
Expand Up @@ -32,7 +32,7 @@ module.exports = function (file, ipldResolver, options) {
})

waterfall([
(cb) => DAGNode.create(f.marshal(), links, cb),
(cb) => DAGNode.create(f.marshal(), links, options.hashAlg, cb),
(node, cb) => {
if (options.onlyHash) return cb(null, node)

Expand Down
5 changes: 3 additions & 2 deletions src/importer/dir-flat.js
Original file line number Diff line number Diff line change
Expand Up @@ -56,12 +56,13 @@ class DirFlat extends Dir {
})

const dir = new UnixFS('directory')
const options = this._options

waterfall(
[
(callback) => DAGNode.create(dir.marshal(), links, callback),
(callback) => DAGNode.create(dir.marshal(), links, options.hashAlg, callback),
(node, callback) => {
if (this._options.onlyHash) return callback(null, node)
if (options.onlyHash) return callback(null, node)

ipldResolver.put(
node,
Expand Down
2 changes: 1 addition & 1 deletion src/importer/dir-sharded.js
Original file line number Diff line number Diff line change
Expand Up @@ -144,7 +144,7 @@ function flush (options, bucket, path, ipldResolver, source, callback) {
dir.hashType = options.hashFn.code
waterfall(
[
(callback) => DAGNode.create(dir.marshal(), links, callback),
(callback) => DAGNode.create(dir.marshal(), links, options.hashAlg, callback),
(node, callback) => {
if (options.onlyHash) return callback(null, node)

Expand Down
33 changes: 33 additions & 0 deletions test/test-builder.js
Original file line number Diff line number Diff line change
Expand Up @@ -57,5 +57,38 @@ module.exports = (repo) => {
)
}, done)
})

it('allows multihash hash algorithm to be specified for big file', (done) => {
eachSeries(Object.keys(mh.names), (hashAlg, cb) => {
const options = { hashAlg, strategy: 'flat' }
const content = String(Math.random() + Date.now())
const inputFile = {
path: content + '.txt',
// Bigger than maxChunkSize
content: Buffer.alloc(262144 + 5).fill(1)
}

const onCollected = (err, nodes) => {
if (err) return cb(err)

const node = nodes[0]

try {
expect(node).to.exist()
expect(mh.decode(node.multihash).name).to.equal(hashAlg)
} catch (err) {
return cb(err)
}

cb()
}

pull(
pull.values([Object.assign({}, inputFile)]),
createBuilder(FixedSizeChunker, ipldResolver, options),
pull.collect(onCollected)
)
}, done)
})
})
}

0 comments on commit 69915da

Please sign in to comment.