Create a hash checksum over a folder or a file.
The hashes are propagated upwards, the hash that is returned for a folder is generated over all the hashes of its children.
The hashes are generated with the sha1 algorithm and returned in base64 encoding by default.
Each file returns a name and a hash, and each folder returns additionally an array of children (file or folder elements).
First, install folder-hash with npm install --save folder-hash
or yarn add folder-hash
.
To see differences to the last version of this package, I would create hashes over all .js and .json files. But ignore everything inside folders starting with a dot, and also from the folders node_modules, test_coverage. The structure of the options object is documented below.
This example is also stored in ./examples/readme-example1.js.
const { hashElement } = require('folder-hash');
const options = {
folders: { exclude: ['.*', 'node_modules', 'test_coverage'] },
files: { include: ['*.js', '*.json'] },
};
console.log('Creating a hash over the current folder:');
hashElement('.', options)
.then(hash => {
console.log(hash.toString());
})
.catch(error => {
return console.error('hashing failed:', error);
});
The returned information looks for example like this:
Creating a hash over the current folder:
{ name: '.', hash: 'YZOrKDx9LCLd8X39PoFTflXGpRU=,'
children: [
{ name: 'examples', hash: 'aG8wg8np5SGddTnw1ex74PC9EnM=,'
children: [
{ name: 'readme-example1.js', hash: 'Xlw8S2iomJWbxOJmmDBnKcauyQ8=' }
{ name: 'readme-with-callbacks.js', hash: 'ybvTHLCQBvWHeKZtGYZK7+6VPUw=' }
{ name: 'readme-with-promises.js', hash: '43i9tE0kSFyJYd9J2O0nkKC+tmI=' }
{ name: 'sample.js', hash: 'PRTD9nsZw3l73O/w5B2FH2qniFk=' }
]}
{ name: 'index.js', hash: 'kQQWXdgKuGfBf7ND3rxjThTLVNA=' }
{ name: 'package.json', hash: 'w7F0S11l6VefDknvmIy8jmKx+Ng=' }
{ name: 'test', hash: 'H5x0JDoV7dEGxI65e8IsencDZ1A=,'
children: [
{ name: 'parameters.js', hash: '3gCEobqzHGzQiHmCDe5yX8weq7M=' }
{ name: 'test.js', hash: 'kg7p8lbaVf1CPtWLAIvkHkdu1oo=' }
]}
]}
And the structure may be traversed to e.g. create incremental backups.
It is also possible to only match the full path and not the basename. The same configuration could look like this:
You should be aware that *nix and Windows behave differently, so please use caution.
const options = {
folders: {
exclude: ['.*', '**.*', '**node_modules', '**test_coverage'],
matchBasename: false,
matchPath: true,
},
files: {
//include: ['**.js', '**.json' ], // Windows
include: ['*.js', '**/*.js', '*.json', '**/*.json'], // *nix
matchBasename: false,
matchPath: true,
},
};
Name | Type | Attributes | Description |
---|---|---|---|
name | string | element name or an element's path | |
dir | string |
<optional> |
directory that contains the element (generated from name if omitted) |
options | Object |
<optional> |
Options object (see below) |
callback | fn |
<optional> |
Error-first callback function |
{
algo: 'sha1', // see crypto.getHashes() for options in your node.js REPL
encoding: 'base64', // 'base64', 'base64url', 'hex' or 'binary'
files: {
exclude: [],
include: [],
matchBasename: true,
matchPath: false,
ignoreBasename: false,
ignoreRootName: false
},
folders: {
exclude: [],
include: [],
matchBasename: true,
matchPath: false,
ignoreRootName: false
},
symbolicLinks: {
include: true,
ignoreBasename: false,
ignoreTargetPath: true,
ignoreTargetContent: false,
ignoreTargetContentAfterError: false,
}
}
Name | Type | Attributes | Default | Description |
---|---|---|---|---|
algo | string |
<optional> |
'sha1' | checksum algorithm, see options in crypto.getHashes() |
encoding | string |
<optional> |
'base64' | encoding of the resulting hash. One of 'base64', 'base64url', 'hex' or 'binary' |
files | Object |
<optional> |
Rules object (see below) | |
folders | Object |
<optional> |
Rules object (see below) | |
symLinks | Object |
<optional> |
Symlink options (see below) |
Name | Type | Attributes | Default | Description |
---|---|---|---|---|
exclude | Array.<string> || Function |
<optional> |
[] | Array of optional exclude glob patterns, see minimatch doc. Can also be a function which returns true if the passed file is excluded. |
include | Array.<string> || Function |
<optional> |
[] | Array of optional include glob patterns, see minimatch doc. Can also be a function which returns true if the passed file is included. |
matchBasename | bool |
<optional> |
true | Match the glob patterns to the file/folder name |
matchPath | bool |
<optional> |
false | Match the glob patterns to the file/folder path |
ignoreBasename | bool |
<optional> |
false | Set to true to calculate the hash without the basename element |
ignoreRootName | bool |
<optional> |
false | Set to true to calculate the hash without the basename of the root (first) element |
Configure, how symbolic links should be hashed.
To understand how the options can be combined to create a specific behavior, look into test/symbolic-links.js.
Name | Type | Default | Description |
---|---|---|---|
include | bool | true | If false, symbolic links are not handled at all. A folder with three symbolic links inside will have no children entries. |
ignoreBasename | bool | false | Set to true to calculate the hash without the basename element |
ignoreTargetPath | bool | true | If false, the resolved link target is added to the hash (uses fs.readlink) |
ignoreTargetContent | bool | false | If true, will only assess the basename and target path (as configured in the other options) |
ignoreTargetContentAfterError | bool | false | If true, will ignore all errors while trying to hash symbolic links and only assess the basename and target path (as configured in other options). E.g. a missing target (ENOENT) or access permissions (EPERM). |
After installing it globally via
$ npm install -g folder-hash
You can use it like this:
# local folder
$ folder-hash -c config.json .
# local folder
$ folder-hash
# global folder
$ folder-hash /user/bin
It also allows to pass an optional JSON configuration file with the -c
or --config
flag, which should contain the same configuration as when using the JavaScript API.
You can also use a local version of folder-hash like this:
$ npx folder-hash --help
Use folder-hash on cli like this:
folder-hash [--config <json-file>] <file-or-folder>
See file ./examples/readme-with-promises.js
const path = require('path');
const { hashElement } = require('folder-hash');
// pass element name and folder path separately
hashElement('test', path.join(__dirname, '..'))
.then(hash => {
console.log('Result for folder "../test":', hash.toString(), '\n');
})
.catch(error => {
return console.error('hashing failed:', error);
});
// pass element path directly
hashElement(__dirname)
.then(hash => {
console.log(`Result for folder "${__dirname}":`);
console.log(hash.toString(), '\n');
})
.catch(error => {
return console.error('hashing failed:', error);
});
// pass options (example: exclude dotFolders)
const options = { encoding: 'hex', folders: { exclude: ['.*'] } };
hashElement(__dirname, options)
.then(hash => {
console.log('Result for folder "' + __dirname + '" (with options):');
console.log(hash.toString(), '\n');
})
.catch(error => {
return console.error('hashing failed:', error);
});
See ./examples/readme-with-callbacks.js
const path = require('path');
const { hashElement } = require('folder-hash');
// pass element name and folder path separately
hashElement('test', path.join(__dirname, '..'), (error, hash) => {
if (error) {
return console.error('hashing failed:', error);
} else {
console.log('Result for folder "../test":', hash.toString(), '\n');
}
});
// pass element path directly
hashElement(__dirname, (error, hash) => {
if (error) {
return console.error('hashing failed:', error);
} else {
console.log('Result for folder "' + __dirname + '":');
console.log(hash.toString(), '\n');
}
});
// pass options (example: exclude dotFiles)
const options = { algo: 'md5', files: { exclude: ['.*'], matchBasename: true } };
hashElement(__dirname, options, (error, hash) => {
if (error) {
return console.error('hashing failed:', error);
} else {
console.log('Result for folder "' + __dirname + '":');
console.log(hash.toString());
}
});
The behavior is documented and verified in the unit tests. Execute npm test
or mocha test
, and have a look at the test subfolder.
You can also have a look at the CircleCI report.
The hashes are the same if:
- A file is checked again
- Two files have the same name and content (but exist in different folders)
The hashes are different if:
- A file was renamed or its content was changed
- Two files have the same name but different content
- Two files have the same content but different names
Content means in this case a folder's children - both the files and the subfolders with their children.
The hashes are the same if:
- A folder is checked again
- Two folders have the same name and content (but have different parent folders)
The hashes are different if:
- A file somewhere in the directory structure was renamed or its content was changed
- Two folders have the same name but different content
- Two folders have the same content but different names
MIT, see LICENSE.txt