Skip to content

HTTPS clone URL

Subversion checkout URL

You can clone with HTTPS or Subversion.

Download ZIP

Loading…

Refactor #57

Merged
merged 20 commits into from

4 participants

@fb55

As you might have seen, I did some refactoring in the refactor branch. Mainly removing unused code, improving and abstracting some stuff and introducing two new public methods:

  • analyzer.path does ra's magic on a path, no matter what kind of file is behind it.
  • analyzer.findNextDir does what .findModulesDir did so long: Return the next directory for a path.

.findModulesDir now actually searches for a directory containing either a node_modules dir or a package.json file.

findit is no longer a dependency, as .path (which is an abstraction of .dir) can do the job.

Please have a look at the code, it's possible that I've missed something. (All tests pass at least.)

fb55 added some commits
@fb55 fb55 use console.log instead of util.puts
also fixed some linting errors (semicolons & equals)
c802fea
@fb55 fb55 added a analyzer.path method that calls the appropriate function for …
…a path

as it's mainly the code of .analyze, use it there
5d32a73
@fb55 fb55 use .path inside .dir (replaces findit) e432496
@fb55 fb55 removed findit from dependencies b12e72d
@fb55 fb55 some minor changes
• use the `in` operator more frequently (it's much nicer than `typeof
foo !== "undefined"` + is more readable)
• removed some garbage
• added some comments
f74cfd3
@fb55 fb55 removed two unnecessary functions
in L54, err would have always be falsey (or the function had returned
earlier)
e1aa6b0
@fb55 fb55 [fix] still accessed property before checking it with `in` f08e5fc
@fb55 fb55 [fix] passed filenames instead of paths to .path 55faa29
@fb55 fb55 forward errors from prerunners in runAnalyze a7c35d1
@fb55 fb55 use Function#bind when it's appropriate, use single quotes d9c45eb
@fb55 fb55 Revert "forward errors from prerunners in runAnalyze"
This reverts commit a7c35d1.
48a34e0
@fb55 fb55 always return an object in npmAnalyze
also moved the creation of the find-dependencies path to the
surrounding scope
7ede462
@fb55 fb55 [fix] move upwards in findModulesDir
this is the functionality that is described in the comments, but wasn't
implemented (yet)
b3e2bb9
@fb55 fb55 removed senseless check in analyzer.dir
a folder should never contain a file that has the folder's path as it's
name
49b2d44
@fb55 fb55 added analyzer.findNextDir function
the last change broke most tests, so .findNextDir is what's actually
wanted for .npmAnalyze
16f22f9
@fb55 fb55 set 1 as the maxDepth of read-installed
we don't use more, so don't do anything fancy
eeebe3a
@fb55 fb55 [fix] call callback when a node_modules folder is present 873a641
@fb55 fb55 simplified mergeDependencies & .package
• removed scanning scripts as they weren't implemented properly (they
just looped a couple of times without doing anything)
• simplified signature of mergeDependencies
• return errors in .package
• also updated some comments
e169c9e
@fb55 fb55 removed some garbage in .package 4968886
@fb55 fb55 [fix] call .findNextDir recursively 2352c31
@indexzero
Owner

@mmalecki @AvianFlu We should look at this in the next week or two to evaluate it. No rush though.

@dscape dscape commented on the diff
lib/require-analyzer.js
((43 lines not shown))
if (!options || !options.target) {
//
// If there are no `options` and no `options.target` property
// respond with the appropriate error.
//
callback(new Error('options and options.target are required'));
- return emitter;
@dscape Collaborator
dscape added a note

Needs review. Is this depended upon anywhere?

Guessing not, as in the error we don't return any ee.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
@dscape
Collaborator

LGTM. Seems like great work by @fb55

Tests pass locally

@yawnt yawnt merged commit 6ece1ff into from
@fb55 fb55 deleted the branch
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Commits on Sep 7, 2012
  1. @fb55

    use console.log instead of util.puts

    fb55 authored
    also fixed some linting errors (semicolons & equals)
  2. @fb55

    added a analyzer.path method that calls the appropriate function for …

    fb55 authored
    …a path
    
    as it's mainly the code of .analyze, use it there
  3. @fb55
  4. @fb55

    removed findit from dependencies

    fb55 authored
  5. @fb55

    some minor changes

    fb55 authored
    • use the `in` operator more frequently (it's much nicer than `typeof
    foo !== "undefined"` + is more readable)
    • removed some garbage
    • added some comments
  6. @fb55

    removed two unnecessary functions

    fb55 authored
    in L54, err would have always be falsey (or the function had returned
    earlier)
  7. @fb55
  8. @fb55
  9. @fb55
Commits on Sep 8, 2012
  1. @fb55
  2. @fb55

    Revert "forward errors from prerunners in runAnalyze"

    fb55 authored
    This reverts commit a7c35d1.
  3. @fb55

    always return an object in npmAnalyze

    fb55 authored
    also moved the creation of the find-dependencies path to the
    surrounding scope
  4. @fb55

    [fix] move upwards in findModulesDir

    fb55 authored
    this is the functionality that is described in the comments, but wasn't
    implemented (yet)
  5. @fb55

    removed senseless check in analyzer.dir

    fb55 authored
    a folder should never contain a file that has the folder's path as it's
    name
  6. @fb55

    added analyzer.findNextDir function

    fb55 authored
    the last change broke most tests, so .findNextDir is what's actually
    wanted for .npmAnalyze
  7. @fb55

    set 1 as the maxDepth of read-installed

    fb55 authored
    we don't use more, so don't do anything fancy
  8. @fb55
  9. @fb55

    simplified mergeDependencies & .package

    fb55 authored
    • removed scanning scripts as they weren't implemented properly (they
    just looped a couple of times without doing anything)
    • simplified signature of mergeDependencies
    • return errors in .package
    • also updated some comments
  10. @fb55

    removed some garbage in .package

    fb55 authored
  11. @fb55
This page is out of date. Refresh to see the latest.
View
19 bin/require-analyzer
@@ -2,7 +2,6 @@
var fs = require('fs'),
path = require('path'),
- util = require('util'),
colors = require('colors'),
winston = require('winston'),
argv = require('optimist').argv,
@@ -24,7 +23,7 @@ var help = [
].join('\n');
if (argv.h || argv.help) {
- return util.puts(help);
+ return console.log(help);
}
//
@@ -71,16 +70,16 @@ function listDependencies (pkg, msgs) {
list = list.replace(/\{\s/, '{ \n')
.replace(/\}/, '\n}')
.replace('\033[90m', ' \033[90m')
- .replace(/, /ig, ',\n ')
+ .replace(/, /ig, ',\n ');
}
else {
list = list.replace(/\n\s{4}/ig, '\n ');
}
- winston.info(msgs.success)
+ winston.info(msgs.success);
list.split('\n').forEach(function (line) {
winston.data(line);
- })
+ });
}
var dir = process.cwd(),
@@ -96,7 +95,7 @@ pkgFile = path.join(dir, argv.f || argv.file || 'package.json');
winston.info('require-analyzer starting in ' + dir.magenta);
fs.readFile(pkgFile, function (err, data) {
if (err) {
- if (err.errno === 34 && err.code == 'ENOENT') {
+ if (err.errno === 34 && err.code === 'ENOENT') {
data = "{}";
}
else {
@@ -146,7 +145,7 @@ fs.readFile(pkgFile, function (err, data) {
winston.error('Error analyzing dependencies'.red);
err.message.split('\n').forEach(function (line) {
winston.error(line);
- })
+ });
return;
}
});
@@ -180,8 +179,8 @@ fs.readFile(pkgFile, function (err, data) {
// file in the target `dir`.
//
if (argv.safe) {
- winston.info('did not update package.json')
- return
+ winston.info('did not update package.json');
+ return;
}
if (Object.keys(newpkg.dependencies).length > 0) {
winston.info('Updating ' + pkgFile.magenta);
@@ -194,7 +193,7 @@ fs.readFile(pkgFile, function (err, data) {
}
winston.info('require-analyzer updated package.json dependencies');
- })
+ });
}
});
});
View
436 lib/require-analyzer.js
@@ -14,8 +14,7 @@ var util = require('util'),
readInstalled = require('read-installed'),
detective = require('detective'),
resolve = require('resolve'),
- semver = require('semver'),
- findit = require('findit');
+ semver = require('semver');
var analyzer = exports;
@@ -23,8 +22,7 @@ var analyzer = exports;
// ### function analyze (options, callback)
// #### @options {Object} Options to analyze against
// #### @callback {function} Continuation to respond to when complete.
-// Calls the appropriate `require-analyzer` method based on the result
-// from `fs.stats()` on `options.target`. When dependencies are returned,
+// Calls `path`. When dependencies are returned,
// `npmAnalyze()` is called with `options` and the resulting object is
// returned to `callback`. Also returns an event emitter which outputs
// data at various stages of completion with events:
@@ -35,14 +33,51 @@ var analyzer = exports;
//
analyzer.analyze = function (options, callback) {
var emitter = new events.EventEmitter();
+
+ //
+ // let path determine what to do
+ //
+ analyzer.path(options, function (err, deps) {
+ if (err) {
+ emitter.emit('childError', err);
+ return callback(err);
+ }
+
+ // Emit the `dependencies` event for streaming results.
+ emitter.emit('dependencies', deps);
+
+ if (options.npm === false || !deps || deps.length === 0) {
+ return callback(null, deps);
+ }
+
+ var npmEmitter = analyzer.npmAnalyze(deps, options, callback);
+
+ //
+ // Re-emit the `search` and `reduce` events from the `npmEmitter`
+ // for streaming results.
+ //
+ ['search', 'reduce'].forEach(function (ev) {
+ npmEmitter.on(ev, emitter.emit.bind(emitter, ev));
+ });
+ });
+
+ return emitter;
+};
+//
+// ### function path (options, callback)
+// #### @options {Object} Options to analyze against
+// #### @callback {function} Continuation to respond to when complete.
+// Calls the appropriate `require-analyzer` method based on the result
+// from `fs.stats()` on `options.target`.
+//
+analyzer.path = function(options, callback){
if (!options || !options.target) {
//
// If there are no `options` and no `options.target` property
// respond with the appropriate error.
//
callback(new Error('options and options.target are required'));
- return emitter;
@dscape Collaborator
dscape added a note

Needs review. Is this depended upon anywhere?

Guessing not, as in the error we don't return any ee.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
}
//
@@ -53,50 +88,19 @@ analyzer.analyze = function (options, callback) {
if (err) {
return callback(err);
}
-
- var analyzeFn, rootDir;
- if (stats.isDirectory()) {
- analyzeFn = analyzer.dir;
+ else if (stats.isDirectory()) {
+ analyzer.dir(options, callback);
}
else if (stats.isFile()) {
- analyzeFn = analyzer.file;
+ if('fileFilter' in options && !options.fileFilter(options.target)) return;
+ analyzer.file(options, callback);
}
else {
- return callback(new Error(options.target + ' is not a file or a directory.'));
+ err = new Error(options.target + ' is not a file or a directory.');
+ err.code = 'UNSUPPORTED_TYPE';
+ callback(err);
}
-
- analyzeFn.call(null, options, function (err, deps) {
- if (err) {
- emitter.emit('childError', err);
- return callback(err);
- }
-
- // Emit the `dependencies` event for streaming results.
- emitter.emit('dependencies', deps);
-
- if (options.npm === false || !deps || deps.length === 0) {
- return callback(null, deps);
- }
-
- var npmEmitter = analyzer.npmAnalyze(deps, options, function (nerr, reduced, suspect) {
- return callback(err || nerr, reduced, suspect);
- });
-
- //
- // Re-emit the `search` and `reduce` events from the `npmEmitter`
- // for streaming results.
- //
- ['search', 'reduce'].forEach(function (ev) {
- npmEmitter.on(ev, function () {
- var args = Array.prototype.slice.call(arguments);
- args.unshift(ev);
- emitter.emit.apply(emitter, args);
- });
- });
- });
});
-
- return emitter;
};
//
@@ -104,7 +108,7 @@ analyzer.analyze = function (options, callback) {
// #### @deps {Array} List of dependencies to analyze.
// #### @options {Object} Set of options to analyze with.
// #### @callback {function} Continuation to respond to when complete.
-// Analyzes the list of dependencies using `npm`, consumes the options:
+// Analyzes the list of dependencies using `read-installed`, consumes the options:
//
// options.reduce: Will remove deps consumed by sibling deps
//
@@ -116,30 +120,39 @@ analyzer.npmAnalyze = function (deps, options, callback) {
return callback();
}
- analyzer.findModulesDir(options.target, function (err, root) {
+ analyzer.findNextDir(options.target, function (err, root) {
if (err) {
return callback(err);
}
//
- // Analyze dependencies by searching for all installed locally via npm.
+ // Analyze dependencies by searching for all installed locally via read-installed.
// Then see if it depends on any other dependencies that are in the
// list so those dependencies may be removed (only if `options.reduce` is set).
//
- readInstalled(root, function (err, result) {
+ readInstalled(root, 1, function (err, result) {
if (err) {
return callback(err);
}
else if (!result || !result.dependencies || Object.keys(result.dependencies).length === 0) {
- return callback(null, deps);
+ // When no dependencies were found, return what we got
+ if(Array.isArray(deps)){
+ return callback(null, deps.reduce(function(obj, prop){
+ obj[prop] = "*";
+ return obj;
+ }, {}));
+ }
+ else {
+ return callback(null, deps);
+ }
}
Object.keys(result.dependencies).forEach(function (pkg) {
if (result.devDependencies && pkg in result.devDependencies) return;
if (result.bundleDependencies && pkg in result.bundleDependencies) return;
if (!Array.isArray(deps)) {
- if (deps[pkg] === '*' || typeof deps[pkg] === 'undefined') {
- pkgs[pkg] = result.dependencies[pkg]
+ if (deps[pkg] === '*' || !(pkg in deps) ) {
+ pkgs[pkg] = pkg in result.dependencies
? result.dependencies[pkg]['version']
: deps[pkg];
}
@@ -158,13 +171,13 @@ analyzer.npmAnalyze = function (deps, options, callback) {
return callback(null, pkgs);
}
- var reduced = analyzer.merge({}, pkgs),
+ var reduced = analyzer.clone(pkgs),
suspect = {};
Object.keys(deps).forEach(function (dep) {
- if (pkgs[dep] && pkgs[dep].dependencies) {
+ if (dep in pkgs && pkgs[dep].dependencies) {
Object.keys(pkgs[dep].dependencies).forEach(function (cdep) {
- if (reduced[cdep]) {
+ if (cdep in reduced) {
suspect[cdep] = pkgs[cdep];
delete reduced[cdep];
}
@@ -181,6 +194,14 @@ analyzer.npmAnalyze = function (deps, options, callback) {
return emitter;
};
+function filterFiles(file){
+ //
+ // If the file is not `.js` or `.coffee` do no analyze it
+ //
+ var ext = path.extname(file);
+ return ext === '.js' || ext === '.coffee';
+}
+
//
// ### function package (dir, callback)
// #### @dir {string} Parent directory to analyze
@@ -189,11 +210,13 @@ analyzer.npmAnalyze = function (deps, options, callback) {
// running `analyzer.package()` if it exists. Otherwise attempts to run
// `analyzer.file()` on all files in the source tree.
//
-analyzer.dir = function (options, callback) {
+analyzer.dir = function (options, callback) {
+ var target = path.resolve(__dirname, options.target);
+
//
// Read the target directory
//
- fs.readdir(options.target, function (err, files) {
+ fs.readdir(target, function (err, files) {
if (err) {
return callback(err);
}
@@ -202,72 +225,54 @@ analyzer.dir = function (options, callback) {
// If there is a package.json in the directory
// then analyze the require(s) based on `package.main`
//
- if ((options.target && files.indexOf(options.target) !== -1)
- || files.indexOf('package.json') !== -1) {
+ if (files.indexOf('package.json') !== -1) {
return analyzer.package(options, callback);
}
-
+
+ var remaining = files.length,
+ packages = {};
+
//
// Otherwise find all files in the directory tree
// and attempt to run `analyzer.file()` on each of them
// in parallel.
//
- var files = [],
- done = [],
- packages = {},
- traversed = false,
- target = path.resolve(__dirname, options.target),
- finder = findit.find(target);
-
- function onRequired() {
+ files.forEach(function(file){
//
- // Respond to the `callback` if all files have been traversed
- // and all files have been executed via `analyzer.file()`
+ // skip all files from 'node_modules' directories
//
- if (traversed && files.length === done.length) {
- callback(null, Object.keys(packages));
- }
- }
-
- finder.on('file', function (file) {
- //
- // skip all files from "node_modules" directories
- // because the checked direcotry might already be
- // in a node_modules directory, only the relative path
- // is checked
- //
- var relativePath = path.relative(target, file);
- if (relativePath.indexOf("node_modules") >= 0) {
- return;
- }
-
+ if(file === 'node_modules') return remaining--;
+
//
- // If the file is not `.js` or `.coffee` do no analyze it
+ // call analyzer.path and currate all dependencies
//
- var ext = path.extname(file),
- clone = analyzer.merge({}, options);
-
- if (ext !== '.js' && ext !== '.coffee') {
- return;
- }
-
- files.push(file);
-
- clone.target = file;
- analyzer.file(clone, function (err, deps) {
- deps.forEach(function (dep) {
+ analyzer.path({
+ __proto__: options,
+ target: path.join(target, file),
+ fileFilter: filterFiles
+ }, function(err, deps){
+ if(err && err.code !== 'UNSUPPORTED_TYPE'){
+ //
+ // skip symlinks & friends
+ // but forward real errors
+ //
+ remaining = -1; //ensures that callback won't be called again
+ callback(err);
+ return;
+ }
+
+ deps.forEach(function(dep){
packages[dep] = true;
});
-
- done.push(file);
- onRequired();
+
+ //
+ // when all files are analyzed, call the callback
+ //
+ if(!--remaining){
+ callback(null, Object.keys(packages));
+ }
});
});
-
- finder.on('end', function () {
- traversed = true;
- onRequired();
- });
});
};
@@ -279,10 +284,6 @@ analyzer.dir = function (options, callback) {
// the require statements in the script located at `package.main`
//
analyzer.package = function (options, callback) {
- var deps = {},
- pkgDeps = {},
- devDeps = {},
- bdlDeps = {};
//
// Attempt to read the package.json in the current directory
//
@@ -293,9 +294,6 @@ analyzer.package = function (options, callback) {
// Attempt to read the package.json data.
//
pkg = JSON.parse(pkg.toString());
- pkgDeps = pkg.dependencies;
- devDeps = pkg.devDependencies;
- bdlDeps = pkg.bundleDependencies;
}
catch (e) {
return callback(e);
@@ -309,7 +307,6 @@ analyzer.package = function (options, callback) {
//
// Analyze the require(s) based on:
// - the `main` property of the package.json
- // - the scripts in the options that relate to package.json
// - the default file if no package.json exists
//
var todo = 0,
@@ -318,7 +315,8 @@ analyzer.package = function (options, callback) {
function dequeue(err) {
todo--;
if (todo === 0) {
- mergeDependencies(err, _deps, pkgDeps, devDeps, bdlDeps, callback);
+ if(err) callback(err);
+ else mergeDependencies(_deps, pkg, callback);
}
}
@@ -327,86 +325,49 @@ analyzer.package = function (options, callback) {
analyzer.file(options, function (err, deps) {
_deps = _deps.concat(deps.filter(function (d) {
- return _deps.indexOf(d) === -1 && d !== pkg.name;
+ return d !== pkg.name && _deps.indexOf(d) === -1;
}));
dequeue(err);
});
}
- var scripts = options.hasOwnProperty('scripts') ? options.scripts : ["test","prestart"];
-
- scripts = scripts.map(function (item) {
- return pkg.scripts && pkg.scripts[item];
- }).filter(function (item) {
- return !!item;
- });
-
- if (scripts) {
- scripts.forEach(function analyzeScript(script) {
- if (!script) {
- return;
- }
-
- var newoptions = analyzer.clone(options);
- try {
- newoptions.target = require.resolve(path.join(newoptions.target, path.normalize(pkg.main || '/')));
- }
- catch (e) {
- todo = 1;
- deps = null;
- dequeue(e);
- }
-
- processOptions(newoptions);
- });
- }
-
var newoptions = analyzer.clone(options);
function setMain(files, pkg, newoptions, callback) {
- var file = null;
-
function nextFile() {
- file = files.shift();
- if (typeof file === 'undefined') {
+ if (!files.length) {
return callback(pkg, newoptions);
}
- checkFile(file);
- }
-
- function checkFile(file) {
- exists(file, fileExists)
- }
- function fileExists(exists) {
- if (exists) {
- pkg.main = file;
- return callback(pkg, newoptions);
- }
- nextFile();
+ var file = files.shift();
+
+ exists(file, function(exists){
+ if (exists) {
+ pkg.main = file;
+ callback(pkg, newoptions);
+ }
+ else nextFile();
+ });
}
nextFile();
- return;
}
function setTarget(pkg, newoptions) {
- try {
- newoptions.target = require.resolve(path.join(newoptions.target, path.normalize(pkg.main || '/')));
- }
- catch (e) {
+ var newPath = path.join(newoptions.target, pkg.main ? path.normalize(pkg.main) : '/'),
+ newTarget = analyzer.resolve(newPath);
+
+ if (newTarget === false) {
todo = 1;
- deps = null;
- dequeue(e);
+ dequeue(new Error('Couldn\'t resolve path ' + newPath));
}
return processOptions(newoptions);
}
// add logic to default to app.js or server.js for main if main is not present.
- if (typeof pkg.main === 'undefined' || pkg.main === '') {
- var files = ["app.js", "server.js", "index.js"]
- setMain(files, pkg, newoptions, setTarget);
+ if ( !('main' in pkg) || pkg.main === '') {
+ setMain(['app.js', 'server.js', 'index.js'], pkg, newoptions, setTarget);
}
else {
setTarget(pkg, newoptions);
@@ -466,6 +427,8 @@ function analyzeFile (options, callback) {
});
}
+var findDepsPath = path.join(__dirname, '..', 'bin', 'find-dependencies');
+
function spawnWorker (options, callback) {
//
// Spawn the `find-dependencies` bin helper to ensure that we are able to
@@ -473,16 +436,16 @@ function spawnWorker (options, callback) {
//
var packages = options.packages,
errs = options.errors,
- deps = fork(path.join(__dirname, '..', 'bin', 'find-dependencies'), [options.target], {silent: true});
+ deps = fork(findDepsPath, [options.target], {silent: true});
deps.send(options.target);
- deps.on("message", function(data){
+ deps.on('message', function(data){
switch(data.type){
- case "load":
+ case 'load':
packages[data.msg] = true;
break;
- case "error":
+ case 'error':
errs.push(data.msg);
}
});
@@ -506,7 +469,6 @@ function spawnWorker (options, callback) {
// Remove the timeout now that we have exited.
//
clearTimeout(timeoutId);
-
callback();
});
}
@@ -521,8 +483,9 @@ function spawnWorker (options, callback) {
//
analyzer.file = function(options, callback){
- options.packages = options.packages || {};
- options.errors = options.errors || [];
+ if(!options.packages) options.packages = {};
+ if(!options.errors) options.errors = [];
+
analyzeFile(options, function(err){
if(options.errors.length > 0){
callback(options.errors); //TODO call with real error object
@@ -557,47 +520,66 @@ analyzer.file = function(options, callback){
};
//
-// ### function findModulesDir (target)
-// #### @target {string} The directory (or file) to search up from
-// Searches up from the specified `target` until it finds a directory which contains
-// a folder called `node_modules`
+// ### function findNextDir (target)
+// #### @target {string} The path to search up from
+// Searches up from the specified `target` until it finds a directory
//
-analyzer.findModulesDir = function (target, callback) {
+analyzer.findNextDir = function(target, callback) {
fs.stat(target, function (err, stats) {
if (err) {
- return callback(err);
+ callback(err);
}
- if (stats.isDirectory()) {
- return fs.readdir(target, function (err, files) {
- if (err) {
- return callback(err);
- }
-
- if (files.indexOf('node_modules') !== -1 || files.indexOf('package.json') !== -1) {
- return callback(null, target);
- }
- else {
- return callback(null, target);
- }
- });
+ else if (stats.isDirectory()) {
+ callback(null, target);
}
else if (stats.isFile()) {
- return analyzer.findModulesDir(path.dirname(target), callback);
+ analyzer.findNextDir(path.dirname(target), callback);
+ }
+ else {
+ callback(new Error(target + ' is not a file or a directory.'));
}
});
};
//
+// ### function findModulesDir (target)
+// #### @target {string} The directory (or file) to search up from
+// Searches up from the specified `target` until it finds a directory which contains
+// a folder called `node_modules`
+//
+analyzer.findModulesDir = function (target, callback) {
+ analyzer.findNextDir(target, function(err, dir){
+ fs.readdir(target, function (err, files) {
+ if (err) {
+ callback(err);
+ }
+ else if (files.indexOf('node_modules') !== -1 || files.indexOf('package.json') !== -1) {
+ //TODO ensure it's actually a directory/file
+ callback(null, target);
+ }
+ else if (target === (target = path.dirname(target))){
+ callback(new Error('Couldn\'t find a node_modules directory.'));
+ }
+ else {
+ analyzer.findModulesDir(target, callback);
+ }
+ });
+ });
+};
+
+//
// ### function (target [arg1, arg2, ...])
// #### @target {Object} Object to merge into
// Merges all properties in `arg1 ... argn`
// into the `target` object.
//
+// TODO remove this as it isn't used anymore
+//
analyzer.merge = function (target) {
var objs = Array.prototype.slice.call(arguments, 1);
objs.forEach(function (o) {
Object.keys(o).forEach(function (attr) {
- if (! o.__lookupGetter__(attr)) {
+ if ( !('get' in Object.getOwnPropertyDescriptor(o, attr)) ) {
target[attr] = o[attr];
}
});
@@ -643,7 +625,7 @@ analyzer.extractVersions = function (dependencies) {
parse = semver.expressions.parse.exec(raw.trim()),
version = parse ? parse.slice(1) : raw,
build = version ? version[3] || version[4] : null;
- if (!/^[v\d]+/.test(raw)) {
+ if (!/^[v\d]/.test(raw)) {
all[pkg] = raw;
}
else if (typeof version === 'string') {
@@ -671,17 +653,15 @@ analyzer.extractVersions = function (dependencies) {
// updated: { /* Union of updated / current with new versions */ }
// }
//
+var cleanVersion = /\<|\>|\=|\s/ig;
+
analyzer.updates = function (current, updated) {
var updates = {
- added: {},
+ added: !current && updated || {},
updated: {}
};
- if (!current) {
- updates.updated = updated || {};
- return updates;
- }
- else if (!updated) {
+ if (!current || !updated) {
return updates;
}
@@ -689,7 +669,7 @@ analyzer.updates = function (current, updated) {
// Get the list of all added dependencies
//
Object.keys(updated).filter(function (key) {
- return !current[key];
+ return !(key in current);
}).forEach(function (key) {
updates.added[key] = updated[key];
});
@@ -698,17 +678,17 @@ analyzer.updates = function (current, updated) {
// Get the list of all dependencies that have been updated
//
Object.keys(updated).filter(function (key) {
- if (!current[key]) {
+ if ( !(key in current) ) {
return false;
}
- var left = updated[key].replace(/\<|\>|\=|\s/ig, ''),
- right = current[key].replace(/\<|\>|\=|\s/ig, '');
+ var left = updated[key].replace(cleanVersion, ''),
+ right = current[key].replace(cleanVersion, '');
return semver.gt(left, right);
}).forEach(function (key) {
updates.updated[key] = updated[key];
- })
+ });
return updates;
};
@@ -719,16 +699,23 @@ analyzer.updates = function (current, updated) {
// Check if `module` is a native module (like `net` or `tty`).
//
// TODO use the resolve module for this
+// (faster & doesn't depend on the node version)
//
analyzer.isNative = function (module) {
try {
- return require.resolve(module) == module;
+ return require.resolve(module) === module;
}
catch (err) {
return false;
}
};
+//
+// ### function resolve (file, base)
+// #### @file {string} filename
+// #### @base {string} the root from which the file should be searched
+// Check if `module` is a native module (like `net` or `tty`).
+//
analyzer.resolve = function(file, base){
try {
return resolve.sync(file, {
@@ -740,17 +727,25 @@ analyzer.resolve = function(file, base){
}
};
-function mergeDependencies(err, deps, pkgDeps, devDeps, bndlDeps, callback) {
+function mergeDependencies(deps, pkg, callback) {
+ var pkgDeps = pkg.dependencies;
+
function removeDevDeps(deps) {
var obj = analyzer.clone(deps), dep;
- for (dep in devDeps) {
- if (typeof obj[dep] !== 'undefined') {
- delete obj[dep];
+
+ if('devDependencies' in pkg){
+ for (dep in pkg.devDependencies) {
+ if (dep in obj) {
+ delete obj[dep];
+ }
}
}
- for (dep in bndlDeps) {
- if (typeof obj[dep] !== 'undefined') {
- delete obj[dep];
+
+ if('bundleDependencies' in pkg){
+ for (dep in pkg.bundleDependencies) {
+ if (dep in obj) {
+ delete obj[dep];
+ }
}
}
return obj;
@@ -758,9 +753,6 @@ function mergeDependencies(err, deps, pkgDeps, devDeps, bndlDeps, callback) {
var merged = {};
- if (err) {
- return callback(err);
- }
if (!Array.isArray(deps)) {
if (typeof deps === 'undefined' ||
Object.keys(deps).length === 0) {
@@ -781,7 +773,7 @@ function mergeDependencies(err, deps, pkgDeps, devDeps, bndlDeps, callback) {
});
Object.keys(pkgDeps).forEach(function (d) {
- if (typeof merged[d] === 'undefined') {
+ if ( !(d in merged) ) {
merged[d] = pkgDeps[d];
}
});
View
3  package.json
@@ -1,7 +1,7 @@
{
"name": "require-analyzer",
"description": "Determine dependencies for a given node.js file, directory tree, or module in code or on the command line",
- "version": "0.4.1-1",
+ "version": "0.4.1",
"author": "Nodejitsu Inc. <info@nodejitsu.com>",
"maintainers": [
"indexzero <charlie@nodejitsu.com>",
@@ -13,7 +13,6 @@
},
"dependencies": {
"colors": "0.x.x",
- "findit": "0.0.x",
"optimist": "0.3.x",
"semver": "1.0.x",
"winston": "0.6.x",
View
5 test/require-analyzer-test.js
@@ -36,7 +36,6 @@ var rawPackages = {
var libDeps = {
'colors': '0.x.x',
- 'findit': '0.0.x',
'read-installed': '0.0.x',
'resolve': '0.2.x',
'optimist': '0.3.x',
@@ -56,7 +55,6 @@ var libPackages = [
'semver',
'slide',
'which',
- 'findit',
'seq',
'hashish',
'traverse',
@@ -67,8 +65,7 @@ var depsFromFile = [
'read-installed',
'detective',
'resolve',
- 'semver',
- 'findit'
+ 'semver'
];
var nativeSubjects = {};
Something went wrong with that request. Please try again.