You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
When loading a saved docker image from a file, the readFile operation is done synchronously, thus blocking Node.js event loop for the duration of the operation. A docker image export file can be quite big, which would result in other tasks to starve, since they would not get any CPU. An asynchonous file read approach should be preferred.
Is there a reason for reading these file synchronously?
The related snippet:
if (options.file) {
if (typeof options.file === 'string') {
data = fs.readFileSync(path.resolve(options.file));
} else {
data = options.file;
}
optionsf.headers['Content-Type'] = 'application/tar';
} else if (opts && options.method === 'POST') {
data = JSON.stringify(opts._body || opts);
optionsf.headers['Content-Type'] = 'application/json';
}
The text was updated successfully, but these errors were encountered:
When loading a saved docker image from a file, the
readFile
operation is done synchronously, thus blocking Node.js event loop for the duration of the operation. A docker image export file can be quite big, which would result in other tasks to starve, since they would not get any CPU. An asynchonous file read approach should be preferred.Is there a reason for reading these file synchronously?
The related snippet:
The text was updated successfully, but these errors were encountered: