-
Notifications
You must be signed in to change notification settings - Fork 587
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Google Cloud Storage - Error during upload: gcs-resumable-upload.json renaming operation not permitted #1154
Comments
There's probably an issue with writing to the file so many times. The easiest thing to do is to disable resumable uploads. In your code, set |
I added that in, but still run into the same problem. I am still unable to upload an array of files at a time. But it helped the single file upload a bit. Before, I would run into this error after 2 requests in a row, now I am able to make 3-5 request in a row with single file upload |
The config file that is written to is updated periodically throughout a file upload. The same config file will be written to for multiple files. So the clash is likely happening because there are too many simultaneous uploads going on. Can you show the code that uploads an array of files? All we have to do is find the call that needs to switch of |
In lib/file.js // Express middleware that will handle an array of files. req.files is an array of files received from
// filemulter.fields([{field: name, maxCount}]) function. This function should handle
// the upload process of files asychronously
function sendFilesToGCS(req, res, next) {
if(!req.files) { return next(); }
// req.files is passed in as an object (String -> Array) where
// fieldname is the key, and the value is array of files
var reqFiles = Object.keys(req.files);
var finishCounter = 0;
reqFiles.forEach(function(key) {
// Limitation: each field in req.files can only have one file associated to it
var file = req.files[key][0];
var gcsName = Date.now() + file.originalname;
var gcsFile = bucket.file(gcsName);
var stream = gcsFile.createWriteStream({ resumeable: false });
console.log('Starting writable stream for ' + file.originalname);
stream.on('error', function(err) {
file.cloudStorageError = err;
console.log('Error in writing ' + file.originalname);
res.status(500).send(err);
});
stream.on('finish', function() {
file.cloudStorageObject = gcsName;
file.cloudStoragePublicUrl = getPublicUrl(gcsName);
console.log('Finish writing ' + file.originalname);
finishCounter++;
if (finishCounter == reqFiles.length) {
next();
}
});
stream.end(file.buffer);
});
} |
In the route router.post('/',
file.multer.fields([
{name: 'brochure', maxCount: 1},
{name: 'application', maxCount: 1},
{name: 'others', maxCount: 1},
{name: 'featureEN', maxCount: 1},
{name: 'featureTC', maxCount: 1},
{name: 'exclusionsEN', maxCount: 1},
{name: 'exclusionsTC', maxCount: 1},
{name: 'precautionsEN', maxCount: 1},
{name: 'precautionsTC', maxCount: 1},
{name: 'contentEN', maxCount: 1},
{name: 'contentTC', maxCount: 1},
{name: 'warningMessageEN', maxCount: 1},
{name: 'warningMessageTC', maxCount: 1},
{name: 'statementFileName', maxCount: 1},
{name: 'privacyFileName', maxCount: 1}
]),
file.sendFilesToGCS,
function(req, res, next) {
var data = req.body;
// multer checks to see if there are files being passed in as part of the request.
// If so, this function will loop through req.files, which is an object (String -> Array)
// Then append a new field to data, set name to be the same as the key in req.files,
// and value as its cloudStoragePublicUrl.
if (req.files) {
Object.keys(req.files).forEach(function(key) {
if(req.files[key][0]) {
data[key] = req.files[key][0].cloudStoragePublicUrl;
console.log(data[key]);
}
});
}
models.Insurance.create({
companyId : data.companyId,
planNameEN : data.planNameEN.trim(),
planNameTC : data.planNameTC.trim(),
classEN : data.classEN.trim(),
classTC : data.classTC.trim(),
medicalCoverage : data.medicalCoverage.trim(),
accidentalCoverage : data.accidentalCoverage.trim(),
luggageCoverage : data.luggageCoverage.trim(),
redirectLink : data.redirectLink.trim(),
brokerEmail : data.brokerEmail.trim(),
paypalSupport : data.paypalSupport.trim(),
brochureFileName : data.brochure,
applicationFormFileName : data.application,
othersFileName : data.others,
featureEN : data.featureEN,
featureTC : data.featureTC,
exclusionsEN : data.exclusionsEN,
exclusionsTC : data.exclusionsTC,
precautionsEN : data.precautionsEN,
precautionsTC : data.precautionsTC,
contentEN : data.contentEN,
contentTC : data.contentTC,
warningMessageEN : data.warningMessageEN,
warningMessageTC : data.warningMessageTC,
statementFileName : data.statementFileName,
privacyFileName : data.privacyFileName
}).then(function(insurance) {
console.log('Successfully POST new insurance at: ' + new Date());
return res.status(200).send(insurance.get({ plain: true }));
}).catch(function(err) {
return res.status(500).send(err);
});
}
); |
- var stream = gcsFile.createWriteStream({ resumeable: false });
+ var stream = gcsFile.createWriteStream({ resumable: false }); |
FML. Thanks so mcuh! |
Haha, no problem. Glad we found the 🐛! |
Now I am getting a { [ERROR:ETIMEDOUT] code: 'ETIMEDOUT', connection: false } error while trying to upload a file greater than 4mb after resumable is set to false. Other files get uploaded successfully. Setting server.timeout in bin/www doesn't have any effect too. Can you please help? |
I can't try to reproduce now, but I think a lot of this revolves around the amount of files attempting to be uploaded at once. |
In my tests so far, I'm not able to reproduce. Can you boil down a test case to the simplest possible components? The
The server could be exhausted from trying to handle multiple simultaneous uploads. Is it possible to pull back on the amount of files being uploaded at once? |
It may be a window-specific thing. I found this after more research on similar issue. In terms of the timeout issue, I am able to upload files that are smaller than ~2.5 mb one by one. But as soon as I try to upload more than 1 files at a time, I start running into that issue files size > ~1500kb. However, it runs perfectly fine if all files are below that limit. I have some files that are up to 8-10mb that I need to upload for this application. May you please help me confirm if the "operation not permitted" issue is a window specific thing, and the timeout issue happens only after i set resumable = false? This is the code I am using now ` function sendFilesToGCS(req, res, next) {
|
I think you should post this over on StackOverflow (use our As a general question, maybe @jgeewax or @thobrla can help: any best practices for uploading multiple files at once? |
If gcloud-node is using a single file to track the state of concurrent resumable uploads, it will be bound by lock contention for that file. Some possible options to get around this problem include:
The third option makes the most sense to me. |
Thanks! That will help for the first problem regarding writing to the same file multiple times. However, when resumable uploads are turned off, @JamesWangSaiBong is getting another error about connections timing out: #1154 (comment)
Does this sound like normal behavior from the Storage API? Should we advise to simply throttle multiple concurrent uploads? |
No, that is not normal behavior, and the connection timeouts are probably not a service-level issue. The next step there is to investigate whether the issue is specific to the gcloud-node client or this specific user. |
About the resumable option: https://cloud.google.com/nodejs/docs/reference/storage/2.3.x/Bucket#upload Motivation: Due to some potential race-conditions that parallel calls to this task can cause (eg: build pipelines optimised to deploy assets in parallel and save some time), disabling `resumable` is a way to mitigate those, as we can see here in this [issue report](googleapis/google-cloud-node#1154 (comment)).
I'm simply trying to follow this tutorial on how to upload files to gcs with Node and Express. But the following error keep causing my app to crash. Usually, I am able to upload one file without a problem in the first run. But I will get this error after running a few request, even with different file. When I try to upload, say 5, files at a time, this error cause my app to crash even in the first run. I see the process is trying to rename a file in the .config folder. Is it a normal behavior? If so, is there a work-around?
Window: v10.0.10586
Node: v4.3.1
Express: v4.13.1
C:\Users\James Wang\gi-cms-backend\node_modules\configstore\index.js:69
throw err;
^
Error: EPERM: operation not permitted, rename 'C:\Users\James Wang.config\configstore\gcs-resumable-upload.json.2873606827' -> 'C:\Users\James Wang.config\configstore\gcs-resumable-upload.json'
at Error (native)
at Object.fs.renameSync (fs.js:681:18)
at Function.writeFileSync [as sync](C:UsersJames Wanggi-cms-backendnode_moduleswrite-file-atomicindex.js:39:8)
at Object.create.all.set (C:\Users\James Wang\gi-cms-backend\node_modules\configstore\index.js:62:21)
at Object.Configstore.set (C:\Users\James Wang\gi-cms-backend\node_modules\configstore\index.js:93:11)
at Upload.set (C:\Users\James Wang\gi-cms-backend\node_modules\gcs-resumable-upload\index.js:264:20)
at C:\Users\James Wang\gi-cms-backend\node_modules\gcs-resumable-upload\index.js:60:14
at C:\Users\James Wang\gi-cms-backend\node_modules\gcs-resumable-upload\index.js:103:5
at Request._callback (C:\Users\James Wang\gi-cms-backend\node_modules\gcs-resumable-upload\index.js:230:7)
at Request.self.callback (C:\Users\James Wang\gi-cms-backend\node_modules\request\request.js:199:22)
at emitTwo (events.js:87:13)
at Request.emit (events.js:172:7)
at Request. (C:\Users\James Wang\gi-cms-backend\node_modules\request\request.js:1036:10)
at emitOne (events.js:82:20)
at Request.emit (events.js:169:7)
at IncomingMessage. (C:\Users\James Wang\gi-cms-backend\node_modules\request\request.js:963:12)
[nodemon] app crashed - waiting for file changes before starting...
The text was updated successfully, but these errors were encountered: