-
Notifications
You must be signed in to change notification settings - Fork 2k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
AWS S3 plugin also works with Google Cloud Storage (direct uploading) #460
Comments
Not that I'm aware at least! we're not working on it at Transloadit. wld you like to implement a plugin for that, perhaps? :D |
Hi! We are working on docs on how to make plugins, and since Uppy is flexible like that, it should be fairly easy to implement, so if you’d like to give it a try, we are here to help. |
Ok guys, we've implemented the URL signing for Google. Seems the "AWSS3" plugin works perfectly with Google's resumable uploads. Maybe rename the plugin, or add to the documentation? |
Thank you! Added this to todo, we’ll think about naming 👌 Could you send some of your usage/signing our way? If there’s something we could use in docs, for example. Thanks! |
I don't think we need to rename it, since Google Cloud Storage probably just copied the API from S3 to make it easier to switch between the two. Adding an example to the docs seems great though. We could do one for DigitalOcean's new object storage thing too, that also mimicks S3's API: https://www.digitalocean.com/products/spaces/ |
@ogtfaber, any news on how to do with Google? |
@ogtfaber I have problem with signature:
|
@johnunclesam Did you find any solution for this signature problem? |
i've been banging my head on the wall for about a week trying to make uppy and uppy-server work with Google Cloud Storage and it won't. The only way i've managed it is to do a native post upload with my own custom signing server. Even this breaks with the AwsS3 plugin because GCS returns the wrong content-type. I'm really trying to not have to write a whole bunch of custom code to enable large uploads on front-end. My platform is all based on GCS. So if anyone has got uppy-server to work via interoperability with Google Cloud Storage please share the steps. Note i've done the CORS and interoperability steps already when messing around with fineuploader Here's what i'm seeing in my most recent attempts with AwsS3 plugin + uppy-server + GCS interoperability: Request Request URL: https://storage.googleapis.com/_[BUCKET_NAME]_ Request Body
Response (from GCS)
It doesnt look like it has any idea about the AWS-style POST form. Here's what my env config looks like for uppy-server:
Here's how uppy is instantiated in React code:
Not trying to do anything over the top crazy here. Just trying to get the basics working, with GCS. What am i missing? ~j |
@jimyaghi did you ever get this working? |
No I didn't manage unfortunately and switched to another library which also gave me trouble but I think I got that one working. It's been a while though I can't remember what the other library was. It looks like support for gcs in upload libraries very much relies on its ability to emulate s3 as being the more popular. |
Got this to work! Needed to add [
{
"origin": ["https://localhost:5000"],
"method": ["GET", "PUT"],
"responseHeader": ["Content-Type"],
"maxAgeSeconds": 3000
},
{
"origin": ["*"],
"method": ["GET"],
"maxAgeSeconds": 3000
}
] Im currently signing my own urls with a "custom" companion (below) but I will try this again with the AWS companion and see if it works... uppy-companion-google.js import { Storage } from '@google-cloud/storage';
// Check for required env variables
if (!process.env.GOOGLE_APPLICATION_CREDENTIALS) {
throw new Error(
'Missing Google Cloud credentials, please set the GOOGLE_APPLICATION_CREDENTIALS environment variable to your credentials.json location'
);
}
if (!process.env.COMPANION_GOOGLE_BUCKET) {
throw new Error(
'Missing bucket, please set the COMPANION_GOOGLE_BUCKET environment variable'
);
}
// Create new storage client
const storage = new Storage();
// Express middleware to return a signed url
const getSignedUrl = (bucket, ...options) => ({ body, headers }, res) => {
// Get bucket reference from env variable
const myBucket = storage.bucket(
bucket || process.env.COMPANION_GOOGLE_BUCKET
);
// Get file reference
const file = myBucket.file(body.filename);
// Merge config with default
const config = {
action: 'write',
contentType: body.contentType,
expires: Date.now() + 1000 * 60 * 60, // 1 hour from now
...options,
};
//-
// Generate a URL to allow write permissions. This means anyone with this
// URL can send a PUT request with new data that will overwrite the file.
//-
file.getSignedUrl(config).then(function(data) {
res.json({
method: 'put',
url: data[0],
fields: {},
headers: { 'content-type': body.contentType },
});
});
};
export { getSignedUrl }; express app const { getSignedUrl } = require('./uppy-companion-google');
...
app.use('/getSignedUrl', cors(), bodyParser.json(), getSignedUrl());
... react app ...
this.uppy.use(AwsS3, {
limit: 1,
timeout: 1000 * 60 * 60,
getUploadParameters(file) {
// Send a request to our signing endpoint.
return fetch(process.env.REACT_APP_GRAPHQL_ENDPOINT + '/getSignedUrl', {
method: 'post',
// Send and receive JSON.
headers: {
accept: 'application/json',
'content-type': 'application/json',
},
body: JSON.stringify({
filename: file.name,
contentType: file.type,
}),
}).then(response => response.json());
},
});
... |
Thank you @danielmahon |
@danielmahon we are on GCP, do resumable uploads to GCS also work ? and also we need to upload the data via a proxy (https_proxy/http_proxy), does it work ? |
@DanielMohan Can u plz confirm if u guys were referring about gcs multipart upload instead of upload to gce in one shot. If it's GCS multipart upload, I will go ahead and give a try as I have requirement to upload files upto 10 gigs in size through browser directly to GCS |
@rajivchodisetti I haven’t personally used it with resumeable uploads yet so I could be wrong but I don’t think it would be a problem as the google cloud client supports it, you would just need to make sure to setup uppy properly as well. You could probably also use the tus version as well. I am currently using the aws plugin for image/media uploads to google, and the tus plugin for video uploads to Vimeo. |
Since it's been reported to work I'll close this issue, feel free to re-open however! |
Thanks mate @danielmahon |
Public names like need to be designed for users (and by "users" I mean the developers who code with Uppy). Will the user know anything about any of these technical details when they first read the name for the very first time? Probably not. Therefore, technical details like this are nothing more than distractions and red-herrings. From the user's point of view, if something has a name that communicates "this-is-for-thing-A", then the user can and should assume that it is not for thing B, even if this assumption turns out to be wrong. Really, if it works for both "A" and "B", then the name should communicate "this-is-both-for-thing-A-and-B". With this said, the name should be changed. When it comes to design problems, we should think like designers, which requires empathy and emotional intelligence. We should definately not be thinking like engineers, even if that's what we all are. |
is there any reference on how to do multipart upload to gcs? |
No description provided.
The text was updated successfully, but these errors were encountered: