Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Target for Azure Storage #134

Closed
ghost opened this issue Jan 8, 2015 · 4 comments
Closed

Target for Azure Storage #134

ghost opened this issue Jan 8, 2015 · 4 comments

Comments

@ghost
Copy link

ghost commented Jan 8, 2015

Hello, we have our projects hosted on Azure. We are planning to build some nodejs utilities of hybrid nature: node-js with C++ add-ons and .NET using edge.

Is it possible to use Azure Storage for Node.js as a storage to host binaries build by node-pre-gyp? If not, please provide the steps if someone has successfully configured it.

It would be much appreciated if Azure Store is added as an alternative service provider to AWS, and the how-to info is reflected on docs.

@springmeyer
Copy link
Contributor

Yes, this should be doable now without any changes to node-pre-gyp. The install command that is used when users install your binaries uses https and does not depend on AWS S3. That is the only critical piece. So just configure the host parameter to point to an http/web accessable location hosted on azure and your users will be able to get your binaries.

As far as getting the binaries published. Well, it should be as simple as running node-pre-gyp package locally and then manually pushing the tarball to your remote location. But with a little bit of effort you can automate that yourself because it is easy to programmatically interact with node-pre-gyp in order to write your own publish script. You can run node-pre-gyp reveal to get access to all the info you'd need to script uploading your binaries to a custom location on azure. A workflow could look like this:

# build and package binary module
./node_modules/.bin/node-pre-gyp build package
# then validate the package
./node_modules/.bin/node-pre-gyp testpackage
# next grab the path to the tarball package
LOCAL_BINARY=$(./node_modules/.bin/node-pre-gyp reveal staged_tarball --silent)
REMOTE_BINARY=$(./node_modules/.bin/node-pre-gyp reveal hosted_tarball --silent)
# ... then use whatever azure upload tool you want to move the LOCAL_BINARY to the REMOTE_BINARY

Let me know if I need to make anything more clear.

@ghost
Copy link
Author

ghost commented Jan 21, 2015

Thanks for the information @springmeyer. We will try to adhere by your instructions.
Maybe it is Azure team's responsibility to add documentation for their customers. For that matter, I have opened an issue to their documentation repo: MicrosoftDocs/azure-docs#2432.

Or maybe our company should buy Amazon S3 bucket, since that is something which is trending..

@ghost ghost closed this as completed Jan 21, 2015
@springmeyer
Copy link
Contributor

@jasonwilliams200OK - yes, I highly recommend using Amazon S3 - you are likely talking about less than a few bucks a month to host a node-pre-gyp binary on S3.

@ghost
Copy link
Author

ghost commented Jan 21, 2015

@springmeyer, I agree. Actually its not about the price, as there is no different AWS-S3 vs. AzS (instead Azure is bit cheaper), but its about the services and community support. Ruby and Node.js gems/packages are mostly Amazon-first and since we are experimenting with these runtimes to gauge our options for future, I think its better to be a hipster (S3) than waiting for other party (Azure) to grow large in certain community.

This issue was closed.
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant