Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Is littles3 use possible #47

Closed
digitalkram opened this issue Jun 30, 2014 · 7 comments
Closed

Is littles3 use possible #47

digitalkram opened this issue Jun 30, 2014 · 7 comments

Comments

@digitalkram
Copy link

Hi there,

Is it possible to use littles3 https://code.google.com/p/littles3/ instead of AWS S3 or are there any major obstacles that make this impossible?

Thanks and Cheers,
Thilo

@tf
Copy link
Member

tf commented Jul 1, 2014

There are two components to consider:

  • Pageflow uses Paperclip to store its files. Any storage backend supported by Paperclip should theoretically work with Pageflow. There would not even be a need to use a fake aws endpoint. Simply using the filesystem adapter could do it.
  • And there is Zencoder, which needs to get its files somewhere and put them somewhere. While Zencoder seems to allow other inputs than s3, Pageflow at the moment is not capable of submitting Zencoder jobs with files coming from other locations than s3. To use littles3, there would have to be a way to tell zencoder about the alternative aws API endpoint (s3:// urls start with the bucket name). Looking at the zencoder docs, I do not see such an option though.

@digitalkram
Copy link
Author

Hi,

thanks for your reply. I was already suspecting that the Zencoder part would be the dealbreaker when I wrote the question.

Are there any plans to make pageflow support (any) other protocol with is offered by Zencoder, like SFTP for example? That would enable people to use storage that only charges for storage but not for traffic (unless hosted on your own) which makes the cost much more predictable as you never know how popular a story will become. Especially if you are not doing that for business purpose.

Is there maybe another possibility? Something like:

  1. Create story on a "staging" environment/instance making use of the normal s3 feature
  2. Replicate output-bucket content to a "local" storage.
  3. Reconfigure that instance to use the locally stored files when serving the story?
    or
  4. Replicate the story into a "production" environment and switch the stories file locations to the local storage during replication?

This way the S3 would be involved only during the story creation but not while serving the stories to hundreds of viewers. That would also be quite nice and worth the effort (that could potentially be scripted) if it is possible at all.

Thanks for your inputs.

Cheers,

@tf
Copy link
Member

tf commented Jul 1, 2014

As it stands Pageflow supports Zencoder SFTP outputs in addition to S3. Still, we had some bad experiences with Zencoder's SFTP adapter failing to copy files while reporting success via the API. I'll try to get around to editing the Zencoder Options wiki page, though. That would deal with video/audio.

Full replication of stories can surely be implemented somehow. But I'd suspect that to be quite complex and definitely out of scope for us. Maybe a custom content delivery setup proxying S3 would be easier to get working.

@digitalkram
Copy link
Author

the wiki update to use SFTP (just to tinker a bit aroud with it to see if we face the same problems with it as you did) would be very appreicated! thanks a lot in advance!

@tf
Copy link
Member

tf commented Jul 4, 2014

Here's the wiki page on using SFTP. If you also experience the Zencoder hickups we ran into, we might want to add a warning note to the wiki page.

@tf tf closed this as completed Jul 4, 2014
@digitalkram
Copy link
Author

hi tf,

thanks for the wiki update. I configured my playground instance accordingly. Will report back here in case I face similar hiccups as you did. What works now (tested only with a small/fast file) is that the files are put via SFTP to my server and I serve the encoded contens myself instead of pointing to the output bucket. It is just fine for me that S3 is used also additionally as this is only a one time thing but not used for every page visitor.

So that covers the Audio/Video part. For the images / paperclip part: Is it possible to use paperclip-sftp to use the same to server the other bucket as well myself?

Thanks and Cheers,

@tf
Copy link
Member

tf commented Jul 8, 2014

FYI, in #54 I've opened a separate feature issue beginning to outline the steps necessary to allow non S3 storage for ImageFiles.

Cheers

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants