Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Support for Onlyfans #24465

Open
5 tasks done
IJCS opened this issue Mar 24, 2020 · 2 comments
Open
5 tasks done

Support for Onlyfans #24465

IJCS opened this issue Mar 24, 2020 · 2 comments
Labels
site-support-request Add extractor(s) for a new domain

Comments

@IJCS
Copy link

IJCS commented Mar 24, 2020

Checklist

  • I'm reporting a new site support request
  • I've verified that I'm running youtube-dl version 2020.03.24
  • I've checked that all provided URLs are alive and playable in a browser
  • I've checked that none of provided URLs violate any copyrights
  • I've searched the bugtracker for similar site support requests including closed ones

Example URLs

-Free post https://onlyfans.com/16391013/killercleavage-free
-Free post https://onlyfans.com/15584274/hoodedhofree
-Free post https://onlyfans.com/13634727/lilianlacefree
-Paid post https://onlyfans.com/16352300/entina_cat
-Free post https://onlyfans.com/16384958/silvermoonfree

Description

I think it would be very helpful to add this to the page list. Especially for mass backups of accounts disconnected from the internet. Always protecting content using credentials.

@IJCS IJCS added the site-support-request Add extractor(s) for a new domain label Mar 24, 2020
@interbiznw
Copy link

You do realize that a lot of content on OF is copyrighted and that they (the creators) ask people not to download their content?

Yes as are other sites youtube-dl can download from. That doesn't automatically assume that it will be used for purposes against the terms of use. As the OP said, it can be used for mass backups/archival purposes of already uploaded videos by the user. People lose their originals all the time and would be a good way to retrieve a copy at the highest format available.

Not to mention, this clearly can already be done by scraping the urls individually in chrome dev tools like any other site.

@laichiaheng
Copy link

Any news?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
site-support-request Add extractor(s) for a new domain
Projects
None yet
Development

No branches or pull requests

3 participants