You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I understand the scope of this project and am aware of the known limitations and my idea is not already on the roadmap.
Your Feature Request
Short Description (Metadata Sync)
Stop syncing metadata on a per video perspective. This means, i want to exclude videos by my own decision.
Its already implemented for deactivated and outdated videos.
Please let users also a simple 'true/false' choice.
Short Description (Metadata Edit)
Choose the source of metadata, (yt | custom/local )
Define read only fields respective fields to sync
TA WebUI CRUD only for metadata. (without Create)
modern Inline editing, yeah nice.
Additional context
Refreshing Metadata is a good starting point, to get a synced version of your favorites. On the other side, you may have expirienced yourself, sometimes information are not present long time. And, imho an archive should prevent this situation.
Is there any way, any kind of history, changelog something like that? What happens with the json.info after creating,indexing all the stuff? Could it placed somewhere, as binary in some db .. ?
Thanks for your time!
Sorry if i went into too much detail. my real life job rubs off sometimes 👍
Your help is needed!
Yes I can help with this feature request!
The text was updated successfully, but these errors were encountered:
So far this project is trying to represent the metadata from youtube as close as possible and as automated as possible. This would be quite a big change in how this project works. The current roadmap and pending features is long as it is.
sometimes information are not present long time.
Tube Archivist will keep the last know state, e.g. when a video goes away, you'll have the state of the metadata since your last refresh. So you won't loose anything.
What happens with the info.json
There was never a info.json file, all relevant metadata gets parsed directly and stored in ES. But this could potentially get dumped to a file. There is a good amount on none relevant data in there, so not sure how useful that would be. That's why you have a database, where you want to store and index all the important stuff so you can search for things. :-)
In any case, if you want to work on any of these things, please reach out on Discrod!
Excluding videos programmatical is implemented for deactivated and outdated videos, a third manual option ... should not reinventing the wheel.
i'll look into the code and will drop up on Discord soon.
Without the example above, Conceptual, enhancing/tweaking already implemented features is also the case with the custom metadata feature, as its already implemented, named offline-import
At least, yes... the 'last state' of a video is save, and yes its been archived. check!
Its not my intention to change anything fundamental., rather then confirm, 'as close as possible and as automated as possible'
I would love to have a save way to archive all the changes on the way from init video upload to offline. (and of course in a language that i can read and understand)
if you've not seen it already, editing metadata belongs also to translating concerns (sorry that i've not mentioned it already at beginning)
Already implemented?
Your Feature Request
Short Description (Metadata Sync)
Stop syncing metadata on a per video perspective. This means, i want to exclude videos by my own decision.
Its already implemented for deactivated and outdated videos.
Please let users also a simple 'true/false' choice.
Short Description (Metadata Edit)
Additional context
Refreshing Metadata is a good starting point, to get a synced version of your favorites. On the other side, you may have expirienced yourself, sometimes information are not present long time. And, imho an archive should prevent this situation.
Is there any way, any kind of history, changelog something like that? What happens with the json.info after creating,indexing all the stuff? Could it placed somewhere, as binary in some db .. ?
Thanks for your time!
Sorry if i went into too much detail. my real life job rubs off sometimes 👍
Your help is needed!
The text was updated successfully, but these errors were encountered: