Skip to content

Commit

Permalink
Rebrand "SickGear PostProcessing script" to "SickGear Process Media e…
Browse files Browse the repository at this point in the history
…xtension".

Change improve setup guide to use the NZBGet version to minimise displayed text based on version.
NZBGet versions prior to v17 now told to upgrade as those version are no longer supported - code has actually exit on start up for some time but docs were still saying how to install.
Change comment out code and unused option sg_base_path.
-----------------------
Change core system to improve performance and facilitate multi TV info sources.
Change migrate core objects TVShow and TVEpisode and everywhere that these objects affect.
Add message to logs and disable ui backlog buttons when no media provider has active and/or scheduled searching enabled.
Change views for py3 compat.
Change set default runtime of 5 mins if none is given for layout Day by Day.
Add OpenSubtitles authentication support to config/Subtitles/Subtitles Plugin.
Add Apprise 0.8.0 (6aa52c3).
Add hachoir_py3 3.0a6 (5b9e05a).
Add sgmllib3k 1.0.0
Update soupsieve 1.9.1 (24859cc) to soupsieve_py2 1.9.5 (6a38398)
Add soupsieve_py3 2.0.0.dev (69194a2).
Add Tornado_py3 Web Server 6.0.3 (ff985fe).
Add xmlrpclib_to 0.1.1 (c37db9e).
Remove ancient Growl lib 0.1
Remove xmltodict library.
Change requirements.txt for Cheetah3 to minimum 3.2.4
Change update sabToSickBeard.
Change update autoProcessTV.
Change remove Twitter notifier.
Update NZBGet Process Media extension, SickGear-NG 1.7 → 2.4
Update Kodi addon 1.0.3 → 1.0.4
Update ADBA for py3.
Update Beautiful Soup 4.8.0 (r526) to 4.8.1 (r531).
Update Send2Trash 1.3.0 (a568370) to 1.5.0 (66afce7).
Update soupsieve 1.9.1 (24859cc) to 1.9.5 (6a38398).
Change use GNTP (Growl Notification Transport Protocol) from Apprise.
Change add multi host support to Growl notifier.
Fix Growl notifier when using empty password.
Change update links for Growl notifications.
Change deprecate confg/Notifications/Growl password field as these are now stored with host setting.
Fix prevent infinite memoryError from a particular jpg data structure.
Change subliminal for py3.
Change enzyme for py3.
Change browser_ua for py3.
Change feedparser for py3 (sgmlib is no longer available on py3 as standardlib so added ext lib)
Fix Guessit.
Fix parse_xml for py3.
Fix name parser with multi eps for py3.
Fix tvdb_api fixes for py3 (search show).
Fix config/media process to only display "pattern is invalid" qtip on "Episode naming" tab if the associated field is actually visible. Also, if the field becomes hidden due to a setting change, hide any previously displayed qtip.
Note for Javascript::getelementbyid (or $('tag[id="<name>"')) is required when an id is being searched in the dom due to ":" used in a shows id name.
Change download anidb xml files to main cache folder and use adba lib folder as a last resort.
Change create get anidb show groups as centralised helper func and consolidate dupe code.
Change move anidb related functions to newly renamed anime.py (from blacklistandwhitelist.py).
Change str encode hex no longer exits in py3, use codecs.encode(...) instead.
Change fix b64decode on py3 returns bytestrings.
Change use binary read when downloading log file via browser to prevent any encoding issues.
Change add case insensitive ordering to anime black/whitelist.
Fix anime groups list not excluding whitelisted stuff.
Change add Windows utf8 fix ... see: ytdl-org/youtube-dl#820
Change if no qualities are wanted, exit manual search thread.
Fix keepalive for py3 process media.
Add .pyc cleaner if python version is switched.
  • Loading branch information
SickGear authored and JackDandy committed Nov 4, 2019
1 parent 887bc18 commit 0afd851
Show file tree
Hide file tree
Showing 1,780 changed files with 214,029 additions and 310,496 deletions.
51 changes: 48 additions & 3 deletions CHANGES.md
Original file line number Diff line number Diff line change
@@ -1,7 +1,50 @@
### 0.21.0 (2019-xx-xx xx:xx:xx UTC)

* Change core system to improve performance and facilitate multi TV info sources
* Add message to logs and disable ui backlog buttons when no media provider has active and/or scheduled searching enabled
* Change views for py3 compat
* Change set default runtime of 5 mins if none is given for layout Day by Day
* Change if no qualities are wanted, exit manual search thread
* Change add case insensitive ordering to anime black/whitelist
* Fix anime groups list not excluding whitelisted stuff
* Add OpenSubtitles authentication support to config/Subtitles/Subtitles Plugin
* Update NZBGet Process Media extension, SickGear-NG 1.7 to 2.4
* Update Kodi addon to 1.0.3 to 1.0.4
* Change requirements.txt for Cheetah3 to minimum 3.2.4
* Change update SABnzbd sabToSickBeard
* Change update autoProcessTV
* Add Apprise 0.8.0 (6aa52c3)
* Change use GNTP (Growl Notification Transport Protocol) from Apprise
* Change add multi host support to Growl notifier
* Fix Growl notifier when using empty password
* Change update links for Growl notifications
* Change config/Notifications/Growl links and guidance
* Change deprecate confg/Notifications/Growl password field as these are now stored with host setting
* Add hachoir_py3 3.0a6 (5b9e05a)
* Add sgmllib3k 1.0.0
* Update soupsieve 1.9.1 (24859cc) to soupsieve_py2 1.9.5 (6a38398)
* Add soupsieve_py3 2.0.0.dev (69194a2)
* Add Tornado_py3 Web Server 6.0.3 (ff985fe)
* Add xmlrpclib_to 0.1.1 (c37db9e)
* Remove ancient Growl lib 0.1
* Change remove Twitter notifier
* Remove redundant httplib2
* Remove redundant oauth2
* Fix prevent infinite memoryError from a particular jpg data structure
* Change browser_ua for py3
* Change feedparser for py3
* Change Subliminal for py3
* Change Enzyme for py3
* Fix Guessit
* Fix parse_xml for py3
* Fix name parser with multi eps for py3
* Fix tvdb_api fixes for py3 (search show)
* Fix config/media process to only display "pattern is invalid" qtip on "Episode naming" tab if the associated field is
actually visible. Also, if the field becomes hidden due to a setting change, hide any previously displayed qtip.
* Remove xmltodict library
* Update ADBA for py3
* Update attr 19.2.0.dev0 (154b4e5) to 19.2.0.dev0 (daf2bc8)
* Update Beautiful Soup 4.7.1 (r497) to 4.8.0 (r526)
* Update Beautiful Soup 4.7.1 (r497) to 4.8.1 (r531)
* Update bencode to 2.1.0 (e8290df)
* Update cachecontrol library 0.12.4 (bd94f7e) to 0.12.5 (007e8ca)
* Update Certifi 2019.03.09 (401100f) to 2019.06.16 (84dc766)
Expand All @@ -15,21 +58,23 @@
* Update MsgPack 0.6.1 (737f08a) to 0.6.1 (05ff11d)
* Update rarfile 3.0 (2704344) to 3.1 (1b14c85)
* Update Requests library 2.22.0 (0b6c110) to 2.22.0 (3d968ff)
* Update Send2Trash 1.3.0 (a568370) to 1.5.0 (66afce7)
* Update Six compatibility library 1.12.0 (8da94b8) to 1.12.0 (aa4e90b)
* Update soupsieve 1.9.1 (24859cc) to 2.0.0.dev (beeb4ab)
* Update tmdb_api to tmdbsimple 2.2.0 (ff17893)
* Update TZlocal 2.0.0.dev0 (b73a692) to 2.0.0b3 (410a838)
* Update unidecode module 1.0.22 (a5045ab) to 1.1.1 (632af82)
* Update urllib3 release 1.25.2 (49eea80) to 1.25.6 (4a6c288)
* Update xmltodict library 0.12.0 (f3ab7e1) to 0.12.0 (02c9b71)


[develop changelog]

* Update attr 19.2.0.dev0 (de84609) to 19.2.0.dev0 (154b4e5)
* Update Beautiful Soup 4.7.1 (r497) to 4.8.0 (r526)
* Update Requests library 2.22.0 (aeda65b) to 2.22.0 (0b6c110)
* Update urllib3 release 1.25.2 (49eea80) to 1.25.3 (3387b20)
* Update urllib3 release 1.25.3 (3387b20) to 1.25.3 (67715fd)
* Update urllib3 release 1.25.5 (edc3ddb) to 1.25.6 (4a6c288)
* Update xmltodict library 0.12.0 (f3ab7e1) to 0.12.0 (02c9b71)


### 0.20.5 (2019-10-18 00:01:00 UTC)
Expand Down
11 changes: 10 additions & 1 deletion HACKS.txt
Original file line number Diff line number Diff line change
@@ -1,7 +1,14 @@
Libs with customisations...

/lib/apprise
/lib/backports/configparser
/lib/browser_ua
/lib/bs4
/lib/dateutil/zoneinfo/__init__.py
/lib/dateutil/tz/tz.py
/lib/enzyme
/lib/feedparser
/lib/guessit
/lib/hachoir_core/config.py
/lib/hachoir_core/stream/input_helpers.py
/lib/hachoir_metadata/jpeg.py
Expand All @@ -12,7 +19,9 @@ Libs with customisations...
/lib/lockfile/mkdirlockfile.py
/lib/rtorrent
/lib/scandir/scandir.py
/lib/tmdb_api/tmdb_api.py
/lib/send2trash
/lib/subliminal
/lib/tmdbsimple
/lib/tornado
/lib/tvdb_api/tvdb_api.py
/lib/tzlocal/unix.py
165 changes: 83 additions & 82 deletions _cleaner.py
Original file line number Diff line number Diff line change
@@ -1,92 +1,93 @@
# remove this file when no longer needed

import os
import io
import shutil
import sys

parent_dir = os.path.abspath(os.path.dirname(__file__))
cleaned_file = os.path.abspath(os.path.join(parent_dir, '.cleaned004.tmp'))
test = os.path.abspath(os.path.join(parent_dir, 'lib', 'requests', 'packages'))
if not os.path.isfile(cleaned_file) or os.path.exists(test):
dead_dirs = [os.path.abspath(os.path.join(parent_dir, *d)) for d in [
('lib', 'requests', 'packages'),
('lib', 'pynma')
]]

for dirpath, dirnames, filenames in os.walk(parent_dir):
for dead_dir in filter(lambda x: x in dead_dirs, [os.path.abspath(os.path.join(dirpath, d)) for d in dirnames]):
try:
shutil.rmtree(dead_dir)
except (StandardError, Exception):
pass

for filename in [fn for fn in filenames if os.path.splitext(fn)[-1].lower() in ('.pyc', '.pyo')]:
try:
os.remove(os.path.abspath(os.path.join(dirpath, filename)))
except (StandardError, Exception):
pass
if 2 == sys.version_info[0]:
# noinspection PyDeprecation
import imp

with open(cleaned_file, 'wb') as fp:
fp.write('This file exists to prevent a rerun delete of *.pyc, *.pyo files')
fp.flush()
os.fsync(fp.fileno())
# noinspection PyDeprecation
magic_number = imp.get_magic().encode('hex').decode('utf-8')
else:
import importlib.util

cleaned_file = os.path.abspath(os.path.join(parent_dir, '.cleaned003.tmp'))
test = os.path.abspath(os.path.join(parent_dir, 'lib', 'imdb'))
if not os.path.isfile(cleaned_file) or os.path.exists(test):
dead_dirs = [os.path.abspath(os.path.join(parent_dir, *d)) for d in [
('lib', 'imdb'),
]]

for dirpath, dirnames, filenames in os.walk(parent_dir):
for dead_dir in filter(lambda x: x in dead_dirs, [os.path.abspath(os.path.join(dirpath, d)) for d in dirnames]):
try:
shutil.rmtree(dead_dir)
except (StandardError, Exception):
pass

for filename in [fn for fn in filenames if os.path.splitext(fn)[-1].lower() in ('.pyc', '.pyo')]:
try:
os.remove(os.path.abspath(os.path.join(dirpath, filename)))
except (StandardError, Exception):
pass
magic_number = importlib.util.MAGIC_NUMBER.hex()

with open(cleaned_file, 'wb') as fp:
fp.write('This file exists to prevent a rerun delete of *.pyc, *.pyo files')
fp.flush()
os.fsync(fp.fileno())
parent_dir = os.path.abspath(os.path.dirname(__file__))
magic_number_file = os.path.join(parent_dir, '.python_magic.tmp')
old_magic = ''
try:
if os.path.isfile(magic_number_file):
with io.open(magic_number_file, 'r', encoding='utf-8') as mf:
old_magic = mf.read()
except (BaseException, Exception):
pass

cleaned_file = os.path.abspath(os.path.join(parent_dir, '.cleaned002.tmp'))
test = os.path.abspath(os.path.join(parent_dir, 'lib', 'hachoir_core'))
if not os.path.isfile(cleaned_file) or os.path.exists(test):
dead_dirs = [os.path.abspath(os.path.join(parent_dir, *d)) for d in [
('.cleaned.tmp',),
('tornado',),
('lib', 'feedcache'),
('lib', 'hachoir_core'), ('lib', 'hachoir_metadata'), ('lib', 'hachoir_parser'),
('lib', 'jsonrpclib'),
('lib', 'shove'),
('lib', 'trakt'),
('lib', 'tvrage_api'),
('lib', 'unrar2')
]]

for dirpath, dirnames, filenames in os.walk(parent_dir):
for dead_dir in filter(lambda x: x in dead_dirs, [os.path.abspath(os.path.join(dirpath, d)) for d in dirnames]):
try:
shutil.rmtree(dead_dir)
except (StandardError, Exception):
pass

for filename in [fn for fn in filenames if os.path.splitext(fn)[-1].lower() in ('.pyc', '.pyo')]:
try:
os.remove(os.path.abspath(os.path.join(dirpath, filename)))
except (StandardError, Exception):
pass
if old_magic != magic_number:
# print('Python magic changed: removing all .pyc, .pyo files')
for pc in ['sickbeard', 'lib']:
search_dir = os.path.join(parent_dir, pc)
for dpath, dnames, fnames in os.walk(search_dir):
for filename in [fn for fn in fnames if os.path.splitext(fn)[-1].lower() in ('.pyc', '.pyo')]:
try:
os.remove(os.path.abspath(os.path.join(dpath, filename)))
except (BaseException, Exception):
pass

try:
with io.open(magic_number_file, 'w+') as mf:
mf.write(magic_number)
except (BaseException, Exception):
pass
# print('finished')

# skip cleaned005 as used during dev by testers
cleanups = [
['.cleaned006.tmp', ('lib', 'bs4', 'builder'), [
('lib', 'boto'), ('lib', 'bs4', 'builder'), ('lib', 'growl'),
('lib', 'hachoir', 'core'), ('lib', 'hachoir', 'field'), ('lib', 'hachoir', 'metadata'),
('lib', 'hachoir', 'parser', 'archive'), ('lib', 'hachoir', 'parser', 'audio'),
('lib', 'hachoir', 'parser', 'common'), ('lib', 'hachoir', 'parser', 'container'),
('lib', 'hachoir', 'parser', 'image'), ('lib', 'hachoir', 'parser', 'misc'),
('lib', 'hachoir', 'parser', 'network'), ('lib', 'hachoir', 'parser', 'program'),
('lib', 'hachoir', 'parser', 'video'), ('lib', 'hachoir', 'parser'), ('lib', 'hachoir', 'stream'),
('lib', 'httplib2'), ('lib', 'oauth2'), ('lib', 'pythontwitter'), ('lib', 'tmdb_api')]],
['.cleaned004.tmp', ('lib', 'requests', 'packages'), [
('lib', 'requests', 'packages'), ('lib', 'pynma')]],
['.cleaned003.tmp', ('lib', 'imdb'), [
('lib', 'imdb')]],
['.cleaned002.tmp', ('lib', 'hachoir_core'), [
('.cleaned.tmp',), ('tornado',),
('lib', 'feedcache'), ('lib', 'hachoir_core'), ('lib', 'hachoir_metadata'), ('lib', 'hachoir_parser'),
('lib', 'jsonrpclib'), ('lib', 'shove'), ('lib', 'trakt'), ('lib', 'tvrage_api'), ('lib', 'unrar2')]],
]
for cleaned_path, test_path, dir_list in cleanups:
cleaned_file = os.path.abspath(os.path.join(parent_dir, cleaned_path))
test = os.path.abspath(os.path.join(parent_dir, *test_path))

if not os.path.isfile(cleaned_file) or os.path.exists(test):
dead_dirs = [os.path.abspath(os.path.join(parent_dir, *d)) for d in dir_list]

for dpath, dnames, fnames in os.walk(parent_dir):
for dead_dir in filter(lambda x: x in dead_dirs, [os.path.abspath(os.path.join(dpath, d)) for d in dnames]):
try:
shutil.rmtree(dead_dir)
except (BaseException, Exception):
pass

with open(cleaned_file, 'wb') as fp:
fp.write('This file exists to prevent a rerun delete of *.pyc, *.pyo files')
fp.flush()
os.fsync(fp.fileno())
for filename in [fn for fn in fnames if os.path.splitext(fn)[-1].lower() in ('.pyc', '.pyo')]:
try:
os.remove(os.path.abspath(os.path.join(dpath, filename)))
except (BaseException, Exception):
pass

with open(cleaned_file, 'wb') as fp:
fp.write('This file exists to prevent a rerun delete of *.pyc, *.pyo files')
fp.flush()
os.fsync(fp.fileno())

cleaned_file = os.path.abspath(os.path.join(parent_dir, '.cleaned_html5lib.tmp'))
test = os.path.abspath(os.path.join(parent_dir, 'lib', 'html5lib', 'treebuilders', '_base.pyc'))
Expand All @@ -99,7 +100,7 @@
]]:
try:
shutil.rmtree(dead_path)
except (StandardError, Exception):
except (BaseException, Exception):
pass

for dead_file in [os.path.abspath(os.path.join(parent_dir, *d)) for d in [
Expand All @@ -119,7 +120,7 @@
if os.path.exists(name):
try:
os.remove(name)
except (StandardError, Exception):
except (BaseException, Exception):
bad_files += [name]
if any(bad_files):
swap_name = cleaned_file
Expand All @@ -137,5 +138,5 @@

try:
os.remove(danger_output)
except (StandardError, Exception):
except (BaseException, Exception):
pass
18 changes: 3 additions & 15 deletions autoProcessTV/SickGear-NG/INSTALL.txt
Original file line number Diff line number Diff line change
@@ -1,5 +1,5 @@
SickGear PostProcessing script for NZBGet
=========================================
SickGear Process Media extension for NZBGet
===========================================

If NZBGet v17+ is installed on the same system as SickGear then as a local install,

Expand All @@ -11,19 +11,7 @@ This is the best set up to automatically get script updates from SickGear

#############

If NZBGet v16 or earlier is installed, then as an older install,

1) Copy the directory with/or this single script file to path set in NZBGet Settings/PATHS/ScriptDir

2) Refresh the NZBGet page and navigate to Settings/SickGear-NG

3) Click View -> Compact to remove any tick and un hide tips and suggestions

4) The bare minimum change is the sg_base_path setting or enter `python -m pip install requests` at admin commandline

5) Navigate to any named TV category at Settings/Categories, click "Choose" Category.Extensions then Apply SickGear-NG

You will need to manually update your script with this set up
NZBGet version 16 and earlier are no longer supported, please upgrade.

#############

Expand Down
Loading

0 comments on commit 0afd851

Please sign in to comment.