Skip to content
This repository has been archived by the owner on Apr 26, 2021. It is now read-only.

Feasibility of merging the experimental branch into master. #21

Closed
DhanrajHira opened this issue Jul 6, 2020 · 16 comments · Fixed by #24
Closed

Feasibility of merging the experimental branch into master. #21

DhanrajHira opened this issue Jul 6, 2020 · 16 comments · Fixed by #24
Labels
discussion Discussing ideas and features

Comments

@DhanrajHira
Copy link
Collaborator

Should we prioritize the merging of the experimental branch into master? The experimental branch is very far ahead of the master branch. I think we should focus on testing before we move on to adding new features. What do you think?

Should we start a new project on this?

@DhanrajHira DhanrajHira added the discussion Discussing ideas and features label Jul 6, 2020
@notmarek
Copy link
Owner

notmarek commented Jul 6, 2020

I think we should merge the experimental branch into master

@notmarek
Copy link
Owner

notmarek commented Jul 7, 2020

I think we are ready to merge now, what do you think?

@DhanrajHira
Copy link
Collaborator Author

Downloads are still broken for me.

    client.get_anime(2506).get_episodes().get_episode_by_number(1).download('hd', multi_threading = True)
  File "E:\Active Projectcs\Sakurajima\Sakurajima\models\base_models.py", line 373, in download
    p.start()
  File "C:\Python\lib\multiprocessing\process.py", line 112, in start
  File "<string>", line 1, in <module>
    self._popen = self._Popen(self)
  File "C:\Python\lib\multiprocessing\context.py", line 223, in _Popen
  File "C:\Python\lib\multiprocessing\spawn.py", line 105, in spawn_main
    exitcode = _main(fd)
  File "C:\Python\lib\multiprocessing\spawn.py", line 114, in _main
    return _default_context.get_context().Process._Popen(process_obj)
  File "C:\Python\lib\multiprocessing\context.py", line 322, in _Popen
    prepare(preparation_data)
  File "C:\Python\lib\multiprocessing\spawn.py", line 225, in prepare
    return Popen(process_obj)
    _fixup_main_from_path(data['init_main_from_path'])
  File "C:\Python\lib\multiprocessing\popen_spawn_win32.py", line 89, in __init__
  File "C:\Python\lib\multiprocessing\spawn.py", line 277, in _fixup_main_from_path
    reduction.dump(process_obj, to_child)
  File "C:\Python\lib\multiprocessing\reduction.py", line 60, in dump
    run_name="__mp_main__")
  File "C:\Python\lib\runpy.py", line 263, in run_path
    ForkingPickler(file, protocol).dump(obj)
    pkg_name=pkg_name, script_name=fname)
BrokenPipeError: [Errno 32] Broken pipe
  File "C:\Python\lib\runpy.py", line 96, in _run_module_code
    mod_name, mod_spec, pkg_name, script_name)
  File "C:\Python\lib\runpy.py", line 85, in _run_code
    exec(code, run_globals)
  File "E:\Active Projectcs\test.py", line 7, in <module>
    client.get_anime(2506).get_episodes().get_episode_by_number(1).download('hd', multi_threading = True)
  File "E:\Active Projectcs\Sakurajima\Sakurajima\models\base_models.py", line 373, in download
    p.start()
  File "C:\Python\lib\multiprocessing\process.py", line 112, in start
    self._popen = self._Popen(self)
  File "C:\Python\lib\multiprocessing\context.py", line 223, in _Popen
    return _default_context.get_context().Process._Popen(process_obj)
  File "C:\Python\lib\multiprocessing\context.py", line 322, in _Popen
    return Popen(process_obj)
  File "C:\Python\lib\multiprocessing\popen_spawn_win32.py", line 46, in __init__
    prep_data = spawn.get_preparation_data(process_obj._name)
  File "C:\Python\lib\multiprocessing\spawn.py", line 143, in get_preparation_data
    _check_not_importing_main()
  File "C:\Python\lib\multiprocessing\spawn.py", line 136, in _check_not_importing_main
    is not going to be frozen to produce an executable.''')
RuntimeError:
        An attempt has been made to start a new process before the
        current process has finished its bootstrapping phase.

        This probably means that you are not using fork to start your
        child processes and you have forgotten to use the proper idiom
        in the main module:

            if __name__ == '__main__':
                freeze_support()
                ...

        The "freeze_support()" line can be omitted if the program
        is not going to be frozen to produce an executable.

@notmarek
Copy link
Owner

notmarek commented Jul 7, 2020

Downloads are still broken for me.

in the file you are executing you need to start with if __name__ == "__main__": this is caused by the multiprocessing module executing the startup script

@DhanrajHira
Copy link
Collaborator Author

I added that to the base_models.py and was wondering why that didn't fix the issue. Haha turns out you have ot add that to the script you are executing. Thanks it works now.

@notmarek
Copy link
Owner

notmarek commented Jul 7, 2020

Should i merge the branches?

@DhanrajHira
Copy link
Collaborator Author

Another error coming your way

[2020-07-07 16:38:03.269098] Started download.
Process Process-3:
Traceback (most recent call last):
  File "C:\Python\lib\site-packages\urllib3\response.py", line 685, in _update_chunk_length
    self.chunk_left = int(line, 16)
ValueError: invalid literal for int() with base 16: b''

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "C:\Python\lib\site-packages\urllib3\response.py", line 425, in _error_catcher
    yield
  File "C:\Python\lib\site-packages\urllib3\response.py", line 752, in read_chunked
    self._update_chunk_length()
  File "C:\Python\lib\site-packages\urllib3\response.py", line 689, in _update_chunk_length
    raise httplib.IncompleteRead(line)
http.client.IncompleteRead: IncompleteRead(0 bytes read)

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "C:\Python\lib\site-packages\requests\models.py", line 750, in generate
    for chunk in self.raw.stream(chunk_size, decode_content=True):
  File "C:\Python\lib\site-packages\urllib3\response.py", line 560, in stream
    for line in self.read_chunked(amt, decode_content=decode_content):
  File "C:\Python\lib\site-packages\urllib3\response.py", line 781, in read_chunked
    self._original_response.close()
  File "C:\Python\lib\contextlib.py", line 130, in __exit__
    self.gen.throw(type, value, traceback)
  File "C:\Python\lib\site-packages\urllib3\response.py", line 443, in _error_catcher
    raise ProtocolError("Connection broken: %r" % e, e)
urllib3.exceptions.ProtocolError: ('Connection broken: IncompleteRead(0 bytes read)', IncompleteRead(0 bytes read))

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "C:\Python\lib\multiprocessing\process.py", line 297, in _bootstrap
    self.run()
  File "C:\Python\lib\multiprocessing\process.py", line 99, in run
    self._target(*self._args, **self._kwargs)
  File "E:\Active Projectcs\Sakurajima\Sakurajima\models\base_models.py", line 317, in download_chunk
    res = requests.get(segment["uri"], cookies=self.__cookies, headers=headers)
  File "C:\Python\lib\site-packages\requests\api.py", line 75, in get
    return request('get', url, params=params, **kwargs)
  File "C:\Python\lib\site-packages\requests\api.py", line 60, in request
    return session.request(method=method, url=url, **kwargs)
  File "C:\Python\lib\site-packages\requests\sessions.py", line 533, in request
    resp = self.send(prep, **send_kwargs)
  File "C:\Python\lib\site-packages\requests\sessions.py", line 686, in send
    r.content
  File "C:\Python\lib\site-packages\requests\models.py", line 828, in content
    self._content = b''.join(self.iter_content(CONTENT_CHUNK_SIZE)) or b''
  File "C:\Python\lib\site-packages\requests\models.py", line 753, in generate
    raise ChunkedEncodingError(e)
requests.exceptions.ChunkedEncodingError: ('Connection broken: IncompleteRead(0 bytes read)', IncompleteRead(0 bytes read))

@notmarek notmarek linked a pull request Jul 7, 2020 that will close this issue
@notmarek
Copy link
Owner

notmarek commented Jul 7, 2020

requests.exceptions.ChunkedEncodingError: ('Connection broken: IncompleteRead(0 bytes read)', IncompleteRead(0 bytes read))

Seems like your connection got cut while downloading, never happened in my testing

@DhanrajHira
Copy link
Collaborator Author

I was talking about the first (ValueError) do you reckon that that was a connection issue too? To be fair my internet is not too good.

@notmarek
Copy link
Owner

notmarek commented Jul 7, 2020

I was talking about the first (ValueError) do you reckon that that was a connection issue too? To be fair my internet is not too good.

Pretty sure yes

@notmarek
Copy link
Owner

notmarek commented Jul 7, 2020

could you tell me what you were downloading?

@DhanrajHira
Copy link
Collaborator Author

Anime ID : 2506 (The God of Highschool)
Episode : 1
Quality : HD

@DhanrajHira
Copy link
Collaborator Author

I tried "ld" quality and the download completed without issues but the resultant .mp4 file is unplayable.

@DhanrajHira
Copy link
Collaborator Author

I'll open an issue regarding this.

@notmarek
Copy link
Owner

notmarek commented Jul 7, 2020

This is very weird, i had no issues downloading this

Anime ID : 2506 (The God of Highschool)
Episode : 1
Quality : HD

@DhanrajHira
Copy link
Collaborator Author

With issue #25 solved, I think the experimental branch is ready to be merged.

Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
discussion Discussing ideas and features
Projects
None yet
Development

Successfully merging a pull request may close this issue.

2 participants