Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

pyllama/downloads returns empty folders #47

Open
flyjgh opened this issue Mar 22, 2023 · 34 comments
Open

pyllama/downloads returns empty folders #47

flyjgh opened this issue Mar 22, 2023 · 34 comments
Labels
question Further information is requested

Comments

@flyjgh
Copy link

flyjgh commented Mar 22, 2023

Hello, when running:

python3 -m llama.download

the command runs almost instantly but only creates empty folders named 7B, 13B, etc...
I also tried by specifying --model-size and --folder with the same result

@juncongmoo
Copy link
Owner

Just updated the code base. Can you reinstall pyllama and try?

@flyjgh
Copy link
Author

flyjgh commented Mar 23, 2023

I just reinstalled and I still got the same behavior

@lucascr91
Copy link

Same problem in Mac M1

@ivanstepanovftw
Copy link

Same in Fedora

@lesurJ
Copy link

lesurJ commented Mar 23, 2023

same problem for me here (intel mac)

@shadowwalker2718
Copy link

Do you have wget installed? You need to install wget otherwise you will only get empty folder.

@lucascr91
Copy link

I have wget installed and in my path. It doesn't work

@juncongmoo
Copy link
Owner

I can not reproduce the error. It always work for me.

image

@juncongmoo juncongmoo added the question Further information is requested label Mar 24, 2023
@juncongmoo
Copy link
Owner

Try pip install hiq-python -U ?

@lesurJ
Copy link

lesurJ commented Mar 24, 2023

i tried after installing wget and hiq-python but it still does not work

I do not get any warnings..

❯ python3 -m llama.download --model_size 7B ❤️ Resume download is supported. You can ctrl-c and rerun the program to resume the downloading Downloading tokenizer... ✅ pyllama_data/tokenizer.model ✅ pyllama_data/tokenizer_checklist.chk Downloading 7B ✅ pyllama_data/7B/params.json ✅ pyllama_data/7B/checklist.chk Checking checksums

@sharlec
Copy link

sharlec commented Mar 24, 2023

same on ubuntu, I can only get the token but not the model

@mldevorg
Copy link
Collaborator

same on ubuntu, I can only get the token but not the model

I am using ubuntu and it works well for me though.

@alexch33
Copy link

work well on ubuntu, i pull last code from repo and reinstall pylama (pip install pyllama -U)

@txdywy
Copy link

txdywy commented Mar 31, 2023

work for mac(intel), empty folder for mac(m1)

@llimllib
Copy link
Contributor

llimllib commented Apr 4, 2023

wget is on my path, and hiq-python is up to date. m1 mac, python 3.10, pyllama v0.0.18:

$ python -m llama.download --model_size 7B --folder llama
❤️ Resume download is supported. You can ctrl-c and rerun the program to resume the downloading
Downloading tokenizer...
✅ llama/tokenizer.model
✅ llama/tokenizer_checklist.chk
Downloading 7B
✅ llama/7B/params.json
✅ llama/7B/checklist.chk
Checking checksums

$ ls llama/
7B/

$ du -sh llama/
  0B	llama/

@llimllib
Copy link
Contributor

llimllib commented Apr 4, 2023

Clearing pyllama out from my site-packages folder, cloning this repo, and running the same command as the previous comment in the root directory of this repository works as it is supposed to.

@michael-erasmus
Copy link

michael-erasmus commented Apr 5, 2023

Can confirm what @llimllib said. I had to do:

pip uninstall pyllama
git clone https://github.com/juncongmoo/pyllama
pip install -e pyllama

After that it works for me.

I also had another issue with py_itree, as reported here. I think this is happening on Mac M1 machines.

This was fixed by uninstalling py_itree first, then installing it from source:

pip uninstall py_itree
pip install https://github.com/juncongmoo/itree/archive/refs/tags/v0.0.18.tar.gz

@anentropic
Copy link

anentropic commented Apr 6, 2023

If you are on macos there is a problem with the download_community.sh script that is called from download.py:

  • it uses declare -A which needs bash v4+, but macos only comes with bash 3.x
  • default shell on recent macos is zsh so we can install a newer bash without breaking things, just brew install bash
  • but the download_community.sh script has #!/bin/bash at the top which points to the system bash instead of homebrew bash (which has been installed as the default, if I bash --version I get version 5.2.15 now, but in a different location)
    • we can fix this by changing the line at the top of download_community.sh to #!/usr/bin/env bash (should work for everyone with that change I think?)
  • now we just need to brew install wget

After these steps I am now in the process of downloading the 7B checkpoints 🎉

I'm on an M1, Ventura 13.3

@llimllib
Copy link
Contributor

llimllib commented Apr 7, 2023

Hopefully this is fixed following #70 and #71, let me know if you see any problems with the updated script!

@george-adams1
Copy link
Contributor

@llimllib I'm getting the same issue on windows

@AstroWa3l
Copy link

AstroWa3l commented Apr 13, 2023

macOS running ARM/Apple Silicon M1 empty folder always persists no matter what steps are listed here or in the linked commits.

@Genie-Liu
Copy link

Still encounter this problem after upgrading bash to v5.

@llimllib
Copy link
Contributor

@Genie-Liu @AstroWa3l if you clone this repository and run llama/download_community.sh 7B /tmp/llama-models hopefully you'll get the 7B model in /tmp/models. Looks like there hasn't been a new release yet, so this issue will persist at least until then

@AstroWa3l
Copy link

@llimllib Unfortunately it did not fix the issue

@llimllib
Copy link
Contributor

@AstroWa3l can you elaborate? Were there errors? What was the output?

@anentropic
Copy link

the script might benefit from having set -e at the top so it exits early instead of continuing after errors

@llimllib
Copy link
Contributor

@anentropic I do that with all my scripts, but I've never worked on this project before and wasn't sure how they were calling the script so I didn't add it

@AstroWa3l
Copy link

AstroWa3l commented Apr 14, 2023

@llimllib I found one error after following your instructions and it was due to a missing md5sum package so I installed it and now it is downloading. I could not download it due to a lack of this package on my system to check hashes. Thank you!!!

@Divjyot
Copy link

Divjyot commented Apr 15, 2023

@llimllib

✅ Worked on Mac (2012) OS Catalina x64_86 architecture Intel chip

  1. Clone Repo
pip uninstall pyllama
git clone https://github.com/juncongmoo/pyllama
pip install -e pyllama
  1. cd pyllama
  2. Run llama/download_community.sh 7B /tmp/llama-models ,

🔴
However using python3 -m llama.download --model_size 7B --folder llama/ command it fails with recursion error.

% pipenv run python3 -m llama.download --model_size 7B --folder llama/
Traceback (most recent call last):
  File "/Library/Developer/CommandLineTools/Library/Frameworks/Python3.framework/Versions/3.9/lib/python3.9/runpy.py", line 197, in _run_module_as_main
    return _run_code(code, main_globals, None,
  File "/Library/Developer/CommandLineTools/Library/Frameworks/Python3.framework/Versions/3.9/lib/python3.9/runpy.py", line 87, in _run_code
    exec(code, run_globals)
  File "/Users/my_user/pyllama/llama/download.py", line 87, in <module>
    download(args)
  File "/Users/my_user/pyllama/llama/download.py", line 20, in download
    download(args)
  File "/Users/my_user/pyllama/llama/download.py", line 20, in download
    download(args)
  File "/Users/my_user/pyllama/llama/download.py", line 20, in download
    download(args)
  [Previous line repeated 985 more times]
  File "/Users/my_user/pyllama/llama/download.py", line 17, in download
    retcode = hiq.execute_cmd(cmd, verbose=False, shell=True, runtime_output=True, env=os.environ)
  File "/Users/my_user/.local/share/virtualenvs/my-env-cIchWPfI/lib/python3.9/site-packages/hiq/utils.py", line 101, in execute_cmd
    proc = subprocess.Popen(
  File "/Library/Developer/CommandLineTools/Library/Frameworks/Python3.framework/Versions/3.9/lib/python3.9/subprocess.py", line 951, in __init__
    self._execute_child(args, executable, preexec_fn, close_fds,
  File "/Library/Developer/CommandLineTools/Library/Frameworks/Python3.framework/Versions/3.9/lib/python3.9/subprocess.py", line 1737, in _execute_child
    for k, v in env.items():
  File "/Library/Developer/CommandLineTools/Library/Frameworks/Python3.framework/Versions/3.9/lib/python3.9/_collections_abc.py", line 851, in __iter__
    for key in self._mapping:
  File "/Library/Developer/CommandLineTools/Library/Frameworks/Python3.framework/Versions/3.9/lib/python3.9/os.py", line 701, in __iter__
    yield self.decodekey(key)
  File "/Library/Developer/CommandLineTools/Library/Frameworks/Python3.framework/Versions/3.9/lib/python3.9/os.py", line 759, in decode
    return value.decode(encoding, 'surrogateescape')
RecursionError: maximum recursion depth exceeded while calling a Python object`

@Genie-Liu @AstroWa3l if you clone this repository and run llama/download_community.sh 7B /tmp/llama-models hopefully you'll get the 7B model in /tmp/models. Looks like there hasn't been a new release yet, so this issue will persist at least until then

@anentropic
Copy link

Ah! I already had this via brew install coreutils

@CefBoud
Copy link

CefBoud commented May 2, 2023

Script failed on verify function because md5sum was not available on my mac m1. brew install md5sha1sum

tristanvdb pushed a commit to tristanvdb/pyllama that referenced this issue May 9, 2023
@Huasito-Appel
Copy link

@mldevorg Can you tell me all the step that you make so that i can try and reproduced it, it dosent work for me on ubuntu

@GreysTone
Copy link

I tried the approach from @CefBoud on my Macbook Air M2(24G), it can solve the downloading loop mentioned above

@yufanghui
Copy link

Can confirm what @llimllib said. I had to do:

pip uninstall pyllama
git clone https://github.com/juncongmoo/pyllama
pip install -e pyllama

After that it works for me.

I also had another issue with py_itree, as reported here. I think this is happening on Mac M1 machines.

This was fixed by uninstalling py_itree first, then installing it from source:

pip uninstall py_itree
pip install https://github.com/juncongmoo/itree/archive/refs/tags/v0.0.18.tar.gz

work for me ,thanks👍

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
question Further information is requested
Projects
None yet
Development

No branches or pull requests