-
Notifications
You must be signed in to change notification settings - Fork 34
remove HF token from build log #253
Conversation
|
@yutianchen666 https://github.com/intel/llm-on-ray/actions/runs/9543290317/job/26299745246?pr=253 |
“NumPy 2.0 is planned to bereleased on June 16” https://numpy.org/news/. So the numpy-2.0.0 has a conflict with skimage |
Signed-off-by: Jiafu Zhang <jiafu.zhang@intel.com>
Signed-off-by: Jiafu Zhang <jiafu.zhang@intel.com>
Signed-off-by: Jiafu Zhang <jiafu.zhang@intel.com>
Signed-off-by: Jiafu Zhang <jiafu.zhang@intel.com>
Signed-off-by: Jiafu Zhang <jiafu.zhang@intel.com>
Signed-off-by: Jiafu Zhang <jiafu.zhang@intel.com>
Signed-off-by: Jiafu Zhang <jiafu.zhang@intel.com>
Signed-off-by: Jiafu Zhang <jiafu.zhang@intel.com>
Signed-off-by: Jiafu Zhang <jiafu.zhang@intel.com>
Signed-off-by: Jiafu Zhang <jiafu.zhang@intel.com>
Signed-off-by: Jiafu Zhang <jiafu.zhang@intel.com>
|
CI has passed except the UT case tiancheng will submit a new PR to fix. please help review. |
|
I think in this PR it is possible to modify pyproject.toml L23directly of "numpy<2.0.0" to pass all CI. Because In order to adapt to latest numpy (2.0.0), I tried the versions of various installation packages, including Skimage, torch, ipex, oneccl_bind_pt, etc. However, there are a lot of incompatibilities, especially Python 3.9 does not support the latest Skimage, oneccl_bind_pt has not adapted to torch2.3, etc. So can only limit numpy to the < 2.0.0 version now |
sounds good. let me try. |
| model_description: | ||
| model_id_or_path: meta-llama/Llama-2-7b-chat-hf | ||
| tokenizer_name_or_path: meta-llama/Llama-2-7b-chat-hf | ||
| model_id_or_path: NousResearch/Llama-2-7b-chat-hf |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
can you revert all the changes for yamls in inference/models and still use meta-llama?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
no, they are actually referenced by CI inference test cases.
carsonwang
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks for the quick fix.
Signed-off-by: Jiafu Zhang <jiafu.zhang@intel.com>
Signed-off-by: Jiafu Zhang <jiafu.zhang@intel.com>
No description provided.