Skip to content
This repository was archived by the owner on Sep 23, 2025. It is now read-only.

Conversation

@jiafuzha
Copy link
Contributor

No description provided.

@jiafuzha
Copy link
Contributor Author

@yutianchen666
There is a ut test failure seems not related to my changes. Please help check below log.

https://github.com/intel/llm-on-ray/actions/runs/9543290317/job/26299745246?pr=253

@yutianchen666
Copy link
Contributor

@yutianchen666 There is a ut test failure seems not related to my changes. Please help check below log.

https://github.com/intel/llm-on-ray/actions/runs/9543290317/job/26299745246?pr=253

“NumPy 2.0 is planned to bereleased on June 16” https://numpy.org/news/. So the numpy-2.0.0 has a conflict with skimage

jiafuzha added 11 commits June 17, 2024 18:29
Signed-off-by: Jiafu Zhang <jiafu.zhang@intel.com>
Signed-off-by: Jiafu Zhang <jiafu.zhang@intel.com>
Signed-off-by: Jiafu Zhang <jiafu.zhang@intel.com>
Signed-off-by: Jiafu Zhang <jiafu.zhang@intel.com>
Signed-off-by: Jiafu Zhang <jiafu.zhang@intel.com>
Signed-off-by: Jiafu Zhang <jiafu.zhang@intel.com>
Signed-off-by: Jiafu Zhang <jiafu.zhang@intel.com>
Signed-off-by: Jiafu Zhang <jiafu.zhang@intel.com>
Signed-off-by: Jiafu Zhang <jiafu.zhang@intel.com>
Signed-off-by: Jiafu Zhang <jiafu.zhang@intel.com>
Signed-off-by: Jiafu Zhang <jiafu.zhang@intel.com>
@jiafuzha
Copy link
Contributor Author

CI has passed except the UT case tiancheng will submit a new PR to fix. please help review.

@yutianchen666
Copy link
Contributor

I think in this PR it is possible to modify pyproject.toml L23directly of "numpy<2.0.0" to pass all CI. Because In order to adapt to latest numpy (2.0.0), I tried the versions of various installation packages, including Skimage, torch, ipex, oneccl_bind_pt, etc. However, there are a lot of incompatibilities, especially Python 3.9 does not support the latest Skimage, oneccl_bind_pt has not adapted to torch2.3, etc. So can only limit numpy to the < 2.0.0 version now

@jiafuzha
Copy link
Contributor Author

@yutianchen666 There is a ut test failure seems not related to my changes. Please help check below log.
https://github.com/intel/llm-on-ray/actions/runs/9543290317/job/26299745246?pr=253

“NumPy 2.0 is planned to bereleased on June 16” https://numpy.org/news/. So the numpy-2.0.0 has a conflict with skimage

sounds good. let me try.

model_description:
model_id_or_path: meta-llama/Llama-2-7b-chat-hf
tokenizer_name_or_path: meta-llama/Llama-2-7b-chat-hf
model_id_or_path: NousResearch/Llama-2-7b-chat-hf
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

can you revert all the changes for yamls in inference/models and still use meta-llama?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

no, they are actually referenced by CI inference test cases.

Copy link
Contributor

@carsonwang carsonwang left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks for the quick fix.

@carsonwang carsonwang merged commit 320922f into intel:main Jun 18, 2024
jiafuzha added 2 commits June 18, 2024 17:10
Signed-off-by: Jiafu Zhang <jiafu.zhang@intel.com>
Signed-off-by: Jiafu Zhang <jiafu.zhang@intel.com>
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants