Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Evaluate: jetmoe/jetmoe-8b | -8b-sft | -8b-chat #47

Open
3 of 7 tasks
ggbetz opened this issue Apr 11, 2024 · 2 comments
Open
3 of 7 tasks

Evaluate: jetmoe/jetmoe-8b | -8b-sft | -8b-chat #47

ggbetz opened this issue Apr 11, 2024 · 2 comments

Comments

@ggbetz
Copy link
Contributor

ggbetz commented Apr 11, 2024

Check upon issue creation:

  • The model has not been evaluated yet and doesn't show up on the CoT Leaderboard.
  • There is no evaluation request issue for the model in the repo.
  • The parameters below have been adapted and shall be used.
  • Wait for jetmoe support in VLLM

Parameters:

NEXT_MODEL_PATH=jetmoe/jetmoe-8b
NEXT_MODEL_REVISION=main
NEXT_MODEL_PRECISION=bfloat16
MAX_LENGTH=2048 
GPU_MEMORY_UTILIZATION=0.5
VLLM_SWAP_SPACE=8

ToDos:

  • Run cot-eval pipeline
  • Merge pull requests for cot-eval results datats (> @ggbetz)
  • Create eval request record to update metadata on leaderboard (> @ggbetz)
@yakazimir
Copy link
Collaborator

I think our transformers version might be out of date: see here: huggingface/text-generation-inference#1620

2024-05-13T17:16:46.166834523Z 2024-05-13 17:16:46,166 - root - INFO - Loading vLLM model jetmoe/jetmoe-8b
2024-05-13T17:16:48.518010161Z Traceback (most recent call last):
2024-05-13T17:16:48.518059680Z   File "/usr/local/bin/cot-eval", line 8, in <module>
2024-05-13T17:16:48.518069710Z     sys.exit(main())
2024-05-13T17:16:48.518076080Z   File "/workspace/cot-eval/src/cot_eval/__main__.py", line 149, in main
2024-05-13T17:16:48.518202040Z     llm = VLLM(
2024-05-13T17:16:48.518229400Z   File "/usr/local/lib/python3.10/dist-packages/langchain_core/load/serializable.py", line 120, in __init__
2024-05-13T17:16:48.518319969Z     super().__init__(**kwargs)
2024-05-13T17:16:48.518333769Z   File "/usr/local/lib/python3.10/dist-packages/pydantic/v1/main.py", line 341, in __init__
2024-05-13T17:16:48.518495708Z     raise validation_error
2024-05-13T17:16:48.518561108Z pydantic.v1.error_wrappers.ValidationError: 1 validation error for VLLM
2024-05-13T17:16:48.518565458Z __root__
2024-05-13T17:16:48.518568898Z   The checkpoint you are trying to load has model type `jetmoe` but Transformers does not recognize this architecture. This could be because of an issue with the checkpoint, or because your version of Transformers is out of date. (type=value_error)``` 

@ggbetz
Copy link
Contributor Author

ggbetz commented May 14, 2024

#54

and:

vllm-project/vllm#3995

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

2 participants