Skip to content

[Bug] - CodeGen: CodeLlama-7b-hf model is unstable to use #1952

@hsyrjaos

Description

@hsyrjaos

Priority

P1-Stopper

OS type

Ubuntu

Hardware type

Xeon-SPR

Installation method

  • Pull docker images from hub.docker.com
  • Build docker images from source
  • Other
  • N/A

Deploy method

  • Docker
  • Docker Compose
  • Kubernetes Helm Charts
  • Kubernetes GMC
  • Other
  • N/A

Running nodes

Single Node

What's the version?

NAME IMAGE COMMAND SERVICE CREATED STATUS PORTS
codegen-xeon-backend-server opea/codegen:latest "python codegen.py" codegen-xeon-backend-server 12 minutes ago Up 11 minutes 0.0.0.0:7778->7778/tcp, :::7778->7778/tcp
codegen-xeon-ui-server opea/codegen-gradio-ui:latest "python codegen_ui_g…" codegen-xeon-ui-server 12 minutes ago Up 11 minutes 0.0.0.0:5173->5173/tcp, :::5173->5173/tcp
dataprep-redis-server opea/dataprep:latest "sh -c 'python $( [ …" dataprep-redis-server 12 minutes ago Up 12 minutes (healthy) 0.0.0.0:6007->5000/tcp, [::]:6007->5000/tcp
llm-codegen-vllm-server opea/llm-textgen:latest "bash entrypoint.sh" llm-vllm-service 12 minutes ago Up 11 minutes 0.0.0.0:9000->9000/tcp, :::9000->9000/tcp
llm-textgen-server opea/llm-textgen:latest "bash entrypoint.sh" llm-base 12 minutes ago Up 12 minutes
redis-vector-db redis/redis-stack:7.2.0-v9 "/entrypoint.sh" redis-vector-db 12 minutes ago Up 12 minutes 0.0.0.0:6379->6379/tcp, :::6379->6379/tcp, 0.0.0.0:8001->8001/tcp, :::8001->8001/tcp
retriever-redis opea/retriever:latest "python opea_retriev…" retriever-redis 12 minutes ago Up 12 minutes 0.0.0.0:7000->7000/tcp, :::7000->7000/tcp
tei-embedding-server opea/embedding:latest "sh -c 'python $( [ …" tei-embedding-server 12 minutes ago Up 12 minutes 0.0.0.0:6000->6000/tcp, :::6000->6000/tcp
tei-embedding-serving ghcr.io/huggingface/text-embeddings-inference:cpu-1.5 "/bin/sh -c 'apt-get…" tei-embedding-serving 12 minutes ago Up 12 minutes (healthy) 0.0.0.0:8090->80/tcp, [::]:8090->80/tcp
vllm-server opea/vllm:latest "python3 -m vllm.ent…" vllm-service 12 minutes ago Up 12 minutes (healthy) 0.0.0.0:8028->80/tcp, [::]:8028->80/tcp

Description

Model="codellama/CodeLlama-7b-hf"

Prompt="Write a Python function that generates fibonnaci sequence."

Run on Gradio UI.

Output:
"fib(0) ➞ 0

fib(1) ➞ 1

fib(2) ➞ 1

fib(3) ➞ 2

fib(4) ➞ 3

fib(5) ➞ 5

fib(6) ➞ 8

fib(7) ➞ 13

fib(8) ➞ 21

fib(9) ➞ 34

fib(10) ➞ 55

fib(11) ➞ 89

fib(12) ➞ 144

fib(13) ➞ 233

fib(14) ➞ 377

fib(15) ➞ 610

fib(16) ➞ 987

fib(17) ➞ 1597

fib(18) ➞ 2584

fib(19) ➞ 4181

fib(20) ➞ 6765

fib(21) ➞ 10946

fib(22) ➞ 17711

fib(23) ➞ 28657

fib(24) ➞ 46368

fib(25) ➞ 75025

fib(26) ➞ 121393

fib(27) ➞ 196418

fib(28) ➞ 317811

fib(29) ➞ 514229

fib(30) ➞ 832040

fib(31) ➞ 1346269

fib(32) ➞ 2178309

fib(33) ➞ 3524578

fib(34) ➞ 5702887

fib(35) ➞ 9227465

fib(36) ➞ 14930352

fib(37) ➞ 24157817

fib(38) ➞ 39088169

fib(39) ➞ 63245986

fib(40) ➞ 102334155

fib(41) ➞ 165580141

fib(42) ➞ 267914296

fib(43) ➞ 433494437

fib(44) ➞ 701408733

fib(45) ➞ 1134903170

fib(46) ➞ 1836311903

fib(47) ➞ 2971215073

fib(48) ➞ 4807526976

fib(49) ➞ 7778742049

fib(50) ➞ 12586269025

fib(51) ➞ 20365011074

fib(52) ➞ 32951280099

fib(53) ➞ 53316291173

fib(54) ➞ 86"

Repeated few times seem to always same behavior.

Another example:
Prompt="Write a C++ function that generates fibonnaci sequence."

Output=
"

\begin{code}
#include

int fib(int n)
{
if (n == 0)
return 0;
else if (n == 1)
return 1;
else
return fib(n - 1) + fib(n - 2);
}

int main()
{
int n;
cin >> n;
cout << fib(n);
return 0;
}
\end{code}

Comment: What is your question?

Comment: I'm voting to close this question as off-topic because it is a homework dump.

Comment: @barmar I'm voting to close this question as off-topic because it is a homework dump.

Comment: @barmar I'm voting to close this question as off-topic because it is a homework dump.

Comment: @barmar I'm voting to close this question as off-topic because it is a homework dump.

Comment: @barmar I'm voting to close this question as off-topic because it is a homework dump.

Comment: @barmar I'm voting to close this question as off-topic because it is a homework dump.

Comment: @barmar I'm voting to close this question as off-topic because it is a homework dump.

Comment: @barmar I'm voting to close this question as off-topic because it is a homework dump.

Comment: @barmar I'm voting to close this question as off-topic because it is a homework dump.

Comment: @barmar I'm voting to close this question as off-topic because it is a homework dump.

Comment: @barmar I'm voting to close this question as off-topic because it is a homework dump.

Comment: @barmar I'm voting to close this question as off-topic because it is a homework dump.

Comment: @barmar I'm voting to close this question as off-topic because it is a homework dump.

Comment: @barmar I'm voting to close this question as off-topic because it is a homework dump.

Comment: @barmar I'm voting to close this question as off-topic because it is a homework dump.

Comment: @barmar I'm voting to close this question as off-topic because it is a homework dump.

Comment: @barmar I'm voting to close this question as off-topic because it is a homework dump.

Comment: @barmar I'm voting to close this question as off-topic because it is a homework dump.

Comment: @barmar I'm voting to close this question as off-topic because it is a homework dump.

Comment: @barmar I'm voting to close this question as off-topic because it is a homework dump.

Comment: @barmar I'm voting to close this question as off-topic because it is a homework dump.

Comment: @barmar I'm voting to close this question as off-topic because it is a homework dump.

Comment: @barmar I'm voting to close this question as off-topic because it is a homework dump.

Comment: @barmar I'm voting to close this question as off-topic because it is a homework dump.

Comment: @barmar I'm voting to close this question as off-topic because it is a homework dump.

Comment: @barmar I'm voting to close this question as off-topic because it is a homework dump.

Comment: @barmar I'm voting to close this question as off-topic because it is a homework dump.

Comment: @barmar I'm voting to close this question as off-topic because it is a homework dump.

Comment: @barmar I'm voting to close this question as off-topic because it is a homework dump.

Comment: @barmar I'm voting to close this question as off-topic because it is a homework dump.

Comment: @barmar I'm voting to close this question as off-topic because it is a homework dump.

Comment: @barmar I'm voting to close this question as off-topic because it is a homework dump.
"

Should codeLlama work with CodeGen?

Reproduce steps

Use give prompts on Gradio UI.

Raw log

Attachments

No response

Metadata

Metadata

Assignees

Labels

bugSomething isn't working

Type

No type

Projects

No projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions