Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Bug]: Processed prompts: 5%|▌ | 429/8535 [00:27<08:37, 15.68it/s] RuntimeError: probability tensor contains either inf, nan or element < 0 #4151

Open
pangpang-xuan opened this issue Apr 17, 2024 · 0 comments
Labels
bug Something isn't working

Comments

@pangpang-xuan
Copy link

Your current environment

由于是学校服务器 环境无法进行重新设置 目前环境为
vllm==0.2.2+cu118
transformers==4.34.1
python==3.10

🐛 Describe the bug

sampling_params = SamplingParams(temperature=0.8, top_p=0.95)   
llm=LLM(model=args.model_dir,trust_remote_code=True)    
print('model loaded!!!!')
good_samples_results= llm.generate(good_samples, sampling_params)

部分代码没有展示出来,bug出现在总共需要推理8000条数据,在推理到第429条时候报错,报错信息如下

File "/home/sft/src/eval_ckpt.py", line 197, in <module>
    good_samples_results= llm.generate(good_samples, sampling_params)
  File "/home/.conda/envs/python310/lib/python3.10/site-packages/vllm/entrypoints/llm.py", line 157, in generate
    return self._run_engine(use_tqdm)
  File "/home/.conda/envs/python310/lib/python3.10/site-packages/vllm/entrypoints/llm.py", line 177, in _run_engine
    step_outputs = self.llm_engine.step()
  File "/home/.conda/envs/python310/lib/python3.10/site-packages/vllm/engine/llm_engine.py", line 562, in step
    output = self._run_workers(
  File "/home/.conda/envs/python310/lib/python3.10/site-packages/vllm/engine/llm_engine.py", line 700, in _run_workers
    output = executor(*args, **kwargs)
  File "/home/.conda/envs/python310/lib/python3.10/site-packages/torch/utils/_contextlib.py", line 115, in decorate_context
    return func(*args, **kwargs)
  File "/home/.conda/envs/python310/lib/python3.10/site-packages/vllm/worker/worker.py", line 370, in execute_model
    output = self.model(
  File "/home/.conda/envs/python310/lib/python3.10/site-packages/torch/nn/modules/module.py", line 1518, in _wrapped_call_impl
    return self._call_impl(*args, **kwargs)
  File "/home/.conda/envs/python310/lib/python3.10/site-packages/torch/nn/modules/module.py", line 1527, in _call_impl
    return forward_call(*args, **kwargs)
  File "/home/.conda/envs/python310/lib/python3.10/site-packages/vllm/model_executor/models/bluelm.py", line 298, in forward
    next_tokens = self.sampler(self.lm_head.weight, hidden_states,
  File "/home/.conda/envs/python310/lib/python3.10/site-packages/torch/nn/modules/module.py", line 1518, in _wrapped_call_impl
    return self._call_impl(*args, **kwargs)
  File "/home/.conda/envs/python310/lib/python3.10/site-packages/torch/nn/modules/module.py", line 1527, in _call_impl
    return forward_call(*args, **kwargs)
  File "/home/.conda/envs/python310/lib/python3.10/site-packages/vllm/model_executor/layers/sampler.py", line 719, in forward
    sample_results = _sample(probs, logprobs, input_metadata)
  File "/home/.conda/envs/python310/lib/python3.10/site-packages/vllm/model_executor/layers/sampler.py", line 1082, in _sample
    sample_results = _random_sample(seq_groups, is_prompts,
  File "/home/.conda/envs/python310/lib/python3.10/site-packages/vllm/model_executor/layers/sampler.py", line 977, in _random_sample
    random_samples = torch.multinomial(probs,
RuntimeError: probability tensor contains either `inf`, `nan` or element < 0
Processed prompts:   5%|▌         | 429/8535 [00:27<08:37, 15.68it/s]
@pangpang-xuan pangpang-xuan added the bug Something isn't working label Apr 17, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

1 participant