Skip to content

示例3代码运行报错 #8

@Halflifefa

Description

@Halflifefa

from transformers import AutoTokenizer, AutoModelForCausalLM
tokenizer = AutoTokenizer.from_pretrained("./deepseek-coder-1.3b-instruct", trust_remote_code=True)
model = AutoModelForCausalLM.from_pretrained("./deepseek-coder-1.3b-instruct", trust_remote_code=True).cuda()
messages=[
... { 'role': 'user', 'content': "write a quick sort algorithm in python."}
... ]
inputs = tokenizer.apply_chat_template(messages, return_tensors="pt").to(model.device)
Traceback (most recent call last):
File "", line 1, in
AttributeError: 'LlamaTokenizerFast' object has no attribute 'apply_chat_template'

Metadata

Metadata

Assignees

No one assigned

    Labels

    ENVEnvironment/Dependency related questions.

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions