Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

config.json file is missing. #1587

Closed
KaifAhmad1 opened this issue Apr 25, 2024 · 4 comments
Closed

config.json file is missing. #1587

KaifAhmad1 opened this issue Apr 25, 2024 · 4 comments

Comments

@KaifAhmad1
Copy link

KaifAhmad1 commented Apr 25, 2024

Hey there @muellerzr @pacman100 can you help me out!

# Training Arguments
training_arguments = TrainingArguments(
    output_dir='Phi-3-hindi-3.4k-history',
    per_device_train_batch_size=4,
    gradient_accumulation_steps=4,
    optim='paged_adamw_32bit',
    learning_rate=2e-4,
    lr_scheduler_type='cosine',
    save_strategy='epoch',
    logging_steps=10,
    save_steps=10,
    num_train_epochs=10,
    max_steps=50,
    fp16=True,
    warmup_ratio=0.05,
    push_to_hub=True,
)
     
# SFTTrainer Arguments
trainer = SFTTrainer(
    model=model,
    train_dataset=train_dataset,
    peft_config=peft_config,
    dataset_text_field='text',
    args=training_arguments,
    tokenizer=tokenizer,
    packing=False,
    max_seq_length=512
)

trainer.train()


from huggingface_hub import HfApi
     

username = "kaifahmad"
MODEL_NAME = "microsoft/Phi-3-mini-128k-instruct"
     

api = HfApi(token="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")
     

output_model_dir = "Phi-3-hindi-3.4k-history"
trainer.model.save_pretrained(output_model_dir)
tokenizer.save_pretrained(output_model_dir)
     
('Phi-3-hindi-3.4k-history/tokenizer_config.json',
 'Phi-3-hindi-3.4k-history/special_tokens_map.json',
 'Phi-3-hindi-3.4k-history/tokenizer.model',
 'Phi-3-hindi-3.4k-history/added_tokens.json',
 'Phi-3-hindi-3.4k-history/tokenizer.json')

!ls Phi-3-hindi-3.4k-history
     
adapter_config.json	   checkpoint-50  special_tokens_map.json  tokenizer.model
adapter_model.safetensors  README.md	  tokenizer_config.json    training_args.bin
added_tokens.json	   runs		  tokenizer.json

Getting this error when running this model on huggingface spaces

BadRequestError: (Request ID: SZAuB-5QQOf9zY3M7d5Ri) Bad request: Can't load config for 'None'. Make sure that: - 'None' is a correct model identifier listed on 'https://huggingface.co/models' - or 'None' is the correct path to a directory containing a config.json file
Traceback:
File "/usr/local/lib/python3.10/site-packages/streamlit/runtime/scriptrunner/script_runner.py", line 584, in _run_script
    exec(code, module.__dict__)
File "/home/user/app/app.py", line 3, in <module>
    gr.load("models/kaifahmad/Phi-3-hindi-3.4k-history").launch()
File "/usr/local/lib/python3.10/site-packages/gradio/external.py", line 60, in load
    return load_blocks_from_repo(
File "/usr/local/lib/python3.10/site-packages/gradio/external.py", line 99, in load_blocks_from_repo
    blocks: gradio.Blocks = factory_methods[src](name, hf_token, alias, **kwargs)
File "/usr/local/lib/python3.10/site-packages/gradio/external.py", line 373, in from_model
    interface = gradio.Interface(**kwargs)
File "/usr/local/lib/python3.10/site-packages/gradio/interface.py", line 515, in __init__
    self.render_examples()
File "/usr/local/lib/python3.10/site-packages/gradio/interface.py", line 861, in render_examples
    self.examples_handler = Examples(
File "/usr/local/lib/python3.10/site-packages/gradio/helpers.py", line 74, in create_examples
    examples_obj.create()
File "/usr/local/lib/python3.10/site-packages/gradio/helpers.py", line 307, in create
    self._start_caching()
File "/usr/local/lib/python3.10/site-packages/gradio/helpers.py", line 358, in _start_caching
    client_utils.synchronize_async(self.cache)
File "/usr/local/lib/python3.10/site-packages/gradio_client/utils.py", line 858, in synchronize_async
    return fsspec.asyn.sync(fsspec.asyn.get_loop(), func, *args, **kwargs)  # type: ignore
File "/usr/local/lib/python3.10/site-packages/fsspec/asyn.py", line 103, in sync
    raise return_result
File "/usr/local/lib/python3.10/site-packages/fsspec/asyn.py", line 56, in _runner
    result[0] = await coro
File "/usr/local/lib/python3.10/site-packages/gradio/helpers.py", line 479, in cache
    prediction = await Context.root_block.process_api(
File "/usr/local/lib/python3.10/site-packages/gradio/blocks.py", line 1788, in process_api
    result = await self.call_function(
File "/usr/local/lib/python3.10/site-packages/gradio/blocks.py", line 1340, in call_function
    prediction = await anyio.to_thread.run_sync(
File "/usr/local/lib/python3.10/site-packages/anyio/to_thread.py", line 56, in run_sync
    return await get_async_backend().run_sync_in_worker_thread(
File "/usr/local/lib/python3.10/site-packages/anyio/_backends/_asyncio.py", line 2144, in run_sync_in_worker_thread
    return await future
File "/usr/local/lib/python3.10/site-packages/anyio/_backends/_asyncio.py", line 851, in run
    result = context.run(func, *args)
File "/usr/local/lib/python3.10/site-packages/gradio/utils.py", line 759, in wrapper
    response = f(*args, **kwargs)
File "/usr/local/lib/python3.10/site-packages/gradio/external.py", line 357, in query_huggingface_inference_endpoints
    data = fn(*data)  # type: ignore
File "/usr/local/lib/python3.10/site-packages/huggingface_hub/inference/_client.py", line 1208, in question_answering
    response = self.post(
File "/usr/local/lib/python3.10/site-packages/huggingface_hub/inference/_client.py", line 267, in post
    hf_raise_for_status(response)
File "/usr/local/lib/python3.10/site-packages/huggingface_hub/utils/_errors.py", line 358, in hf_raise_for_status
    raise BadRequestError(message, response=response) from e

Here is the Model folder info

FireShot Capture 004 - kaifahmad_Phi-3-hindi-3 4k-history at main - huggingface co

@muellerzr
Copy link
Contributor

@KaifAhmad1 please don't expose your tokens in github issues. We're going to have to revoke it since edit history can be found on Github

@muellerzr
Copy link
Contributor

And you never uploaded a config.json, as the error implies

@KaifAhmad1
Copy link
Author

KaifAhmad1 commented Apr 25, 2024

Hey, @muellerzr
Where can I locate my config.json file? Also, I've revoked my token, so no worries about security.

Copy link

This issue has been automatically marked as stale because it has not had recent activity. If you think this still needs to be addressed please comment on this thread.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants