-
Notifications
You must be signed in to change notification settings - Fork 52
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Ygr avsr #13
Ygr avsr #13
Conversation
@@ -33,8 +34,33 @@ class custom_dataset: | |||
train_split: str = "train" | |||
test_split: str = "validation" | |||
data_path: str = NotImplemented | |||
max_words: int = NotImplemented |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
duplication
TRAIN_LRS3_MODEL_FILE: str = "/nfs/yangguanrou.ygr/AVSR/train-step_0108-wer_0.058.ckpt" # "/home/oss/yangguanrou.ygr/AVSR/train-step_0108-wer_0.058.ckpt" #单一模态是这个 | ||
TRAINED_AO_FILE : str = "/nfs/yangguanrou.ygr/AVSR/check/train-step_0604-wer_0.054.ckpt" #"/home/oss/yangguanrou.ygr/AVSR/check/train-step_0604-wer_0.054.ckpt" | ||
TRAINED_VO_FILE: str = "/nfs/yangguanrou.ygr/AVSR/check/train-step_1191-wer_0.674.ckpt" #"/home/oss/yangguanrou.ygr/AVSR/check/train-step_1191-wer_0.674.ckpt" |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
maybe we can customize each model config, instead of merge all model together.
@LauraGPT Can this be resolved with hydra?
@@ -222,7 +222,7 @@ def main(**kwargs): | |||
lr=train_config.lr, | |||
weight_decay=train_config.weight_decay, | |||
) | |||
scheduler = StepLR(optimizer, step_size=1, gamma=train_config.gamma) | |||
scheduler = StepLR(optimizer, step_size=1/root/SLAM-LLM/src/llama_recipes/models, gamma=train_config.gamma) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
to fix
logger.info("model_config: {}".format(model_config)) | ||
|
||
|
||
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
to clean up
[2023-12-15 16:38:36][root][INFO] - --> Training Set Length = 280879 | ||
[2023-12-15 16:38:36][root][INFO] - --> Validation Set Length = 2864 | ||
[2023-12-15 16:38:36][llama_recipes.utils.config_utils][INFO] - Using batching strategy: custom | ||
[2023-12-15 16:38:36][llama_recipes.utils.config_utils][INFO] - Using batching strategy: custom |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
need remove
try: | ||
#loss = model(**batch).loss | ||
outputs, *rest = model(**batch) | ||
except Exception as e: |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
to clean up
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
- clean up duplication and your own marks.
- solve the wandb bug with config update.
add log&wandb