Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Training multi-model #14

Closed
atharvou23 opened this issue Sep 14, 2022 · 2 comments
Closed

Training multi-model #14

atharvou23 opened this issue Sep 14, 2022 · 2 comments

Comments

@atharvou23
Copy link

atharvou23 commented Sep 14, 2022

@jiaaoc
Used versions:
torch==1.3.0
pytorch-lightning==0.8.1
install rouge==1.0.0
transformers==3.2.0
--editable 'transformers/'
Running bash script train_multi_graph.sh throws following error:

GPU available: True, used: True
INFO:lightning:GPU available: True, used: True
TPU available: False, using: 0 TPU cores
INFO:lightning:TPU available: False, using: 0 TPU cores
CUDA_VISIBLE_DEVICES: [0]
INFO:lightning:CUDA_VISIBLE_DEVICES: [0]
normal graph
dicsource_graph
Traceback (most recent call last):
  File "/Structure-Aware-BART-main/src/train.py", line 540, in <module>
    main(args)
  File "/Structure-Aware-BART-main/src/train.py", line 506, in main
    logger=logger,
  File "/Structure-Aware-BART-main/src/lightning_base.py", line 700, in generic_train
    trainer.fit(model)
  File "/usr/local/lib/python3.7/dist-packages/pytorch_lightning/trainer/trainer.py", line 918, in fit
    self.single_gpu_train(model)
  File "/usr/local/lib/python3.7/dist-packages/pytorch_lightning/trainer/distrib_parts.py", line 167, in single_gpu_train
    self.optimizers, self.lr_schedulers, self.optimizer_frequencies = self.init_optimizers(model)
  File "/usr/local/lib/python3.7/dist-packages/pytorch_lightning/trainer/optimizers.py", line 18, in init_optimizers
    optim_conf = model.configure_optimizers()
  File "/Structure-Aware-BART-main/src/lightning_base.py", line 181, in configure_optimizers
    new_params_id += list(map(id, model.model.discourse_encoder.parameters()))  +\
  File "/usr/local/lib/python3.7/dist-packages/torch/nn/modules/module.py", line 585, in __getattr__
    type(self).__name__, name))
AttributeError: 'BartModel' object has no attribute 'discourse_encoder'
@jiaaoc
Copy link
Member

jiaaoc commented Sep 21, 2022

You need to run the pip install -e transformers. (the provided transformers in this repo)

@jiaaoc jiaaoc closed this as completed Sep 21, 2022
@atharvou23
Copy link
Author

atharvou23 commented Sep 27, 2022

@jiaaoc

#libraries
! pip install torch==1.3.0
! pip install pytorch-lightning==0.8.1
! pip install rouge==1.0.0
! pip install rouge_score
! pip install --user --editable "./transformers"

! python3 "/src/train.py" \
    --data_dir '/src/data' \
    --learning_rate=3e-5 \
    --gpus 1 \
    --do_train \
    --do_predict \
    --check_val_every_n_epoch 1 \
    --early_stopping_patience 4 \
    --max_source_length 800 \
    --task summarization \
    --label_smoothing 0.1 \
    --model_name_or_path facebook/bart-base \
    --cache_dir "./cache" \
    --output_dir "/composit" \
    --lr_scheduler polynomial \
    --weight_decay 0.01 --warmup_steps 120 --num_train_epochs 2 \
    --max_grad_norm 0.1 \
    --dropout 0.1 --attention_dropout 0.1 \
    --train_batch_size 4 \
    --eval_batch_size 2 \
    --gradient_accumulation_steps 8 \
    --sortish_sampler \
    --seed 42 \
    --val_metric loss \
    --discourse_graph \
    --relation \
    --lr_new 10 \
    --warm_up_new 60 \
    --warm_up_new_2 60 \
    --discourse_attn_head 2\
    --action_graph \
    --action_encoder_attn_head 2 --composit \
    --config_name "./src/bart_config_10_7.json" \
    --num_workers 1

When I run this I get a following Import Error:

Traceback (most recent call last):
  File "/src/train.py", line 20, in <module>
    from callbacks import Seq2SeqLoggingCallback, get_checkpoint_callback, get_early_stopping_callback
  File "/src/callbacks.py", line 11, in <module>
    from utils import save_json
  File "/src/utils.py", line 21, in <module>
    from transformers import BartTokenizer
ModuleNotFoundError: No module named 'transformers'

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants