Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

evaluting causing error for infer language pair on pretrained cnndm #16

Closed
amiyamandal-dev opened this issue Jun 29, 2020 · 1 comment
Closed

Comments

@amiyamandal-dev
Copy link

amiyamandal-dev commented Jun 29, 2020

fairseq-generate cnndm/processed --path /e/workspace/ProphetNet/a.pt --user-dir prophetnet --task translation_prophetnet --batch-size 32 --gen-subset test --beam 5 --num-workers 4 --min-len 45 --max-len-b 110  --no-repeat-ngram-size 3 --lenpen 1.2 2>&1 > cnndm/output-ck9-pelt1.2-test-beam5.txt

using the above command for Inference and Evaluation but causing an error on a pre-trained model for CNN/Daily Mail

Traceback (most recent call last):
  File "D:\windows_program\conda\envs\p\Scripts\fairseq-generate-script.py", line 33, in <module>
    sys.exit(load_entry_point('fairseq', 'console_scripts', 'fairseq-generate')())
  File "e:\fairseq\fairseq_cli\generate.py", line 270, in cli_main
    main(args)
  File "e:\fairseq\fairseq_cli\generate.py", line 36, in main
    return _main(args, sys.stdout)
  File "e:\fairseq\fairseq_cli\generate.py", line 57, in _main
    task = tasks.setup_task(args)
  File "e:\fairseq\fairseq\tasks\__init__.py", line 17, in setup_task
    return TASK_REGISTRY[args.task].setup_task(args, **kwargs)
  File "e:\fairseq\fairseq\tasks\translation.py", line 226, in setup_task
    raise Exception('Could not infer language pair, please provide it explicitly')
Exception: Could not infer language pair, please provide it explicitly

but in docs their no such arguments for fairseq-generate

fairseq 0.9.0
torch 1.5.1
model prophetnet_large_160G_cnndm_model.pt

@amiyamandal-dev
Copy link
Author

the same issue is coming while training from scartch

sh-4.4$ sh run.sh
2020-06-30 14:00:05 | INFO | fairseq_cli.train | Namespace(activation_dropout=0.1, activation_fn='gelu', adam_betas='(0.9, 0.999)', adam_eps=1e-08, all_gather_list_size=16384, arch='ngram_transformer_prophet_large', attention_dropout=0.1, best_checkpoint_metric='loss', bf16=False, bpe=None, broadcast_buffers=False, bucket_cap_mb=25, checkpoint_suffix='', clip_norm=0.1, cpu=False, criterion='ngram_language_loss', curriculum=0, data='cnndm/processed', data_buffer_size=10, dataset_impl=None, ddp_backend='no_c10d', decoder_attention_heads=16, decoder_embed_dim=1024, decoder_ffn_embed_dim=4096, decoder_layers=12, device_id=0, disable_ngram_loss=False, disable_validation=False, distributed_backend='nccl', distributed_init_method=None, distributed_no_spawn=False, distributed_port=-1, distributed_rank=0, distributed_world_size=1, distributed_wrapper='DDP', dropout=0.1, empty_cache_freq=0, encoder_attention_heads=16, encoder_embed_dim=1024, encoder_ffn_embed_dim=4096, encoder_layers=12, eval_bleu=False, eval_bleu_args=None, eval_bleu_detok='space', eval_bleu_detok_args=None, eval_bleu_print_samples=False, eval_bleu_remove_bpe=None, eval_tokenized_bleu=False, fast_stat_sync=False, find_unused_parameters=False, fix_batches_to_gpus=False, fixed_validation_seed=None, fp16=True, fp16_init_scale=128, fp16_no_flatten_grads=False, fp16_scale_tolerance=0.0, fp16_scale_window=None, keep_best_checkpoints=-1, keep_interval_updates=-1, keep_last_epochs=10, label_smoothing=0.1, left_pad_source='True', left_pad_target='False', load_alignments=False, load_from_pretrained_model='../prophetnet_large_pretrained_160G_14epoch_model.pt', load_sep=True, localsgd_frequency=3, log_format=None, log_interval=100, lr=[0.0001], lr_scheduler='inverse_sqrt', max_epoch=10, max_sentences=2, max_sentences_valid=2, max_source_positions=512, max_target_positions=512, max_tokens=None, max_tokens_valid=None, max_update=0, maximize_best_checkpoint_metric=False, memory_efficient_bf16=False, memory_efficient_fp16=False, min_loss_scale=0.0001, min_lr=-1, model_parallel_size=1, ngram=2, no_epoch_checkpoints=False, no_last_checkpoints=False, no_progress_bar=False, no_save=False, no_save_optimizer_state=False, nprocs_per_node=1, num_batch_buckets=0, num_buckets=32, num_workers=4, optimizer='adam', optimizer_overrides='{}', patience=-1, profile=False, quantization_config_path=None, relative_max_distance=128, required_batch_size_multiple=8, reset_dataloader=False, reset_lr_scheduler=False, reset_meters=False, reset_optimizer=False, restore_file='checkpoint_last.pt', save_dir='cnndm/finetune_cnndm_checkpoints', save_interval=1, save_interval_updates=0, seed=1, sentence_avg=False, share_all_embeddings=True, share_decoder_input_output_embed=True, skip_invalid_size_inputs_valid_test=True, slowmo_algorithm='LocalSGD', slowmo_momentum=None, source_lang=None, target_lang=None, task='translation_prophetnet', tensorboard_logdir='cnndm/finetune_cnndm_tensorboard', threshold_loss_scale=None, tokenizer=None, tpu=False, train_subset='train', truncate_source=False, update_freq=[32], upsample_primary=1, use_bmuf=False, use_old_adam=False, user_dir='./prophetnet', valid_subset='valid', validate_interval=1, warmup_init_lr=1e-07, warmup_updates=1000, weight_decay=0.01)
Traceback (most recent call last):
  File "D:\windows_program\conda\envs\p\Scripts\fairseq-train-script.py", line 33, in <module>
    sys.exit(load_entry_point('fairseq', 'console_scripts', 'fairseq-train')())
  File "e:\fairseq\fairseq_cli\train.py", line 347, in cli_main
    cli_main_helper(args)
  File "e:\fairseq\fairseq_cli\train.py", line 385, in cli_main_helper
    main(args)
  File "e:\fairseq\fairseq_cli\train.py", line 64, in main
    task = tasks.setup_task(args)
  File "e:\fairseq\fairseq\tasks\__init__.py", line 17, in setup_task
    return TASK_REGISTRY[args.task].setup_task(args, **kwargs)
  File "e:\fairseq\fairseq\tasks\translation.py", line 226, in setup_task
    raise Exception('Could not infer language pair, please provide it explicitly')
Exception: Could not infer language pair, please provide it explicitly

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant