Skip to content
This repository has been archived by the owner on Dec 16, 2022. It is now read-only.

MacOS: "AssertionError: Torch not compiled with CUDA enabled" #877

Closed
danyaljj opened this issue Feb 18, 2018 · 3 comments
Closed

MacOS: "AssertionError: Torch not compiled with CUDA enabled" #877

danyaljj opened this issue Feb 18, 2018 · 3 comments

Comments

@danyaljj
Copy link

danyaljj commented Feb 18, 2018

I do NOT have a GPU on my MacOS. I think by setting "cuda_device": 0, in my config file everything should be cool (i.e. using CPU, instead of trying to use my non-existent GPUs). However, getting this error:


huntsman-ve501-0123:Desktop daniel$ python3.6 -m allennlp.run train ~/Desktop/bidaf.json -s ~/
/usr/local/lib/python3.6/site-packages/h5py/__init__.py:36: FutureWarning: Conversion of the second argument of issubdtype from `float` to `np.floating` is deprecated. In future, it will be treated as `np.float64 == np.dtype(float).type`.
  from ._conv import register_converters as _register_converters
2018-02-17 19:21:36,531 - INFO - allennlp.common.params - random_seed = 13370
2018-02-17 19:21:36,532 - INFO - allennlp.common.params - numpy_seed = 1337
2018-02-17 19:21:36,532 - INFO - allennlp.common.params - pytorch_seed = 133
2018-02-17 19:21:36,533 - INFO - allennlp.common.checks - Pytorch version: 0.3.0.post4
2018-02-17 19:21:36,535 - INFO - allennlp.common.params - dataset_reader.type = squad
2018-02-17 19:21:36,535 - INFO - allennlp.common.params - dataset_reader.tokenizer.type = word
2018-02-17 19:21:36,535 - INFO - allennlp.common.params - dataset_reader.tokenizer.word_splitter.type = spacy
2018-02-17 19:21:36,535 - INFO - allennlp.common.params - dataset_reader.tokenizer.word_splitter.language = en_core_web_sm
2018-02-17 19:21:36,535 - INFO - allennlp.common.params - dataset_reader.tokenizer.word_splitter.pos_tags = False
2018-02-17 19:21:36,535 - INFO - allennlp.common.params - dataset_reader.tokenizer.word_splitter.parse = False
2018-02-17 19:21:36,535 - INFO - allennlp.common.params - dataset_reader.tokenizer.word_splitter.ner = False
2018-02-17 19:21:37,069 - INFO - allennlp.common.params - dataset_reader.tokenizer.word_filter.type = pass_through
2018-02-17 19:21:37,069 - INFO - allennlp.common.params - dataset_reader.tokenizer.word_stemmer.type = pass_through
2018-02-17 19:21:37,069 - INFO - allennlp.common.params - dataset_reader.tokenizer.start_tokens = None
2018-02-17 19:21:37,069 - INFO - allennlp.common.params - dataset_reader.tokenizer.end_tokens = None
2018-02-17 19:21:37,070 - INFO - allennlp.common.params - dataset_reader.token_indexers.tokens.type = single_id
2018-02-17 19:21:37,070 - INFO - allennlp.common.params - dataset_reader.token_indexers.tokens.namespace = tokens
2018-02-17 19:21:37,070 - INFO - allennlp.common.params - dataset_reader.token_indexers.tokens.lowercase_tokens = True
2018-02-17 19:21:37,070 - INFO - allennlp.common.params - dataset_reader.token_indexers.token_characters.type = characters
2018-02-17 19:21:37,070 - INFO - allennlp.common.params - dataset_reader.token_indexers.token_characters.namespace = token_characters
2018-02-17 19:21:37,071 - INFO - allennlp.common.params - dataset_reader.token_indexers.token_characters.character_tokenizer.byte_encoding = utf-8
2018-02-17 19:21:37,071 - INFO - allennlp.common.params - dataset_reader.token_indexers.token_characters.character_tokenizer.lowercase_characters = False
2018-02-17 19:21:37,071 - INFO - allennlp.common.params - dataset_reader.token_indexers.token_characters.character_tokenizer.start_tokens = [259]
2018-02-17 19:21:37,071 - INFO - allennlp.common.params - dataset_reader.token_indexers.token_characters.character_tokenizer.end_tokens = [260]
2018-02-17 19:21:37,071 - INFO - allennlp.common.params - train_data_path = /Users/daniel/Desktop/train_all_RemediaOnly_squadFormat.json
2018-02-17 19:21:37,071 - INFO - allennlp.commands.train - Reading training data from /Users/daniel/Desktop/train_all_RemediaOnly_squadFormat.json
2018-02-17 19:21:37,072 - INFO - allennlp.data.dataset_readers.reading_comprehension.squad - Reading file at /Users/daniel/Desktop/train_all_RemediaOnly_squadFormat.json
2018-02-17 19:21:37,073 - INFO - allennlp.data.dataset_readers.reading_comprehension.squad - Reading the dataset
100%|##########| 55/55 [00:00<00:00, 181.70it/s]
2018-02-17 19:21:37,379 - INFO - allennlp.common.params - validation_data_path = /Users/daniel/Desktop/train_all_RemediaOnly_squadFormat.json
2018-02-17 19:21:37,379 - INFO - allennlp.commands.train - Reading validation data from /Users/daniel/Desktop/train_all_RemediaOnly_squadFormat.json
2018-02-17 19:21:37,379 - INFO - allennlp.data.dataset_readers.reading_comprehension.squad - Reading file at /Users/daniel/Desktop/train_all_RemediaOnly_squadFormat.json
2018-02-17 19:21:37,380 - INFO - allennlp.data.dataset_readers.reading_comprehension.squad - Reading the dataset
100%|##########| 55/55 [00:00<00:00, 222.10it/s]
2018-02-17 19:21:37,629 - INFO - allennlp.common.params - test_data_path = None
2018-02-17 19:21:37,630 - INFO - allennlp.commands.train - Creating a vocabulary using validation, train data.
2018-02-17 19:21:37,631 - INFO - allennlp.common.params - vocabulary.directory_path = None
2018-02-17 19:21:37,631 - INFO - allennlp.common.params - vocabulary.min_count = 1
2018-02-17 19:21:37,631 - INFO - allennlp.common.params - vocabulary.max_vocab_size = None
2018-02-17 19:21:37,631 - INFO - allennlp.common.params - vocabulary.non_padded_namespaces = ('*tags', '*labels')
2018-02-17 19:21:37,631 - INFO - allennlp.common.params - vocabulary.only_include_pretrained_words = False
2018-02-17 19:21:37,631 - INFO - allennlp.data.vocabulary - Fitting token dictionary from dataset.
100%|##########| 544/544 [00:01<00:00, 354.94it/s]
2018-02-17 19:21:39,172 - WARNING - root - vocabulary serialization directory /Users/daniel/vocabulary is not empty
2018-02-17 19:21:39,182 - INFO - allennlp.common.params - model.type = bidaf
2018-02-17 19:21:39,183 - INFO - allennlp.common.params - model.text_field_embedder.type = basic
2018-02-17 19:21:39,183 - INFO - allennlp.common.params - model.text_field_embedder.tokens.type = embedding
2018-02-17 19:21:39,183 - INFO - allennlp.common.params - model.text_field_embedder.tokens.num_embeddings = None
2018-02-17 19:21:39,183 - INFO - allennlp.common.params - model.text_field_embedder.tokens.vocab_namespace = tokens
2018-02-17 19:21:39,183 - INFO - allennlp.common.params - model.text_field_embedder.tokens.embedding_dim = 100
2018-02-17 19:21:39,184 - INFO - allennlp.common.params - model.text_field_embedder.tokens.pretrained_file = https://s3-us-west-2.amazonaws.com/allennlp/datasets/glove/glove.6B.100d.txt.gz
2018-02-17 19:21:39,184 - INFO - allennlp.common.params - model.text_field_embedder.tokens.projection_dim = None
2018-02-17 19:21:39,184 - INFO - allennlp.common.params - model.text_field_embedder.tokens.trainable = False
2018-02-17 19:21:39,184 - INFO - allennlp.common.params - model.text_field_embedder.tokens.padding_index = None
2018-02-17 19:21:39,184 - INFO - allennlp.common.params - model.text_field_embedder.tokens.max_norm = None
2018-02-17 19:21:39,184 - INFO - allennlp.common.params - model.text_field_embedder.tokens.norm_type = 2.0
2018-02-17 19:21:39,184 - INFO - allennlp.common.params - model.text_field_embedder.tokens.scale_grad_by_freq = False
2018-02-17 19:21:39,184 - INFO - allennlp.common.params - model.text_field_embedder.tokens.sparse = False
2018-02-17 19:21:39,185 - INFO - allennlp.modules.token_embedders.embedding - Reading embeddings from file
2018-02-17 19:21:46,334 - INFO - allennlp.modules.token_embedders.embedding - Initializing pre-trained embedding layer
2018-02-17 19:21:46,355 - INFO - allennlp.common.params - model.text_field_embedder.token_characters.type = character_encoding
2018-02-17 19:21:46,355 - INFO - allennlp.common.params - model.text_field_embedder.token_characters.embedding.num_embeddings = 262
2018-02-17 19:21:46,355 - INFO - allennlp.common.params - model.text_field_embedder.token_characters.embedding.vocab_namespace = token_characters
2018-02-17 19:21:46,355 - INFO - allennlp.common.params - model.text_field_embedder.token_characters.embedding.embedding_dim = 16
2018-02-17 19:21:46,355 - INFO - allennlp.common.params - model.text_field_embedder.token_characters.embedding.pretrained_file = None
2018-02-17 19:21:46,355 - INFO - allennlp.common.params - model.text_field_embedder.token_characters.embedding.projection_dim = None
2018-02-17 19:21:46,355 - INFO - allennlp.common.params - model.text_field_embedder.token_characters.embedding.trainable = True
2018-02-17 19:21:46,355 - INFO - allennlp.common.params - model.text_field_embedder.token_characters.embedding.padding_index = None
2018-02-17 19:21:46,356 - INFO - allennlp.common.params - model.text_field_embedder.token_characters.embedding.max_norm = None
2018-02-17 19:21:46,356 - INFO - allennlp.common.params - model.text_field_embedder.token_characters.embedding.norm_type = 2.0
2018-02-17 19:21:46,356 - INFO - allennlp.common.params - model.text_field_embedder.token_characters.embedding.scale_grad_by_freq = False
2018-02-17 19:21:46,356 - INFO - allennlp.common.params - model.text_field_embedder.token_characters.embedding.sparse = False
2018-02-17 19:21:46,356 - INFO - allennlp.common.params - model.text_field_embedder.token_characters.encoder.type = cnn
2018-02-17 19:21:46,356 - INFO - allennlp.common.params - model.text_field_embedder.token_characters.encoder.embedding_dim = 16
2018-02-17 19:21:46,357 - INFO - allennlp.common.params - model.text_field_embedder.token_characters.encoder.output_dim = None
2018-02-17 19:21:46,357 - INFO - allennlp.common.params - model.text_field_embedder.token_characters.encoder.num_filters = 100
2018-02-17 19:21:46,357 - INFO - allennlp.common.params - model.text_field_embedder.token_characters.encoder.conv_layer_activation = relu
2018-02-17 19:21:46,357 - INFO - allennlp.common.params - model.text_field_embedder.token_characters.encoder.ngram_filter_sizes = [5]
2018-02-17 19:21:46,359 - INFO - allennlp.common.params - model.text_field_embedder.token_characters.dropout = 0.2
2018-02-17 19:21:46,359 - INFO - allennlp.common.params - model.num_highway_layers = 2
2018-02-17 19:21:46,360 - INFO - allennlp.common.params - model.phrase_layer.type = lstm
2018-02-17 19:21:46,360 - INFO - allennlp.common.params - model.phrase_layer.batch_first = True
2018-02-17 19:21:46,360 - INFO - allennlp.common.params - Converting Params object to dict; logging of default values will not occur when dictionary parameters are used subsequently.
2018-02-17 19:21:46,360 - INFO - allennlp.common.params - CURRENTLY DEFINED PARAMETERS: 
2018-02-17 19:21:46,360 - INFO - allennlp.common.params - model.phrase_layer.bidirectional = True
2018-02-17 19:21:46,361 - INFO - allennlp.common.params - model.phrase_layer.input_size = 200
2018-02-17 19:21:46,361 - INFO - allennlp.common.params - model.phrase_layer.hidden_size = 100
2018-02-17 19:21:46,361 - INFO - allennlp.common.params - model.phrase_layer.num_layers = 1
2018-02-17 19:21:46,361 - INFO - allennlp.common.params - model.phrase_layer.dropout = 0.2
2018-02-17 19:21:46,361 - INFO - allennlp.common.params - model.phrase_layer.batch_first = True
2018-02-17 19:21:46,364 - INFO - allennlp.common.params - model.similarity_function.type = linear
2018-02-17 19:21:46,364 - INFO - allennlp.common.params - model.similarity_function.tensor_1_dim = 200
2018-02-17 19:21:46,364 - INFO - allennlp.common.params - model.similarity_function.tensor_2_dim = 200
2018-02-17 19:21:46,364 - INFO - allennlp.common.params - model.similarity_function.combination = x,y,x*y
2018-02-17 19:21:46,365 - INFO - allennlp.common.params - model.similarity_function.activation = linear
2018-02-17 19:21:46,365 - INFO - allennlp.common.params - model.modeling_layer.type = lstm
2018-02-17 19:21:46,365 - INFO - allennlp.common.params - model.modeling_layer.batch_first = True
2018-02-17 19:21:46,365 - INFO - allennlp.common.params - Converting Params object to dict; logging of default values will not occur when dictionary parameters are used subsequently.
2018-02-17 19:21:46,365 - INFO - allennlp.common.params - CURRENTLY DEFINED PARAMETERS: 
2018-02-17 19:21:46,366 - INFO - allennlp.common.params - model.modeling_layer.bidirectional = True
2018-02-17 19:21:46,366 - INFO - allennlp.common.params - model.modeling_layer.input_size = 800
2018-02-17 19:21:46,366 - INFO - allennlp.common.params - model.modeling_layer.hidden_size = 100
2018-02-17 19:21:46,366 - INFO - allennlp.common.params - model.modeling_layer.num_layers = 2
2018-02-17 19:21:46,366 - INFO - allennlp.common.params - model.modeling_layer.dropout = 0.2
2018-02-17 19:21:46,366 - INFO - allennlp.common.params - model.modeling_layer.batch_first = True
2018-02-17 19:21:46,376 - INFO - allennlp.common.params - model.span_end_encoder.type = lstm
2018-02-17 19:21:46,377 - INFO - allennlp.common.params - model.span_end_encoder.batch_first = True
2018-02-17 19:21:46,377 - INFO - allennlp.common.params - Converting Params object to dict; logging of default values will not occur when dictionary parameters are used subsequently.
2018-02-17 19:21:46,377 - INFO - allennlp.common.params - CURRENTLY DEFINED PARAMETERS: 
2018-02-17 19:21:46,377 - INFO - allennlp.common.params - model.span_end_encoder.bidirectional = True
2018-02-17 19:21:46,377 - INFO - allennlp.common.params - model.span_end_encoder.input_size = 1400
2018-02-17 19:21:46,377 - INFO - allennlp.common.params - model.span_end_encoder.hidden_size = 100
2018-02-17 19:21:46,378 - INFO - allennlp.common.params - model.span_end_encoder.num_layers = 1
2018-02-17 19:21:46,378 - INFO - allennlp.common.params - model.span_end_encoder.dropout = 0.2
2018-02-17 19:21:46,378 - INFO - allennlp.common.params - model.span_end_encoder.batch_first = True
2018-02-17 19:21:46,392 - INFO - allennlp.common.params - model.dropout = 0.2
2018-02-17 19:21:46,392 - INFO - allennlp.common.params - model.initializer = []
2018-02-17 19:21:46,392 - INFO - allennlp.common.params - model.regularizer = []
2018-02-17 19:21:46,392 - INFO - allennlp.common.params - model.mask_lstms = True
2018-02-17 19:21:46,395 - INFO - allennlp.nn.initializers - Initializing parameters
2018-02-17 19:21:46,396 - INFO - allennlp.nn.initializers - Done initializing parameters; the following parameters are using their default initialization from their code
2018-02-17 19:21:46,396 - INFO - allennlp.nn.initializers -    _highway_layer._module._layers.0.bias
2018-02-17 19:21:46,396 - INFO - allennlp.nn.initializers -    _highway_layer._module._layers.0.weight
2018-02-17 19:21:46,396 - INFO - allennlp.nn.initializers -    _highway_layer._module._layers.1.bias
2018-02-17 19:21:46,397 - INFO - allennlp.nn.initializers -    _highway_layer._module._layers.1.weight
2018-02-17 19:21:46,397 - INFO - allennlp.nn.initializers -    _matrix_attention._similarity_function._bias
2018-02-17 19:21:46,397 - INFO - allennlp.nn.initializers -    _matrix_attention._similarity_function._weight_vector
2018-02-17 19:21:46,397 - INFO - allennlp.nn.initializers -    _modeling_layer._module.bias_hh_l0
2018-02-17 19:21:46,398 - INFO - allennlp.nn.initializers -    _modeling_layer._module.bias_hh_l0_reverse
2018-02-17 19:21:46,398 - INFO - allennlp.nn.initializers -    _modeling_layer._module.bias_hh_l1
2018-02-17 19:21:46,398 - INFO - allennlp.nn.initializers -    _modeling_layer._module.bias_hh_l1_reverse
2018-02-17 19:21:46,399 - INFO - allennlp.nn.initializers -    _modeling_layer._module.bias_ih_l0
2018-02-17 19:21:46,399 - INFO - allennlp.nn.initializers -    _modeling_layer._module.bias_ih_l0_reverse
2018-02-17 19:21:46,399 - INFO - allennlp.nn.initializers -    _modeling_layer._module.bias_ih_l1
2018-02-17 19:21:46,399 - INFO - allennlp.nn.initializers -    _modeling_layer._module.bias_ih_l1_reverse
2018-02-17 19:21:46,399 - INFO - allennlp.nn.initializers -    _modeling_layer._module.weight_hh_l0
2018-02-17 19:21:46,399 - INFO - allennlp.nn.initializers -    _modeling_layer._module.weight_hh_l0_reverse
2018-02-17 19:21:46,400 - INFO - allennlp.nn.initializers -    _modeling_layer._module.weight_hh_l1
2018-02-17 19:21:46,400 - INFO - allennlp.nn.initializers -    _modeling_layer._module.weight_hh_l1_reverse
2018-02-17 19:21:46,400 - INFO - allennlp.nn.initializers -    _modeling_layer._module.weight_ih_l0
2018-02-17 19:21:46,400 - INFO - allennlp.nn.initializers -    _modeling_layer._module.weight_ih_l0_reverse
2018-02-17 19:21:46,400 - INFO - allennlp.nn.initializers -    _modeling_layer._module.weight_ih_l1
2018-02-17 19:21:46,401 - INFO - allennlp.nn.initializers -    _modeling_layer._module.weight_ih_l1_reverse
2018-02-17 19:21:46,401 - INFO - allennlp.nn.initializers -    _phrase_layer._module.bias_hh_l0
2018-02-17 19:21:46,401 - INFO - allennlp.nn.initializers -    _phrase_layer._module.bias_hh_l0_reverse
2018-02-17 19:21:46,401 - INFO - allennlp.nn.initializers -    _phrase_layer._module.bias_ih_l0
2018-02-17 19:21:46,401 - INFO - allennlp.nn.initializers -    _phrase_layer._module.bias_ih_l0_reverse
2018-02-17 19:21:46,402 - INFO - allennlp.nn.initializers -    _phrase_layer._module.weight_hh_l0
2018-02-17 19:21:46,402 - INFO - allennlp.nn.initializers -    _phrase_layer._module.weight_hh_l0_reverse
2018-02-17 19:21:46,402 - INFO - allennlp.nn.initializers -    _phrase_layer._module.weight_ih_l0
2018-02-17 19:21:46,402 - INFO - allennlp.nn.initializers -    _phrase_layer._module.weight_ih_l0_reverse
2018-02-17 19:21:46,402 - INFO - allennlp.nn.initializers -    _span_end_encoder._module.bias_hh_l0
2018-02-17 19:21:46,402 - INFO - allennlp.nn.initializers -    _span_end_encoder._module.bias_hh_l0_reverse
2018-02-17 19:21:46,402 - INFO - allennlp.nn.initializers -    _span_end_encoder._module.bias_ih_l0
2018-02-17 19:21:46,402 - INFO - allennlp.nn.initializers -    _span_end_encoder._module.bias_ih_l0_reverse
2018-02-17 19:21:46,403 - INFO - allennlp.nn.initializers -    _span_end_encoder._module.weight_hh_l0
2018-02-17 19:21:46,403 - INFO - allennlp.nn.initializers -    _span_end_encoder._module.weight_hh_l0_reverse
2018-02-17 19:21:46,403 - INFO - allennlp.nn.initializers -    _span_end_encoder._module.weight_ih_l0
2018-02-17 19:21:46,403 - INFO - allennlp.nn.initializers -    _span_end_encoder._module.weight_ih_l0_reverse
2018-02-17 19:21:46,403 - INFO - allennlp.nn.initializers -    _span_end_predictor._module.bias
2018-02-17 19:21:46,403 - INFO - allennlp.nn.initializers -    _span_end_predictor._module.weight
2018-02-17 19:21:46,403 - INFO - allennlp.nn.initializers -    _span_start_predictor._module.bias
2018-02-17 19:21:46,403 - INFO - allennlp.nn.initializers -    _span_start_predictor._module.weight
2018-02-17 19:21:46,403 - INFO - allennlp.nn.initializers -    _text_field_embedder.token_embedder_token_characters._embedding._module.weight
2018-02-17 19:21:46,404 - INFO - allennlp.nn.initializers -    _text_field_embedder.token_embedder_token_characters._encoder._module.conv_layer_0.bias
2018-02-17 19:21:46,404 - INFO - allennlp.nn.initializers -    _text_field_embedder.token_embedder_token_characters._encoder._module.conv_layer_0.weight
2018-02-17 19:21:46,404 - INFO - allennlp.nn.initializers -    _text_field_embedder.token_embedder_tokens.weight
2018-02-17 19:21:46,404 - INFO - allennlp.common.params - iterator.type = bucket
2018-02-17 19:21:46,404 - INFO - allennlp.common.params - iterator.sorting_keys = [['passage', 'num_tokens'], ['question', 'num_tokens']]
2018-02-17 19:21:46,404 - INFO - allennlp.common.params - iterator.padding_noise = 0.1
2018-02-17 19:21:46,404 - INFO - allennlp.common.params - iterator.biggest_batch_first = False
2018-02-17 19:21:46,404 - INFO - allennlp.common.params - iterator.batch_size = 40
2018-02-17 19:21:46,404 - INFO - allennlp.data.dataset - Indexing dataset
100%|##########| 272/272 [00:00<00:00, 292.86it/s]
2018-02-17 19:21:47,334 - INFO - allennlp.data.dataset - Indexing dataset
100%|##########| 272/272 [00:00<00:00, 312.04it/s]
2018-02-17 19:21:48,219 - INFO - allennlp.common.params - trainer.patience = 10
2018-02-17 19:21:48,219 - INFO - allennlp.common.params - trainer.validation_metric = +em
2018-02-17 19:21:48,219 - INFO - allennlp.common.params - trainer.num_epochs = 10
2018-02-17 19:21:48,219 - INFO - allennlp.common.params - trainer.cuda_device = 0
2018-02-17 19:21:48,220 - INFO - allennlp.common.params - trainer.grad_norm = 5.0
2018-02-17 19:21:48,220 - INFO - allennlp.common.params - trainer.grad_clipping = None
Traceback (most recent call last):
  File "/usr/local/Cellar/python3/3.6.4_2/Frameworks/Python.framework/Versions/3.6/lib/python3.6/runpy.py", line 193, in _run_module_as_main
    "__main__", mod_spec)
  File "/usr/local/Cellar/python3/3.6.4_2/Frameworks/Python.framework/Versions/3.6/lib/python3.6/runpy.py", line 85, in _run_code
    exec(code, run_globals)
  File "/usr/local/lib/python3.6/site-packages/allennlp/run.py", line 13, in <module>
    main(prog="python -m allennlp.run")
  File "/usr/local/lib/python3.6/site-packages/allennlp/commands/__init__.py", line 77, in main
    args.func(args)
  File "/usr/local/lib/python3.6/site-packages/allennlp/commands/train.py", line 73, in train_model_from_args
    train_model_from_file(args.param_path, args.serialization_dir)
  File "/usr/local/lib/python3.6/site-packages/allennlp/commands/train.py", line 89, in train_model_from_file
    return train_model(params, serialization_dir)
  File "/usr/local/lib/python3.6/site-packages/allennlp/commands/train.py", line 174, in train_model
    trainer_params)
  File "/usr/local/lib/python3.6/site-packages/allennlp/training/trainer.py", line 517, in from_params
    model = model.cuda(cuda_device)
  File "/usr/local/lib/python3.6/site-packages/torch/nn/modules/module.py", line 216, in cuda
    return self._apply(lambda t: t.cuda(device))
  File "/usr/local/lib/python3.6/site-packages/torch/nn/modules/module.py", line 146, in _apply
    module._apply(fn)
  File "/usr/local/lib/python3.6/site-packages/torch/nn/modules/module.py", line 146, in _apply
    module._apply(fn)
  File "/usr/local/lib/python3.6/site-packages/torch/nn/modules/module.py", line 152, in _apply
    param.data = fn(param.data)
  File "/usr/local/lib/python3.6/site-packages/torch/nn/modules/module.py", line 216, in <lambda>
    return self._apply(lambda t: t.cuda(device))
  File "/usr/local/lib/python3.6/site-packages/torch/_utils.py", line 61, in _cuda
    with torch.cuda.device(device):
  File "/usr/local/lib/python3.6/site-packages/torch/cuda/__init__.py", line 186, in __enter__
    _lazy_init()
  File "/usr/local/lib/python3.6/site-packages/torch/cuda/__init__.py", line 120, in _lazy_init
    _check_driver()
  File "/usr/local/lib/python3.6/site-packages/torch/cuda/__init__.py", line 55, in _check_driver
    raise AssertionError("Torch not compiled with CUDA enabled")
AssertionError: Torch not compiled with CUDA enabled

In the log, note the mention of "trainer.cuda_device = 0".

@DeNeutoy
Copy link
Contributor

Ah - CUDA device ids are zero indexed, so you actually need to set it to -1 (or remove the flag entirely from your config, as the default value for this flag is -1).

@danyaljj
Copy link
Author

Oh yikes

@donpaul999
Copy link

Hi! I have the same error but I can't manage to solve it. I have a MacOS device too.

Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants