C:\Users\chuan\python37\Scripts\python.exe C:/Users/chuan/Desktop/ABSA-PyTorch/train.py loading file https://s3.amazonaws.com/models.huggingface.co/bert/bert-base-uncased-vocab.txt from cache at C:\Users\chuan\.cache\torch\pytorch_transformers\26bc1ad6c0ac742e9b52263248f6d0f00068293b33709fae12320c0e35ccfbbb.542ce4285a40d23a559526243235df47c5f75c197f04f37d1a0c124c32c9a084 loading configuration file https://s3.amazonaws.com/models.huggingface.co/bert/bert-base-uncased-config.json from cache at C:\Users\chuan\.cache\torch\pytorch_transformers\4dad0251492946e18ac39290fcfe91b89d370fee250efe9521476438fe8ca185.bf3b9ea126d8c0001ee8a1e8b92229871d06d36d8808208cc2449280da87785c Model config { "attention_probs_dropout_prob": 0.1, "finetuning_task": null, "hidden_act": "gelu", "hidden_dropout_prob": 0.1, "hidden_size": 768, "initializer_range": 0.02, "intermediate_size": 3072, "layer_norm_eps": 1e-12, "max_position_embeddings": 512, "num_attention_heads": 12, "num_hidden_layers": 12, "num_labels": 2, "output_attentions": false, "output_hidden_states": false, "pruned_heads": {}, "torchscript": false, "type_vocab_size": 2, "vocab_size": 30522 } https://s3.amazonaws.com/models.huggingface.co/bert/bert-base-uncased-pytorch_model.bin not found in cache or force_download set to True, downloading to C:\Users\chuan\AppData\Local\Temp\tmpwlopmhrz 100%|██████████| 440473133/440473133 [11:57<00:00, 613680.01B/s] copying C:\Users\chuan\AppData\Local\Temp\tmpwlopmhrz to cache at C:\Users\chuan\.cache\torch\pytorch_transformers\aa1ef1aede4482d0dbcd4d52baad8ae300e60902e88fcb0bebdec09afd232066.36ca03ab34a1a5d5fa7bc3d03d55c4fa650fed07220e2eeebc06ce58d0e9a157 creating metadata file for C:\Users\chuan\.cache\torch\pytorch_transformers\aa1ef1aede4482d0dbcd4d52baad8ae300e60902e88fcb0bebdec09afd232066.36ca03ab34a1a5d5fa7bc3d03d55c4fa650fed07220e2eeebc06ce58d0e9a157 removing temp file C:\Users\chuan\AppData\Local\Temp\tmpwlopmhrz loading weights file https://s3.amazonaws.com/models.huggingface.co/bert/bert-base-uncased-pytorch_model.bin from cache at C:\Users\chuan\.cache\torch\pytorch_transformers\aa1ef1aede4482d0dbcd4d52baad8ae300e60902e88fcb0bebdec09afd232066.36ca03ab34a1a5d5fa7bc3d03d55c4fa650fed07220e2eeebc06ce58d0e9a157 cuda memory allocated: 452890112 n_trainable_params: 112937661, n_nontrainable_params: 0 > training arguments: >>> model_name: aen_bert >>> dataset: restaurant >>> optimizer: >>> initializer: >>> learning_rate: 2e-05 >>> dropout: 0.1 >>> l2reg: 0.01 >>> num_epoch: 10 >>> batch_size: 16 >>> log_step: 5 >>> embed_dim: 300 >>> hidden_dim: 300 >>> bert_dim: 768 >>> pretrained_bert_name: bert-base-uncased >>> max_seq_len: 80 >>> polarities_dim: 3 >>> hops: 3 >>> device: cuda >>> seed: None >>> valset_ratio: 0 >>> local_context_focus: cdm >>> SRD: 3 >>> model_class: >>> dataset_file: {'train': './datasets/semeval14/Restaurants_Train.xml.seg', 'test': './datasets/semeval14/Restaurants_Test_Gold.xml.seg'} >>> inputs_cols: ['text_raw_bert_indices', 'aspect_bert_indices'] >>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> epoch: 0 C:\Program Files\Python37\lib\site-packages\torch\nn\functional.py:1339: UserWarning: nn.functional.tanh is deprecated. Use torch.tanh instead. warnings.warn("nn.functional.tanh is deprecated. Use torch.tanh instead.") C:\Users\chuan\Desktop\ABSA-PyTorch\models\aen.py:120: UserWarning: To copy construct from a tensor, it is recommended to use sourceTensor.clone().detach() or sourceTensor.clone().detach().requires_grad_(True), rather than torch.tensor(sourceTensor). context_len = torch.tensor(context_len, dtype=torch.float).to(self.opt.device) C:\Users\chuan\Desktop\ABSA-PyTorch\models\aen.py:121: UserWarning: To copy construct from a tensor, it is recommended to use sourceTensor.clone().detach() or sourceTensor.clone().detach().requires_grad_(True), rather than torch.tensor(sourceTensor). target_len = torch.tensor(target_len, dtype=torch.float).to(self.opt.device) loss: 1.2213, acc: 0.4500 loss: 1.0841, acc: 0.5188 loss: 1.0744, acc: 0.5250 loss: 1.0750, acc: 0.4969 loss: 1.0166, acc: 0.5400 loss: 1.0194, acc: 0.5563 loss: 1.0168, acc: 0.5643 loss: 1.0129, acc: 0.5672 loss: 0.9995, acc: 0.5764 loss: 0.9976, acc: 0.5737 loss: 0.9865, acc: 0.5830 loss: 0.9755, acc: 0.5875 loss: 0.9603, acc: 0.5942 loss: 0.9405, acc: 0.6027 loss: 0.9283, acc: 0.6050 loss: 0.9243, acc: 0.6039 loss: 0.9179, acc: 0.6088 loss: 0.9029, acc: 0.6139 loss: 0.8975, acc: 0.6138 loss: 0.8862, acc: 0.6188 loss: 0.8794, acc: 0.6220 loss: 0.8752, acc: 0.6250 loss: 0.8703, acc: 0.6283 loss: 0.8634, acc: 0.6323 loss: 0.8564, acc: 0.6360 loss: 0.8537, acc: 0.6370 loss: 0.8479, acc: 0.6389 loss: 0.8509, acc: 0.6388 loss: 0.8473, acc: 0.6427 loss: 0.8413, acc: 0.6479 loss: 0.8353, acc: 0.6500 loss: 0.8266, acc: 0.6523 loss: 0.8204, acc: 0.6561 loss: 0.8177, acc: 0.6566 loss: 0.8136, acc: 0.6582 loss: 0.8075, acc: 0.6604 loss: 0.8010, acc: 0.6622 loss: 0.7975, acc: 0.6635 loss: 0.7915, acc: 0.6673 loss: 0.7877, acc: 0.6700 loss: 0.7837, acc: 0.6732 loss: 0.7764, acc: 0.6774 loss: 0.7719, acc: 0.6797 loss: 0.7703, acc: 0.6804 loss: 0.7683, acc: 0.6806 > val_acc: 0.7848, val_f1: 0.6097 >> saved: state_dict/aen_bert_restaurant_val_acc0.7848 >>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> epoch: 1 loss: 0.4682, acc: 0.8125 loss: 0.5179, acc: 0.7778 loss: 0.5050, acc: 0.8036 loss: 0.5176, acc: 0.7895 loss: 0.5547, acc: 0.7760 loss: 0.5658, acc: 0.7629 loss: 0.5527, acc: 0.7721 loss: 0.5593, acc: 0.7660 loss: 0.5706, acc: 0.7642 loss: 0.5763, acc: 0.7679 loss: 0.5858, acc: 0.7662 loss: 0.5843, acc: 0.7691 loss: 0.5873, acc: 0.7666 loss: 0.5859, acc: 0.7672 loss: 0.5853, acc: 0.7677 loss: 0.5709, acc: 0.7737 loss: 0.5672, acc: 0.7738 loss: 0.5686, acc: 0.7739 loss: 0.5657, acc: 0.7759 loss: 0.5598, acc: 0.7784 loss: 0.5624, acc: 0.7770 loss: 0.5590, acc: 0.7804 loss: 0.5582, acc: 0.7818 loss: 0.5610, acc: 0.7820 loss: 0.5609, acc: 0.7807 loss: 0.5591, acc: 0.7810 loss: 0.5617, acc: 0.7803 loss: 0.5579, acc: 0.7819 loss: 0.5587, acc: 0.7804 loss: 0.5541, acc: 0.7819 loss: 0.5497, acc: 0.7841 loss: 0.5494, acc: 0.7854 loss: 0.5465, acc: 0.7870 loss: 0.5477, acc: 0.7870 loss: 0.5459, acc: 0.7884 loss: 0.5465, acc: 0.7870 loss: 0.5451, acc: 0.7877 loss: 0.5439, acc: 0.7877 loss: 0.5445, acc: 0.7867 loss: 0.5468, acc: 0.7855 loss: 0.5478, acc: 0.7846 loss: 0.5481, acc: 0.7841 loss: 0.5515, acc: 0.7836 loss: 0.5491, acc: 0.7842 loss: 0.5494, acc: 0.7840 > val_acc: 0.8009, val_f1: 0.6679 >> saved: state_dict/aen_bert_restaurant_val_acc0.8009 >>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> epoch: 2 loss: 0.5613, acc: 0.7708 loss: 0.5198, acc: 0.7656 loss: 0.5128, acc: 0.7644 loss: 0.4801, acc: 0.7986 loss: 0.4553, acc: 0.8071 loss: 0.4462, acc: 0.8147 loss: 0.4397, acc: 0.8220 loss: 0.4442, acc: 0.8191 loss: 0.4394, acc: 0.8212 loss: 0.4320, acc: 0.8242 loss: 0.4283, acc: 0.8267 loss: 0.4252, acc: 0.8297 loss: 0.4400, acc: 0.8234 loss: 0.4410, acc: 0.8244 loss: 0.4413, acc: 0.8262 loss: 0.4406, acc: 0.8269 loss: 0.4368, acc: 0.8283 loss: 0.4284, acc: 0.8288 loss: 0.4290, acc: 0.8286 loss: 0.4423, acc: 0.8240 loss: 0.4496, acc: 0.8192 loss: 0.4598, acc: 0.8154 loss: 0.4600, acc: 0.8175 loss: 0.4629, acc: 0.8178 loss: 0.4622, acc: 0.8186 loss: 0.4663, acc: 0.8159 loss: 0.4640, acc: 0.8172 loss: 0.4665, acc: 0.8143 loss: 0.4659, acc: 0.8142 loss: 0.4724, acc: 0.8117 loss: 0.4741, acc: 0.8109 loss: 0.4750, acc: 0.8113 loss: 0.4735, acc: 0.8113 loss: 0.4746, acc: 0.8095 loss: 0.4798, acc: 0.8089 loss: 0.4763, acc: 0.8104 loss: 0.4775, acc: 0.8091 loss: 0.4807, acc: 0.8075 loss: 0.4824, acc: 0.8073 loss: 0.4860, acc: 0.8059 loss: 0.4889, acc: 0.8039 loss: 0.4900, acc: 0.8035 loss: 0.4883, acc: 0.8046 loss: 0.4897, acc: 0.8036 loss: 0.4906, acc: 0.8027 > val_acc: 0.8107, val_f1: 0.7287 >> saved: state_dict/aen_bert_restaurant_val_acc0.8107 >>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> epoch: 3 loss: 0.6435, acc: 0.6562 loss: 0.4175, acc: 0.8214 loss: 0.4176, acc: 0.8333 loss: 0.4351, acc: 0.8199 loss: 0.4382, acc: 0.8153 loss: 0.4015, acc: 0.8333 loss: 0.4070, acc: 0.8359 loss: 0.4010, acc: 0.8361 loss: 0.4231, acc: 0.8259 loss: 0.4081, acc: 0.8338 loss: 0.3972, acc: 0.8389 loss: 0.3895, acc: 0.8443 loss: 0.4001, acc: 0.8397 loss: 0.3966, acc: 0.8386 loss: 0.3932, acc: 0.8403 loss: 0.3954, acc: 0.8385 loss: 0.4020, acc: 0.8354 loss: 0.4100, acc: 0.8297 loss: 0.4074, acc: 0.8329 loss: 0.4081, acc: 0.8344 loss: 0.4133, acc: 0.8315 loss: 0.4166, acc: 0.8289 loss: 0.4144, acc: 0.8292 loss: 0.4188, acc: 0.8264 loss: 0.4218, acc: 0.8253 loss: 0.4195, acc: 0.8278 loss: 0.4221, acc: 0.8272 loss: 0.4214, acc: 0.8266 loss: 0.4216, acc: 0.8261 loss: 0.4266, acc: 0.8253 loss: 0.4282, acc: 0.8248 loss: 0.4241, acc: 0.8272 loss: 0.4215, acc: 0.8275 loss: 0.4204, acc: 0.8286 loss: 0.4210, acc: 0.8296 loss: 0.4240, acc: 0.8291 loss: 0.4251, acc: 0.8286 loss: 0.4261, acc: 0.8282 loss: 0.4270, acc: 0.8281 loss: 0.4277, acc: 0.8277 loss: 0.4279, acc: 0.8283 loss: 0.4279, acc: 0.8276 loss: 0.4275, acc: 0.8272 loss: 0.4255, acc: 0.8283 loss: 0.4304, acc: 0.8263 > val_acc: 0.8045, val_f1: 0.7087 >>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> epoch: 4 loss: 0.5197, acc: 0.7500 loss: 0.3725, acc: 0.8542 loss: 0.3921, acc: 0.8523 loss: 0.3860, acc: 0.8594 loss: 0.3940, acc: 0.8452 loss: 0.3950, acc: 0.8462 loss: 0.3890, acc: 0.8427 loss: 0.3777, acc: 0.8524 loss: 0.3634, acc: 0.8567 loss: 0.3536, acc: 0.8614 loss: 0.3584, acc: 0.8615 loss: 0.3643, acc: 0.8549 loss: 0.3757, acc: 0.8535 loss: 0.3747, acc: 0.8542 loss: 0.3742, acc: 0.8530 loss: 0.3667, acc: 0.8544 loss: 0.3659, acc: 0.8557 loss: 0.3775, acc: 0.8496 loss: 0.3817, acc: 0.8468 loss: 0.3828, acc: 0.8464 loss: 0.3816, acc: 0.8465 loss: 0.3896, acc: 0.8443 loss: 0.3892, acc: 0.8435 loss: 0.3920, acc: 0.8421 loss: 0.3946, acc: 0.8409 loss: 0.3887, acc: 0.8428 loss: 0.3878, acc: 0.8416 loss: 0.3872, acc: 0.8419 loss: 0.3885, acc: 0.8426 loss: 0.3861, acc: 0.8442 loss: 0.3825, acc: 0.8460 loss: 0.3796, acc: 0.8474 loss: 0.3819, acc: 0.8455 loss: 0.3831, acc: 0.8460 loss: 0.3852, acc: 0.8436 loss: 0.3846, acc: 0.8445 loss: 0.3859, acc: 0.8436 loss: 0.3859, acc: 0.8438 loss: 0.3880, acc: 0.8449 loss: 0.3896, acc: 0.8447 loss: 0.3922, acc: 0.8430 loss: 0.3951, acc: 0.8404 loss: 0.4025, acc: 0.8365 loss: 0.4056, acc: 0.8348 loss: 0.4055, acc: 0.8340 loss: 0.4054, acc: 0.8331 > val_acc: 0.8161, val_f1: 0.7049 >> saved: state_dict/aen_bert_restaurant_val_acc0.8161 >>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> epoch: 5 loss: 0.3374, acc: 0.9000 loss: 0.3085, acc: 0.8875 loss: 0.3090, acc: 0.8833 loss: 0.3287, acc: 0.8781 loss: 0.3636, acc: 0.8650 loss: 0.3758, acc: 0.8562 loss: 0.3770, acc: 0.8536 loss: 0.3972, acc: 0.8391 loss: 0.3964, acc: 0.8333 loss: 0.3890, acc: 0.8375 loss: 0.4025, acc: 0.8295 loss: 0.4074, acc: 0.8281 loss: 0.4140, acc: 0.8298 loss: 0.4074, acc: 0.8321 loss: 0.4095, acc: 0.8325 loss: 0.4069, acc: 0.8336 loss: 0.4119, acc: 0.8316 loss: 0.4247, acc: 0.8264 loss: 0.4225, acc: 0.8276 loss: 0.4256, acc: 0.8287 loss: 0.4300, acc: 0.8262 loss: 0.4296, acc: 0.8273 loss: 0.4276, acc: 0.8283 loss: 0.4244, acc: 0.8292 loss: 0.4224, acc: 0.8305 loss: 0.4199, acc: 0.8313 loss: 0.4131, acc: 0.8329 loss: 0.4124, acc: 0.8335 loss: 0.4134, acc: 0.8328 loss: 0.4111, acc: 0.8337 loss: 0.4067, acc: 0.8355 loss: 0.4068, acc: 0.8352 loss: 0.4019, acc: 0.8379 loss: 0.4013, acc: 0.8386 loss: 0.3973, acc: 0.8407 loss: 0.3940, acc: 0.8417 loss: 0.3898, acc: 0.8429 loss: 0.3899, acc: 0.8438 loss: 0.3931, acc: 0.8420 loss: 0.3925, acc: 0.8416 loss: 0.3925, acc: 0.8402 loss: 0.3923, acc: 0.8411 loss: 0.3938, acc: 0.8416 loss: 0.3935, acc: 0.8415 loss: 0.3956, acc: 0.8403 > val_acc: 0.7348, val_f1: 0.6774 >>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> epoch: 6 loss: 0.4471, acc: 0.7812 loss: 0.3396, acc: 0.8472 loss: 0.3602, acc: 0.8527 loss: 0.3629, acc: 0.8586 loss: 0.3521, acc: 0.8672 loss: 0.3501, acc: 0.8707 loss: 0.3526, acc: 0.8640 loss: 0.3593, acc: 0.8526 loss: 0.3573, acc: 0.8537 loss: 0.3481, acc: 0.8546 loss: 0.3443, acc: 0.8565 loss: 0.3483, acc: 0.8581 loss: 0.3500, acc: 0.8545 loss: 0.3473, acc: 0.8542 loss: 0.3486, acc: 0.8539 loss: 0.3475, acc: 0.8560 loss: 0.3525, acc: 0.8519 loss: 0.3483, acc: 0.8553 loss: 0.3464, acc: 0.8570 loss: 0.3471, acc: 0.8573 loss: 0.3516, acc: 0.8552 loss: 0.3504, acc: 0.8549 loss: 0.3550, acc: 0.8536 loss: 0.3603, acc: 0.8498 loss: 0.3641, acc: 0.8483 loss: 0.3647, acc: 0.8484 loss: 0.3608, acc: 0.8503 loss: 0.3597, acc: 0.8498 loss: 0.3602, acc: 0.8498 loss: 0.3709, acc: 0.8452 loss: 0.3743, acc: 0.8446 loss: 0.3775, acc: 0.8436 loss: 0.3776, acc: 0.8438 loss: 0.3808, acc: 0.8428 loss: 0.3816, acc: 0.8423 loss: 0.3863, acc: 0.8408 loss: 0.3845, acc: 0.8414 loss: 0.3856, acc: 0.8403 loss: 0.3852, acc: 0.8405 loss: 0.3825, acc: 0.8420 loss: 0.3816, acc: 0.8428 loss: 0.3806, acc: 0.8430 loss: 0.3813, acc: 0.8429 loss: 0.3794, acc: 0.8442 loss: 0.3826, acc: 0.8429 > val_acc: 0.8116, val_f1: 0.6815 >>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> epoch: 7 loss: 0.4183, acc: 0.7917 loss: 0.4371, acc: 0.8125 loss: 0.4255, acc: 0.8173 loss: 0.4186, acc: 0.8125 loss: 0.4227, acc: 0.8098 loss: 0.4162, acc: 0.8147 loss: 0.3993, acc: 0.8201 loss: 0.4205, acc: 0.8174 loss: 0.4115, acc: 0.8212 loss: 0.4010, acc: 0.8281 loss: 0.3997, acc: 0.8278 loss: 0.4042, acc: 0.8211 loss: 0.3919, acc: 0.8274 loss: 0.3787, acc: 0.8336 loss: 0.3689, acc: 0.8390 loss: 0.3843, acc: 0.8365 loss: 0.3774, acc: 0.8389 loss: 0.3760, acc: 0.8402 loss: 0.3802, acc: 0.8387 loss: 0.3762, acc: 0.8406 loss: 0.3745, acc: 0.8422 loss: 0.3757, acc: 0.8432 loss: 0.3747, acc: 0.8440 loss: 0.3765, acc: 0.8406 loss: 0.3862, acc: 0.8359 loss: 0.3870, acc: 0.8364 loss: 0.3851, acc: 0.8360 loss: 0.3860, acc: 0.8370 loss: 0.3909, acc: 0.8357 loss: 0.3890, acc: 0.8366 loss: 0.3872, acc: 0.8362 loss: 0.3861, acc: 0.8362 loss: 0.3860, acc: 0.8359 loss: 0.3877, acc: 0.8348 loss: 0.3852, acc: 0.8363 loss: 0.3867, acc: 0.8360 loss: 0.3881, acc: 0.8357 loss: 0.3877, acc: 0.8364 loss: 0.3862, acc: 0.8368 loss: 0.3842, acc: 0.8368 loss: 0.3882, acc: 0.8359 loss: 0.3894, acc: 0.8353 loss: 0.3873, acc: 0.8360 loss: 0.3824, acc: 0.8377 loss: 0.3820, acc: 0.8377 > val_acc: 0.7768, val_f1: 0.6777 >>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> epoch: 8 loss: 0.2476, acc: 0.9062 loss: 0.3246, acc: 0.8839 loss: 0.3488, acc: 0.8802 loss: 0.3064, acc: 0.8934 loss: 0.2934, acc: 0.8949 loss: 0.3049, acc: 0.8866 loss: 0.2845, acc: 0.8945 loss: 0.2721, acc: 0.9003 loss: 0.3037, acc: 0.8869 loss: 0.3034, acc: 0.8856 loss: 0.3073, acc: 0.8834 loss: 0.3138, acc: 0.8794 loss: 0.3105, acc: 0.8790 loss: 0.3215, acc: 0.8759 loss: 0.3212, acc: 0.8750 loss: 0.3160, acc: 0.8766 loss: 0.3166, acc: 0.8735 loss: 0.3174, acc: 0.8750 loss: 0.3208, acc: 0.8750 loss: 0.3183, acc: 0.8756 loss: 0.3233, acc: 0.8713 loss: 0.3249, acc: 0.8703 loss: 0.3287, acc: 0.8689 loss: 0.3301, acc: 0.8665 loss: 0.3295, acc: 0.8663 loss: 0.3331, acc: 0.8627 loss: 0.3336, acc: 0.8627 loss: 0.3317, acc: 0.8631 loss: 0.3325, acc: 0.8636 loss: 0.3331, acc: 0.8627 loss: 0.3359, acc: 0.8610 loss: 0.3342, acc: 0.8623 loss: 0.3374, acc: 0.8603 loss: 0.3373, acc: 0.8604 loss: 0.3438, acc: 0.8568 loss: 0.3449, acc: 0.8559 loss: 0.3451, acc: 0.8554 loss: 0.3483, acc: 0.8539 loss: 0.3502, acc: 0.8535 loss: 0.3512, acc: 0.8534 loss: 0.3528, acc: 0.8530 loss: 0.3543, acc: 0.8511 loss: 0.3584, acc: 0.8494 loss: 0.3593, acc: 0.8488 loss: 0.3628, acc: 0.8474 > val_acc: 0.7821, val_f1: 0.6973 >>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> epoch: 9 loss: 0.3592, acc: 0.8750 loss: 0.3833, acc: 0.8021 loss: 0.3272, acc: 0.8466 loss: 0.3370, acc: 0.8633 loss: 0.3543, acc: 0.8601 loss: 0.3603, acc: 0.8534 loss: 0.3469, acc: 0.8528 loss: 0.3412, acc: 0.8542 loss: 0.3396, acc: 0.8537 loss: 0.3305, acc: 0.8560 loss: 0.3338, acc: 0.8529 loss: 0.3350, acc: 0.8527 loss: 0.3412, acc: 0.8463 loss: 0.3462, acc: 0.8400 loss: 0.3459, acc: 0.8433 loss: 0.3506, acc: 0.8429 loss: 0.3564, acc: 0.8403 loss: 0.3521, acc: 0.8423 loss: 0.3538, acc: 0.8400 loss: 0.3584, acc: 0.8372 loss: 0.3544, acc: 0.8422 loss: 0.3552, acc: 0.8426 loss: 0.3546, acc: 0.8440 loss: 0.3601, acc: 0.8416 loss: 0.3648, acc: 0.8378 loss: 0.3663, acc: 0.8393 loss: 0.3651, acc: 0.8397 loss: 0.3642, acc: 0.8401 loss: 0.3646, acc: 0.8409 loss: 0.3636, acc: 0.8408 loss: 0.3686, acc: 0.8394 loss: 0.3672, acc: 0.8409 loss: 0.3685, acc: 0.8405 loss: 0.3695, acc: 0.8404 loss: 0.3702, acc: 0.8403 loss: 0.3703, acc: 0.8398 loss: 0.3665, acc: 0.8419 loss: 0.3651, acc: 0.8421 loss: 0.3642, acc: 0.8420 loss: 0.3635, acc: 0.8415 loss: 0.3632, acc: 0.8417 loss: 0.3628, acc: 0.8416 loss: 0.3696, acc: 0.8383 loss: 0.3693, acc: 0.8380 loss: 0.3726, acc: 0.8388 loss: 0.3729, acc: 0.8390 > val_acc: 0.7866, val_f1: 0.6611 >> test_acc: 0.8161, test_f1: 0.7049 Process finished with exit code 0