stefan-it's picture
Upload folder using huggingface_hub
a520b0e
2023-10-19 20:48:01,185 ----------------------------------------------------------------------------------------------------
2023-10-19 20:48:01,185 Model: "SequenceTagger(
(embeddings): TransformerWordEmbeddings(
(model): BertModel(
(embeddings): BertEmbeddings(
(word_embeddings): Embedding(32001, 128)
(position_embeddings): Embedding(512, 128)
(token_type_embeddings): Embedding(2, 128)
(LayerNorm): LayerNorm((128,), eps=1e-12, elementwise_affine=True)
(dropout): Dropout(p=0.1, inplace=False)
)
(encoder): BertEncoder(
(layer): ModuleList(
(0-1): 2 x BertLayer(
(attention): BertAttention(
(self): BertSelfAttention(
(query): Linear(in_features=128, out_features=128, bias=True)
(key): Linear(in_features=128, out_features=128, bias=True)
(value): Linear(in_features=128, out_features=128, bias=True)
(dropout): Dropout(p=0.1, inplace=False)
)
(output): BertSelfOutput(
(dense): Linear(in_features=128, out_features=128, bias=True)
(LayerNorm): LayerNorm((128,), eps=1e-12, elementwise_affine=True)
(dropout): Dropout(p=0.1, inplace=False)
)
)
(intermediate): BertIntermediate(
(dense): Linear(in_features=128, out_features=512, bias=True)
(intermediate_act_fn): GELUActivation()
)
(output): BertOutput(
(dense): Linear(in_features=512, out_features=128, bias=True)
(LayerNorm): LayerNorm((128,), eps=1e-12, elementwise_affine=True)
(dropout): Dropout(p=0.1, inplace=False)
)
)
)
)
(pooler): BertPooler(
(dense): Linear(in_features=128, out_features=128, bias=True)
(activation): Tanh()
)
)
)
(locked_dropout): LockedDropout(p=0.5)
(linear): Linear(in_features=128, out_features=17, bias=True)
(loss_function): CrossEntropyLoss()
)"
2023-10-19 20:48:01,185 ----------------------------------------------------------------------------------------------------
2023-10-19 20:48:01,185 MultiCorpus: 7142 train + 698 dev + 2570 test sentences
- NER_HIPE_2022 Corpus: 7142 train + 698 dev + 2570 test sentences - /root/.flair/datasets/ner_hipe_2022/v2.1/newseye/fr/with_doc_seperator
2023-10-19 20:48:01,185 ----------------------------------------------------------------------------------------------------
2023-10-19 20:48:01,185 Train: 7142 sentences
2023-10-19 20:48:01,185 (train_with_dev=False, train_with_test=False)
2023-10-19 20:48:01,185 ----------------------------------------------------------------------------------------------------
2023-10-19 20:48:01,185 Training Params:
2023-10-19 20:48:01,186 - learning_rate: "5e-05"
2023-10-19 20:48:01,186 - mini_batch_size: "8"
2023-10-19 20:48:01,186 - max_epochs: "10"
2023-10-19 20:48:01,186 - shuffle: "True"
2023-10-19 20:48:01,186 ----------------------------------------------------------------------------------------------------
2023-10-19 20:48:01,186 Plugins:
2023-10-19 20:48:01,186 - TensorboardLogger
2023-10-19 20:48:01,186 - LinearScheduler | warmup_fraction: '0.1'
2023-10-19 20:48:01,186 ----------------------------------------------------------------------------------------------------
2023-10-19 20:48:01,186 Final evaluation on model from best epoch (best-model.pt)
2023-10-19 20:48:01,186 - metric: "('micro avg', 'f1-score')"
2023-10-19 20:48:01,186 ----------------------------------------------------------------------------------------------------
2023-10-19 20:48:01,186 Computation:
2023-10-19 20:48:01,186 - compute on device: cuda:0
2023-10-19 20:48:01,186 - embedding storage: none
2023-10-19 20:48:01,186 ----------------------------------------------------------------------------------------------------
2023-10-19 20:48:01,186 Model training base path: "hmbench-newseye/fr-dbmdz/bert-tiny-historic-multilingual-cased-bs8-wsFalse-e10-lr5e-05-poolingfirst-layers-1-crfFalse-4"
2023-10-19 20:48:01,186 ----------------------------------------------------------------------------------------------------
2023-10-19 20:48:01,186 ----------------------------------------------------------------------------------------------------
2023-10-19 20:48:01,186 Logging anything other than scalars to TensorBoard is currently not supported.
2023-10-19 20:48:03,348 epoch 1 - iter 89/893 - loss 3.31788487 - time (sec): 2.16 - samples/sec: 11311.02 - lr: 0.000005 - momentum: 0.000000
2023-10-19 20:48:05,631 epoch 1 - iter 178/893 - loss 3.00220485 - time (sec): 4.44 - samples/sec: 11232.29 - lr: 0.000010 - momentum: 0.000000
2023-10-19 20:48:07,923 epoch 1 - iter 267/893 - loss 2.52761114 - time (sec): 6.74 - samples/sec: 11190.98 - lr: 0.000015 - momentum: 0.000000
2023-10-19 20:48:10,295 epoch 1 - iter 356/893 - loss 2.08600669 - time (sec): 9.11 - samples/sec: 11235.03 - lr: 0.000020 - momentum: 0.000000
2023-10-19 20:48:12,581 epoch 1 - iter 445/893 - loss 1.83262530 - time (sec): 11.39 - samples/sec: 11102.08 - lr: 0.000025 - momentum: 0.000000
2023-10-19 20:48:15,291 epoch 1 - iter 534/893 - loss 1.65539925 - time (sec): 14.10 - samples/sec: 10637.43 - lr: 0.000030 - momentum: 0.000000
2023-10-19 20:48:17,545 epoch 1 - iter 623/893 - loss 1.51778252 - time (sec): 16.36 - samples/sec: 10597.71 - lr: 0.000035 - momentum: 0.000000
2023-10-19 20:48:19,817 epoch 1 - iter 712/893 - loss 1.39687716 - time (sec): 18.63 - samples/sec: 10637.02 - lr: 0.000040 - momentum: 0.000000
2023-10-19 20:48:22,116 epoch 1 - iter 801/893 - loss 1.30006566 - time (sec): 20.93 - samples/sec: 10682.81 - lr: 0.000045 - momentum: 0.000000
2023-10-19 20:48:24,252 epoch 1 - iter 890/893 - loss 1.22292912 - time (sec): 23.07 - samples/sec: 10766.58 - lr: 0.000050 - momentum: 0.000000
2023-10-19 20:48:24,309 ----------------------------------------------------------------------------------------------------
2023-10-19 20:48:24,309 EPOCH 1 done: loss 1.2221 - lr: 0.000050
2023-10-19 20:48:25,279 DEV : loss 0.31805744767189026 - f1-score (micro avg) 0.1418
2023-10-19 20:48:25,292 saving best model
2023-10-19 20:48:25,326 ----------------------------------------------------------------------------------------------------
2023-10-19 20:48:27,490 epoch 2 - iter 89/893 - loss 0.44839489 - time (sec): 2.16 - samples/sec: 12092.79 - lr: 0.000049 - momentum: 0.000000
2023-10-19 20:48:29,745 epoch 2 - iter 178/893 - loss 0.45190939 - time (sec): 4.42 - samples/sec: 11426.89 - lr: 0.000049 - momentum: 0.000000
2023-10-19 20:48:32,002 epoch 2 - iter 267/893 - loss 0.43792015 - time (sec): 6.68 - samples/sec: 11102.76 - lr: 0.000048 - momentum: 0.000000
2023-10-19 20:48:34,385 epoch 2 - iter 356/893 - loss 0.43913817 - time (sec): 9.06 - samples/sec: 11081.95 - lr: 0.000048 - momentum: 0.000000
2023-10-19 20:48:36,597 epoch 2 - iter 445/893 - loss 0.42993763 - time (sec): 11.27 - samples/sec: 11119.56 - lr: 0.000047 - momentum: 0.000000
2023-10-19 20:48:38,813 epoch 2 - iter 534/893 - loss 0.43160330 - time (sec): 13.49 - samples/sec: 11151.24 - lr: 0.000047 - momentum: 0.000000
2023-10-19 20:48:41,016 epoch 2 - iter 623/893 - loss 0.42662350 - time (sec): 15.69 - samples/sec: 11100.26 - lr: 0.000046 - momentum: 0.000000
2023-10-19 20:48:43,209 epoch 2 - iter 712/893 - loss 0.42179532 - time (sec): 17.88 - samples/sec: 11134.96 - lr: 0.000046 - momentum: 0.000000
2023-10-19 20:48:45,471 epoch 2 - iter 801/893 - loss 0.41961257 - time (sec): 20.14 - samples/sec: 11127.37 - lr: 0.000045 - momentum: 0.000000
2023-10-19 20:48:47,717 epoch 2 - iter 890/893 - loss 0.41430294 - time (sec): 22.39 - samples/sec: 11086.08 - lr: 0.000044 - momentum: 0.000000
2023-10-19 20:48:47,788 ----------------------------------------------------------------------------------------------------
2023-10-19 20:48:47,789 EPOCH 2 done: loss 0.4144 - lr: 0.000044
2023-10-19 20:48:50,606 DEV : loss 0.2429792881011963 - f1-score (micro avg) 0.4008
2023-10-19 20:48:50,619 saving best model
2023-10-19 20:48:50,656 ----------------------------------------------------------------------------------------------------
2023-10-19 20:48:52,917 epoch 3 - iter 89/893 - loss 0.34817659 - time (sec): 2.26 - samples/sec: 10227.98 - lr: 0.000044 - momentum: 0.000000
2023-10-19 20:48:55,197 epoch 3 - iter 178/893 - loss 0.34419809 - time (sec): 4.54 - samples/sec: 10597.89 - lr: 0.000043 - momentum: 0.000000
2023-10-19 20:48:57,484 epoch 3 - iter 267/893 - loss 0.34570190 - time (sec): 6.83 - samples/sec: 10723.08 - lr: 0.000043 - momentum: 0.000000
2023-10-19 20:48:59,738 epoch 3 - iter 356/893 - loss 0.35673527 - time (sec): 9.08 - samples/sec: 10912.76 - lr: 0.000042 - momentum: 0.000000
2023-10-19 20:49:01,958 epoch 3 - iter 445/893 - loss 0.35787478 - time (sec): 11.30 - samples/sec: 10929.46 - lr: 0.000042 - momentum: 0.000000
2023-10-19 20:49:04,293 epoch 3 - iter 534/893 - loss 0.34975742 - time (sec): 13.64 - samples/sec: 10906.45 - lr: 0.000041 - momentum: 0.000000
2023-10-19 20:49:06,578 epoch 3 - iter 623/893 - loss 0.34623751 - time (sec): 15.92 - samples/sec: 10910.39 - lr: 0.000041 - momentum: 0.000000
2023-10-19 20:49:08,859 epoch 3 - iter 712/893 - loss 0.34105492 - time (sec): 18.20 - samples/sec: 10933.97 - lr: 0.000040 - momentum: 0.000000
2023-10-19 20:49:11,148 epoch 3 - iter 801/893 - loss 0.33634117 - time (sec): 20.49 - samples/sec: 10927.23 - lr: 0.000039 - momentum: 0.000000
2023-10-19 20:49:13,428 epoch 3 - iter 890/893 - loss 0.33396920 - time (sec): 22.77 - samples/sec: 10869.31 - lr: 0.000039 - momentum: 0.000000
2023-10-19 20:49:13,514 ----------------------------------------------------------------------------------------------------
2023-10-19 20:49:13,514 EPOCH 3 done: loss 0.3337 - lr: 0.000039
2023-10-19 20:49:16,351 DEV : loss 0.21542218327522278 - f1-score (micro avg) 0.4361
2023-10-19 20:49:16,366 saving best model
2023-10-19 20:49:16,401 ----------------------------------------------------------------------------------------------------
2023-10-19 20:49:18,738 epoch 4 - iter 89/893 - loss 0.30607610 - time (sec): 2.34 - samples/sec: 10539.75 - lr: 0.000038 - momentum: 0.000000
2023-10-19 20:49:21,031 epoch 4 - iter 178/893 - loss 0.30905617 - time (sec): 4.63 - samples/sec: 10680.88 - lr: 0.000038 - momentum: 0.000000
2023-10-19 20:49:23,318 epoch 4 - iter 267/893 - loss 0.29857101 - time (sec): 6.92 - samples/sec: 10720.72 - lr: 0.000037 - momentum: 0.000000
2023-10-19 20:49:25,577 epoch 4 - iter 356/893 - loss 0.29476408 - time (sec): 9.17 - samples/sec: 10783.77 - lr: 0.000037 - momentum: 0.000000
2023-10-19 20:49:27,831 epoch 4 - iter 445/893 - loss 0.29327210 - time (sec): 11.43 - samples/sec: 10831.79 - lr: 0.000036 - momentum: 0.000000
2023-10-19 20:49:30,154 epoch 4 - iter 534/893 - loss 0.29218114 - time (sec): 13.75 - samples/sec: 10873.04 - lr: 0.000036 - momentum: 0.000000
2023-10-19 20:49:32,370 epoch 4 - iter 623/893 - loss 0.29353192 - time (sec): 15.97 - samples/sec: 10834.37 - lr: 0.000035 - momentum: 0.000000
2023-10-19 20:49:34,613 epoch 4 - iter 712/893 - loss 0.29422867 - time (sec): 18.21 - samples/sec: 10899.70 - lr: 0.000034 - momentum: 0.000000
2023-10-19 20:49:36,847 epoch 4 - iter 801/893 - loss 0.29355665 - time (sec): 20.45 - samples/sec: 10841.59 - lr: 0.000034 - momentum: 0.000000
2023-10-19 20:49:39,153 epoch 4 - iter 890/893 - loss 0.29116940 - time (sec): 22.75 - samples/sec: 10896.21 - lr: 0.000033 - momentum: 0.000000
2023-10-19 20:49:39,226 ----------------------------------------------------------------------------------------------------
2023-10-19 20:49:39,227 EPOCH 4 done: loss 0.2910 - lr: 0.000033
2023-10-19 20:49:41,610 DEV : loss 0.20385973155498505 - f1-score (micro avg) 0.4589
2023-10-19 20:49:41,624 saving best model
2023-10-19 20:49:41,659 ----------------------------------------------------------------------------------------------------
2023-10-19 20:49:43,712 epoch 5 - iter 89/893 - loss 0.25219515 - time (sec): 2.05 - samples/sec: 11876.44 - lr: 0.000033 - momentum: 0.000000
2023-10-19 20:49:45,968 epoch 5 - iter 178/893 - loss 0.26468865 - time (sec): 4.31 - samples/sec: 11290.43 - lr: 0.000032 - momentum: 0.000000
2023-10-19 20:49:48,261 epoch 5 - iter 267/893 - loss 0.27013532 - time (sec): 6.60 - samples/sec: 11053.64 - lr: 0.000032 - momentum: 0.000000
2023-10-19 20:49:50,577 epoch 5 - iter 356/893 - loss 0.26947573 - time (sec): 8.92 - samples/sec: 11028.19 - lr: 0.000031 - momentum: 0.000000
2023-10-19 20:49:52,776 epoch 5 - iter 445/893 - loss 0.27194910 - time (sec): 11.12 - samples/sec: 11145.21 - lr: 0.000031 - momentum: 0.000000
2023-10-19 20:49:54,655 epoch 5 - iter 534/893 - loss 0.26870618 - time (sec): 13.00 - samples/sec: 11451.82 - lr: 0.000030 - momentum: 0.000000
2023-10-19 20:49:56,476 epoch 5 - iter 623/893 - loss 0.26893292 - time (sec): 14.82 - samples/sec: 11703.51 - lr: 0.000029 - momentum: 0.000000
2023-10-19 20:49:58,730 epoch 5 - iter 712/893 - loss 0.26943529 - time (sec): 17.07 - samples/sec: 11640.91 - lr: 0.000029 - momentum: 0.000000
2023-10-19 20:50:01,091 epoch 5 - iter 801/893 - loss 0.26558780 - time (sec): 19.43 - samples/sec: 11484.67 - lr: 0.000028 - momentum: 0.000000
2023-10-19 20:50:03,363 epoch 5 - iter 890/893 - loss 0.26382300 - time (sec): 21.70 - samples/sec: 11422.87 - lr: 0.000028 - momentum: 0.000000
2023-10-19 20:50:03,435 ----------------------------------------------------------------------------------------------------
2023-10-19 20:50:03,435 EPOCH 5 done: loss 0.2639 - lr: 0.000028
2023-10-19 20:50:06,328 DEV : loss 0.19422198832035065 - f1-score (micro avg) 0.4997
2023-10-19 20:50:06,349 saving best model
2023-10-19 20:50:06,385 ----------------------------------------------------------------------------------------------------
2023-10-19 20:50:08,645 epoch 6 - iter 89/893 - loss 0.24073724 - time (sec): 2.26 - samples/sec: 11025.22 - lr: 0.000027 - momentum: 0.000000
2023-10-19 20:50:10,868 epoch 6 - iter 178/893 - loss 0.23869287 - time (sec): 4.48 - samples/sec: 10742.28 - lr: 0.000027 - momentum: 0.000000
2023-10-19 20:50:13,108 epoch 6 - iter 267/893 - loss 0.23653702 - time (sec): 6.72 - samples/sec: 10700.65 - lr: 0.000026 - momentum: 0.000000
2023-10-19 20:50:15,366 epoch 6 - iter 356/893 - loss 0.23682304 - time (sec): 8.98 - samples/sec: 10740.98 - lr: 0.000026 - momentum: 0.000000
2023-10-19 20:50:17,617 epoch 6 - iter 445/893 - loss 0.23901517 - time (sec): 11.23 - samples/sec: 10742.46 - lr: 0.000025 - momentum: 0.000000
2023-10-19 20:50:19,888 epoch 6 - iter 534/893 - loss 0.24103990 - time (sec): 13.50 - samples/sec: 10774.77 - lr: 0.000024 - momentum: 0.000000
2023-10-19 20:50:22,139 epoch 6 - iter 623/893 - loss 0.24250770 - time (sec): 15.75 - samples/sec: 10852.32 - lr: 0.000024 - momentum: 0.000000
2023-10-19 20:50:24,407 epoch 6 - iter 712/893 - loss 0.24499793 - time (sec): 18.02 - samples/sec: 10936.70 - lr: 0.000023 - momentum: 0.000000
2023-10-19 20:50:26,664 epoch 6 - iter 801/893 - loss 0.24412070 - time (sec): 20.28 - samples/sec: 10977.66 - lr: 0.000023 - momentum: 0.000000
2023-10-19 20:50:28,983 epoch 6 - iter 890/893 - loss 0.24378175 - time (sec): 22.60 - samples/sec: 10963.58 - lr: 0.000022 - momentum: 0.000000
2023-10-19 20:50:29,062 ----------------------------------------------------------------------------------------------------
2023-10-19 20:50:29,062 EPOCH 6 done: loss 0.2434 - lr: 0.000022
2023-10-19 20:50:31,413 DEV : loss 0.1892632693052292 - f1-score (micro avg) 0.5121
2023-10-19 20:50:31,426 saving best model
2023-10-19 20:50:31,462 ----------------------------------------------------------------------------------------------------
2023-10-19 20:50:34,181 epoch 7 - iter 89/893 - loss 0.21102632 - time (sec): 2.72 - samples/sec: 8398.67 - lr: 0.000022 - momentum: 0.000000
2023-10-19 20:50:36,427 epoch 7 - iter 178/893 - loss 0.22949609 - time (sec): 4.96 - samples/sec: 9490.59 - lr: 0.000021 - momentum: 0.000000
2023-10-19 20:50:38,659 epoch 7 - iter 267/893 - loss 0.22714217 - time (sec): 7.20 - samples/sec: 10067.28 - lr: 0.000021 - momentum: 0.000000
2023-10-19 20:50:40,892 epoch 7 - iter 356/893 - loss 0.22411998 - time (sec): 9.43 - samples/sec: 10212.94 - lr: 0.000020 - momentum: 0.000000
2023-10-19 20:50:43,256 epoch 7 - iter 445/893 - loss 0.22268983 - time (sec): 11.79 - samples/sec: 10343.03 - lr: 0.000019 - momentum: 0.000000
2023-10-19 20:50:45,358 epoch 7 - iter 534/893 - loss 0.22316819 - time (sec): 13.90 - samples/sec: 10467.56 - lr: 0.000019 - momentum: 0.000000
2023-10-19 20:50:47,671 epoch 7 - iter 623/893 - loss 0.22564912 - time (sec): 16.21 - samples/sec: 10384.21 - lr: 0.000018 - momentum: 0.000000
2023-10-19 20:50:50,031 epoch 7 - iter 712/893 - loss 0.22635014 - time (sec): 18.57 - samples/sec: 10611.04 - lr: 0.000018 - momentum: 0.000000
2023-10-19 20:50:52,321 epoch 7 - iter 801/893 - loss 0.22691152 - time (sec): 20.86 - samples/sec: 10735.21 - lr: 0.000017 - momentum: 0.000000
2023-10-19 20:50:54,568 epoch 7 - iter 890/893 - loss 0.22770015 - time (sec): 23.10 - samples/sec: 10734.74 - lr: 0.000017 - momentum: 0.000000
2023-10-19 20:50:54,639 ----------------------------------------------------------------------------------------------------
2023-10-19 20:50:54,639 EPOCH 7 done: loss 0.2274 - lr: 0.000017
2023-10-19 20:50:56,997 DEV : loss 0.18830116093158722 - f1-score (micro avg) 0.5288
2023-10-19 20:50:57,012 saving best model
2023-10-19 20:50:57,045 ----------------------------------------------------------------------------------------------------
2023-10-19 20:50:59,273 epoch 8 - iter 89/893 - loss 0.19916108 - time (sec): 2.23 - samples/sec: 10591.83 - lr: 0.000016 - momentum: 0.000000
2023-10-19 20:51:01,665 epoch 8 - iter 178/893 - loss 0.21119586 - time (sec): 4.62 - samples/sec: 10765.32 - lr: 0.000016 - momentum: 0.000000
2023-10-19 20:51:03,984 epoch 8 - iter 267/893 - loss 0.21979280 - time (sec): 6.94 - samples/sec: 10695.84 - lr: 0.000015 - momentum: 0.000000
2023-10-19 20:51:06,299 epoch 8 - iter 356/893 - loss 0.21652746 - time (sec): 9.25 - samples/sec: 10710.65 - lr: 0.000014 - momentum: 0.000000
2023-10-19 20:51:08,560 epoch 8 - iter 445/893 - loss 0.22339324 - time (sec): 11.51 - samples/sec: 10656.55 - lr: 0.000014 - momentum: 0.000000
2023-10-19 20:51:10,778 epoch 8 - iter 534/893 - loss 0.22339473 - time (sec): 13.73 - samples/sec: 10739.10 - lr: 0.000013 - momentum: 0.000000
2023-10-19 20:51:13,062 epoch 8 - iter 623/893 - loss 0.22183591 - time (sec): 16.02 - samples/sec: 10697.70 - lr: 0.000013 - momentum: 0.000000
2023-10-19 20:51:15,299 epoch 8 - iter 712/893 - loss 0.21960297 - time (sec): 18.25 - samples/sec: 10714.14 - lr: 0.000012 - momentum: 0.000000
2023-10-19 20:51:17,631 epoch 8 - iter 801/893 - loss 0.22015520 - time (sec): 20.59 - samples/sec: 10823.07 - lr: 0.000012 - momentum: 0.000000
2023-10-19 20:51:19,875 epoch 8 - iter 890/893 - loss 0.21912504 - time (sec): 22.83 - samples/sec: 10865.83 - lr: 0.000011 - momentum: 0.000000
2023-10-19 20:51:19,943 ----------------------------------------------------------------------------------------------------
2023-10-19 20:51:19,943 EPOCH 8 done: loss 0.2197 - lr: 0.000011
2023-10-19 20:51:22,761 DEV : loss 0.18704333901405334 - f1-score (micro avg) 0.5266
2023-10-19 20:51:22,774 ----------------------------------------------------------------------------------------------------
2023-10-19 20:51:25,008 epoch 9 - iter 89/893 - loss 0.22409608 - time (sec): 2.23 - samples/sec: 10905.75 - lr: 0.000011 - momentum: 0.000000
2023-10-19 20:51:27,275 epoch 9 - iter 178/893 - loss 0.21545283 - time (sec): 4.50 - samples/sec: 10899.19 - lr: 0.000010 - momentum: 0.000000
2023-10-19 20:51:29,461 epoch 9 - iter 267/893 - loss 0.22001584 - time (sec): 6.69 - samples/sec: 10951.22 - lr: 0.000009 - momentum: 0.000000
2023-10-19 20:51:31,753 epoch 9 - iter 356/893 - loss 0.22377309 - time (sec): 8.98 - samples/sec: 10998.28 - lr: 0.000009 - momentum: 0.000000
2023-10-19 20:51:34,036 epoch 9 - iter 445/893 - loss 0.22189321 - time (sec): 11.26 - samples/sec: 11162.57 - lr: 0.000008 - momentum: 0.000000
2023-10-19 20:51:36,280 epoch 9 - iter 534/893 - loss 0.21704728 - time (sec): 13.50 - samples/sec: 11064.48 - lr: 0.000008 - momentum: 0.000000
2023-10-19 20:51:38,531 epoch 9 - iter 623/893 - loss 0.21551115 - time (sec): 15.76 - samples/sec: 11080.72 - lr: 0.000007 - momentum: 0.000000
2023-10-19 20:51:40,762 epoch 9 - iter 712/893 - loss 0.21465534 - time (sec): 17.99 - samples/sec: 11037.93 - lr: 0.000007 - momentum: 0.000000
2023-10-19 20:51:43,131 epoch 9 - iter 801/893 - loss 0.21346279 - time (sec): 20.36 - samples/sec: 10987.62 - lr: 0.000006 - momentum: 0.000000
2023-10-19 20:51:45,480 epoch 9 - iter 890/893 - loss 0.21290729 - time (sec): 22.70 - samples/sec: 10917.64 - lr: 0.000006 - momentum: 0.000000
2023-10-19 20:51:45,552 ----------------------------------------------------------------------------------------------------
2023-10-19 20:51:45,552 EPOCH 9 done: loss 0.2128 - lr: 0.000006
2023-10-19 20:51:47,908 DEV : loss 0.18491902947425842 - f1-score (micro avg) 0.5228
2023-10-19 20:51:47,922 ----------------------------------------------------------------------------------------------------
2023-10-19 20:51:50,065 epoch 10 - iter 89/893 - loss 0.19783421 - time (sec): 2.14 - samples/sec: 12237.43 - lr: 0.000005 - momentum: 0.000000
2023-10-19 20:51:52,362 epoch 10 - iter 178/893 - loss 0.19939993 - time (sec): 4.44 - samples/sec: 11666.48 - lr: 0.000004 - momentum: 0.000000
2023-10-19 20:51:55,147 epoch 10 - iter 267/893 - loss 0.20297035 - time (sec): 7.22 - samples/sec: 10795.24 - lr: 0.000004 - momentum: 0.000000
2023-10-19 20:51:57,423 epoch 10 - iter 356/893 - loss 0.20510717 - time (sec): 9.50 - samples/sec: 10783.91 - lr: 0.000003 - momentum: 0.000000
2023-10-19 20:51:59,726 epoch 10 - iter 445/893 - loss 0.20467266 - time (sec): 11.80 - samples/sec: 10781.26 - lr: 0.000003 - momentum: 0.000000
2023-10-19 20:52:01,947 epoch 10 - iter 534/893 - loss 0.20537293 - time (sec): 14.02 - samples/sec: 10806.69 - lr: 0.000002 - momentum: 0.000000
2023-10-19 20:52:04,321 epoch 10 - iter 623/893 - loss 0.20318246 - time (sec): 16.40 - samples/sec: 10731.05 - lr: 0.000002 - momentum: 0.000000
2023-10-19 20:52:06,590 epoch 10 - iter 712/893 - loss 0.20334880 - time (sec): 18.67 - samples/sec: 10705.27 - lr: 0.000001 - momentum: 0.000000
2023-10-19 20:52:08,869 epoch 10 - iter 801/893 - loss 0.20499649 - time (sec): 20.95 - samples/sec: 10675.00 - lr: 0.000001 - momentum: 0.000000
2023-10-19 20:52:11,144 epoch 10 - iter 890/893 - loss 0.20573355 - time (sec): 23.22 - samples/sec: 10664.10 - lr: 0.000000 - momentum: 0.000000
2023-10-19 20:52:11,215 ----------------------------------------------------------------------------------------------------
2023-10-19 20:52:11,215 EPOCH 10 done: loss 0.2059 - lr: 0.000000
2023-10-19 20:52:13,598 DEV : loss 0.18561328947544098 - f1-score (micro avg) 0.5273
2023-10-19 20:52:13,641 ----------------------------------------------------------------------------------------------------
2023-10-19 20:52:13,642 Loading model from best epoch ...
2023-10-19 20:52:13,719 SequenceTagger predicts: Dictionary with 17 tags: O, S-PER, B-PER, E-PER, I-PER, S-LOC, B-LOC, E-LOC, I-LOC, S-ORG, B-ORG, E-ORG, I-ORG, S-HumanProd, B-HumanProd, E-HumanProd, I-HumanProd
2023-10-19 20:52:18,262
Results:
- F-score (micro) 0.4184
- F-score (macro) 0.256
- Accuracy 0.2729
By class:
precision recall f1-score support
LOC 0.4267 0.4968 0.4591 1095
PER 0.4579 0.4733 0.4655 1012
ORG 0.1359 0.0784 0.0995 357
HumanProd 0.0000 0.0000 0.0000 33
micro avg 0.4159 0.4209 0.4184 2497
macro avg 0.2551 0.2621 0.2560 2497
weighted avg 0.3921 0.4209 0.4042 2497
2023-10-19 20:52:18,262 ----------------------------------------------------------------------------------------------------