2023-10-15 21:09:01,347 ---------------------------------------------------------------------------------------------------- 2023-10-15 21:09:01,348 Model: "SequenceTagger( (embeddings): TransformerWordEmbeddings( (model): BertModel( (embeddings): BertEmbeddings( (word_embeddings): Embedding(32001, 768) (position_embeddings): Embedding(512, 768) (token_type_embeddings): Embedding(2, 768) (LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True) (dropout): Dropout(p=0.1, inplace=False) ) (encoder): BertEncoder( (layer): ModuleList( (0-11): 12 x BertLayer( (attention): BertAttention( (self): BertSelfAttention( (query): Linear(in_features=768, out_features=768, bias=True) (key): Linear(in_features=768, out_features=768, bias=True) (value): Linear(in_features=768, out_features=768, bias=True) (dropout): Dropout(p=0.1, inplace=False) ) (output): BertSelfOutput( (dense): Linear(in_features=768, out_features=768, bias=True) (LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True) (dropout): Dropout(p=0.1, inplace=False) ) ) (intermediate): BertIntermediate( (dense): Linear(in_features=768, out_features=3072, bias=True) (intermediate_act_fn): GELUActivation() ) (output): BertOutput( (dense): Linear(in_features=3072, out_features=768, bias=True) (LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True) (dropout): Dropout(p=0.1, inplace=False) ) ) ) ) (pooler): BertPooler( (dense): Linear(in_features=768, out_features=768, bias=True) (activation): Tanh() ) ) ) (locked_dropout): LockedDropout(p=0.5) (linear): Linear(in_features=768, out_features=17, bias=True) (loss_function): CrossEntropyLoss() )" 2023-10-15 21:09:01,348 ---------------------------------------------------------------------------------------------------- 2023-10-15 21:09:01,348 MultiCorpus: 20847 train + 1123 dev + 3350 test sentences - NER_HIPE_2022 Corpus: 20847 train + 1123 dev + 3350 test sentences - /root/.flair/datasets/ner_hipe_2022/v2.1/newseye/de/with_doc_seperator 2023-10-15 21:09:01,348 ---------------------------------------------------------------------------------------------------- 2023-10-15 21:09:01,348 Train: 20847 sentences 2023-10-15 21:09:01,348 (train_with_dev=False, train_with_test=False) 2023-10-15 21:09:01,348 ---------------------------------------------------------------------------------------------------- 2023-10-15 21:09:01,348 Training Params: 2023-10-15 21:09:01,349 - learning_rate: "5e-05" 2023-10-15 21:09:01,349 - mini_batch_size: "4" 2023-10-15 21:09:01,349 - max_epochs: "10" 2023-10-15 21:09:01,349 - shuffle: "True" 2023-10-15 21:09:01,349 ---------------------------------------------------------------------------------------------------- 2023-10-15 21:09:01,349 Plugins: 2023-10-15 21:09:01,349 - LinearScheduler | warmup_fraction: '0.1' 2023-10-15 21:09:01,349 ---------------------------------------------------------------------------------------------------- 2023-10-15 21:09:01,349 Final evaluation on model from best epoch (best-model.pt) 2023-10-15 21:09:01,349 - metric: "('micro avg', 'f1-score')" 2023-10-15 21:09:01,349 ---------------------------------------------------------------------------------------------------- 2023-10-15 21:09:01,349 Computation: 2023-10-15 21:09:01,349 - compute on device: cuda:0 2023-10-15 21:09:01,349 - embedding storage: none 2023-10-15 21:09:01,349 ---------------------------------------------------------------------------------------------------- 2023-10-15 21:09:01,349 Model training base path: "hmbench-newseye/de-dbmdz/bert-base-historic-multilingual-cased-bs4-wsFalse-e10-lr5e-05-poolingfirst-layers-1-crfFalse-4" 2023-10-15 21:09:01,349 ---------------------------------------------------------------------------------------------------- 2023-10-15 21:09:01,349 ---------------------------------------------------------------------------------------------------- 2023-10-15 21:09:26,219 epoch 1 - iter 521/5212 - loss 1.35418822 - time (sec): 24.87 - samples/sec: 1405.99 - lr: 0.000005 - momentum: 0.000000 2023-10-15 21:09:51,690 epoch 1 - iter 1042/5212 - loss 0.87173210 - time (sec): 50.34 - samples/sec: 1460.05 - lr: 0.000010 - momentum: 0.000000 2023-10-15 21:10:17,117 epoch 1 - iter 1563/5212 - loss 0.68359880 - time (sec): 75.77 - samples/sec: 1436.88 - lr: 0.000015 - momentum: 0.000000 2023-10-15 21:10:42,614 epoch 1 - iter 2084/5212 - loss 0.57980752 - time (sec): 101.26 - samples/sec: 1431.39 - lr: 0.000020 - momentum: 0.000000 2023-10-15 21:11:08,079 epoch 1 - iter 2605/5212 - loss 0.50915925 - time (sec): 126.73 - samples/sec: 1450.07 - lr: 0.000025 - momentum: 0.000000 2023-10-15 21:11:33,037 epoch 1 - iter 3126/5212 - loss 0.46695555 - time (sec): 151.69 - samples/sec: 1443.28 - lr: 0.000030 - momentum: 0.000000 2023-10-15 21:11:57,932 epoch 1 - iter 3647/5212 - loss 0.43068088 - time (sec): 176.58 - samples/sec: 1443.12 - lr: 0.000035 - momentum: 0.000000 2023-10-15 21:12:23,688 epoch 1 - iter 4168/5212 - loss 0.40031908 - time (sec): 202.34 - samples/sec: 1444.07 - lr: 0.000040 - momentum: 0.000000 2023-10-15 21:12:48,715 epoch 1 - iter 4689/5212 - loss 0.38161901 - time (sec): 227.37 - samples/sec: 1445.17 - lr: 0.000045 - momentum: 0.000000 2023-10-15 21:13:15,489 epoch 1 - iter 5210/5212 - loss 0.36415325 - time (sec): 254.14 - samples/sec: 1445.58 - lr: 0.000050 - momentum: 0.000000 2023-10-15 21:13:15,572 ---------------------------------------------------------------------------------------------------- 2023-10-15 21:13:15,573 EPOCH 1 done: loss 0.3641 - lr: 0.000050 2023-10-15 21:13:21,332 DEV : loss 0.12803316116333008 - f1-score (micro avg) 0.2579 2023-10-15 21:13:21,357 saving best model 2023-10-15 21:13:21,729 ---------------------------------------------------------------------------------------------------- 2023-10-15 21:13:47,367 epoch 2 - iter 521/5212 - loss 0.21777270 - time (sec): 25.64 - samples/sec: 1486.64 - lr: 0.000049 - momentum: 0.000000 2023-10-15 21:14:13,018 epoch 2 - iter 1042/5212 - loss 0.18749115 - time (sec): 51.29 - samples/sec: 1487.45 - lr: 0.000049 - momentum: 0.000000 2023-10-15 21:14:38,321 epoch 2 - iter 1563/5212 - loss 0.18643203 - time (sec): 76.59 - samples/sec: 1481.92 - lr: 0.000048 - momentum: 0.000000 2023-10-15 21:15:03,555 epoch 2 - iter 2084/5212 - loss 0.18814099 - time (sec): 101.82 - samples/sec: 1457.01 - lr: 0.000048 - momentum: 0.000000 2023-10-15 21:15:28,943 epoch 2 - iter 2605/5212 - loss 0.19599826 - time (sec): 127.21 - samples/sec: 1459.62 - lr: 0.000047 - momentum: 0.000000 2023-10-15 21:15:53,824 epoch 2 - iter 3126/5212 - loss 0.19471761 - time (sec): 152.09 - samples/sec: 1454.98 - lr: 0.000047 - momentum: 0.000000 2023-10-15 21:16:18,931 epoch 2 - iter 3647/5212 - loss 0.19295220 - time (sec): 177.20 - samples/sec: 1466.52 - lr: 0.000046 - momentum: 0.000000 2023-10-15 21:16:42,937 epoch 2 - iter 4168/5212 - loss 0.19677761 - time (sec): 201.21 - samples/sec: 1467.46 - lr: 0.000046 - momentum: 0.000000 2023-10-15 21:17:07,768 epoch 2 - iter 4689/5212 - loss 0.19258009 - time (sec): 226.04 - samples/sec: 1476.98 - lr: 0.000045 - momentum: 0.000000 2023-10-15 21:17:31,714 epoch 2 - iter 5210/5212 - loss 0.19109860 - time (sec): 249.98 - samples/sec: 1469.65 - lr: 0.000044 - momentum: 0.000000 2023-10-15 21:17:31,800 ---------------------------------------------------------------------------------------------------- 2023-10-15 21:17:31,800 EPOCH 2 done: loss 0.1911 - lr: 0.000044 2023-10-15 21:17:40,715 DEV : loss 0.12834765017032623 - f1-score (micro avg) 0.3234 2023-10-15 21:17:40,741 saving best model 2023-10-15 21:17:41,160 ---------------------------------------------------------------------------------------------------- 2023-10-15 21:18:06,231 epoch 3 - iter 521/5212 - loss 0.16680848 - time (sec): 25.07 - samples/sec: 1455.11 - lr: 0.000044 - momentum: 0.000000 2023-10-15 21:18:31,447 epoch 3 - iter 1042/5212 - loss 0.15227978 - time (sec): 50.28 - samples/sec: 1454.38 - lr: 0.000043 - momentum: 0.000000 2023-10-15 21:18:56,965 epoch 3 - iter 1563/5212 - loss 0.15045944 - time (sec): 75.80 - samples/sec: 1457.53 - lr: 0.000043 - momentum: 0.000000 2023-10-15 21:19:22,788 epoch 3 - iter 2084/5212 - loss 0.15461880 - time (sec): 101.63 - samples/sec: 1460.91 - lr: 0.000042 - momentum: 0.000000 2023-10-15 21:19:47,794 epoch 3 - iter 2605/5212 - loss 0.14819648 - time (sec): 126.63 - samples/sec: 1460.43 - lr: 0.000042 - momentum: 0.000000 2023-10-15 21:20:12,632 epoch 3 - iter 3126/5212 - loss 0.14802839 - time (sec): 151.47 - samples/sec: 1457.06 - lr: 0.000041 - momentum: 0.000000 2023-10-15 21:20:37,526 epoch 3 - iter 3647/5212 - loss 0.14888939 - time (sec): 176.36 - samples/sec: 1455.73 - lr: 0.000041 - momentum: 0.000000 2023-10-15 21:21:03,199 epoch 3 - iter 4168/5212 - loss 0.14477081 - time (sec): 202.04 - samples/sec: 1463.02 - lr: 0.000040 - momentum: 0.000000 2023-10-15 21:21:28,311 epoch 3 - iter 4689/5212 - loss 0.14379556 - time (sec): 227.15 - samples/sec: 1463.51 - lr: 0.000039 - momentum: 0.000000 2023-10-15 21:21:53,162 epoch 3 - iter 5210/5212 - loss 0.14317646 - time (sec): 252.00 - samples/sec: 1457.92 - lr: 0.000039 - momentum: 0.000000 2023-10-15 21:21:53,250 ---------------------------------------------------------------------------------------------------- 2023-10-15 21:21:53,250 EPOCH 3 done: loss 0.1432 - lr: 0.000039 2023-10-15 21:22:01,515 DEV : loss 0.16754454374313354 - f1-score (micro avg) 0.3436 2023-10-15 21:22:01,544 saving best model 2023-10-15 21:22:02,149 ---------------------------------------------------------------------------------------------------- 2023-10-15 21:22:27,751 epoch 4 - iter 521/5212 - loss 0.11278945 - time (sec): 25.60 - samples/sec: 1434.22 - lr: 0.000038 - momentum: 0.000000 2023-10-15 21:22:52,628 epoch 4 - iter 1042/5212 - loss 0.10789964 - time (sec): 50.48 - samples/sec: 1413.48 - lr: 0.000038 - momentum: 0.000000 2023-10-15 21:23:17,727 epoch 4 - iter 1563/5212 - loss 0.11007241 - time (sec): 75.58 - samples/sec: 1421.41 - lr: 0.000037 - momentum: 0.000000 2023-10-15 21:23:43,808 epoch 4 - iter 2084/5212 - loss 0.10613979 - time (sec): 101.66 - samples/sec: 1422.18 - lr: 0.000037 - momentum: 0.000000 2023-10-15 21:24:08,589 epoch 4 - iter 2605/5212 - loss 0.10636173 - time (sec): 126.44 - samples/sec: 1423.89 - lr: 0.000036 - momentum: 0.000000 2023-10-15 21:24:33,295 epoch 4 - iter 3126/5212 - loss 0.11023475 - time (sec): 151.14 - samples/sec: 1428.26 - lr: 0.000036 - momentum: 0.000000 2023-10-15 21:24:58,631 epoch 4 - iter 3647/5212 - loss 0.11185702 - time (sec): 176.48 - samples/sec: 1441.00 - lr: 0.000035 - momentum: 0.000000 2023-10-15 21:25:23,843 epoch 4 - iter 4168/5212 - loss 0.11237802 - time (sec): 201.69 - samples/sec: 1437.97 - lr: 0.000034 - momentum: 0.000000 2023-10-15 21:25:49,127 epoch 4 - iter 4689/5212 - loss 0.11101056 - time (sec): 226.98 - samples/sec: 1445.77 - lr: 0.000034 - momentum: 0.000000 2023-10-15 21:26:14,931 epoch 4 - iter 5210/5212 - loss 0.10870647 - time (sec): 252.78 - samples/sec: 1453.35 - lr: 0.000033 - momentum: 0.000000 2023-10-15 21:26:15,016 ---------------------------------------------------------------------------------------------------- 2023-10-15 21:26:15,016 EPOCH 4 done: loss 0.1087 - lr: 0.000033 2023-10-15 21:26:23,320 DEV : loss 0.23736144602298737 - f1-score (micro avg) 0.3947 2023-10-15 21:26:23,351 saving best model 2023-10-15 21:26:23,983 ---------------------------------------------------------------------------------------------------- 2023-10-15 21:26:48,489 epoch 5 - iter 521/5212 - loss 0.07596090 - time (sec): 24.50 - samples/sec: 1403.26 - lr: 0.000033 - momentum: 0.000000 2023-10-15 21:27:13,356 epoch 5 - iter 1042/5212 - loss 0.08242943 - time (sec): 49.37 - samples/sec: 1408.42 - lr: 0.000032 - momentum: 0.000000 2023-10-15 21:27:38,336 epoch 5 - iter 1563/5212 - loss 0.08169838 - time (sec): 74.35 - samples/sec: 1414.40 - lr: 0.000032 - momentum: 0.000000 2023-10-15 21:28:03,345 epoch 5 - iter 2084/5212 - loss 0.08443360 - time (sec): 99.36 - samples/sec: 1432.55 - lr: 0.000031 - momentum: 0.000000 2023-10-15 21:28:29,132 epoch 5 - iter 2605/5212 - loss 0.08304632 - time (sec): 125.15 - samples/sec: 1438.85 - lr: 0.000031 - momentum: 0.000000 2023-10-15 21:28:54,172 epoch 5 - iter 3126/5212 - loss 0.08296829 - time (sec): 150.18 - samples/sec: 1435.67 - lr: 0.000030 - momentum: 0.000000 2023-10-15 21:29:19,404 epoch 5 - iter 3647/5212 - loss 0.08179795 - time (sec): 175.42 - samples/sec: 1442.82 - lr: 0.000029 - momentum: 0.000000 2023-10-15 21:29:45,496 epoch 5 - iter 4168/5212 - loss 0.08011600 - time (sec): 201.51 - samples/sec: 1445.91 - lr: 0.000029 - momentum: 0.000000 2023-10-15 21:30:11,175 epoch 5 - iter 4689/5212 - loss 0.07905627 - time (sec): 227.19 - samples/sec: 1449.26 - lr: 0.000028 - momentum: 0.000000 2023-10-15 21:30:36,583 epoch 5 - iter 5210/5212 - loss 0.08040599 - time (sec): 252.60 - samples/sec: 1454.09 - lr: 0.000028 - momentum: 0.000000 2023-10-15 21:30:36,676 ---------------------------------------------------------------------------------------------------- 2023-10-15 21:30:36,676 EPOCH 5 done: loss 0.0804 - lr: 0.000028 2023-10-15 21:30:45,177 DEV : loss 0.32270362973213196 - f1-score (micro avg) 0.3444 2023-10-15 21:30:45,211 ---------------------------------------------------------------------------------------------------- 2023-10-15 21:31:10,971 epoch 6 - iter 521/5212 - loss 0.07154848 - time (sec): 25.76 - samples/sec: 1455.64 - lr: 0.000027 - momentum: 0.000000 2023-10-15 21:31:36,495 epoch 6 - iter 1042/5212 - loss 0.08555596 - time (sec): 51.28 - samples/sec: 1482.06 - lr: 0.000027 - momentum: 0.000000 2023-10-15 21:32:01,396 epoch 6 - iter 1563/5212 - loss 0.07751863 - time (sec): 76.18 - samples/sec: 1462.55 - lr: 0.000026 - momentum: 0.000000 2023-10-15 21:32:26,938 epoch 6 - iter 2084/5212 - loss 0.07146560 - time (sec): 101.72 - samples/sec: 1481.82 - lr: 0.000026 - momentum: 0.000000 2023-10-15 21:32:51,955 epoch 6 - iter 2605/5212 - loss 0.06915037 - time (sec): 126.74 - samples/sec: 1475.93 - lr: 0.000025 - momentum: 0.000000 2023-10-15 21:33:17,101 epoch 6 - iter 3126/5212 - loss 0.06782335 - time (sec): 151.89 - samples/sec: 1469.97 - lr: 0.000024 - momentum: 0.000000 2023-10-15 21:33:41,720 epoch 6 - iter 3647/5212 - loss 0.06809506 - time (sec): 176.51 - samples/sec: 1461.46 - lr: 0.000024 - momentum: 0.000000 2023-10-15 21:34:06,826 epoch 6 - iter 4168/5212 - loss 0.06739184 - time (sec): 201.61 - samples/sec: 1455.29 - lr: 0.000023 - momentum: 0.000000 2023-10-15 21:34:31,898 epoch 6 - iter 4689/5212 - loss 0.06720584 - time (sec): 226.68 - samples/sec: 1449.42 - lr: 0.000023 - momentum: 0.000000 2023-10-15 21:34:57,524 epoch 6 - iter 5210/5212 - loss 0.06716812 - time (sec): 252.31 - samples/sec: 1454.78 - lr: 0.000022 - momentum: 0.000000 2023-10-15 21:34:57,655 ---------------------------------------------------------------------------------------------------- 2023-10-15 21:34:57,655 EPOCH 6 done: loss 0.0671 - lr: 0.000022 2023-10-15 21:35:06,686 DEV : loss 0.3467041552066803 - f1-score (micro avg) 0.3468 2023-10-15 21:35:06,712 ---------------------------------------------------------------------------------------------------- 2023-10-15 21:35:31,692 epoch 7 - iter 521/5212 - loss 0.04008855 - time (sec): 24.98 - samples/sec: 1510.39 - lr: 0.000022 - momentum: 0.000000 2023-10-15 21:35:57,463 epoch 7 - iter 1042/5212 - loss 0.04042192 - time (sec): 50.75 - samples/sec: 1495.14 - lr: 0.000021 - momentum: 0.000000 2023-10-15 21:36:22,413 epoch 7 - iter 1563/5212 - loss 0.04452128 - time (sec): 75.70 - samples/sec: 1475.67 - lr: 0.000021 - momentum: 0.000000 2023-10-15 21:36:47,471 epoch 7 - iter 2084/5212 - loss 0.04517209 - time (sec): 100.76 - samples/sec: 1444.65 - lr: 0.000020 - momentum: 0.000000 2023-10-15 21:37:12,770 epoch 7 - iter 2605/5212 - loss 0.04565804 - time (sec): 126.06 - samples/sec: 1454.67 - lr: 0.000019 - momentum: 0.000000 2023-10-15 21:37:37,812 epoch 7 - iter 3126/5212 - loss 0.04487907 - time (sec): 151.10 - samples/sec: 1446.65 - lr: 0.000019 - momentum: 0.000000 2023-10-15 21:38:03,557 epoch 7 - iter 3647/5212 - loss 0.04501311 - time (sec): 176.84 - samples/sec: 1459.11 - lr: 0.000018 - momentum: 0.000000 2023-10-15 21:38:28,512 epoch 7 - iter 4168/5212 - loss 0.04461942 - time (sec): 201.80 - samples/sec: 1448.64 - lr: 0.000018 - momentum: 0.000000 2023-10-15 21:38:54,620 epoch 7 - iter 4689/5212 - loss 0.04411942 - time (sec): 227.91 - samples/sec: 1452.78 - lr: 0.000017 - momentum: 0.000000 2023-10-15 21:39:19,467 epoch 7 - iter 5210/5212 - loss 0.04338822 - time (sec): 252.75 - samples/sec: 1453.50 - lr: 0.000017 - momentum: 0.000000 2023-10-15 21:39:19,553 ---------------------------------------------------------------------------------------------------- 2023-10-15 21:39:19,553 EPOCH 7 done: loss 0.0434 - lr: 0.000017 2023-10-15 21:39:28,681 DEV : loss 0.3810468912124634 - f1-score (micro avg) 0.3437 2023-10-15 21:39:28,710 ---------------------------------------------------------------------------------------------------- 2023-10-15 21:39:53,580 epoch 8 - iter 521/5212 - loss 0.03094489 - time (sec): 24.87 - samples/sec: 1372.29 - lr: 0.000016 - momentum: 0.000000 2023-10-15 21:40:18,965 epoch 8 - iter 1042/5212 - loss 0.03266070 - time (sec): 50.25 - samples/sec: 1449.21 - lr: 0.000016 - momentum: 0.000000 2023-10-15 21:40:44,068 epoch 8 - iter 1563/5212 - loss 0.03082353 - time (sec): 75.36 - samples/sec: 1445.56 - lr: 0.000015 - momentum: 0.000000 2023-10-15 21:41:09,236 epoch 8 - iter 2084/5212 - loss 0.03189132 - time (sec): 100.52 - samples/sec: 1443.37 - lr: 0.000014 - momentum: 0.000000 2023-10-15 21:41:34,729 epoch 8 - iter 2605/5212 - loss 0.03187351 - time (sec): 126.02 - samples/sec: 1451.00 - lr: 0.000014 - momentum: 0.000000 2023-10-15 21:42:00,318 epoch 8 - iter 3126/5212 - loss 0.03141481 - time (sec): 151.61 - samples/sec: 1456.23 - lr: 0.000013 - momentum: 0.000000 2023-10-15 21:42:25,218 epoch 8 - iter 3647/5212 - loss 0.03152484 - time (sec): 176.51 - samples/sec: 1459.90 - lr: 0.000013 - momentum: 0.000000 2023-10-15 21:42:50,184 epoch 8 - iter 4168/5212 - loss 0.03161608 - time (sec): 201.47 - samples/sec: 1459.49 - lr: 0.000012 - momentum: 0.000000 2023-10-15 21:43:15,609 epoch 8 - iter 4689/5212 - loss 0.03114092 - time (sec): 226.90 - samples/sec: 1458.94 - lr: 0.000012 - momentum: 0.000000 2023-10-15 21:43:40,617 epoch 8 - iter 5210/5212 - loss 0.03225916 - time (sec): 251.91 - samples/sec: 1457.75 - lr: 0.000011 - momentum: 0.000000 2023-10-15 21:43:40,714 ---------------------------------------------------------------------------------------------------- 2023-10-15 21:43:40,715 EPOCH 8 done: loss 0.0323 - lr: 0.000011 2023-10-15 21:43:49,762 DEV : loss 0.3573097884654999 - f1-score (micro avg) 0.3737 2023-10-15 21:43:49,788 ---------------------------------------------------------------------------------------------------- 2023-10-15 21:44:15,283 epoch 9 - iter 521/5212 - loss 0.02137501 - time (sec): 25.49 - samples/sec: 1557.65 - lr: 0.000011 - momentum: 0.000000 2023-10-15 21:44:40,744 epoch 9 - iter 1042/5212 - loss 0.02521327 - time (sec): 50.95 - samples/sec: 1532.21 - lr: 0.000010 - momentum: 0.000000 2023-10-15 21:45:06,007 epoch 9 - iter 1563/5212 - loss 0.02417142 - time (sec): 76.22 - samples/sec: 1518.03 - lr: 0.000009 - momentum: 0.000000 2023-10-15 21:45:30,713 epoch 9 - iter 2084/5212 - loss 0.02264293 - time (sec): 100.92 - samples/sec: 1505.23 - lr: 0.000009 - momentum: 0.000000 2023-10-15 21:45:55,673 epoch 9 - iter 2605/5212 - loss 0.02338377 - time (sec): 125.88 - samples/sec: 1495.02 - lr: 0.000008 - momentum: 0.000000 2023-10-15 21:46:20,343 epoch 9 - iter 3126/5212 - loss 0.02317935 - time (sec): 150.55 - samples/sec: 1467.58 - lr: 0.000008 - momentum: 0.000000 2023-10-15 21:46:45,572 epoch 9 - iter 3647/5212 - loss 0.02373783 - time (sec): 175.78 - samples/sec: 1469.70 - lr: 0.000007 - momentum: 0.000000 2023-10-15 21:47:10,321 epoch 9 - iter 4168/5212 - loss 0.02314043 - time (sec): 200.53 - samples/sec: 1469.15 - lr: 0.000007 - momentum: 0.000000 2023-10-15 21:47:35,423 epoch 9 - iter 4689/5212 - loss 0.02314496 - time (sec): 225.63 - samples/sec: 1467.76 - lr: 0.000006 - momentum: 0.000000 2023-10-15 21:48:00,356 epoch 9 - iter 5210/5212 - loss 0.02267322 - time (sec): 250.57 - samples/sec: 1466.11 - lr: 0.000006 - momentum: 0.000000 2023-10-15 21:48:00,446 ---------------------------------------------------------------------------------------------------- 2023-10-15 21:48:00,446 EPOCH 9 done: loss 0.0227 - lr: 0.000006 2023-10-15 21:48:09,481 DEV : loss 0.42241212725639343 - f1-score (micro avg) 0.3776 2023-10-15 21:48:09,509 ---------------------------------------------------------------------------------------------------- 2023-10-15 21:48:34,392 epoch 10 - iter 521/5212 - loss 0.01795075 - time (sec): 24.88 - samples/sec: 1427.01 - lr: 0.000005 - momentum: 0.000000 2023-10-15 21:48:59,088 epoch 10 - iter 1042/5212 - loss 0.01766837 - time (sec): 49.58 - samples/sec: 1399.01 - lr: 0.000004 - momentum: 0.000000 2023-10-15 21:49:24,146 epoch 10 - iter 1563/5212 - loss 0.01546036 - time (sec): 74.64 - samples/sec: 1428.03 - lr: 0.000004 - momentum: 0.000000 2023-10-15 21:49:48,972 epoch 10 - iter 2084/5212 - loss 0.01551573 - time (sec): 99.46 - samples/sec: 1433.30 - lr: 0.000003 - momentum: 0.000000 2023-10-15 21:50:13,944 epoch 10 - iter 2605/5212 - loss 0.01520813 - time (sec): 124.43 - samples/sec: 1436.94 - lr: 0.000003 - momentum: 0.000000 2023-10-15 21:50:39,014 epoch 10 - iter 3126/5212 - loss 0.01608964 - time (sec): 149.50 - samples/sec: 1440.85 - lr: 0.000002 - momentum: 0.000000 2023-10-15 21:51:04,117 epoch 10 - iter 3647/5212 - loss 0.01586596 - time (sec): 174.61 - samples/sec: 1446.98 - lr: 0.000002 - momentum: 0.000000 2023-10-15 21:51:29,596 epoch 10 - iter 4168/5212 - loss 0.01541581 - time (sec): 200.09 - samples/sec: 1456.61 - lr: 0.000001 - momentum: 0.000000 2023-10-15 21:51:54,619 epoch 10 - iter 4689/5212 - loss 0.01533530 - time (sec): 225.11 - samples/sec: 1454.47 - lr: 0.000001 - momentum: 0.000000 2023-10-15 21:52:20,313 epoch 10 - iter 5210/5212 - loss 0.01573070 - time (sec): 250.80 - samples/sec: 1464.56 - lr: 0.000000 - momentum: 0.000000 2023-10-15 21:52:20,399 ---------------------------------------------------------------------------------------------------- 2023-10-15 21:52:20,400 EPOCH 10 done: loss 0.0157 - lr: 0.000000 2023-10-15 21:52:29,511 DEV : loss 0.42857325077056885 - f1-score (micro avg) 0.3698 2023-10-15 21:52:29,962 ---------------------------------------------------------------------------------------------------- 2023-10-15 21:52:29,963 Loading model from best epoch ... 2023-10-15 21:52:31,489 SequenceTagger predicts: Dictionary with 17 tags: O, S-LOC, B-LOC, E-LOC, I-LOC, S-PER, B-PER, E-PER, I-PER, S-ORG, B-ORG, E-ORG, I-ORG, S-HumanProd, B-HumanProd, E-HumanProd, I-HumanProd 2023-10-15 21:52:47,128 Results: - F-score (micro) 0.4359 - F-score (macro) 0.2917 - Accuracy 0.283 By class: precision recall f1-score support LOC 0.5601 0.5338 0.5466 1214 PER 0.3793 0.3676 0.3734 808 ORG 0.2152 0.2890 0.2467 353 HumanProd 0.0000 0.0000 0.0000 15 micro avg 0.4337 0.4381 0.4359 2390 macro avg 0.2886 0.2976 0.2917 2390 weighted avg 0.4445 0.4381 0.4403 2390 2023-10-15 21:52:47,129 ----------------------------------------------------------------------------------------------------