|
2023-10-17 17:40:53,736 ---------------------------------------------------------------------------------------------------- |
|
2023-10-17 17:40:53,737 Model: "SequenceTagger( |
|
(embeddings): TransformerWordEmbeddings( |
|
(model): ElectraModel( |
|
(embeddings): ElectraEmbeddings( |
|
(word_embeddings): Embedding(32001, 768) |
|
(position_embeddings): Embedding(512, 768) |
|
(token_type_embeddings): Embedding(2, 768) |
|
(LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True) |
|
(dropout): Dropout(p=0.1, inplace=False) |
|
) |
|
(encoder): ElectraEncoder( |
|
(layer): ModuleList( |
|
(0-11): 12 x ElectraLayer( |
|
(attention): ElectraAttention( |
|
(self): ElectraSelfAttention( |
|
(query): Linear(in_features=768, out_features=768, bias=True) |
|
(key): Linear(in_features=768, out_features=768, bias=True) |
|
(value): Linear(in_features=768, out_features=768, bias=True) |
|
(dropout): Dropout(p=0.1, inplace=False) |
|
) |
|
(output): ElectraSelfOutput( |
|
(dense): Linear(in_features=768, out_features=768, bias=True) |
|
(LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True) |
|
(dropout): Dropout(p=0.1, inplace=False) |
|
) |
|
) |
|
(intermediate): ElectraIntermediate( |
|
(dense): Linear(in_features=768, out_features=3072, bias=True) |
|
(intermediate_act_fn): GELUActivation() |
|
) |
|
(output): ElectraOutput( |
|
(dense): Linear(in_features=3072, out_features=768, bias=True) |
|
(LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True) |
|
(dropout): Dropout(p=0.1, inplace=False) |
|
) |
|
) |
|
) |
|
) |
|
) |
|
) |
|
(locked_dropout): LockedDropout(p=0.5) |
|
(linear): Linear(in_features=768, out_features=17, bias=True) |
|
(loss_function): CrossEntropyLoss() |
|
)" |
|
2023-10-17 17:40:53,737 ---------------------------------------------------------------------------------------------------- |
|
2023-10-17 17:40:53,738 MultiCorpus: 1166 train + 165 dev + 415 test sentences |
|
- NER_HIPE_2022 Corpus: 1166 train + 165 dev + 415 test sentences - /root/.flair/datasets/ner_hipe_2022/v2.1/newseye/fi/with_doc_seperator |
|
2023-10-17 17:40:53,738 ---------------------------------------------------------------------------------------------------- |
|
2023-10-17 17:40:53,738 Train: 1166 sentences |
|
2023-10-17 17:40:53,738 (train_with_dev=False, train_with_test=False) |
|
2023-10-17 17:40:53,738 ---------------------------------------------------------------------------------------------------- |
|
2023-10-17 17:40:53,738 Training Params: |
|
2023-10-17 17:40:53,738 - learning_rate: "3e-05" |
|
2023-10-17 17:40:53,738 - mini_batch_size: "4" |
|
2023-10-17 17:40:53,738 - max_epochs: "10" |
|
2023-10-17 17:40:53,738 - shuffle: "True" |
|
2023-10-17 17:40:53,738 ---------------------------------------------------------------------------------------------------- |
|
2023-10-17 17:40:53,738 Plugins: |
|
2023-10-17 17:40:53,738 - TensorboardLogger |
|
2023-10-17 17:40:53,738 - LinearScheduler | warmup_fraction: '0.1' |
|
2023-10-17 17:40:53,738 ---------------------------------------------------------------------------------------------------- |
|
2023-10-17 17:40:53,738 Final evaluation on model from best epoch (best-model.pt) |
|
2023-10-17 17:40:53,738 - metric: "('micro avg', 'f1-score')" |
|
2023-10-17 17:40:53,738 ---------------------------------------------------------------------------------------------------- |
|
2023-10-17 17:40:53,738 Computation: |
|
2023-10-17 17:40:53,738 - compute on device: cuda:0 |
|
2023-10-17 17:40:53,738 - embedding storage: none |
|
2023-10-17 17:40:53,738 ---------------------------------------------------------------------------------------------------- |
|
2023-10-17 17:40:53,738 Model training base path: "hmbench-newseye/fi-hmteams/teams-base-historic-multilingual-discriminator-bs4-wsFalse-e10-lr3e-05-poolingfirst-layers-1-crfFalse-2" |
|
2023-10-17 17:40:53,738 ---------------------------------------------------------------------------------------------------- |
|
2023-10-17 17:40:53,738 ---------------------------------------------------------------------------------------------------- |
|
2023-10-17 17:40:53,739 Logging anything other than scalars to TensorBoard is currently not supported. |
|
2023-10-17 17:40:55,392 epoch 1 - iter 29/292 - loss 3.68250305 - time (sec): 1.65 - samples/sec: 2700.70 - lr: 0.000003 - momentum: 0.000000 |
|
2023-10-17 17:40:57,068 epoch 1 - iter 58/292 - loss 3.14319074 - time (sec): 3.33 - samples/sec: 2759.47 - lr: 0.000006 - momentum: 0.000000 |
|
2023-10-17 17:40:58,833 epoch 1 - iter 87/292 - loss 2.40373810 - time (sec): 5.09 - samples/sec: 2632.37 - lr: 0.000009 - momentum: 0.000000 |
|
2023-10-17 17:41:00,337 epoch 1 - iter 116/292 - loss 1.99462913 - time (sec): 6.60 - samples/sec: 2597.35 - lr: 0.000012 - momentum: 0.000000 |
|
2023-10-17 17:41:01,905 epoch 1 - iter 145/292 - loss 1.71324812 - time (sec): 8.17 - samples/sec: 2605.46 - lr: 0.000015 - momentum: 0.000000 |
|
2023-10-17 17:41:03,463 epoch 1 - iter 174/292 - loss 1.49452360 - time (sec): 9.72 - samples/sec: 2614.47 - lr: 0.000018 - momentum: 0.000000 |
|
2023-10-17 17:41:05,137 epoch 1 - iter 203/292 - loss 1.32945702 - time (sec): 11.40 - samples/sec: 2608.52 - lr: 0.000021 - momentum: 0.000000 |
|
2023-10-17 17:41:06,708 epoch 1 - iter 232/292 - loss 1.21768316 - time (sec): 12.97 - samples/sec: 2597.01 - lr: 0.000024 - momentum: 0.000000 |
|
2023-10-17 17:41:08,391 epoch 1 - iter 261/292 - loss 1.10270361 - time (sec): 14.65 - samples/sec: 2622.03 - lr: 0.000027 - momentum: 0.000000 |
|
2023-10-17 17:41:10,413 epoch 1 - iter 290/292 - loss 1.00803344 - time (sec): 16.67 - samples/sec: 2653.62 - lr: 0.000030 - momentum: 0.000000 |
|
2023-10-17 17:41:10,509 ---------------------------------------------------------------------------------------------------- |
|
2023-10-17 17:41:10,509 EPOCH 1 done: loss 1.0055 - lr: 0.000030 |
|
2023-10-17 17:41:11,541 DEV : loss 0.18136057257652283 - f1-score (micro avg) 0.476 |
|
2023-10-17 17:41:11,546 saving best model |
|
2023-10-17 17:41:11,880 ---------------------------------------------------------------------------------------------------- |
|
2023-10-17 17:41:13,724 epoch 2 - iter 29/292 - loss 0.27089229 - time (sec): 1.84 - samples/sec: 2733.99 - lr: 0.000030 - momentum: 0.000000 |
|
2023-10-17 17:41:15,282 epoch 2 - iter 58/292 - loss 0.26909781 - time (sec): 3.40 - samples/sec: 2563.95 - lr: 0.000029 - momentum: 0.000000 |
|
2023-10-17 17:41:16,787 epoch 2 - iter 87/292 - loss 0.25031740 - time (sec): 4.91 - samples/sec: 2554.96 - lr: 0.000029 - momentum: 0.000000 |
|
2023-10-17 17:41:18,259 epoch 2 - iter 116/292 - loss 0.23780745 - time (sec): 6.38 - samples/sec: 2539.34 - lr: 0.000029 - momentum: 0.000000 |
|
2023-10-17 17:41:20,070 epoch 2 - iter 145/292 - loss 0.24363146 - time (sec): 8.19 - samples/sec: 2617.92 - lr: 0.000028 - momentum: 0.000000 |
|
2023-10-17 17:41:21,741 epoch 2 - iter 174/292 - loss 0.22791139 - time (sec): 9.86 - samples/sec: 2566.47 - lr: 0.000028 - momentum: 0.000000 |
|
2023-10-17 17:41:23,404 epoch 2 - iter 203/292 - loss 0.21320925 - time (sec): 11.52 - samples/sec: 2565.02 - lr: 0.000028 - momentum: 0.000000 |
|
2023-10-17 17:41:25,029 epoch 2 - iter 232/292 - loss 0.20829472 - time (sec): 13.15 - samples/sec: 2566.07 - lr: 0.000027 - momentum: 0.000000 |
|
2023-10-17 17:41:26,841 epoch 2 - iter 261/292 - loss 0.19624855 - time (sec): 14.96 - samples/sec: 2612.33 - lr: 0.000027 - momentum: 0.000000 |
|
2023-10-17 17:41:28,696 epoch 2 - iter 290/292 - loss 0.19564262 - time (sec): 16.81 - samples/sec: 2633.34 - lr: 0.000027 - momentum: 0.000000 |
|
2023-10-17 17:41:28,790 ---------------------------------------------------------------------------------------------------- |
|
2023-10-17 17:41:28,791 EPOCH 2 done: loss 0.1950 - lr: 0.000027 |
|
2023-10-17 17:41:30,017 DEV : loss 0.11772378534078598 - f1-score (micro avg) 0.6652 |
|
2023-10-17 17:41:30,022 saving best model |
|
2023-10-17 17:41:30,470 ---------------------------------------------------------------------------------------------------- |
|
2023-10-17 17:41:31,956 epoch 3 - iter 29/292 - loss 0.12437244 - time (sec): 1.48 - samples/sec: 2385.06 - lr: 0.000026 - momentum: 0.000000 |
|
2023-10-17 17:41:33,783 epoch 3 - iter 58/292 - loss 0.11415926 - time (sec): 3.31 - samples/sec: 2643.48 - lr: 0.000026 - momentum: 0.000000 |
|
2023-10-17 17:41:35,552 epoch 3 - iter 87/292 - loss 0.10815903 - time (sec): 5.08 - samples/sec: 2660.89 - lr: 0.000026 - momentum: 0.000000 |
|
2023-10-17 17:41:37,274 epoch 3 - iter 116/292 - loss 0.10540997 - time (sec): 6.80 - samples/sec: 2621.18 - lr: 0.000025 - momentum: 0.000000 |
|
2023-10-17 17:41:38,909 epoch 3 - iter 145/292 - loss 0.11397380 - time (sec): 8.43 - samples/sec: 2586.10 - lr: 0.000025 - momentum: 0.000000 |
|
2023-10-17 17:41:40,496 epoch 3 - iter 174/292 - loss 0.11201308 - time (sec): 10.02 - samples/sec: 2602.24 - lr: 0.000025 - momentum: 0.000000 |
|
2023-10-17 17:41:42,105 epoch 3 - iter 203/292 - loss 0.11590502 - time (sec): 11.63 - samples/sec: 2571.47 - lr: 0.000024 - momentum: 0.000000 |
|
2023-10-17 17:41:43,904 epoch 3 - iter 232/292 - loss 0.11396581 - time (sec): 13.43 - samples/sec: 2609.79 - lr: 0.000024 - momentum: 0.000000 |
|
2023-10-17 17:41:45,647 epoch 3 - iter 261/292 - loss 0.11202885 - time (sec): 15.17 - samples/sec: 2616.01 - lr: 0.000024 - momentum: 0.000000 |
|
2023-10-17 17:41:47,313 epoch 3 - iter 290/292 - loss 0.11160831 - time (sec): 16.84 - samples/sec: 2630.43 - lr: 0.000023 - momentum: 0.000000 |
|
2023-10-17 17:41:47,395 ---------------------------------------------------------------------------------------------------- |
|
2023-10-17 17:41:47,395 EPOCH 3 done: loss 0.1113 - lr: 0.000023 |
|
2023-10-17 17:41:48,677 DEV : loss 0.1223890632390976 - f1-score (micro avg) 0.7442 |
|
2023-10-17 17:41:48,682 saving best model |
|
2023-10-17 17:41:49,156 ---------------------------------------------------------------------------------------------------- |
|
2023-10-17 17:41:50,842 epoch 4 - iter 29/292 - loss 0.06652164 - time (sec): 1.68 - samples/sec: 2930.17 - lr: 0.000023 - momentum: 0.000000 |
|
2023-10-17 17:41:52,443 epoch 4 - iter 58/292 - loss 0.07721451 - time (sec): 3.29 - samples/sec: 2823.05 - lr: 0.000023 - momentum: 0.000000 |
|
2023-10-17 17:41:54,122 epoch 4 - iter 87/292 - loss 0.09464470 - time (sec): 4.96 - samples/sec: 2769.06 - lr: 0.000022 - momentum: 0.000000 |
|
2023-10-17 17:41:55,600 epoch 4 - iter 116/292 - loss 0.08942417 - time (sec): 6.44 - samples/sec: 2674.86 - lr: 0.000022 - momentum: 0.000000 |
|
2023-10-17 17:41:57,265 epoch 4 - iter 145/292 - loss 0.08297407 - time (sec): 8.11 - samples/sec: 2674.26 - lr: 0.000022 - momentum: 0.000000 |
|
2023-10-17 17:41:59,057 epoch 4 - iter 174/292 - loss 0.08430332 - time (sec): 9.90 - samples/sec: 2704.79 - lr: 0.000021 - momentum: 0.000000 |
|
2023-10-17 17:42:00,597 epoch 4 - iter 203/292 - loss 0.08242476 - time (sec): 11.44 - samples/sec: 2670.17 - lr: 0.000021 - momentum: 0.000000 |
|
2023-10-17 17:42:02,247 epoch 4 - iter 232/292 - loss 0.08006731 - time (sec): 13.09 - samples/sec: 2644.82 - lr: 0.000021 - momentum: 0.000000 |
|
2023-10-17 17:42:03,920 epoch 4 - iter 261/292 - loss 0.07525678 - time (sec): 14.76 - samples/sec: 2651.25 - lr: 0.000020 - momentum: 0.000000 |
|
2023-10-17 17:42:05,625 epoch 4 - iter 290/292 - loss 0.07416172 - time (sec): 16.47 - samples/sec: 2691.57 - lr: 0.000020 - momentum: 0.000000 |
|
2023-10-17 17:42:05,708 ---------------------------------------------------------------------------------------------------- |
|
2023-10-17 17:42:05,708 EPOCH 4 done: loss 0.0740 - lr: 0.000020 |
|
2023-10-17 17:42:07,158 DEV : loss 0.13055771589279175 - f1-score (micro avg) 0.7602 |
|
2023-10-17 17:42:07,163 saving best model |
|
2023-10-17 17:42:07,621 ---------------------------------------------------------------------------------------------------- |
|
2023-10-17 17:42:09,233 epoch 5 - iter 29/292 - loss 0.05571383 - time (sec): 1.61 - samples/sec: 2424.35 - lr: 0.000020 - momentum: 0.000000 |
|
2023-10-17 17:42:10,958 epoch 5 - iter 58/292 - loss 0.06187539 - time (sec): 3.33 - samples/sec: 2621.87 - lr: 0.000019 - momentum: 0.000000 |
|
2023-10-17 17:42:12,602 epoch 5 - iter 87/292 - loss 0.05527628 - time (sec): 4.98 - samples/sec: 2716.85 - lr: 0.000019 - momentum: 0.000000 |
|
2023-10-17 17:42:14,400 epoch 5 - iter 116/292 - loss 0.05682208 - time (sec): 6.77 - samples/sec: 2692.02 - lr: 0.000019 - momentum: 0.000000 |
|
2023-10-17 17:42:15,955 epoch 5 - iter 145/292 - loss 0.05460577 - time (sec): 8.33 - samples/sec: 2649.83 - lr: 0.000018 - momentum: 0.000000 |
|
2023-10-17 17:42:17,518 epoch 5 - iter 174/292 - loss 0.05178654 - time (sec): 9.89 - samples/sec: 2646.78 - lr: 0.000018 - momentum: 0.000000 |
|
2023-10-17 17:42:19,227 epoch 5 - iter 203/292 - loss 0.05084122 - time (sec): 11.60 - samples/sec: 2652.93 - lr: 0.000018 - momentum: 0.000000 |
|
2023-10-17 17:42:20,796 epoch 5 - iter 232/292 - loss 0.04985321 - time (sec): 13.17 - samples/sec: 2667.06 - lr: 0.000017 - momentum: 0.000000 |
|
2023-10-17 17:42:22,639 epoch 5 - iter 261/292 - loss 0.05230871 - time (sec): 15.01 - samples/sec: 2661.45 - lr: 0.000017 - momentum: 0.000000 |
|
2023-10-17 17:42:24,268 epoch 5 - iter 290/292 - loss 0.05658315 - time (sec): 16.64 - samples/sec: 2649.21 - lr: 0.000017 - momentum: 0.000000 |
|
2023-10-17 17:42:24,391 ---------------------------------------------------------------------------------------------------- |
|
2023-10-17 17:42:24,391 EPOCH 5 done: loss 0.0566 - lr: 0.000017 |
|
2023-10-17 17:42:25,637 DEV : loss 0.13382023572921753 - f1-score (micro avg) 0.7511 |
|
2023-10-17 17:42:25,642 ---------------------------------------------------------------------------------------------------- |
|
2023-10-17 17:42:27,192 epoch 6 - iter 29/292 - loss 0.02881444 - time (sec): 1.55 - samples/sec: 2584.46 - lr: 0.000016 - momentum: 0.000000 |
|
2023-10-17 17:42:29,029 epoch 6 - iter 58/292 - loss 0.04281959 - time (sec): 3.38 - samples/sec: 2679.47 - lr: 0.000016 - momentum: 0.000000 |
|
2023-10-17 17:42:30,557 epoch 6 - iter 87/292 - loss 0.03964120 - time (sec): 4.91 - samples/sec: 2585.43 - lr: 0.000016 - momentum: 0.000000 |
|
2023-10-17 17:42:32,359 epoch 6 - iter 116/292 - loss 0.03670832 - time (sec): 6.71 - samples/sec: 2583.43 - lr: 0.000015 - momentum: 0.000000 |
|
2023-10-17 17:42:34,249 epoch 6 - iter 145/292 - loss 0.03902949 - time (sec): 8.61 - samples/sec: 2586.32 - lr: 0.000015 - momentum: 0.000000 |
|
2023-10-17 17:42:35,947 epoch 6 - iter 174/292 - loss 0.03725768 - time (sec): 10.30 - samples/sec: 2631.61 - lr: 0.000015 - momentum: 0.000000 |
|
2023-10-17 17:42:37,562 epoch 6 - iter 203/292 - loss 0.03855395 - time (sec): 11.92 - samples/sec: 2622.30 - lr: 0.000014 - momentum: 0.000000 |
|
2023-10-17 17:42:39,209 epoch 6 - iter 232/292 - loss 0.03648586 - time (sec): 13.56 - samples/sec: 2603.71 - lr: 0.000014 - momentum: 0.000000 |
|
2023-10-17 17:42:40,834 epoch 6 - iter 261/292 - loss 0.03780658 - time (sec): 15.19 - samples/sec: 2607.88 - lr: 0.000014 - momentum: 0.000000 |
|
2023-10-17 17:42:42,380 epoch 6 - iter 290/292 - loss 0.03828100 - time (sec): 16.74 - samples/sec: 2650.48 - lr: 0.000013 - momentum: 0.000000 |
|
2023-10-17 17:42:42,461 ---------------------------------------------------------------------------------------------------- |
|
2023-10-17 17:42:42,461 EPOCH 6 done: loss 0.0382 - lr: 0.000013 |
|
2023-10-17 17:42:43,729 DEV : loss 0.1387849748134613 - f1-score (micro avg) 0.7562 |
|
2023-10-17 17:42:43,734 ---------------------------------------------------------------------------------------------------- |
|
2023-10-17 17:42:45,307 epoch 7 - iter 29/292 - loss 0.01361842 - time (sec): 1.57 - samples/sec: 2431.67 - lr: 0.000013 - momentum: 0.000000 |
|
2023-10-17 17:42:46,981 epoch 7 - iter 58/292 - loss 0.02741036 - time (sec): 3.25 - samples/sec: 2667.70 - lr: 0.000013 - momentum: 0.000000 |
|
2023-10-17 17:42:48,715 epoch 7 - iter 87/292 - loss 0.02358511 - time (sec): 4.98 - samples/sec: 2705.00 - lr: 0.000012 - momentum: 0.000000 |
|
2023-10-17 17:42:50,390 epoch 7 - iter 116/292 - loss 0.03159564 - time (sec): 6.65 - samples/sec: 2673.57 - lr: 0.000012 - momentum: 0.000000 |
|
2023-10-17 17:42:52,053 epoch 7 - iter 145/292 - loss 0.02732421 - time (sec): 8.32 - samples/sec: 2697.29 - lr: 0.000012 - momentum: 0.000000 |
|
2023-10-17 17:42:53,652 epoch 7 - iter 174/292 - loss 0.02603176 - time (sec): 9.92 - samples/sec: 2630.63 - lr: 0.000011 - momentum: 0.000000 |
|
2023-10-17 17:42:55,324 epoch 7 - iter 203/292 - loss 0.02704455 - time (sec): 11.59 - samples/sec: 2680.50 - lr: 0.000011 - momentum: 0.000000 |
|
2023-10-17 17:42:56,986 epoch 7 - iter 232/292 - loss 0.02861640 - time (sec): 13.25 - samples/sec: 2672.23 - lr: 0.000011 - momentum: 0.000000 |
|
2023-10-17 17:42:58,689 epoch 7 - iter 261/292 - loss 0.02783647 - time (sec): 14.95 - samples/sec: 2679.23 - lr: 0.000010 - momentum: 0.000000 |
|
2023-10-17 17:43:00,300 epoch 7 - iter 290/292 - loss 0.02793728 - time (sec): 16.57 - samples/sec: 2671.21 - lr: 0.000010 - momentum: 0.000000 |
|
2023-10-17 17:43:00,399 ---------------------------------------------------------------------------------------------------- |
|
2023-10-17 17:43:00,399 EPOCH 7 done: loss 0.0278 - lr: 0.000010 |
|
2023-10-17 17:43:01,670 DEV : loss 0.15122058987617493 - f1-score (micro avg) 0.7865 |
|
2023-10-17 17:43:01,676 saving best model |
|
2023-10-17 17:43:02,074 ---------------------------------------------------------------------------------------------------- |
|
2023-10-17 17:43:03,652 epoch 8 - iter 29/292 - loss 0.03474470 - time (sec): 1.58 - samples/sec: 2548.38 - lr: 0.000010 - momentum: 0.000000 |
|
2023-10-17 17:43:05,374 epoch 8 - iter 58/292 - loss 0.03005709 - time (sec): 3.30 - samples/sec: 2550.78 - lr: 0.000009 - momentum: 0.000000 |
|
2023-10-17 17:43:07,042 epoch 8 - iter 87/292 - loss 0.02472687 - time (sec): 4.97 - samples/sec: 2531.31 - lr: 0.000009 - momentum: 0.000000 |
|
2023-10-17 17:43:08,591 epoch 8 - iter 116/292 - loss 0.02387951 - time (sec): 6.52 - samples/sec: 2547.94 - lr: 0.000009 - momentum: 0.000000 |
|
2023-10-17 17:43:10,241 epoch 8 - iter 145/292 - loss 0.02389223 - time (sec): 8.17 - samples/sec: 2591.36 - lr: 0.000008 - momentum: 0.000000 |
|
2023-10-17 17:43:11,927 epoch 8 - iter 174/292 - loss 0.02420953 - time (sec): 9.85 - samples/sec: 2622.08 - lr: 0.000008 - momentum: 0.000000 |
|
2023-10-17 17:43:13,437 epoch 8 - iter 203/292 - loss 0.02291096 - time (sec): 11.36 - samples/sec: 2610.71 - lr: 0.000008 - momentum: 0.000000 |
|
2023-10-17 17:43:15,274 epoch 8 - iter 232/292 - loss 0.02169136 - time (sec): 13.20 - samples/sec: 2650.27 - lr: 0.000007 - momentum: 0.000000 |
|
2023-10-17 17:43:16,993 epoch 8 - iter 261/292 - loss 0.02098081 - time (sec): 14.92 - samples/sec: 2627.87 - lr: 0.000007 - momentum: 0.000000 |
|
2023-10-17 17:43:18,843 epoch 8 - iter 290/292 - loss 0.02082531 - time (sec): 16.77 - samples/sec: 2642.30 - lr: 0.000007 - momentum: 0.000000 |
|
2023-10-17 17:43:18,932 ---------------------------------------------------------------------------------------------------- |
|
2023-10-17 17:43:18,932 EPOCH 8 done: loss 0.0208 - lr: 0.000007 |
|
2023-10-17 17:43:20,229 DEV : loss 0.1585906744003296 - f1-score (micro avg) 0.7679 |
|
2023-10-17 17:43:20,234 ---------------------------------------------------------------------------------------------------- |
|
2023-10-17 17:43:21,892 epoch 9 - iter 29/292 - loss 0.01213194 - time (sec): 1.66 - samples/sec: 2813.91 - lr: 0.000006 - momentum: 0.000000 |
|
2023-10-17 17:43:23,564 epoch 9 - iter 58/292 - loss 0.01599530 - time (sec): 3.33 - samples/sec: 2657.07 - lr: 0.000006 - momentum: 0.000000 |
|
2023-10-17 17:43:25,343 epoch 9 - iter 87/292 - loss 0.02053743 - time (sec): 5.11 - samples/sec: 2697.30 - lr: 0.000006 - momentum: 0.000000 |
|
2023-10-17 17:43:27,166 epoch 9 - iter 116/292 - loss 0.02114353 - time (sec): 6.93 - samples/sec: 2673.31 - lr: 0.000005 - momentum: 0.000000 |
|
2023-10-17 17:43:28,962 epoch 9 - iter 145/292 - loss 0.01954370 - time (sec): 8.73 - samples/sec: 2678.43 - lr: 0.000005 - momentum: 0.000000 |
|
2023-10-17 17:43:30,574 epoch 9 - iter 174/292 - loss 0.01808923 - time (sec): 10.34 - samples/sec: 2665.74 - lr: 0.000005 - momentum: 0.000000 |
|
2023-10-17 17:43:32,169 epoch 9 - iter 203/292 - loss 0.01813334 - time (sec): 11.93 - samples/sec: 2656.85 - lr: 0.000004 - momentum: 0.000000 |
|
2023-10-17 17:43:33,777 epoch 9 - iter 232/292 - loss 0.01754142 - time (sec): 13.54 - samples/sec: 2650.16 - lr: 0.000004 - momentum: 0.000000 |
|
2023-10-17 17:43:35,257 epoch 9 - iter 261/292 - loss 0.01772635 - time (sec): 15.02 - samples/sec: 2616.40 - lr: 0.000004 - momentum: 0.000000 |
|
2023-10-17 17:43:36,993 epoch 9 - iter 290/292 - loss 0.01726453 - time (sec): 16.76 - samples/sec: 2632.97 - lr: 0.000003 - momentum: 0.000000 |
|
2023-10-17 17:43:37,089 ---------------------------------------------------------------------------------------------------- |
|
2023-10-17 17:43:37,089 EPOCH 9 done: loss 0.0174 - lr: 0.000003 |
|
2023-10-17 17:43:38,331 DEV : loss 0.16505198180675507 - f1-score (micro avg) 0.7753 |
|
2023-10-17 17:43:38,336 ---------------------------------------------------------------------------------------------------- |
|
2023-10-17 17:43:39,887 epoch 10 - iter 29/292 - loss 0.00544103 - time (sec): 1.55 - samples/sec: 2830.15 - lr: 0.000003 - momentum: 0.000000 |
|
2023-10-17 17:43:41,491 epoch 10 - iter 58/292 - loss 0.01290303 - time (sec): 3.15 - samples/sec: 2692.45 - lr: 0.000003 - momentum: 0.000000 |
|
2023-10-17 17:43:43,394 epoch 10 - iter 87/292 - loss 0.01829305 - time (sec): 5.06 - samples/sec: 2546.21 - lr: 0.000002 - momentum: 0.000000 |
|
2023-10-17 17:43:45,078 epoch 10 - iter 116/292 - loss 0.01848904 - time (sec): 6.74 - samples/sec: 2661.73 - lr: 0.000002 - momentum: 0.000000 |
|
2023-10-17 17:43:46,711 epoch 10 - iter 145/292 - loss 0.01710130 - time (sec): 8.37 - samples/sec: 2698.38 - lr: 0.000002 - momentum: 0.000000 |
|
2023-10-17 17:43:48,242 epoch 10 - iter 174/292 - loss 0.01481307 - time (sec): 9.90 - samples/sec: 2677.18 - lr: 0.000001 - momentum: 0.000000 |
|
2023-10-17 17:43:50,174 epoch 10 - iter 203/292 - loss 0.01532186 - time (sec): 11.84 - samples/sec: 2663.93 - lr: 0.000001 - momentum: 0.000000 |
|
2023-10-17 17:43:51,882 epoch 10 - iter 232/292 - loss 0.01400894 - time (sec): 13.55 - samples/sec: 2675.56 - lr: 0.000001 - momentum: 0.000000 |
|
2023-10-17 17:43:53,457 epoch 10 - iter 261/292 - loss 0.01527717 - time (sec): 15.12 - samples/sec: 2665.48 - lr: 0.000000 - momentum: 0.000000 |
|
2023-10-17 17:43:55,002 epoch 10 - iter 290/292 - loss 0.01471316 - time (sec): 16.67 - samples/sec: 2654.31 - lr: 0.000000 - momentum: 0.000000 |
|
2023-10-17 17:43:55,093 ---------------------------------------------------------------------------------------------------- |
|
2023-10-17 17:43:55,093 EPOCH 10 done: loss 0.0146 - lr: 0.000000 |
|
2023-10-17 17:43:56,439 DEV : loss 0.16691839694976807 - f1-score (micro avg) 0.7648 |
|
2023-10-17 17:43:56,829 ---------------------------------------------------------------------------------------------------- |
|
2023-10-17 17:43:56,831 Loading model from best epoch ... |
|
2023-10-17 17:43:58,309 SequenceTagger predicts: Dictionary with 17 tags: O, S-LOC, B-LOC, E-LOC, I-LOC, S-PER, B-PER, E-PER, I-PER, S-ORG, B-ORG, E-ORG, I-ORG, S-HumanProd, B-HumanProd, E-HumanProd, I-HumanProd |
|
2023-10-17 17:44:01,099 |
|
Results: |
|
- F-score (micro) 0.7634 |
|
- F-score (macro) 0.6822 |
|
- Accuracy 0.6295 |
|
|
|
By class: |
|
precision recall f1-score support |
|
|
|
PER 0.8366 0.8534 0.8450 348 |
|
LOC 0.6485 0.8199 0.7242 261 |
|
ORG 0.4524 0.3654 0.4043 52 |
|
HumanProd 0.7391 0.7727 0.7556 22 |
|
|
|
micro avg 0.7293 0.8009 0.7634 683 |
|
macro avg 0.6692 0.7029 0.6822 683 |
|
weighted avg 0.7323 0.8009 0.7624 683 |
|
|
|
2023-10-17 17:44:01,099 ---------------------------------------------------------------------------------------------------- |
|
|