stefan-it's picture
Upload folder using huggingface_hub
bbc9d09
2023-10-17 17:26:59,563 ----------------------------------------------------------------------------------------------------
2023-10-17 17:26:59,564 Model: "SequenceTagger(
(embeddings): TransformerWordEmbeddings(
(model): ElectraModel(
(embeddings): ElectraEmbeddings(
(word_embeddings): Embedding(32001, 768)
(position_embeddings): Embedding(512, 768)
(token_type_embeddings): Embedding(2, 768)
(LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)
(dropout): Dropout(p=0.1, inplace=False)
)
(encoder): ElectraEncoder(
(layer): ModuleList(
(0-11): 12 x ElectraLayer(
(attention): ElectraAttention(
(self): ElectraSelfAttention(
(query): Linear(in_features=768, out_features=768, bias=True)
(key): Linear(in_features=768, out_features=768, bias=True)
(value): Linear(in_features=768, out_features=768, bias=True)
(dropout): Dropout(p=0.1, inplace=False)
)
(output): ElectraSelfOutput(
(dense): Linear(in_features=768, out_features=768, bias=True)
(LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)
(dropout): Dropout(p=0.1, inplace=False)
)
)
(intermediate): ElectraIntermediate(
(dense): Linear(in_features=768, out_features=3072, bias=True)
(intermediate_act_fn): GELUActivation()
)
(output): ElectraOutput(
(dense): Linear(in_features=3072, out_features=768, bias=True)
(LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)
(dropout): Dropout(p=0.1, inplace=False)
)
)
)
)
)
)
(locked_dropout): LockedDropout(p=0.5)
(linear): Linear(in_features=768, out_features=17, bias=True)
(loss_function): CrossEntropyLoss()
)"
2023-10-17 17:26:59,564 ----------------------------------------------------------------------------------------------------
2023-10-17 17:26:59,564 MultiCorpus: 1166 train + 165 dev + 415 test sentences
- NER_HIPE_2022 Corpus: 1166 train + 165 dev + 415 test sentences - /root/.flair/datasets/ner_hipe_2022/v2.1/newseye/fi/with_doc_seperator
2023-10-17 17:26:59,564 ----------------------------------------------------------------------------------------------------
2023-10-17 17:26:59,564 Train: 1166 sentences
2023-10-17 17:26:59,564 (train_with_dev=False, train_with_test=False)
2023-10-17 17:26:59,565 ----------------------------------------------------------------------------------------------------
2023-10-17 17:26:59,565 Training Params:
2023-10-17 17:26:59,565 - learning_rate: "3e-05"
2023-10-17 17:26:59,565 - mini_batch_size: "4"
2023-10-17 17:26:59,565 - max_epochs: "10"
2023-10-17 17:26:59,565 - shuffle: "True"
2023-10-17 17:26:59,565 ----------------------------------------------------------------------------------------------------
2023-10-17 17:26:59,565 Plugins:
2023-10-17 17:26:59,565 - TensorboardLogger
2023-10-17 17:26:59,565 - LinearScheduler | warmup_fraction: '0.1'
2023-10-17 17:26:59,565 ----------------------------------------------------------------------------------------------------
2023-10-17 17:26:59,565 Final evaluation on model from best epoch (best-model.pt)
2023-10-17 17:26:59,565 - metric: "('micro avg', 'f1-score')"
2023-10-17 17:26:59,565 ----------------------------------------------------------------------------------------------------
2023-10-17 17:26:59,565 Computation:
2023-10-17 17:26:59,565 - compute on device: cuda:0
2023-10-17 17:26:59,565 - embedding storage: none
2023-10-17 17:26:59,565 ----------------------------------------------------------------------------------------------------
2023-10-17 17:26:59,565 Model training base path: "hmbench-newseye/fi-hmteams/teams-base-historic-multilingual-discriminator-bs4-wsFalse-e10-lr3e-05-poolingfirst-layers-1-crfFalse-1"
2023-10-17 17:26:59,566 ----------------------------------------------------------------------------------------------------
2023-10-17 17:26:59,566 ----------------------------------------------------------------------------------------------------
2023-10-17 17:26:59,566 Logging anything other than scalars to TensorBoard is currently not supported.
2023-10-17 17:27:02,559 epoch 1 - iter 29/292 - loss 3.38067707 - time (sec): 2.99 - samples/sec: 1274.79 - lr: 0.000003 - momentum: 0.000000
2023-10-17 17:27:04,259 epoch 1 - iter 58/292 - loss 2.84271477 - time (sec): 4.69 - samples/sec: 1733.47 - lr: 0.000006 - momentum: 0.000000
2023-10-17 17:27:06,435 epoch 1 - iter 87/292 - loss 2.15025489 - time (sec): 6.87 - samples/sec: 2025.87 - lr: 0.000009 - momentum: 0.000000
2023-10-17 17:27:08,024 epoch 1 - iter 116/292 - loss 1.73892010 - time (sec): 8.46 - samples/sec: 2196.36 - lr: 0.000012 - momentum: 0.000000
2023-10-17 17:27:09,750 epoch 1 - iter 145/292 - loss 1.52932502 - time (sec): 10.18 - samples/sec: 2213.85 - lr: 0.000015 - momentum: 0.000000
2023-10-17 17:27:11,391 epoch 1 - iter 174/292 - loss 1.35737795 - time (sec): 11.82 - samples/sec: 2255.55 - lr: 0.000018 - momentum: 0.000000
2023-10-17 17:27:13,018 epoch 1 - iter 203/292 - loss 1.21956588 - time (sec): 13.45 - samples/sec: 2279.46 - lr: 0.000021 - momentum: 0.000000
2023-10-17 17:27:14,949 epoch 1 - iter 232/292 - loss 1.09424639 - time (sec): 15.38 - samples/sec: 2310.72 - lr: 0.000024 - momentum: 0.000000
2023-10-17 17:27:16,712 epoch 1 - iter 261/292 - loss 1.01634785 - time (sec): 17.14 - samples/sec: 2301.10 - lr: 0.000027 - momentum: 0.000000
2023-10-17 17:27:18,606 epoch 1 - iter 290/292 - loss 0.93625762 - time (sec): 19.04 - samples/sec: 2320.43 - lr: 0.000030 - momentum: 0.000000
2023-10-17 17:27:18,714 ----------------------------------------------------------------------------------------------------
2023-10-17 17:27:18,715 EPOCH 1 done: loss 0.9322 - lr: 0.000030
2023-10-17 17:27:19,545 DEV : loss 0.19820411503314972 - f1-score (micro avg) 0.4404
2023-10-17 17:27:19,551 saving best model
2023-10-17 17:27:19,982 ----------------------------------------------------------------------------------------------------
2023-10-17 17:27:21,872 epoch 2 - iter 29/292 - loss 0.22906131 - time (sec): 1.89 - samples/sec: 2440.74 - lr: 0.000030 - momentum: 0.000000
2023-10-17 17:27:23,585 epoch 2 - iter 58/292 - loss 0.20380503 - time (sec): 3.60 - samples/sec: 2427.27 - lr: 0.000029 - momentum: 0.000000
2023-10-17 17:27:25,351 epoch 2 - iter 87/292 - loss 0.19601726 - time (sec): 5.37 - samples/sec: 2453.85 - lr: 0.000029 - momentum: 0.000000
2023-10-17 17:27:27,325 epoch 2 - iter 116/292 - loss 0.18101298 - time (sec): 7.34 - samples/sec: 2492.83 - lr: 0.000029 - momentum: 0.000000
2023-10-17 17:27:29,067 epoch 2 - iter 145/292 - loss 0.17978471 - time (sec): 9.08 - samples/sec: 2536.20 - lr: 0.000028 - momentum: 0.000000
2023-10-17 17:27:30,696 epoch 2 - iter 174/292 - loss 0.17845205 - time (sec): 10.71 - samples/sec: 2574.15 - lr: 0.000028 - momentum: 0.000000
2023-10-17 17:27:32,232 epoch 2 - iter 203/292 - loss 0.18417685 - time (sec): 12.25 - samples/sec: 2562.26 - lr: 0.000028 - momentum: 0.000000
2023-10-17 17:27:33,820 epoch 2 - iter 232/292 - loss 0.19378925 - time (sec): 13.84 - samples/sec: 2538.03 - lr: 0.000027 - momentum: 0.000000
2023-10-17 17:27:35,628 epoch 2 - iter 261/292 - loss 0.19190160 - time (sec): 15.64 - samples/sec: 2570.21 - lr: 0.000027 - momentum: 0.000000
2023-10-17 17:27:37,313 epoch 2 - iter 290/292 - loss 0.18637235 - time (sec): 17.33 - samples/sec: 2554.65 - lr: 0.000027 - momentum: 0.000000
2023-10-17 17:27:37,418 ----------------------------------------------------------------------------------------------------
2023-10-17 17:27:37,419 EPOCH 2 done: loss 0.1866 - lr: 0.000027
2023-10-17 17:27:38,858 DEV : loss 0.13132032752037048 - f1-score (micro avg) 0.6523
2023-10-17 17:27:38,881 saving best model
2023-10-17 17:27:39,381 ----------------------------------------------------------------------------------------------------
2023-10-17 17:27:41,149 epoch 3 - iter 29/292 - loss 0.11668019 - time (sec): 1.76 - samples/sec: 2582.00 - lr: 0.000026 - momentum: 0.000000
2023-10-17 17:27:42,789 epoch 3 - iter 58/292 - loss 0.10524911 - time (sec): 3.40 - samples/sec: 2641.65 - lr: 0.000026 - momentum: 0.000000
2023-10-17 17:27:44,415 epoch 3 - iter 87/292 - loss 0.12104987 - time (sec): 5.03 - samples/sec: 2720.98 - lr: 0.000026 - momentum: 0.000000
2023-10-17 17:27:45,968 epoch 3 - iter 116/292 - loss 0.11755234 - time (sec): 6.58 - samples/sec: 2686.04 - lr: 0.000025 - momentum: 0.000000
2023-10-17 17:27:47,599 epoch 3 - iter 145/292 - loss 0.11284499 - time (sec): 8.21 - samples/sec: 2658.66 - lr: 0.000025 - momentum: 0.000000
2023-10-17 17:27:49,217 epoch 3 - iter 174/292 - loss 0.10811762 - time (sec): 9.83 - samples/sec: 2628.27 - lr: 0.000025 - momentum: 0.000000
2023-10-17 17:27:51,020 epoch 3 - iter 203/292 - loss 0.10863850 - time (sec): 11.63 - samples/sec: 2659.81 - lr: 0.000024 - momentum: 0.000000
2023-10-17 17:27:52,752 epoch 3 - iter 232/292 - loss 0.10627802 - time (sec): 13.37 - samples/sec: 2663.02 - lr: 0.000024 - momentum: 0.000000
2023-10-17 17:27:54,311 epoch 3 - iter 261/292 - loss 0.10575975 - time (sec): 14.93 - samples/sec: 2655.88 - lr: 0.000024 - momentum: 0.000000
2023-10-17 17:27:56,137 epoch 3 - iter 290/292 - loss 0.10736363 - time (sec): 16.75 - samples/sec: 2634.68 - lr: 0.000023 - momentum: 0.000000
2023-10-17 17:27:56,257 ----------------------------------------------------------------------------------------------------
2023-10-17 17:27:56,257 EPOCH 3 done: loss 0.1078 - lr: 0.000023
2023-10-17 17:27:57,516 DEV : loss 0.1270725578069687 - f1-score (micro avg) 0.7313
2023-10-17 17:27:57,521 saving best model
2023-10-17 17:27:57,991 ----------------------------------------------------------------------------------------------------
2023-10-17 17:27:59,500 epoch 4 - iter 29/292 - loss 0.11337783 - time (sec): 1.50 - samples/sec: 2364.25 - lr: 0.000023 - momentum: 0.000000
2023-10-17 17:28:01,109 epoch 4 - iter 58/292 - loss 0.09246785 - time (sec): 3.11 - samples/sec: 2522.43 - lr: 0.000023 - momentum: 0.000000
2023-10-17 17:28:03,000 epoch 4 - iter 87/292 - loss 0.07732073 - time (sec): 5.01 - samples/sec: 2562.90 - lr: 0.000022 - momentum: 0.000000
2023-10-17 17:28:04,764 epoch 4 - iter 116/292 - loss 0.07279139 - time (sec): 6.77 - samples/sec: 2578.42 - lr: 0.000022 - momentum: 0.000000
2023-10-17 17:28:06,391 epoch 4 - iter 145/292 - loss 0.06862627 - time (sec): 8.40 - samples/sec: 2594.33 - lr: 0.000022 - momentum: 0.000000
2023-10-17 17:28:08,168 epoch 4 - iter 174/292 - loss 0.06941418 - time (sec): 10.17 - samples/sec: 2623.37 - lr: 0.000021 - momentum: 0.000000
2023-10-17 17:28:09,936 epoch 4 - iter 203/292 - loss 0.07292749 - time (sec): 11.94 - samples/sec: 2633.95 - lr: 0.000021 - momentum: 0.000000
2023-10-17 17:28:11,626 epoch 4 - iter 232/292 - loss 0.07213824 - time (sec): 13.63 - samples/sec: 2620.05 - lr: 0.000021 - momentum: 0.000000
2023-10-17 17:28:13,204 epoch 4 - iter 261/292 - loss 0.07275720 - time (sec): 15.21 - samples/sec: 2641.76 - lr: 0.000020 - momentum: 0.000000
2023-10-17 17:28:14,908 epoch 4 - iter 290/292 - loss 0.07210992 - time (sec): 16.91 - samples/sec: 2611.71 - lr: 0.000020 - momentum: 0.000000
2023-10-17 17:28:15,014 ----------------------------------------------------------------------------------------------------
2023-10-17 17:28:15,015 EPOCH 4 done: loss 0.0719 - lr: 0.000020
2023-10-17 17:28:16,266 DEV : loss 0.12534381449222565 - f1-score (micro avg) 0.7749
2023-10-17 17:28:16,271 saving best model
2023-10-17 17:28:16,794 ----------------------------------------------------------------------------------------------------
2023-10-17 17:28:18,744 epoch 5 - iter 29/292 - loss 0.05953152 - time (sec): 1.95 - samples/sec: 2665.79 - lr: 0.000020 - momentum: 0.000000
2023-10-17 17:28:20,508 epoch 5 - iter 58/292 - loss 0.04305700 - time (sec): 3.71 - samples/sec: 2662.09 - lr: 0.000019 - momentum: 0.000000
2023-10-17 17:28:22,098 epoch 5 - iter 87/292 - loss 0.05274874 - time (sec): 5.30 - samples/sec: 2689.83 - lr: 0.000019 - momentum: 0.000000
2023-10-17 17:28:23,608 epoch 5 - iter 116/292 - loss 0.05816026 - time (sec): 6.81 - samples/sec: 2652.68 - lr: 0.000019 - momentum: 0.000000
2023-10-17 17:28:25,334 epoch 5 - iter 145/292 - loss 0.05818204 - time (sec): 8.54 - samples/sec: 2630.17 - lr: 0.000018 - momentum: 0.000000
2023-10-17 17:28:27,100 epoch 5 - iter 174/292 - loss 0.05810976 - time (sec): 10.30 - samples/sec: 2649.02 - lr: 0.000018 - momentum: 0.000000
2023-10-17 17:28:28,766 epoch 5 - iter 203/292 - loss 0.05835023 - time (sec): 11.97 - samples/sec: 2652.40 - lr: 0.000018 - momentum: 0.000000
2023-10-17 17:28:30,397 epoch 5 - iter 232/292 - loss 0.05556762 - time (sec): 13.60 - samples/sec: 2658.25 - lr: 0.000017 - momentum: 0.000000
2023-10-17 17:28:32,060 epoch 5 - iter 261/292 - loss 0.05489206 - time (sec): 15.26 - samples/sec: 2619.23 - lr: 0.000017 - momentum: 0.000000
2023-10-17 17:28:33,627 epoch 5 - iter 290/292 - loss 0.05276153 - time (sec): 16.83 - samples/sec: 2620.96 - lr: 0.000017 - momentum: 0.000000
2023-10-17 17:28:33,744 ----------------------------------------------------------------------------------------------------
2023-10-17 17:28:33,744 EPOCH 5 done: loss 0.0524 - lr: 0.000017
2023-10-17 17:28:35,003 DEV : loss 0.1388275921344757 - f1-score (micro avg) 0.7404
2023-10-17 17:28:35,007 ----------------------------------------------------------------------------------------------------
2023-10-17 17:28:36,580 epoch 6 - iter 29/292 - loss 0.04318866 - time (sec): 1.57 - samples/sec: 2847.50 - lr: 0.000016 - momentum: 0.000000
2023-10-17 17:28:38,085 epoch 6 - iter 58/292 - loss 0.04282065 - time (sec): 3.08 - samples/sec: 2629.42 - lr: 0.000016 - momentum: 0.000000
2023-10-17 17:28:39,949 epoch 6 - iter 87/292 - loss 0.04016784 - time (sec): 4.94 - samples/sec: 2609.49 - lr: 0.000016 - momentum: 0.000000
2023-10-17 17:28:41,597 epoch 6 - iter 116/292 - loss 0.04140804 - time (sec): 6.59 - samples/sec: 2608.75 - lr: 0.000015 - momentum: 0.000000
2023-10-17 17:28:43,095 epoch 6 - iter 145/292 - loss 0.04001813 - time (sec): 8.09 - samples/sec: 2562.13 - lr: 0.000015 - momentum: 0.000000
2023-10-17 17:28:44,814 epoch 6 - iter 174/292 - loss 0.03934836 - time (sec): 9.80 - samples/sec: 2549.72 - lr: 0.000015 - momentum: 0.000000
2023-10-17 17:28:46,650 epoch 6 - iter 203/292 - loss 0.03998330 - time (sec): 11.64 - samples/sec: 2553.75 - lr: 0.000014 - momentum: 0.000000
2023-10-17 17:28:48,456 epoch 6 - iter 232/292 - loss 0.03957101 - time (sec): 13.45 - samples/sec: 2559.54 - lr: 0.000014 - momentum: 0.000000
2023-10-17 17:28:50,115 epoch 6 - iter 261/292 - loss 0.03892236 - time (sec): 15.11 - samples/sec: 2565.17 - lr: 0.000014 - momentum: 0.000000
2023-10-17 17:28:51,917 epoch 6 - iter 290/292 - loss 0.03724276 - time (sec): 16.91 - samples/sec: 2618.81 - lr: 0.000013 - momentum: 0.000000
2023-10-17 17:28:52,008 ----------------------------------------------------------------------------------------------------
2023-10-17 17:28:52,008 EPOCH 6 done: loss 0.0373 - lr: 0.000013
2023-10-17 17:28:53,260 DEV : loss 0.15327706933021545 - f1-score (micro avg) 0.7414
2023-10-17 17:28:53,265 ----------------------------------------------------------------------------------------------------
2023-10-17 17:28:55,107 epoch 7 - iter 29/292 - loss 0.01499409 - time (sec): 1.84 - samples/sec: 2673.73 - lr: 0.000013 - momentum: 0.000000
2023-10-17 17:28:56,892 epoch 7 - iter 58/292 - loss 0.02746269 - time (sec): 3.63 - samples/sec: 2535.72 - lr: 0.000013 - momentum: 0.000000
2023-10-17 17:28:58,571 epoch 7 - iter 87/292 - loss 0.03058176 - time (sec): 5.31 - samples/sec: 2609.09 - lr: 0.000012 - momentum: 0.000000
2023-10-17 17:29:00,399 epoch 7 - iter 116/292 - loss 0.02783010 - time (sec): 7.13 - samples/sec: 2613.54 - lr: 0.000012 - momentum: 0.000000
2023-10-17 17:29:02,208 epoch 7 - iter 145/292 - loss 0.02763787 - time (sec): 8.94 - samples/sec: 2636.93 - lr: 0.000012 - momentum: 0.000000
2023-10-17 17:29:03,848 epoch 7 - iter 174/292 - loss 0.02864891 - time (sec): 10.58 - samples/sec: 2631.66 - lr: 0.000011 - momentum: 0.000000
2023-10-17 17:29:05,535 epoch 7 - iter 203/292 - loss 0.02705060 - time (sec): 12.27 - samples/sec: 2655.11 - lr: 0.000011 - momentum: 0.000000
2023-10-17 17:29:07,098 epoch 7 - iter 232/292 - loss 0.02619522 - time (sec): 13.83 - samples/sec: 2656.91 - lr: 0.000011 - momentum: 0.000000
2023-10-17 17:29:08,606 epoch 7 - iter 261/292 - loss 0.02763773 - time (sec): 15.34 - samples/sec: 2636.76 - lr: 0.000010 - momentum: 0.000000
2023-10-17 17:29:10,173 epoch 7 - iter 290/292 - loss 0.02674452 - time (sec): 16.91 - samples/sec: 2618.73 - lr: 0.000010 - momentum: 0.000000
2023-10-17 17:29:10,271 ----------------------------------------------------------------------------------------------------
2023-10-17 17:29:10,271 EPOCH 7 done: loss 0.0267 - lr: 0.000010
2023-10-17 17:29:11,518 DEV : loss 0.16855847835540771 - f1-score (micro avg) 0.7511
2023-10-17 17:29:11,523 ----------------------------------------------------------------------------------------------------
2023-10-17 17:29:13,452 epoch 8 - iter 29/292 - loss 0.02027046 - time (sec): 1.93 - samples/sec: 2456.17 - lr: 0.000010 - momentum: 0.000000
2023-10-17 17:29:15,350 epoch 8 - iter 58/292 - loss 0.02601394 - time (sec): 3.83 - samples/sec: 2461.62 - lr: 0.000009 - momentum: 0.000000
2023-10-17 17:29:17,197 epoch 8 - iter 87/292 - loss 0.02915936 - time (sec): 5.67 - samples/sec: 2511.81 - lr: 0.000009 - momentum: 0.000000
2023-10-17 17:29:18,764 epoch 8 - iter 116/292 - loss 0.02597150 - time (sec): 7.24 - samples/sec: 2514.51 - lr: 0.000009 - momentum: 0.000000
2023-10-17 17:29:20,355 epoch 8 - iter 145/292 - loss 0.02298947 - time (sec): 8.83 - samples/sec: 2498.33 - lr: 0.000008 - momentum: 0.000000
2023-10-17 17:29:21,943 epoch 8 - iter 174/292 - loss 0.02214849 - time (sec): 10.42 - samples/sec: 2477.29 - lr: 0.000008 - momentum: 0.000000
2023-10-17 17:29:23,569 epoch 8 - iter 203/292 - loss 0.02160627 - time (sec): 12.04 - samples/sec: 2515.45 - lr: 0.000008 - momentum: 0.000000
2023-10-17 17:29:25,193 epoch 8 - iter 232/292 - loss 0.01994902 - time (sec): 13.67 - samples/sec: 2512.24 - lr: 0.000007 - momentum: 0.000000
2023-10-17 17:29:27,208 epoch 8 - iter 261/292 - loss 0.02019007 - time (sec): 15.68 - samples/sec: 2572.58 - lr: 0.000007 - momentum: 0.000000
2023-10-17 17:29:28,717 epoch 8 - iter 290/292 - loss 0.02000299 - time (sec): 17.19 - samples/sec: 2580.42 - lr: 0.000007 - momentum: 0.000000
2023-10-17 17:29:28,810 ----------------------------------------------------------------------------------------------------
2023-10-17 17:29:28,811 EPOCH 8 done: loss 0.0200 - lr: 0.000007
2023-10-17 17:29:30,046 DEV : loss 0.16279011964797974 - f1-score (micro avg) 0.7297
2023-10-17 17:29:30,050 ----------------------------------------------------------------------------------------------------
2023-10-17 17:29:31,831 epoch 9 - iter 29/292 - loss 0.01144995 - time (sec): 1.78 - samples/sec: 2396.24 - lr: 0.000006 - momentum: 0.000000
2023-10-17 17:29:33,608 epoch 9 - iter 58/292 - loss 0.01623989 - time (sec): 3.56 - samples/sec: 2291.67 - lr: 0.000006 - momentum: 0.000000
2023-10-17 17:29:35,363 epoch 9 - iter 87/292 - loss 0.01401842 - time (sec): 5.31 - samples/sec: 2234.76 - lr: 0.000006 - momentum: 0.000000
2023-10-17 17:29:37,138 epoch 9 - iter 116/292 - loss 0.01205765 - time (sec): 7.09 - samples/sec: 2251.76 - lr: 0.000005 - momentum: 0.000000
2023-10-17 17:29:38,916 epoch 9 - iter 145/292 - loss 0.01173432 - time (sec): 8.86 - samples/sec: 2348.03 - lr: 0.000005 - momentum: 0.000000
2023-10-17 17:29:40,652 epoch 9 - iter 174/292 - loss 0.01383228 - time (sec): 10.60 - samples/sec: 2400.23 - lr: 0.000005 - momentum: 0.000000
2023-10-17 17:29:42,275 epoch 9 - iter 203/292 - loss 0.01267356 - time (sec): 12.22 - samples/sec: 2423.12 - lr: 0.000004 - momentum: 0.000000
2023-10-17 17:29:43,999 epoch 9 - iter 232/292 - loss 0.01196640 - time (sec): 13.95 - samples/sec: 2479.86 - lr: 0.000004 - momentum: 0.000000
2023-10-17 17:29:45,663 epoch 9 - iter 261/292 - loss 0.01366318 - time (sec): 15.61 - samples/sec: 2524.91 - lr: 0.000004 - momentum: 0.000000
2023-10-17 17:29:47,441 epoch 9 - iter 290/292 - loss 0.01473423 - time (sec): 17.39 - samples/sec: 2527.37 - lr: 0.000003 - momentum: 0.000000
2023-10-17 17:29:47,644 ----------------------------------------------------------------------------------------------------
2023-10-17 17:29:47,644 EPOCH 9 done: loss 0.0147 - lr: 0.000003
2023-10-17 17:29:48,905 DEV : loss 0.175223708152771 - f1-score (micro avg) 0.7467
2023-10-17 17:29:48,910 ----------------------------------------------------------------------------------------------------
2023-10-17 17:29:50,670 epoch 10 - iter 29/292 - loss 0.01438218 - time (sec): 1.76 - samples/sec: 2780.63 - lr: 0.000003 - momentum: 0.000000
2023-10-17 17:29:52,357 epoch 10 - iter 58/292 - loss 0.01415184 - time (sec): 3.45 - samples/sec: 2802.79 - lr: 0.000003 - momentum: 0.000000
2023-10-17 17:29:54,032 epoch 10 - iter 87/292 - loss 0.01816413 - time (sec): 5.12 - samples/sec: 2710.02 - lr: 0.000002 - momentum: 0.000000
2023-10-17 17:29:55,669 epoch 10 - iter 116/292 - loss 0.01481988 - time (sec): 6.76 - samples/sec: 2678.17 - lr: 0.000002 - momentum: 0.000000
2023-10-17 17:29:57,326 epoch 10 - iter 145/292 - loss 0.01203336 - time (sec): 8.41 - samples/sec: 2674.61 - lr: 0.000002 - momentum: 0.000000
2023-10-17 17:29:59,003 epoch 10 - iter 174/292 - loss 0.01214143 - time (sec): 10.09 - samples/sec: 2658.32 - lr: 0.000001 - momentum: 0.000000
2023-10-17 17:30:00,852 epoch 10 - iter 203/292 - loss 0.01276323 - time (sec): 11.94 - samples/sec: 2663.21 - lr: 0.000001 - momentum: 0.000000
2023-10-17 17:30:02,439 epoch 10 - iter 232/292 - loss 0.01297237 - time (sec): 13.53 - samples/sec: 2638.23 - lr: 0.000001 - momentum: 0.000000
2023-10-17 17:30:04,230 epoch 10 - iter 261/292 - loss 0.01231085 - time (sec): 15.32 - samples/sec: 2651.34 - lr: 0.000000 - momentum: 0.000000
2023-10-17 17:30:05,785 epoch 10 - iter 290/292 - loss 0.01144895 - time (sec): 16.87 - samples/sec: 2623.09 - lr: 0.000000 - momentum: 0.000000
2023-10-17 17:30:05,893 ----------------------------------------------------------------------------------------------------
2023-10-17 17:30:05,893 EPOCH 10 done: loss 0.0114 - lr: 0.000000
2023-10-17 17:30:07,136 DEV : loss 0.17449024319648743 - f1-score (micro avg) 0.7387
2023-10-17 17:30:07,536 ----------------------------------------------------------------------------------------------------
2023-10-17 17:30:07,537 Loading model from best epoch ...
2023-10-17 17:30:09,536 SequenceTagger predicts: Dictionary with 17 tags: O, S-LOC, B-LOC, E-LOC, I-LOC, S-PER, B-PER, E-PER, I-PER, S-ORG, B-ORG, E-ORG, I-ORG, S-HumanProd, B-HumanProd, E-HumanProd, I-HumanProd
2023-10-17 17:30:12,032
Results:
- F-score (micro) 0.7261
- F-score (macro) 0.6066
- Accuracy 0.5897
By class:
precision recall f1-score support
PER 0.8437 0.8218 0.8326 348
LOC 0.6073 0.7701 0.6791 261
ORG 0.3077 0.2308 0.2637 52
HumanProd 0.6667 0.6364 0.6512 22
micro avg 0.7027 0.7511 0.7261 683
macro avg 0.6063 0.6148 0.6066 683
weighted avg 0.7068 0.7511 0.7248 683
2023-10-17 17:30:12,032 ----------------------------------------------------------------------------------------------------