stefan-it's picture
Upload folder using huggingface_hub
c7d85f3
2023-10-13 12:48:22,060 ----------------------------------------------------------------------------------------------------
2023-10-13 12:48:22,061 Model: "SequenceTagger(
(embeddings): TransformerWordEmbeddings(
(model): BertModel(
(embeddings): BertEmbeddings(
(word_embeddings): Embedding(32001, 768)
(position_embeddings): Embedding(512, 768)
(token_type_embeddings): Embedding(2, 768)
(LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)
(dropout): Dropout(p=0.1, inplace=False)
)
(encoder): BertEncoder(
(layer): ModuleList(
(0-11): 12 x BertLayer(
(attention): BertAttention(
(self): BertSelfAttention(
(query): Linear(in_features=768, out_features=768, bias=True)
(key): Linear(in_features=768, out_features=768, bias=True)
(value): Linear(in_features=768, out_features=768, bias=True)
(dropout): Dropout(p=0.1, inplace=False)
)
(output): BertSelfOutput(
(dense): Linear(in_features=768, out_features=768, bias=True)
(LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)
(dropout): Dropout(p=0.1, inplace=False)
)
)
(intermediate): BertIntermediate(
(dense): Linear(in_features=768, out_features=3072, bias=True)
(intermediate_act_fn): GELUActivation()
)
(output): BertOutput(
(dense): Linear(in_features=3072, out_features=768, bias=True)
(LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)
(dropout): Dropout(p=0.1, inplace=False)
)
)
)
)
(pooler): BertPooler(
(dense): Linear(in_features=768, out_features=768, bias=True)
(activation): Tanh()
)
)
)
(locked_dropout): LockedDropout(p=0.5)
(linear): Linear(in_features=768, out_features=21, bias=True)
(loss_function): CrossEntropyLoss()
)"
2023-10-13 12:48:22,061 ----------------------------------------------------------------------------------------------------
2023-10-13 12:48:22,062 MultiCorpus: 3575 train + 1235 dev + 1266 test sentences
- NER_HIPE_2022 Corpus: 3575 train + 1235 dev + 1266 test sentences - /root/.flair/datasets/ner_hipe_2022/v2.1/hipe2020/de/with_doc_seperator
2023-10-13 12:48:22,062 ----------------------------------------------------------------------------------------------------
2023-10-13 12:48:22,062 Train: 3575 sentences
2023-10-13 12:48:22,062 (train_with_dev=False, train_with_test=False)
2023-10-13 12:48:22,062 ----------------------------------------------------------------------------------------------------
2023-10-13 12:48:22,062 Training Params:
2023-10-13 12:48:22,062 - learning_rate: "5e-05"
2023-10-13 12:48:22,062 - mini_batch_size: "8"
2023-10-13 12:48:22,062 - max_epochs: "10"
2023-10-13 12:48:22,062 - shuffle: "True"
2023-10-13 12:48:22,062 ----------------------------------------------------------------------------------------------------
2023-10-13 12:48:22,062 Plugins:
2023-10-13 12:48:22,062 - LinearScheduler | warmup_fraction: '0.1'
2023-10-13 12:48:22,062 ----------------------------------------------------------------------------------------------------
2023-10-13 12:48:22,062 Final evaluation on model from best epoch (best-model.pt)
2023-10-13 12:48:22,062 - metric: "('micro avg', 'f1-score')"
2023-10-13 12:48:22,062 ----------------------------------------------------------------------------------------------------
2023-10-13 12:48:22,062 Computation:
2023-10-13 12:48:22,062 - compute on device: cuda:0
2023-10-13 12:48:22,062 - embedding storage: none
2023-10-13 12:48:22,062 ----------------------------------------------------------------------------------------------------
2023-10-13 12:48:22,062 Model training base path: "hmbench-hipe2020/de-dbmdz/bert-base-historic-multilingual-cased-bs8-wsFalse-e10-lr5e-05-poolingfirst-layers-1-crfFalse-3"
2023-10-13 12:48:22,062 ----------------------------------------------------------------------------------------------------
2023-10-13 12:48:22,062 ----------------------------------------------------------------------------------------------------
2023-10-13 12:48:24,736 epoch 1 - iter 44/447 - loss 2.72113629 - time (sec): 2.67 - samples/sec: 3086.23 - lr: 0.000005 - momentum: 0.000000
2023-10-13 12:48:27,748 epoch 1 - iter 88/447 - loss 1.76254453 - time (sec): 5.69 - samples/sec: 3008.78 - lr: 0.000010 - momentum: 0.000000
2023-10-13 12:48:30,428 epoch 1 - iter 132/447 - loss 1.36926797 - time (sec): 8.36 - samples/sec: 2990.42 - lr: 0.000015 - momentum: 0.000000
2023-10-13 12:48:33,398 epoch 1 - iter 176/447 - loss 1.09736044 - time (sec): 11.33 - samples/sec: 3050.34 - lr: 0.000020 - momentum: 0.000000
2023-10-13 12:48:36,384 epoch 1 - iter 220/447 - loss 0.93710869 - time (sec): 14.32 - samples/sec: 3024.41 - lr: 0.000024 - momentum: 0.000000
2023-10-13 12:48:39,333 epoch 1 - iter 264/447 - loss 0.82829129 - time (sec): 17.27 - samples/sec: 3009.72 - lr: 0.000029 - momentum: 0.000000
2023-10-13 12:48:42,028 epoch 1 - iter 308/447 - loss 0.75538064 - time (sec): 19.96 - samples/sec: 3005.19 - lr: 0.000034 - momentum: 0.000000
2023-10-13 12:48:44,956 epoch 1 - iter 352/447 - loss 0.70021192 - time (sec): 22.89 - samples/sec: 2974.25 - lr: 0.000039 - momentum: 0.000000
2023-10-13 12:48:48,054 epoch 1 - iter 396/447 - loss 0.64625200 - time (sec): 25.99 - samples/sec: 2968.22 - lr: 0.000044 - momentum: 0.000000
2023-10-13 12:48:50,791 epoch 1 - iter 440/447 - loss 0.60642690 - time (sec): 28.73 - samples/sec: 2972.97 - lr: 0.000049 - momentum: 0.000000
2023-10-13 12:48:51,196 ----------------------------------------------------------------------------------------------------
2023-10-13 12:48:51,196 EPOCH 1 done: loss 0.6027 - lr: 0.000049
2023-10-13 12:48:56,315 DEV : loss 0.18237876892089844 - f1-score (micro avg) 0.597
2023-10-13 12:48:56,364 saving best model
2023-10-13 12:48:56,829 ----------------------------------------------------------------------------------------------------
2023-10-13 12:48:59,725 epoch 2 - iter 44/447 - loss 0.19768284 - time (sec): 2.89 - samples/sec: 2808.02 - lr: 0.000049 - momentum: 0.000000
2023-10-13 12:49:02,590 epoch 2 - iter 88/447 - loss 0.18768155 - time (sec): 5.76 - samples/sec: 2855.46 - lr: 0.000049 - momentum: 0.000000
2023-10-13 12:49:05,392 epoch 2 - iter 132/447 - loss 0.17989274 - time (sec): 8.56 - samples/sec: 2902.30 - lr: 0.000048 - momentum: 0.000000
2023-10-13 12:49:08,357 epoch 2 - iter 176/447 - loss 0.16672786 - time (sec): 11.53 - samples/sec: 2885.66 - lr: 0.000048 - momentum: 0.000000
2023-10-13 12:49:11,204 epoch 2 - iter 220/447 - loss 0.16616890 - time (sec): 14.37 - samples/sec: 2884.18 - lr: 0.000047 - momentum: 0.000000
2023-10-13 12:49:14,037 epoch 2 - iter 264/447 - loss 0.16187003 - time (sec): 17.21 - samples/sec: 2896.45 - lr: 0.000047 - momentum: 0.000000
2023-10-13 12:49:16,772 epoch 2 - iter 308/447 - loss 0.16369277 - time (sec): 19.94 - samples/sec: 2908.91 - lr: 0.000046 - momentum: 0.000000
2023-10-13 12:49:19,882 epoch 2 - iter 352/447 - loss 0.15804249 - time (sec): 23.05 - samples/sec: 2909.39 - lr: 0.000046 - momentum: 0.000000
2023-10-13 12:49:22,765 epoch 2 - iter 396/447 - loss 0.15849072 - time (sec): 25.93 - samples/sec: 2958.06 - lr: 0.000045 - momentum: 0.000000
2023-10-13 12:49:25,599 epoch 2 - iter 440/447 - loss 0.15632034 - time (sec): 28.77 - samples/sec: 2964.47 - lr: 0.000045 - momentum: 0.000000
2023-10-13 12:49:26,075 ----------------------------------------------------------------------------------------------------
2023-10-13 12:49:26,075 EPOCH 2 done: loss 0.1560 - lr: 0.000045
2023-10-13 12:49:34,983 DEV : loss 0.13287977874279022 - f1-score (micro avg) 0.6972
2023-10-13 12:49:35,016 saving best model
2023-10-13 12:49:35,506 ----------------------------------------------------------------------------------------------------
2023-10-13 12:49:38,199 epoch 3 - iter 44/447 - loss 0.08731570 - time (sec): 2.68 - samples/sec: 3037.64 - lr: 0.000044 - momentum: 0.000000
2023-10-13 12:49:40,892 epoch 3 - iter 88/447 - loss 0.08438381 - time (sec): 5.38 - samples/sec: 2977.90 - lr: 0.000043 - momentum: 0.000000
2023-10-13 12:49:43,924 epoch 3 - iter 132/447 - loss 0.08570094 - time (sec): 8.41 - samples/sec: 2944.04 - lr: 0.000043 - momentum: 0.000000
2023-10-13 12:49:46,612 epoch 3 - iter 176/447 - loss 0.08756481 - time (sec): 11.10 - samples/sec: 2979.75 - lr: 0.000042 - momentum: 0.000000
2023-10-13 12:49:49,330 epoch 3 - iter 220/447 - loss 0.09082623 - time (sec): 13.82 - samples/sec: 2972.54 - lr: 0.000042 - momentum: 0.000000
2023-10-13 12:49:52,197 epoch 3 - iter 264/447 - loss 0.08856938 - time (sec): 16.68 - samples/sec: 2986.60 - lr: 0.000041 - momentum: 0.000000
2023-10-13 12:49:55,140 epoch 3 - iter 308/447 - loss 0.08687729 - time (sec): 19.63 - samples/sec: 2969.21 - lr: 0.000041 - momentum: 0.000000
2023-10-13 12:49:58,082 epoch 3 - iter 352/447 - loss 0.08587287 - time (sec): 22.57 - samples/sec: 2955.23 - lr: 0.000040 - momentum: 0.000000
2023-10-13 12:50:00,987 epoch 3 - iter 396/447 - loss 0.08320635 - time (sec): 25.47 - samples/sec: 2965.27 - lr: 0.000040 - momentum: 0.000000
2023-10-13 12:50:03,734 epoch 3 - iter 440/447 - loss 0.08286656 - time (sec): 28.22 - samples/sec: 2981.68 - lr: 0.000039 - momentum: 0.000000
2023-10-13 12:50:04,509 ----------------------------------------------------------------------------------------------------
2023-10-13 12:50:04,510 EPOCH 3 done: loss 0.0821 - lr: 0.000039
2023-10-13 12:50:13,364 DEV : loss 0.13965222239494324 - f1-score (micro avg) 0.7439
2023-10-13 12:50:13,399 saving best model
2023-10-13 12:50:13,905 ----------------------------------------------------------------------------------------------------
2023-10-13 12:50:16,806 epoch 4 - iter 44/447 - loss 0.06374565 - time (sec): 2.89 - samples/sec: 2818.47 - lr: 0.000038 - momentum: 0.000000
2023-10-13 12:50:19,552 epoch 4 - iter 88/447 - loss 0.06611011 - time (sec): 5.64 - samples/sec: 2940.44 - lr: 0.000038 - momentum: 0.000000
2023-10-13 12:50:22,225 epoch 4 - iter 132/447 - loss 0.05945489 - time (sec): 8.31 - samples/sec: 2982.35 - lr: 0.000037 - momentum: 0.000000
2023-10-13 12:50:25,405 epoch 4 - iter 176/447 - loss 0.05575885 - time (sec): 11.49 - samples/sec: 3029.54 - lr: 0.000037 - momentum: 0.000000
2023-10-13 12:50:28,204 epoch 4 - iter 220/447 - loss 0.05278981 - time (sec): 14.29 - samples/sec: 3024.95 - lr: 0.000036 - momentum: 0.000000
2023-10-13 12:50:31,045 epoch 4 - iter 264/447 - loss 0.05317609 - time (sec): 17.13 - samples/sec: 3014.77 - lr: 0.000036 - momentum: 0.000000
2023-10-13 12:50:33,781 epoch 4 - iter 308/447 - loss 0.05338658 - time (sec): 19.87 - samples/sec: 3025.23 - lr: 0.000035 - momentum: 0.000000
2023-10-13 12:50:36,538 epoch 4 - iter 352/447 - loss 0.05202556 - time (sec): 22.62 - samples/sec: 3023.70 - lr: 0.000035 - momentum: 0.000000
2023-10-13 12:50:39,597 epoch 4 - iter 396/447 - loss 0.05132416 - time (sec): 25.68 - samples/sec: 3010.67 - lr: 0.000034 - momentum: 0.000000
2023-10-13 12:50:42,247 epoch 4 - iter 440/447 - loss 0.05058772 - time (sec): 28.33 - samples/sec: 3013.07 - lr: 0.000033 - momentum: 0.000000
2023-10-13 12:50:42,652 ----------------------------------------------------------------------------------------------------
2023-10-13 12:50:42,653 EPOCH 4 done: loss 0.0507 - lr: 0.000033
2023-10-13 12:50:51,797 DEV : loss 0.15824183821678162 - f1-score (micro avg) 0.7532
2023-10-13 12:50:51,830 saving best model
2023-10-13 12:50:52,335 ----------------------------------------------------------------------------------------------------
2023-10-13 12:50:55,282 epoch 5 - iter 44/447 - loss 0.02999823 - time (sec): 2.94 - samples/sec: 2875.83 - lr: 0.000033 - momentum: 0.000000
2023-10-13 12:50:58,080 epoch 5 - iter 88/447 - loss 0.03142316 - time (sec): 5.74 - samples/sec: 2884.02 - lr: 0.000032 - momentum: 0.000000
2023-10-13 12:51:00,834 epoch 5 - iter 132/447 - loss 0.03153706 - time (sec): 8.49 - samples/sec: 2941.59 - lr: 0.000032 - momentum: 0.000000
2023-10-13 12:51:03,670 epoch 5 - iter 176/447 - loss 0.03146456 - time (sec): 11.33 - samples/sec: 2905.90 - lr: 0.000031 - momentum: 0.000000
2023-10-13 12:51:06,980 epoch 5 - iter 220/447 - loss 0.03208178 - time (sec): 14.64 - samples/sec: 2879.23 - lr: 0.000031 - momentum: 0.000000
2023-10-13 12:51:09,695 epoch 5 - iter 264/447 - loss 0.03368068 - time (sec): 17.36 - samples/sec: 2889.22 - lr: 0.000030 - momentum: 0.000000
2023-10-13 12:51:12,451 epoch 5 - iter 308/447 - loss 0.03426693 - time (sec): 20.11 - samples/sec: 2904.70 - lr: 0.000030 - momentum: 0.000000
2023-10-13 12:51:15,623 epoch 5 - iter 352/447 - loss 0.03584978 - time (sec): 23.28 - samples/sec: 2925.57 - lr: 0.000029 - momentum: 0.000000
2023-10-13 12:51:18,634 epoch 5 - iter 396/447 - loss 0.03559068 - time (sec): 26.29 - samples/sec: 2937.26 - lr: 0.000028 - momentum: 0.000000
2023-10-13 12:51:21,576 epoch 5 - iter 440/447 - loss 0.03588276 - time (sec): 29.24 - samples/sec: 2919.05 - lr: 0.000028 - momentum: 0.000000
2023-10-13 12:51:21,988 ----------------------------------------------------------------------------------------------------
2023-10-13 12:51:21,989 EPOCH 5 done: loss 0.0355 - lr: 0.000028
2023-10-13 12:51:30,616 DEV : loss 0.17583982646465302 - f1-score (micro avg) 0.7625
2023-10-13 12:51:30,650 saving best model
2023-10-13 12:51:31,135 ----------------------------------------------------------------------------------------------------
2023-10-13 12:51:34,394 epoch 6 - iter 44/447 - loss 0.01600788 - time (sec): 3.26 - samples/sec: 2954.25 - lr: 0.000027 - momentum: 0.000000
2023-10-13 12:51:37,101 epoch 6 - iter 88/447 - loss 0.01598411 - time (sec): 5.96 - samples/sec: 2988.06 - lr: 0.000027 - momentum: 0.000000
2023-10-13 12:51:40,033 epoch 6 - iter 132/447 - loss 0.01807915 - time (sec): 8.90 - samples/sec: 2994.03 - lr: 0.000026 - momentum: 0.000000
2023-10-13 12:51:42,899 epoch 6 - iter 176/447 - loss 0.01681758 - time (sec): 11.76 - samples/sec: 3010.57 - lr: 0.000026 - momentum: 0.000000
2023-10-13 12:51:45,738 epoch 6 - iter 220/447 - loss 0.01823050 - time (sec): 14.60 - samples/sec: 3040.07 - lr: 0.000025 - momentum: 0.000000
2023-10-13 12:51:48,430 epoch 6 - iter 264/447 - loss 0.02019697 - time (sec): 17.29 - samples/sec: 3025.48 - lr: 0.000025 - momentum: 0.000000
2023-10-13 12:51:51,200 epoch 6 - iter 308/447 - loss 0.01912780 - time (sec): 20.06 - samples/sec: 3004.35 - lr: 0.000024 - momentum: 0.000000
2023-10-13 12:51:53,876 epoch 6 - iter 352/447 - loss 0.02112949 - time (sec): 22.74 - samples/sec: 3001.14 - lr: 0.000023 - momentum: 0.000000
2023-10-13 12:51:56,698 epoch 6 - iter 396/447 - loss 0.02312469 - time (sec): 25.56 - samples/sec: 3010.87 - lr: 0.000023 - momentum: 0.000000
2023-10-13 12:51:59,256 epoch 6 - iter 440/447 - loss 0.02258683 - time (sec): 28.12 - samples/sec: 3017.49 - lr: 0.000022 - momentum: 0.000000
2023-10-13 12:51:59,876 ----------------------------------------------------------------------------------------------------
2023-10-13 12:51:59,877 EPOCH 6 done: loss 0.0222 - lr: 0.000022
2023-10-13 12:52:08,836 DEV : loss 0.21230502426624298 - f1-score (micro avg) 0.7578
2023-10-13 12:52:08,867 ----------------------------------------------------------------------------------------------------
2023-10-13 12:52:11,860 epoch 7 - iter 44/447 - loss 0.02437766 - time (sec): 2.99 - samples/sec: 3073.74 - lr: 0.000022 - momentum: 0.000000
2023-10-13 12:52:15,198 epoch 7 - iter 88/447 - loss 0.01924784 - time (sec): 6.33 - samples/sec: 2997.77 - lr: 0.000021 - momentum: 0.000000
2023-10-13 12:52:18,035 epoch 7 - iter 132/447 - loss 0.01736984 - time (sec): 9.17 - samples/sec: 3015.68 - lr: 0.000021 - momentum: 0.000000
2023-10-13 12:52:20,905 epoch 7 - iter 176/447 - loss 0.01662843 - time (sec): 12.04 - samples/sec: 3055.48 - lr: 0.000020 - momentum: 0.000000
2023-10-13 12:52:24,031 epoch 7 - iter 220/447 - loss 0.01461178 - time (sec): 15.16 - samples/sec: 3011.95 - lr: 0.000020 - momentum: 0.000000
2023-10-13 12:52:26,692 epoch 7 - iter 264/447 - loss 0.01314305 - time (sec): 17.82 - samples/sec: 2990.65 - lr: 0.000019 - momentum: 0.000000
2023-10-13 12:52:29,402 epoch 7 - iter 308/447 - loss 0.01265643 - time (sec): 20.53 - samples/sec: 2973.51 - lr: 0.000018 - momentum: 0.000000
2023-10-13 12:52:32,221 epoch 7 - iter 352/447 - loss 0.01273989 - time (sec): 23.35 - samples/sec: 2967.43 - lr: 0.000018 - momentum: 0.000000
2023-10-13 12:52:34,784 epoch 7 - iter 396/447 - loss 0.01331493 - time (sec): 25.92 - samples/sec: 2975.03 - lr: 0.000017 - momentum: 0.000000
2023-10-13 12:52:37,386 epoch 7 - iter 440/447 - loss 0.01337988 - time (sec): 28.52 - samples/sec: 2981.87 - lr: 0.000017 - momentum: 0.000000
2023-10-13 12:52:37,892 ----------------------------------------------------------------------------------------------------
2023-10-13 12:52:37,892 EPOCH 7 done: loss 0.0132 - lr: 0.000017
2023-10-13 12:52:46,514 DEV : loss 0.23592258989810944 - f1-score (micro avg) 0.7612
2023-10-13 12:52:46,545 ----------------------------------------------------------------------------------------------------
2023-10-13 12:52:49,368 epoch 8 - iter 44/447 - loss 0.01339854 - time (sec): 2.82 - samples/sec: 3064.25 - lr: 0.000016 - momentum: 0.000000
2023-10-13 12:52:52,244 epoch 8 - iter 88/447 - loss 0.01099649 - time (sec): 5.70 - samples/sec: 2945.77 - lr: 0.000016 - momentum: 0.000000
2023-10-13 12:52:55,576 epoch 8 - iter 132/447 - loss 0.00853314 - time (sec): 9.03 - samples/sec: 2964.34 - lr: 0.000015 - momentum: 0.000000
2023-10-13 12:52:58,656 epoch 8 - iter 176/447 - loss 0.00969601 - time (sec): 12.11 - samples/sec: 2906.80 - lr: 0.000015 - momentum: 0.000000
2023-10-13 12:53:01,479 epoch 8 - iter 220/447 - loss 0.01007474 - time (sec): 14.93 - samples/sec: 2918.87 - lr: 0.000014 - momentum: 0.000000
2023-10-13 12:53:04,582 epoch 8 - iter 264/447 - loss 0.01035185 - time (sec): 18.04 - samples/sec: 2902.77 - lr: 0.000013 - momentum: 0.000000
2023-10-13 12:53:07,412 epoch 8 - iter 308/447 - loss 0.01063796 - time (sec): 20.87 - samples/sec: 2919.76 - lr: 0.000013 - momentum: 0.000000
2023-10-13 12:53:10,111 epoch 8 - iter 352/447 - loss 0.00992196 - time (sec): 23.56 - samples/sec: 2924.33 - lr: 0.000012 - momentum: 0.000000
2023-10-13 12:53:12,894 epoch 8 - iter 396/447 - loss 0.00942271 - time (sec): 26.35 - samples/sec: 2938.99 - lr: 0.000012 - momentum: 0.000000
2023-10-13 12:53:15,454 epoch 8 - iter 440/447 - loss 0.00956868 - time (sec): 28.91 - samples/sec: 2952.81 - lr: 0.000011 - momentum: 0.000000
2023-10-13 12:53:15,838 ----------------------------------------------------------------------------------------------------
2023-10-13 12:53:15,838 EPOCH 8 done: loss 0.0095 - lr: 0.000011
2023-10-13 12:53:24,513 DEV : loss 0.2494419664144516 - f1-score (micro avg) 0.7784
2023-10-13 12:53:24,546 saving best model
2023-10-13 12:53:25,426 ----------------------------------------------------------------------------------------------------
2023-10-13 12:53:28,519 epoch 9 - iter 44/447 - loss 0.00818351 - time (sec): 3.09 - samples/sec: 2643.65 - lr: 0.000011 - momentum: 0.000000
2023-10-13 12:53:31,740 epoch 9 - iter 88/447 - loss 0.00467539 - time (sec): 6.31 - samples/sec: 2773.19 - lr: 0.000010 - momentum: 0.000000
2023-10-13 12:53:34,662 epoch 9 - iter 132/447 - loss 0.00483987 - time (sec): 9.23 - samples/sec: 2821.56 - lr: 0.000010 - momentum: 0.000000
2023-10-13 12:53:37,469 epoch 9 - iter 176/447 - loss 0.00526312 - time (sec): 12.04 - samples/sec: 2859.97 - lr: 0.000009 - momentum: 0.000000
2023-10-13 12:53:40,102 epoch 9 - iter 220/447 - loss 0.00658465 - time (sec): 14.67 - samples/sec: 2914.07 - lr: 0.000008 - momentum: 0.000000
2023-10-13 12:53:42,708 epoch 9 - iter 264/447 - loss 0.00837702 - time (sec): 17.28 - samples/sec: 2944.21 - lr: 0.000008 - momentum: 0.000000
2023-10-13 12:53:45,383 epoch 9 - iter 308/447 - loss 0.00730189 - time (sec): 19.96 - samples/sec: 2955.19 - lr: 0.000007 - momentum: 0.000000
2023-10-13 12:53:48,016 epoch 9 - iter 352/447 - loss 0.00753736 - time (sec): 22.59 - samples/sec: 2973.51 - lr: 0.000007 - momentum: 0.000000
2023-10-13 12:53:51,129 epoch 9 - iter 396/447 - loss 0.00741956 - time (sec): 25.70 - samples/sec: 2997.38 - lr: 0.000006 - momentum: 0.000000
2023-10-13 12:53:53,973 epoch 9 - iter 440/447 - loss 0.00707842 - time (sec): 28.55 - samples/sec: 2989.06 - lr: 0.000006 - momentum: 0.000000
2023-10-13 12:53:54,369 ----------------------------------------------------------------------------------------------------
2023-10-13 12:53:54,369 EPOCH 9 done: loss 0.0073 - lr: 0.000006
2023-10-13 12:54:02,736 DEV : loss 0.2321784794330597 - f1-score (micro avg) 0.7825
2023-10-13 12:54:02,767 saving best model
2023-10-13 12:54:03,165 ----------------------------------------------------------------------------------------------------
2023-10-13 12:54:06,127 epoch 10 - iter 44/447 - loss 0.00185807 - time (sec): 2.96 - samples/sec: 3045.90 - lr: 0.000005 - momentum: 0.000000
2023-10-13 12:54:08,834 epoch 10 - iter 88/447 - loss 0.00465358 - time (sec): 5.67 - samples/sec: 3052.43 - lr: 0.000005 - momentum: 0.000000
2023-10-13 12:54:12,066 epoch 10 - iter 132/447 - loss 0.00358618 - time (sec): 8.90 - samples/sec: 2892.03 - lr: 0.000004 - momentum: 0.000000
2023-10-13 12:54:14,976 epoch 10 - iter 176/447 - loss 0.00391395 - time (sec): 11.81 - samples/sec: 2932.40 - lr: 0.000003 - momentum: 0.000000
2023-10-13 12:54:17,595 epoch 10 - iter 220/447 - loss 0.00416413 - time (sec): 14.43 - samples/sec: 2956.30 - lr: 0.000003 - momentum: 0.000000
2023-10-13 12:54:20,350 epoch 10 - iter 264/447 - loss 0.00456129 - time (sec): 17.18 - samples/sec: 2956.21 - lr: 0.000002 - momentum: 0.000000
2023-10-13 12:54:23,283 epoch 10 - iter 308/447 - loss 0.00441630 - time (sec): 20.12 - samples/sec: 2953.83 - lr: 0.000002 - momentum: 0.000000
2023-10-13 12:54:26,710 epoch 10 - iter 352/447 - loss 0.00512117 - time (sec): 23.54 - samples/sec: 2958.63 - lr: 0.000001 - momentum: 0.000000
2023-10-13 12:54:29,451 epoch 10 - iter 396/447 - loss 0.00514772 - time (sec): 26.28 - samples/sec: 2948.94 - lr: 0.000001 - momentum: 0.000000
2023-10-13 12:54:32,059 epoch 10 - iter 440/447 - loss 0.00469391 - time (sec): 28.89 - samples/sec: 2951.55 - lr: 0.000000 - momentum: 0.000000
2023-10-13 12:54:32,458 ----------------------------------------------------------------------------------------------------
2023-10-13 12:54:32,458 EPOCH 10 done: loss 0.0047 - lr: 0.000000
2023-10-13 12:54:40,938 DEV : loss 0.23675945401191711 - f1-score (micro avg) 0.7819
2023-10-13 12:54:41,310 ----------------------------------------------------------------------------------------------------
2023-10-13 12:54:41,311 Loading model from best epoch ...
2023-10-13 12:54:42,795 SequenceTagger predicts: Dictionary with 21 tags: O, S-loc, B-loc, E-loc, I-loc, S-pers, B-pers, E-pers, I-pers, S-org, B-org, E-org, I-org, S-prod, B-prod, E-prod, I-prod, S-time, B-time, E-time, I-time
2023-10-13 12:54:47,671
Results:
- F-score (micro) 0.7527
- F-score (macro) 0.6738
- Accuracy 0.624
By class:
precision recall f1-score support
loc 0.8248 0.8607 0.8424 596
pers 0.6852 0.7387 0.7110 333
org 0.5635 0.5379 0.5504 132
prod 0.5962 0.4697 0.5254 66
time 0.7255 0.7551 0.7400 49
micro avg 0.7421 0.7636 0.7527 1176
macro avg 0.6790 0.6724 0.6738 1176
weighted avg 0.7390 0.7636 0.7503 1176
2023-10-13 12:54:47,671 ----------------------------------------------------------------------------------------------------