stefan-it's picture
Upload folder using huggingface_hub
324275e
2023-10-18 17:45:45,111 ----------------------------------------------------------------------------------------------------
2023-10-18 17:45:45,111 Model: "SequenceTagger(
(embeddings): TransformerWordEmbeddings(
(model): BertModel(
(embeddings): BertEmbeddings(
(word_embeddings): Embedding(32001, 128)
(position_embeddings): Embedding(512, 128)
(token_type_embeddings): Embedding(2, 128)
(LayerNorm): LayerNorm((128,), eps=1e-12, elementwise_affine=True)
(dropout): Dropout(p=0.1, inplace=False)
)
(encoder): BertEncoder(
(layer): ModuleList(
(0-1): 2 x BertLayer(
(attention): BertAttention(
(self): BertSelfAttention(
(query): Linear(in_features=128, out_features=128, bias=True)
(key): Linear(in_features=128, out_features=128, bias=True)
(value): Linear(in_features=128, out_features=128, bias=True)
(dropout): Dropout(p=0.1, inplace=False)
)
(output): BertSelfOutput(
(dense): Linear(in_features=128, out_features=128, bias=True)
(LayerNorm): LayerNorm((128,), eps=1e-12, elementwise_affine=True)
(dropout): Dropout(p=0.1, inplace=False)
)
)
(intermediate): BertIntermediate(
(dense): Linear(in_features=128, out_features=512, bias=True)
(intermediate_act_fn): GELUActivation()
)
(output): BertOutput(
(dense): Linear(in_features=512, out_features=128, bias=True)
(LayerNorm): LayerNorm((128,), eps=1e-12, elementwise_affine=True)
(dropout): Dropout(p=0.1, inplace=False)
)
)
)
)
(pooler): BertPooler(
(dense): Linear(in_features=128, out_features=128, bias=True)
(activation): Tanh()
)
)
)
(locked_dropout): LockedDropout(p=0.5)
(linear): Linear(in_features=128, out_features=21, bias=True)
(loss_function): CrossEntropyLoss()
)"
2023-10-18 17:45:45,111 ----------------------------------------------------------------------------------------------------
2023-10-18 17:45:45,111 MultiCorpus: 3575 train + 1235 dev + 1266 test sentences
- NER_HIPE_2022 Corpus: 3575 train + 1235 dev + 1266 test sentences - /root/.flair/datasets/ner_hipe_2022/v2.1/hipe2020/de/with_doc_seperator
2023-10-18 17:45:45,111 ----------------------------------------------------------------------------------------------------
2023-10-18 17:45:45,111 Train: 3575 sentences
2023-10-18 17:45:45,111 (train_with_dev=False, train_with_test=False)
2023-10-18 17:45:45,111 ----------------------------------------------------------------------------------------------------
2023-10-18 17:45:45,111 Training Params:
2023-10-18 17:45:45,111 - learning_rate: "5e-05"
2023-10-18 17:45:45,111 - mini_batch_size: "8"
2023-10-18 17:45:45,111 - max_epochs: "10"
2023-10-18 17:45:45,111 - shuffle: "True"
2023-10-18 17:45:45,111 ----------------------------------------------------------------------------------------------------
2023-10-18 17:45:45,111 Plugins:
2023-10-18 17:45:45,111 - TensorboardLogger
2023-10-18 17:45:45,111 - LinearScheduler | warmup_fraction: '0.1'
2023-10-18 17:45:45,111 ----------------------------------------------------------------------------------------------------
2023-10-18 17:45:45,112 Final evaluation on model from best epoch (best-model.pt)
2023-10-18 17:45:45,112 - metric: "('micro avg', 'f1-score')"
2023-10-18 17:45:45,112 ----------------------------------------------------------------------------------------------------
2023-10-18 17:45:45,112 Computation:
2023-10-18 17:45:45,112 - compute on device: cuda:0
2023-10-18 17:45:45,112 - embedding storage: none
2023-10-18 17:45:45,112 ----------------------------------------------------------------------------------------------------
2023-10-18 17:45:45,112 Model training base path: "hmbench-hipe2020/de-dbmdz/bert-tiny-historic-multilingual-cased-bs8-wsFalse-e10-lr5e-05-poolingfirst-layers-1-crfFalse-1"
2023-10-18 17:45:45,112 ----------------------------------------------------------------------------------------------------
2023-10-18 17:45:45,112 ----------------------------------------------------------------------------------------------------
2023-10-18 17:45:45,112 Logging anything other than scalars to TensorBoard is currently not supported.
2023-10-18 17:45:46,424 epoch 1 - iter 44/447 - loss 3.63023183 - time (sec): 1.31 - samples/sec: 6249.34 - lr: 0.000005 - momentum: 0.000000
2023-10-18 17:45:47,340 epoch 1 - iter 88/447 - loss 3.47670301 - time (sec): 2.23 - samples/sec: 7442.66 - lr: 0.000010 - momentum: 0.000000
2023-10-18 17:45:48,324 epoch 1 - iter 132/447 - loss 3.17492072 - time (sec): 3.21 - samples/sec: 7810.71 - lr: 0.000015 - momentum: 0.000000
2023-10-18 17:45:49,295 epoch 1 - iter 176/447 - loss 2.82555737 - time (sec): 4.18 - samples/sec: 7998.44 - lr: 0.000020 - momentum: 0.000000
2023-10-18 17:45:50,276 epoch 1 - iter 220/447 - loss 2.42797386 - time (sec): 5.16 - samples/sec: 8200.94 - lr: 0.000024 - momentum: 0.000000
2023-10-18 17:45:51,302 epoch 1 - iter 264/447 - loss 2.13076998 - time (sec): 6.19 - samples/sec: 8265.38 - lr: 0.000029 - momentum: 0.000000
2023-10-18 17:45:52,327 epoch 1 - iter 308/447 - loss 1.91676111 - time (sec): 7.21 - samples/sec: 8275.15 - lr: 0.000034 - momentum: 0.000000
2023-10-18 17:45:53,332 epoch 1 - iter 352/447 - loss 1.74561988 - time (sec): 8.22 - samples/sec: 8341.64 - lr: 0.000039 - momentum: 0.000000
2023-10-18 17:45:54,330 epoch 1 - iter 396/447 - loss 1.61932497 - time (sec): 9.22 - samples/sec: 8373.39 - lr: 0.000044 - momentum: 0.000000
2023-10-18 17:45:55,313 epoch 1 - iter 440/447 - loss 1.51895144 - time (sec): 10.20 - samples/sec: 8350.97 - lr: 0.000049 - momentum: 0.000000
2023-10-18 17:45:55,465 ----------------------------------------------------------------------------------------------------
2023-10-18 17:45:55,465 EPOCH 1 done: loss 1.5041 - lr: 0.000049
2023-10-18 17:45:57,366 DEV : loss 0.45467135310173035 - f1-score (micro avg) 0.0
2023-10-18 17:45:57,391 ----------------------------------------------------------------------------------------------------
2023-10-18 17:45:58,406 epoch 2 - iter 44/447 - loss 0.55495071 - time (sec): 1.02 - samples/sec: 9278.81 - lr: 0.000049 - momentum: 0.000000
2023-10-18 17:45:59,729 epoch 2 - iter 88/447 - loss 0.52145669 - time (sec): 2.34 - samples/sec: 7818.26 - lr: 0.000049 - momentum: 0.000000
2023-10-18 17:46:00,751 epoch 2 - iter 132/447 - loss 0.51503667 - time (sec): 3.36 - samples/sec: 8091.33 - lr: 0.000048 - momentum: 0.000000
2023-10-18 17:46:01,718 epoch 2 - iter 176/447 - loss 0.51312139 - time (sec): 4.33 - samples/sec: 8020.83 - lr: 0.000048 - momentum: 0.000000
2023-10-18 17:46:02,719 epoch 2 - iter 220/447 - loss 0.50166932 - time (sec): 5.33 - samples/sec: 8204.80 - lr: 0.000047 - momentum: 0.000000
2023-10-18 17:46:03,686 epoch 2 - iter 264/447 - loss 0.49283811 - time (sec): 6.29 - samples/sec: 8225.13 - lr: 0.000047 - momentum: 0.000000
2023-10-18 17:46:04,685 epoch 2 - iter 308/447 - loss 0.49657977 - time (sec): 7.29 - samples/sec: 8237.58 - lr: 0.000046 - momentum: 0.000000
2023-10-18 17:46:05,718 epoch 2 - iter 352/447 - loss 0.49009269 - time (sec): 8.33 - samples/sec: 8241.12 - lr: 0.000046 - momentum: 0.000000
2023-10-18 17:46:06,718 epoch 2 - iter 396/447 - loss 0.48910603 - time (sec): 9.33 - samples/sec: 8256.38 - lr: 0.000045 - momentum: 0.000000
2023-10-18 17:46:07,702 epoch 2 - iter 440/447 - loss 0.48266159 - time (sec): 10.31 - samples/sec: 8252.91 - lr: 0.000045 - momentum: 0.000000
2023-10-18 17:46:07,863 ----------------------------------------------------------------------------------------------------
2023-10-18 17:46:07,863 EPOCH 2 done: loss 0.4822 - lr: 0.000045
2023-10-18 17:46:12,738 DEV : loss 0.35227200388908386 - f1-score (micro avg) 0.1697
2023-10-18 17:46:12,763 saving best model
2023-10-18 17:46:12,800 ----------------------------------------------------------------------------------------------------
2023-10-18 17:46:13,765 epoch 3 - iter 44/447 - loss 0.39653677 - time (sec): 0.96 - samples/sec: 8094.88 - lr: 0.000044 - momentum: 0.000000
2023-10-18 17:46:14,775 epoch 3 - iter 88/447 - loss 0.41886899 - time (sec): 1.98 - samples/sec: 8353.85 - lr: 0.000043 - momentum: 0.000000
2023-10-18 17:46:15,797 epoch 3 - iter 132/447 - loss 0.41806988 - time (sec): 3.00 - samples/sec: 8102.47 - lr: 0.000043 - momentum: 0.000000
2023-10-18 17:46:16,827 epoch 3 - iter 176/447 - loss 0.40474759 - time (sec): 4.03 - samples/sec: 8248.55 - lr: 0.000042 - momentum: 0.000000
2023-10-18 17:46:17,812 epoch 3 - iter 220/447 - loss 0.40178671 - time (sec): 5.01 - samples/sec: 8364.46 - lr: 0.000042 - momentum: 0.000000
2023-10-18 17:46:18,811 epoch 3 - iter 264/447 - loss 0.39833510 - time (sec): 6.01 - samples/sec: 8467.03 - lr: 0.000041 - momentum: 0.000000
2023-10-18 17:46:19,813 epoch 3 - iter 308/447 - loss 0.39305489 - time (sec): 7.01 - samples/sec: 8379.61 - lr: 0.000041 - momentum: 0.000000
2023-10-18 17:46:20,864 epoch 3 - iter 352/447 - loss 0.39147747 - time (sec): 8.06 - samples/sec: 8325.86 - lr: 0.000040 - momentum: 0.000000
2023-10-18 17:46:21,931 epoch 3 - iter 396/447 - loss 0.39451842 - time (sec): 9.13 - samples/sec: 8281.18 - lr: 0.000040 - momentum: 0.000000
2023-10-18 17:46:23,003 epoch 3 - iter 440/447 - loss 0.39174653 - time (sec): 10.20 - samples/sec: 8357.16 - lr: 0.000039 - momentum: 0.000000
2023-10-18 17:46:23,147 ----------------------------------------------------------------------------------------------------
2023-10-18 17:46:23,147 EPOCH 3 done: loss 0.3906 - lr: 0.000039
2023-10-18 17:46:28,370 DEV : loss 0.3125365972518921 - f1-score (micro avg) 0.3066
2023-10-18 17:46:28,397 saving best model
2023-10-18 17:46:28,440 ----------------------------------------------------------------------------------------------------
2023-10-18 17:46:29,500 epoch 4 - iter 44/447 - loss 0.34658596 - time (sec): 1.06 - samples/sec: 8576.54 - lr: 0.000038 - momentum: 0.000000
2023-10-18 17:46:30,503 epoch 4 - iter 88/447 - loss 0.36650845 - time (sec): 2.06 - samples/sec: 8538.19 - lr: 0.000038 - momentum: 0.000000
2023-10-18 17:46:31,510 epoch 4 - iter 132/447 - loss 0.37690399 - time (sec): 3.07 - samples/sec: 8586.97 - lr: 0.000037 - momentum: 0.000000
2023-10-18 17:46:32,518 epoch 4 - iter 176/447 - loss 0.37315065 - time (sec): 4.08 - samples/sec: 8680.94 - lr: 0.000037 - momentum: 0.000000
2023-10-18 17:46:33,480 epoch 4 - iter 220/447 - loss 0.36813917 - time (sec): 5.04 - samples/sec: 8629.02 - lr: 0.000036 - momentum: 0.000000
2023-10-18 17:46:34,474 epoch 4 - iter 264/447 - loss 0.36318897 - time (sec): 6.03 - samples/sec: 8586.76 - lr: 0.000036 - momentum: 0.000000
2023-10-18 17:46:35,519 epoch 4 - iter 308/447 - loss 0.35678724 - time (sec): 7.08 - samples/sec: 8529.43 - lr: 0.000035 - momentum: 0.000000
2023-10-18 17:46:36,574 epoch 4 - iter 352/447 - loss 0.35194138 - time (sec): 8.13 - samples/sec: 8459.85 - lr: 0.000035 - momentum: 0.000000
2023-10-18 17:46:37,585 epoch 4 - iter 396/447 - loss 0.35494397 - time (sec): 9.14 - samples/sec: 8446.83 - lr: 0.000034 - momentum: 0.000000
2023-10-18 17:46:38,593 epoch 4 - iter 440/447 - loss 0.35274777 - time (sec): 10.15 - samples/sec: 8401.55 - lr: 0.000033 - momentum: 0.000000
2023-10-18 17:46:38,757 ----------------------------------------------------------------------------------------------------
2023-10-18 17:46:38,757 EPOCH 4 done: loss 0.3532 - lr: 0.000033
2023-10-18 17:46:44,063 DEV : loss 0.3131018579006195 - f1-score (micro avg) 0.3417
2023-10-18 17:46:44,088 saving best model
2023-10-18 17:46:44,121 ----------------------------------------------------------------------------------------------------
2023-10-18 17:46:45,122 epoch 5 - iter 44/447 - loss 0.30571568 - time (sec): 1.00 - samples/sec: 8142.12 - lr: 0.000033 - momentum: 0.000000
2023-10-18 17:46:46,142 epoch 5 - iter 88/447 - loss 0.33894497 - time (sec): 2.02 - samples/sec: 7815.16 - lr: 0.000032 - momentum: 0.000000
2023-10-18 17:46:47,211 epoch 5 - iter 132/447 - loss 0.31782972 - time (sec): 3.09 - samples/sec: 7706.70 - lr: 0.000032 - momentum: 0.000000
2023-10-18 17:46:48,269 epoch 5 - iter 176/447 - loss 0.31678493 - time (sec): 4.15 - samples/sec: 8031.04 - lr: 0.000031 - momentum: 0.000000
2023-10-18 17:46:49,274 epoch 5 - iter 220/447 - loss 0.31827885 - time (sec): 5.15 - samples/sec: 8157.81 - lr: 0.000031 - momentum: 0.000000
2023-10-18 17:46:50,285 epoch 5 - iter 264/447 - loss 0.31772496 - time (sec): 6.16 - samples/sec: 8270.72 - lr: 0.000030 - momentum: 0.000000
2023-10-18 17:46:51,298 epoch 5 - iter 308/447 - loss 0.31931617 - time (sec): 7.18 - samples/sec: 8288.28 - lr: 0.000030 - momentum: 0.000000
2023-10-18 17:46:52,318 epoch 5 - iter 352/447 - loss 0.32349790 - time (sec): 8.20 - samples/sec: 8320.37 - lr: 0.000029 - momentum: 0.000000
2023-10-18 17:46:53,315 epoch 5 - iter 396/447 - loss 0.32487172 - time (sec): 9.19 - samples/sec: 8325.07 - lr: 0.000028 - momentum: 0.000000
2023-10-18 17:46:54,306 epoch 5 - iter 440/447 - loss 0.32512626 - time (sec): 10.18 - samples/sec: 8372.89 - lr: 0.000028 - momentum: 0.000000
2023-10-18 17:46:54,471 ----------------------------------------------------------------------------------------------------
2023-10-18 17:46:54,471 EPOCH 5 done: loss 0.3284 - lr: 0.000028
2023-10-18 17:46:59,699 DEV : loss 0.2934885025024414 - f1-score (micro avg) 0.3473
2023-10-18 17:46:59,723 saving best model
2023-10-18 17:46:59,757 ----------------------------------------------------------------------------------------------------
2023-10-18 17:47:00,810 epoch 6 - iter 44/447 - loss 0.33102953 - time (sec): 1.05 - samples/sec: 7910.31 - lr: 0.000027 - momentum: 0.000000
2023-10-18 17:47:01,850 epoch 6 - iter 88/447 - loss 0.29089739 - time (sec): 2.09 - samples/sec: 8271.88 - lr: 0.000027 - momentum: 0.000000
2023-10-18 17:47:02,897 epoch 6 - iter 132/447 - loss 0.27937740 - time (sec): 3.14 - samples/sec: 8474.85 - lr: 0.000026 - momentum: 0.000000
2023-10-18 17:47:03,884 epoch 6 - iter 176/447 - loss 0.29312207 - time (sec): 4.13 - samples/sec: 8382.12 - lr: 0.000026 - momentum: 0.000000
2023-10-18 17:47:04,878 epoch 6 - iter 220/447 - loss 0.30141709 - time (sec): 5.12 - samples/sec: 8404.35 - lr: 0.000025 - momentum: 0.000000
2023-10-18 17:47:05,903 epoch 6 - iter 264/447 - loss 0.30127538 - time (sec): 6.15 - samples/sec: 8330.13 - lr: 0.000025 - momentum: 0.000000
2023-10-18 17:47:06,882 epoch 6 - iter 308/447 - loss 0.30249673 - time (sec): 7.12 - samples/sec: 8356.36 - lr: 0.000024 - momentum: 0.000000
2023-10-18 17:47:07,871 epoch 6 - iter 352/447 - loss 0.30097020 - time (sec): 8.11 - samples/sec: 8424.06 - lr: 0.000023 - momentum: 0.000000
2023-10-18 17:47:08,915 epoch 6 - iter 396/447 - loss 0.30519595 - time (sec): 9.16 - samples/sec: 8402.09 - lr: 0.000023 - momentum: 0.000000
2023-10-18 17:47:09,894 epoch 6 - iter 440/447 - loss 0.30649731 - time (sec): 10.14 - samples/sec: 8393.94 - lr: 0.000022 - momentum: 0.000000
2023-10-18 17:47:10,051 ----------------------------------------------------------------------------------------------------
2023-10-18 17:47:10,051 EPOCH 6 done: loss 0.3064 - lr: 0.000022
2023-10-18 17:47:15,383 DEV : loss 0.29135483503341675 - f1-score (micro avg) 0.368
2023-10-18 17:47:15,409 saving best model
2023-10-18 17:47:15,440 ----------------------------------------------------------------------------------------------------
2023-10-18 17:47:16,409 epoch 7 - iter 44/447 - loss 0.27124394 - time (sec): 0.97 - samples/sec: 8470.90 - lr: 0.000022 - momentum: 0.000000
2023-10-18 17:47:17,393 epoch 7 - iter 88/447 - loss 0.27751488 - time (sec): 1.95 - samples/sec: 8585.24 - lr: 0.000021 - momentum: 0.000000
2023-10-18 17:47:18,413 epoch 7 - iter 132/447 - loss 0.28270565 - time (sec): 2.97 - samples/sec: 8218.31 - lr: 0.000021 - momentum: 0.000000
2023-10-18 17:47:19,446 epoch 7 - iter 176/447 - loss 0.28697327 - time (sec): 4.01 - samples/sec: 8291.23 - lr: 0.000020 - momentum: 0.000000
2023-10-18 17:47:20,471 epoch 7 - iter 220/447 - loss 0.28533917 - time (sec): 5.03 - samples/sec: 8295.46 - lr: 0.000020 - momentum: 0.000000
2023-10-18 17:47:21,440 epoch 7 - iter 264/447 - loss 0.28421280 - time (sec): 6.00 - samples/sec: 8287.42 - lr: 0.000019 - momentum: 0.000000
2023-10-18 17:47:22,449 epoch 7 - iter 308/447 - loss 0.28908989 - time (sec): 7.01 - samples/sec: 8330.52 - lr: 0.000018 - momentum: 0.000000
2023-10-18 17:47:23,525 epoch 7 - iter 352/447 - loss 0.28831722 - time (sec): 8.08 - samples/sec: 8425.48 - lr: 0.000018 - momentum: 0.000000
2023-10-18 17:47:24,553 epoch 7 - iter 396/447 - loss 0.28733893 - time (sec): 9.11 - samples/sec: 8348.03 - lr: 0.000017 - momentum: 0.000000
2023-10-18 17:47:25,615 epoch 7 - iter 440/447 - loss 0.29081983 - time (sec): 10.17 - samples/sec: 8398.03 - lr: 0.000017 - momentum: 0.000000
2023-10-18 17:47:25,778 ----------------------------------------------------------------------------------------------------
2023-10-18 17:47:25,778 EPOCH 7 done: loss 0.2911 - lr: 0.000017
2023-10-18 17:47:30,756 DEV : loss 0.2889775335788727 - f1-score (micro avg) 0.363
2023-10-18 17:47:30,782 ----------------------------------------------------------------------------------------------------
2023-10-18 17:47:31,817 epoch 8 - iter 44/447 - loss 0.27122766 - time (sec): 1.03 - samples/sec: 7852.47 - lr: 0.000016 - momentum: 0.000000
2023-10-18 17:47:32,800 epoch 8 - iter 88/447 - loss 0.27239490 - time (sec): 2.02 - samples/sec: 8271.11 - lr: 0.000016 - momentum: 0.000000
2023-10-18 17:47:33,633 epoch 8 - iter 132/447 - loss 0.27395850 - time (sec): 2.85 - samples/sec: 8639.16 - lr: 0.000015 - momentum: 0.000000
2023-10-18 17:47:34,496 epoch 8 - iter 176/447 - loss 0.27805841 - time (sec): 3.71 - samples/sec: 8827.52 - lr: 0.000015 - momentum: 0.000000
2023-10-18 17:47:35,344 epoch 8 - iter 220/447 - loss 0.27997338 - time (sec): 4.56 - samples/sec: 9055.14 - lr: 0.000014 - momentum: 0.000000
2023-10-18 17:47:36,268 epoch 8 - iter 264/447 - loss 0.28648908 - time (sec): 5.49 - samples/sec: 9102.12 - lr: 0.000013 - momentum: 0.000000
2023-10-18 17:47:37,357 epoch 8 - iter 308/447 - loss 0.28405540 - time (sec): 6.57 - samples/sec: 9122.48 - lr: 0.000013 - momentum: 0.000000
2023-10-18 17:47:38,425 epoch 8 - iter 352/447 - loss 0.28687288 - time (sec): 7.64 - samples/sec: 9000.75 - lr: 0.000012 - momentum: 0.000000
2023-10-18 17:47:39,444 epoch 8 - iter 396/447 - loss 0.28653366 - time (sec): 8.66 - samples/sec: 8876.98 - lr: 0.000012 - momentum: 0.000000
2023-10-18 17:47:40,456 epoch 8 - iter 440/447 - loss 0.28373991 - time (sec): 9.67 - samples/sec: 8828.62 - lr: 0.000011 - momentum: 0.000000
2023-10-18 17:47:40,605 ----------------------------------------------------------------------------------------------------
2023-10-18 17:47:40,605 EPOCH 8 done: loss 0.2825 - lr: 0.000011
2023-10-18 17:47:45,952 DEV : loss 0.2879358232021332 - f1-score (micro avg) 0.3677
2023-10-18 17:47:45,979 ----------------------------------------------------------------------------------------------------
2023-10-18 17:47:47,006 epoch 9 - iter 44/447 - loss 0.27183602 - time (sec): 1.03 - samples/sec: 8000.53 - lr: 0.000011 - momentum: 0.000000
2023-10-18 17:47:48,025 epoch 9 - iter 88/447 - loss 0.26259942 - time (sec): 2.05 - samples/sec: 8257.05 - lr: 0.000010 - momentum: 0.000000
2023-10-18 17:47:49,009 epoch 9 - iter 132/447 - loss 0.26992750 - time (sec): 3.03 - samples/sec: 8293.21 - lr: 0.000010 - momentum: 0.000000
2023-10-18 17:47:49,989 epoch 9 - iter 176/447 - loss 0.26493230 - time (sec): 4.01 - samples/sec: 8324.07 - lr: 0.000009 - momentum: 0.000000
2023-10-18 17:47:50,977 epoch 9 - iter 220/447 - loss 0.26644403 - time (sec): 5.00 - samples/sec: 8318.87 - lr: 0.000008 - momentum: 0.000000
2023-10-18 17:47:51,984 epoch 9 - iter 264/447 - loss 0.26207838 - time (sec): 6.01 - samples/sec: 8424.72 - lr: 0.000008 - momentum: 0.000000
2023-10-18 17:47:53,030 epoch 9 - iter 308/447 - loss 0.26828243 - time (sec): 7.05 - samples/sec: 8473.53 - lr: 0.000007 - momentum: 0.000000
2023-10-18 17:47:53,988 epoch 9 - iter 352/447 - loss 0.26953453 - time (sec): 8.01 - samples/sec: 8451.29 - lr: 0.000007 - momentum: 0.000000
2023-10-18 17:47:54,994 epoch 9 - iter 396/447 - loss 0.27140936 - time (sec): 9.01 - samples/sec: 8428.25 - lr: 0.000006 - momentum: 0.000000
2023-10-18 17:47:56,012 epoch 9 - iter 440/447 - loss 0.26981883 - time (sec): 10.03 - samples/sec: 8518.48 - lr: 0.000006 - momentum: 0.000000
2023-10-18 17:47:56,171 ----------------------------------------------------------------------------------------------------
2023-10-18 17:47:56,171 EPOCH 9 done: loss 0.2702 - lr: 0.000006
2023-10-18 17:48:01,497 DEV : loss 0.2887320816516876 - f1-score (micro avg) 0.3744
2023-10-18 17:48:01,523 saving best model
2023-10-18 17:48:01,562 ----------------------------------------------------------------------------------------------------
2023-10-18 17:48:02,553 epoch 10 - iter 44/447 - loss 0.25613707 - time (sec): 0.99 - samples/sec: 8003.44 - lr: 0.000005 - momentum: 0.000000
2023-10-18 17:48:03,577 epoch 10 - iter 88/447 - loss 0.24160136 - time (sec): 2.01 - samples/sec: 8521.32 - lr: 0.000005 - momentum: 0.000000
2023-10-18 17:48:04,552 epoch 10 - iter 132/447 - loss 0.25434692 - time (sec): 2.99 - samples/sec: 8567.48 - lr: 0.000004 - momentum: 0.000000
2023-10-18 17:48:05,521 epoch 10 - iter 176/447 - loss 0.26304286 - time (sec): 3.96 - samples/sec: 8549.81 - lr: 0.000003 - momentum: 0.000000
2023-10-18 17:48:06,578 epoch 10 - iter 220/447 - loss 0.26755417 - time (sec): 5.02 - samples/sec: 8613.78 - lr: 0.000003 - momentum: 0.000000
2023-10-18 17:48:07,539 epoch 10 - iter 264/447 - loss 0.26551062 - time (sec): 5.98 - samples/sec: 8631.16 - lr: 0.000002 - momentum: 0.000000
2023-10-18 17:48:08,548 epoch 10 - iter 308/447 - loss 0.27058725 - time (sec): 6.99 - samples/sec: 8586.48 - lr: 0.000002 - momentum: 0.000000
2023-10-18 17:48:09,548 epoch 10 - iter 352/447 - loss 0.27199708 - time (sec): 7.99 - samples/sec: 8560.23 - lr: 0.000001 - momentum: 0.000000
2023-10-18 17:48:10,564 epoch 10 - iter 396/447 - loss 0.27115365 - time (sec): 9.00 - samples/sec: 8546.42 - lr: 0.000001 - momentum: 0.000000
2023-10-18 17:48:11,555 epoch 10 - iter 440/447 - loss 0.26841919 - time (sec): 9.99 - samples/sec: 8532.39 - lr: 0.000000 - momentum: 0.000000
2023-10-18 17:48:11,713 ----------------------------------------------------------------------------------------------------
2023-10-18 17:48:11,713 EPOCH 10 done: loss 0.2675 - lr: 0.000000
2023-10-18 17:48:17,028 DEV : loss 0.2889607846736908 - f1-score (micro avg) 0.3718
2023-10-18 17:48:17,087 ----------------------------------------------------------------------------------------------------
2023-10-18 17:48:17,088 Loading model from best epoch ...
2023-10-18 17:48:17,167 SequenceTagger predicts: Dictionary with 21 tags: O, S-loc, B-loc, E-loc, I-loc, S-pers, B-pers, E-pers, I-pers, S-org, B-org, E-org, I-org, S-prod, B-prod, E-prod, I-prod, S-time, B-time, E-time, I-time
2023-10-18 17:48:19,105
Results:
- F-score (micro) 0.3781
- F-score (macro) 0.1626
- Accuracy 0.2465
By class:
precision recall f1-score support
loc 0.5388 0.5822 0.5597 596
pers 0.1865 0.2072 0.1963 333
org 0.0000 0.0000 0.0000 132
time 0.0952 0.0408 0.0571 49
prod 0.0000 0.0000 0.0000 66
micro avg 0.4039 0.3554 0.3781 1176
macro avg 0.1641 0.1660 0.1626 1176
weighted avg 0.3298 0.3554 0.3416 1176
2023-10-18 17:48:19,105 ----------------------------------------------------------------------------------------------------