stefan-it's picture
Upload ./training.log with huggingface_hub
c60164a
2023-10-23 22:11:00,819 ----------------------------------------------------------------------------------------------------
2023-10-23 22:11:00,820 Model: "SequenceTagger(
(embeddings): TransformerWordEmbeddings(
(model): BertModel(
(embeddings): BertEmbeddings(
(word_embeddings): Embedding(64001, 768)
(position_embeddings): Embedding(512, 768)
(token_type_embeddings): Embedding(2, 768)
(LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)
(dropout): Dropout(p=0.1, inplace=False)
)
(encoder): BertEncoder(
(layer): ModuleList(
(0): BertLayer(
(attention): BertAttention(
(self): BertSelfAttention(
(query): Linear(in_features=768, out_features=768, bias=True)
(key): Linear(in_features=768, out_features=768, bias=True)
(value): Linear(in_features=768, out_features=768, bias=True)
(dropout): Dropout(p=0.1, inplace=False)
)
(output): BertSelfOutput(
(dense): Linear(in_features=768, out_features=768, bias=True)
(LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)
(dropout): Dropout(p=0.1, inplace=False)
)
)
(intermediate): BertIntermediate(
(dense): Linear(in_features=768, out_features=3072, bias=True)
(intermediate_act_fn): GELUActivation()
)
(output): BertOutput(
(dense): Linear(in_features=3072, out_features=768, bias=True)
(LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)
(dropout): Dropout(p=0.1, inplace=False)
)
)
(1): BertLayer(
(attention): BertAttention(
(self): BertSelfAttention(
(query): Linear(in_features=768, out_features=768, bias=True)
(key): Linear(in_features=768, out_features=768, bias=True)
(value): Linear(in_features=768, out_features=768, bias=True)
(dropout): Dropout(p=0.1, inplace=False)
)
(output): BertSelfOutput(
(dense): Linear(in_features=768, out_features=768, bias=True)
(LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)
(dropout): Dropout(p=0.1, inplace=False)
)
)
(intermediate): BertIntermediate(
(dense): Linear(in_features=768, out_features=3072, bias=True)
(intermediate_act_fn): GELUActivation()
)
(output): BertOutput(
(dense): Linear(in_features=3072, out_features=768, bias=True)
(LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)
(dropout): Dropout(p=0.1, inplace=False)
)
)
(2): BertLayer(
(attention): BertAttention(
(self): BertSelfAttention(
(query): Linear(in_features=768, out_features=768, bias=True)
(key): Linear(in_features=768, out_features=768, bias=True)
(value): Linear(in_features=768, out_features=768, bias=True)
(dropout): Dropout(p=0.1, inplace=False)
)
(output): BertSelfOutput(
(dense): Linear(in_features=768, out_features=768, bias=True)
(LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)
(dropout): Dropout(p=0.1, inplace=False)
)
)
(intermediate): BertIntermediate(
(dense): Linear(in_features=768, out_features=3072, bias=True)
(intermediate_act_fn): GELUActivation()
)
(output): BertOutput(
(dense): Linear(in_features=3072, out_features=768, bias=True)
(LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)
(dropout): Dropout(p=0.1, inplace=False)
)
)
(3): BertLayer(
(attention): BertAttention(
(self): BertSelfAttention(
(query): Linear(in_features=768, out_features=768, bias=True)
(key): Linear(in_features=768, out_features=768, bias=True)
(value): Linear(in_features=768, out_features=768, bias=True)
(dropout): Dropout(p=0.1, inplace=False)
)
(output): BertSelfOutput(
(dense): Linear(in_features=768, out_features=768, bias=True)
(LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)
(dropout): Dropout(p=0.1, inplace=False)
)
)
(intermediate): BertIntermediate(
(dense): Linear(in_features=768, out_features=3072, bias=True)
(intermediate_act_fn): GELUActivation()
)
(output): BertOutput(
(dense): Linear(in_features=3072, out_features=768, bias=True)
(LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)
(dropout): Dropout(p=0.1, inplace=False)
)
)
(4): BertLayer(
(attention): BertAttention(
(self): BertSelfAttention(
(query): Linear(in_features=768, out_features=768, bias=True)
(key): Linear(in_features=768, out_features=768, bias=True)
(value): Linear(in_features=768, out_features=768, bias=True)
(dropout): Dropout(p=0.1, inplace=False)
)
(output): BertSelfOutput(
(dense): Linear(in_features=768, out_features=768, bias=True)
(LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)
(dropout): Dropout(p=0.1, inplace=False)
)
)
(intermediate): BertIntermediate(
(dense): Linear(in_features=768, out_features=3072, bias=True)
(intermediate_act_fn): GELUActivation()
)
(output): BertOutput(
(dense): Linear(in_features=3072, out_features=768, bias=True)
(LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)
(dropout): Dropout(p=0.1, inplace=False)
)
)
(5): BertLayer(
(attention): BertAttention(
(self): BertSelfAttention(
(query): Linear(in_features=768, out_features=768, bias=True)
(key): Linear(in_features=768, out_features=768, bias=True)
(value): Linear(in_features=768, out_features=768, bias=True)
(dropout): Dropout(p=0.1, inplace=False)
)
(output): BertSelfOutput(
(dense): Linear(in_features=768, out_features=768, bias=True)
(LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)
(dropout): Dropout(p=0.1, inplace=False)
)
)
(intermediate): BertIntermediate(
(dense): Linear(in_features=768, out_features=3072, bias=True)
(intermediate_act_fn): GELUActivation()
)
(output): BertOutput(
(dense): Linear(in_features=3072, out_features=768, bias=True)
(LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)
(dropout): Dropout(p=0.1, inplace=False)
)
)
(6): BertLayer(
(attention): BertAttention(
(self): BertSelfAttention(
(query): Linear(in_features=768, out_features=768, bias=True)
(key): Linear(in_features=768, out_features=768, bias=True)
(value): Linear(in_features=768, out_features=768, bias=True)
(dropout): Dropout(p=0.1, inplace=False)
)
(output): BertSelfOutput(
(dense): Linear(in_features=768, out_features=768, bias=True)
(LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)
(dropout): Dropout(p=0.1, inplace=False)
)
)
(intermediate): BertIntermediate(
(dense): Linear(in_features=768, out_features=3072, bias=True)
(intermediate_act_fn): GELUActivation()
)
(output): BertOutput(
(dense): Linear(in_features=3072, out_features=768, bias=True)
(LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)
(dropout): Dropout(p=0.1, inplace=False)
)
)
(7): BertLayer(
(attention): BertAttention(
(self): BertSelfAttention(
(query): Linear(in_features=768, out_features=768, bias=True)
(key): Linear(in_features=768, out_features=768, bias=True)
(value): Linear(in_features=768, out_features=768, bias=True)
(dropout): Dropout(p=0.1, inplace=False)
)
(output): BertSelfOutput(
(dense): Linear(in_features=768, out_features=768, bias=True)
(LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)
(dropout): Dropout(p=0.1, inplace=False)
)
)
(intermediate): BertIntermediate(
(dense): Linear(in_features=768, out_features=3072, bias=True)
(intermediate_act_fn): GELUActivation()
)
(output): BertOutput(
(dense): Linear(in_features=3072, out_features=768, bias=True)
(LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)
(dropout): Dropout(p=0.1, inplace=False)
)
)
(8): BertLayer(
(attention): BertAttention(
(self): BertSelfAttention(
(query): Linear(in_features=768, out_features=768, bias=True)
(key): Linear(in_features=768, out_features=768, bias=True)
(value): Linear(in_features=768, out_features=768, bias=True)
(dropout): Dropout(p=0.1, inplace=False)
)
(output): BertSelfOutput(
(dense): Linear(in_features=768, out_features=768, bias=True)
(LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)
(dropout): Dropout(p=0.1, inplace=False)
)
)
(intermediate): BertIntermediate(
(dense): Linear(in_features=768, out_features=3072, bias=True)
(intermediate_act_fn): GELUActivation()
)
(output): BertOutput(
(dense): Linear(in_features=3072, out_features=768, bias=True)
(LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)
(dropout): Dropout(p=0.1, inplace=False)
)
)
(9): BertLayer(
(attention): BertAttention(
(self): BertSelfAttention(
(query): Linear(in_features=768, out_features=768, bias=True)
(key): Linear(in_features=768, out_features=768, bias=True)
(value): Linear(in_features=768, out_features=768, bias=True)
(dropout): Dropout(p=0.1, inplace=False)
)
(output): BertSelfOutput(
(dense): Linear(in_features=768, out_features=768, bias=True)
(LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)
(dropout): Dropout(p=0.1, inplace=False)
)
)
(intermediate): BertIntermediate(
(dense): Linear(in_features=768, out_features=3072, bias=True)
(intermediate_act_fn): GELUActivation()
)
(output): BertOutput(
(dense): Linear(in_features=3072, out_features=768, bias=True)
(LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)
(dropout): Dropout(p=0.1, inplace=False)
)
)
(10): BertLayer(
(attention): BertAttention(
(self): BertSelfAttention(
(query): Linear(in_features=768, out_features=768, bias=True)
(key): Linear(in_features=768, out_features=768, bias=True)
(value): Linear(in_features=768, out_features=768, bias=True)
(dropout): Dropout(p=0.1, inplace=False)
)
(output): BertSelfOutput(
(dense): Linear(in_features=768, out_features=768, bias=True)
(LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)
(dropout): Dropout(p=0.1, inplace=False)
)
)
(intermediate): BertIntermediate(
(dense): Linear(in_features=768, out_features=3072, bias=True)
(intermediate_act_fn): GELUActivation()
)
(output): BertOutput(
(dense): Linear(in_features=3072, out_features=768, bias=True)
(LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)
(dropout): Dropout(p=0.1, inplace=False)
)
)
(11): BertLayer(
(attention): BertAttention(
(self): BertSelfAttention(
(query): Linear(in_features=768, out_features=768, bias=True)
(key): Linear(in_features=768, out_features=768, bias=True)
(value): Linear(in_features=768, out_features=768, bias=True)
(dropout): Dropout(p=0.1, inplace=False)
)
(output): BertSelfOutput(
(dense): Linear(in_features=768, out_features=768, bias=True)
(LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)
(dropout): Dropout(p=0.1, inplace=False)
)
)
(intermediate): BertIntermediate(
(dense): Linear(in_features=768, out_features=3072, bias=True)
(intermediate_act_fn): GELUActivation()
)
(output): BertOutput(
(dense): Linear(in_features=3072, out_features=768, bias=True)
(LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)
(dropout): Dropout(p=0.1, inplace=False)
)
)
)
)
(pooler): BertPooler(
(dense): Linear(in_features=768, out_features=768, bias=True)
(activation): Tanh()
)
)
)
(locked_dropout): LockedDropout(p=0.5)
(linear): Linear(in_features=768, out_features=21, bias=True)
(loss_function): CrossEntropyLoss()
)"
2023-10-23 22:11:00,820 ----------------------------------------------------------------------------------------------------
2023-10-23 22:11:00,820 MultiCorpus: 3575 train + 1235 dev + 1266 test sentences
- NER_HIPE_2022 Corpus: 3575 train + 1235 dev + 1266 test sentences - /home/ubuntu/.flair/datasets/ner_hipe_2022/v2.1/hipe2020/de/with_doc_seperator
2023-10-23 22:11:00,820 ----------------------------------------------------------------------------------------------------
2023-10-23 22:11:00,820 Train: 3575 sentences
2023-10-23 22:11:00,820 (train_with_dev=False, train_with_test=False)
2023-10-23 22:11:00,820 ----------------------------------------------------------------------------------------------------
2023-10-23 22:11:00,820 Training Params:
2023-10-23 22:11:00,820 - learning_rate: "5e-05"
2023-10-23 22:11:00,820 - mini_batch_size: "4"
2023-10-23 22:11:00,820 - max_epochs: "10"
2023-10-23 22:11:00,820 - shuffle: "True"
2023-10-23 22:11:00,820 ----------------------------------------------------------------------------------------------------
2023-10-23 22:11:00,820 Plugins:
2023-10-23 22:11:00,820 - TensorboardLogger
2023-10-23 22:11:00,820 - LinearScheduler | warmup_fraction: '0.1'
2023-10-23 22:11:00,821 ----------------------------------------------------------------------------------------------------
2023-10-23 22:11:00,821 Final evaluation on model from best epoch (best-model.pt)
2023-10-23 22:11:00,821 - metric: "('micro avg', 'f1-score')"
2023-10-23 22:11:00,821 ----------------------------------------------------------------------------------------------------
2023-10-23 22:11:00,821 Computation:
2023-10-23 22:11:00,821 - compute on device: cuda:0
2023-10-23 22:11:00,821 - embedding storage: none
2023-10-23 22:11:00,821 ----------------------------------------------------------------------------------------------------
2023-10-23 22:11:00,821 Model training base path: "hmbench-hipe2020/de-dbmdz/bert-base-historic-multilingual-64k-td-cased-bs4-wsFalse-e10-lr5e-05-poolingfirst-layers-1-crfFalse-4"
2023-10-23 22:11:00,821 ----------------------------------------------------------------------------------------------------
2023-10-23 22:11:00,821 ----------------------------------------------------------------------------------------------------
2023-10-23 22:11:00,821 Logging anything other than scalars to TensorBoard is currently not supported.
2023-10-23 22:11:06,471 epoch 1 - iter 89/894 - loss 2.24767612 - time (sec): 5.65 - samples/sec: 1465.04 - lr: 0.000005 - momentum: 0.000000
2023-10-23 22:11:12,248 epoch 1 - iter 178/894 - loss 1.32989836 - time (sec): 11.43 - samples/sec: 1509.50 - lr: 0.000010 - momentum: 0.000000
2023-10-23 22:11:17,833 epoch 1 - iter 267/894 - loss 1.01594059 - time (sec): 17.01 - samples/sec: 1500.72 - lr: 0.000015 - momentum: 0.000000
2023-10-23 22:11:23,383 epoch 1 - iter 356/894 - loss 0.85782838 - time (sec): 22.56 - samples/sec: 1499.78 - lr: 0.000020 - momentum: 0.000000
2023-10-23 22:11:28,969 epoch 1 - iter 445/894 - loss 0.73177723 - time (sec): 28.15 - samples/sec: 1508.85 - lr: 0.000025 - momentum: 0.000000
2023-10-23 22:11:34,459 epoch 1 - iter 534/894 - loss 0.65249591 - time (sec): 33.64 - samples/sec: 1506.93 - lr: 0.000030 - momentum: 0.000000
2023-10-23 22:11:40,179 epoch 1 - iter 623/894 - loss 0.58897814 - time (sec): 39.36 - samples/sec: 1514.74 - lr: 0.000035 - momentum: 0.000000
2023-10-23 22:11:45,756 epoch 1 - iter 712/894 - loss 0.54026067 - time (sec): 44.93 - samples/sec: 1517.85 - lr: 0.000040 - momentum: 0.000000
2023-10-23 22:11:51,314 epoch 1 - iter 801/894 - loss 0.50352442 - time (sec): 50.49 - samples/sec: 1515.38 - lr: 0.000045 - momentum: 0.000000
2023-10-23 22:11:57,287 epoch 1 - iter 890/894 - loss 0.47177208 - time (sec): 56.47 - samples/sec: 1522.81 - lr: 0.000050 - momentum: 0.000000
2023-10-23 22:11:57,598 ----------------------------------------------------------------------------------------------------
2023-10-23 22:11:57,598 EPOCH 1 done: loss 0.4717 - lr: 0.000050
2023-10-23 22:12:02,458 DEV : loss 0.18034544587135315 - f1-score (micro avg) 0.5217
2023-10-23 22:12:02,478 saving best model
2023-10-23 22:12:02,946 ----------------------------------------------------------------------------------------------------
2023-10-23 22:12:08,443 epoch 2 - iter 89/894 - loss 0.16827676 - time (sec): 5.50 - samples/sec: 1470.62 - lr: 0.000049 - momentum: 0.000000
2023-10-23 22:12:14,038 epoch 2 - iter 178/894 - loss 0.15766325 - time (sec): 11.09 - samples/sec: 1535.26 - lr: 0.000049 - momentum: 0.000000
2023-10-23 22:12:19,915 epoch 2 - iter 267/894 - loss 0.15542630 - time (sec): 16.97 - samples/sec: 1549.29 - lr: 0.000048 - momentum: 0.000000
2023-10-23 22:12:25,515 epoch 2 - iter 356/894 - loss 0.15957913 - time (sec): 22.57 - samples/sec: 1535.54 - lr: 0.000048 - momentum: 0.000000
2023-10-23 22:12:31,204 epoch 2 - iter 445/894 - loss 0.16341156 - time (sec): 28.26 - samples/sec: 1532.91 - lr: 0.000047 - momentum: 0.000000
2023-10-23 22:12:36,902 epoch 2 - iter 534/894 - loss 0.15864282 - time (sec): 33.96 - samples/sec: 1519.51 - lr: 0.000047 - momentum: 0.000000
2023-10-23 22:12:42,475 epoch 2 - iter 623/894 - loss 0.16171748 - time (sec): 39.53 - samples/sec: 1514.59 - lr: 0.000046 - momentum: 0.000000
2023-10-23 22:12:47,959 epoch 2 - iter 712/894 - loss 0.15483930 - time (sec): 45.01 - samples/sec: 1501.41 - lr: 0.000046 - momentum: 0.000000
2023-10-23 22:12:53,845 epoch 2 - iter 801/894 - loss 0.15163502 - time (sec): 50.90 - samples/sec: 1514.04 - lr: 0.000045 - momentum: 0.000000
2023-10-23 22:12:59,561 epoch 2 - iter 890/894 - loss 0.14974204 - time (sec): 56.61 - samples/sec: 1521.89 - lr: 0.000044 - momentum: 0.000000
2023-10-23 22:12:59,808 ----------------------------------------------------------------------------------------------------
2023-10-23 22:12:59,809 EPOCH 2 done: loss 0.1492 - lr: 0.000044
2023-10-23 22:13:06,298 DEV : loss 0.1743343621492386 - f1-score (micro avg) 0.6821
2023-10-23 22:13:06,318 saving best model
2023-10-23 22:13:06,906 ----------------------------------------------------------------------------------------------------
2023-10-23 22:13:12,429 epoch 3 - iter 89/894 - loss 0.09273514 - time (sec): 5.52 - samples/sec: 1420.41 - lr: 0.000044 - momentum: 0.000000
2023-10-23 22:13:18,207 epoch 3 - iter 178/894 - loss 0.09432610 - time (sec): 11.30 - samples/sec: 1444.51 - lr: 0.000043 - momentum: 0.000000
2023-10-23 22:13:23,762 epoch 3 - iter 267/894 - loss 0.09201036 - time (sec): 16.86 - samples/sec: 1470.53 - lr: 0.000043 - momentum: 0.000000
2023-10-23 22:13:29,556 epoch 3 - iter 356/894 - loss 0.09877382 - time (sec): 22.65 - samples/sec: 1500.80 - lr: 0.000042 - momentum: 0.000000
2023-10-23 22:13:35,072 epoch 3 - iter 445/894 - loss 0.09586860 - time (sec): 28.17 - samples/sec: 1476.33 - lr: 0.000042 - momentum: 0.000000
2023-10-23 22:13:40,966 epoch 3 - iter 534/894 - loss 0.10865567 - time (sec): 34.06 - samples/sec: 1492.01 - lr: 0.000041 - momentum: 0.000000
2023-10-23 22:13:46,850 epoch 3 - iter 623/894 - loss 0.10472267 - time (sec): 39.94 - samples/sec: 1505.52 - lr: 0.000041 - momentum: 0.000000
2023-10-23 22:13:52,432 epoch 3 - iter 712/894 - loss 0.10079658 - time (sec): 45.53 - samples/sec: 1518.39 - lr: 0.000040 - momentum: 0.000000
2023-10-23 22:13:57,983 epoch 3 - iter 801/894 - loss 0.10204516 - time (sec): 51.08 - samples/sec: 1515.78 - lr: 0.000039 - momentum: 0.000000
2023-10-23 22:14:03,641 epoch 3 - iter 890/894 - loss 0.10044707 - time (sec): 56.73 - samples/sec: 1519.00 - lr: 0.000039 - momentum: 0.000000
2023-10-23 22:14:03,886 ----------------------------------------------------------------------------------------------------
2023-10-23 22:14:03,886 EPOCH 3 done: loss 0.1009 - lr: 0.000039
2023-10-23 22:14:10,389 DEV : loss 0.17964236438274384 - f1-score (micro avg) 0.7016
2023-10-23 22:14:10,409 saving best model
2023-10-23 22:14:11,002 ----------------------------------------------------------------------------------------------------
2023-10-23 22:14:16,611 epoch 4 - iter 89/894 - loss 0.06929115 - time (sec): 5.61 - samples/sec: 1511.93 - lr: 0.000038 - momentum: 0.000000
2023-10-23 22:14:22,139 epoch 4 - iter 178/894 - loss 0.07558077 - time (sec): 11.14 - samples/sec: 1487.05 - lr: 0.000038 - momentum: 0.000000
2023-10-23 22:14:27,727 epoch 4 - iter 267/894 - loss 0.06465839 - time (sec): 16.72 - samples/sec: 1507.05 - lr: 0.000037 - momentum: 0.000000
2023-10-23 22:14:33,623 epoch 4 - iter 356/894 - loss 0.06402311 - time (sec): 22.62 - samples/sec: 1527.66 - lr: 0.000037 - momentum: 0.000000
2023-10-23 22:14:39,396 epoch 4 - iter 445/894 - loss 0.06469627 - time (sec): 28.39 - samples/sec: 1527.99 - lr: 0.000036 - momentum: 0.000000
2023-10-23 22:14:45,018 epoch 4 - iter 534/894 - loss 0.06445300 - time (sec): 34.02 - samples/sec: 1522.05 - lr: 0.000036 - momentum: 0.000000
2023-10-23 22:14:50,505 epoch 4 - iter 623/894 - loss 0.06592729 - time (sec): 39.50 - samples/sec: 1523.69 - lr: 0.000035 - momentum: 0.000000
2023-10-23 22:14:56,137 epoch 4 - iter 712/894 - loss 0.06504216 - time (sec): 45.13 - samples/sec: 1523.15 - lr: 0.000034 - momentum: 0.000000
2023-10-23 22:15:01,901 epoch 4 - iter 801/894 - loss 0.06508479 - time (sec): 50.90 - samples/sec: 1521.92 - lr: 0.000034 - momentum: 0.000000
2023-10-23 22:15:07,561 epoch 4 - iter 890/894 - loss 0.06338887 - time (sec): 56.56 - samples/sec: 1524.30 - lr: 0.000033 - momentum: 0.000000
2023-10-23 22:15:07,809 ----------------------------------------------------------------------------------------------------
2023-10-23 22:15:07,809 EPOCH 4 done: loss 0.0636 - lr: 0.000033
2023-10-23 22:15:14,301 DEV : loss 0.2551104426383972 - f1-score (micro avg) 0.7203
2023-10-23 22:15:14,321 saving best model
2023-10-23 22:15:14,914 ----------------------------------------------------------------------------------------------------
2023-10-23 22:15:20,699 epoch 5 - iter 89/894 - loss 0.03423092 - time (sec): 5.78 - samples/sec: 1555.39 - lr: 0.000033 - momentum: 0.000000
2023-10-23 22:15:26,368 epoch 5 - iter 178/894 - loss 0.03919786 - time (sec): 11.45 - samples/sec: 1529.70 - lr: 0.000032 - momentum: 0.000000
2023-10-23 22:15:31,920 epoch 5 - iter 267/894 - loss 0.03831350 - time (sec): 17.00 - samples/sec: 1519.52 - lr: 0.000032 - momentum: 0.000000
2023-10-23 22:15:37,645 epoch 5 - iter 356/894 - loss 0.03789332 - time (sec): 22.73 - samples/sec: 1535.60 - lr: 0.000031 - momentum: 0.000000
2023-10-23 22:15:43,563 epoch 5 - iter 445/894 - loss 0.03759650 - time (sec): 28.65 - samples/sec: 1560.25 - lr: 0.000031 - momentum: 0.000000
2023-10-23 22:15:49,030 epoch 5 - iter 534/894 - loss 0.03829895 - time (sec): 34.11 - samples/sec: 1540.36 - lr: 0.000030 - momentum: 0.000000
2023-10-23 22:15:54,744 epoch 5 - iter 623/894 - loss 0.03927776 - time (sec): 39.83 - samples/sec: 1531.49 - lr: 0.000029 - momentum: 0.000000
2023-10-23 22:16:00,300 epoch 5 - iter 712/894 - loss 0.03884967 - time (sec): 45.39 - samples/sec: 1533.98 - lr: 0.000029 - momentum: 0.000000
2023-10-23 22:16:05,845 epoch 5 - iter 801/894 - loss 0.03960733 - time (sec): 50.93 - samples/sec: 1521.58 - lr: 0.000028 - momentum: 0.000000
2023-10-23 22:16:11,463 epoch 5 - iter 890/894 - loss 0.03948272 - time (sec): 56.55 - samples/sec: 1519.82 - lr: 0.000028 - momentum: 0.000000
2023-10-23 22:16:11,769 ----------------------------------------------------------------------------------------------------
2023-10-23 22:16:11,769 EPOCH 5 done: loss 0.0394 - lr: 0.000028
2023-10-23 22:16:18,259 DEV : loss 0.2507097125053406 - f1-score (micro avg) 0.7541
2023-10-23 22:16:18,279 saving best model
2023-10-23 22:16:18,871 ----------------------------------------------------------------------------------------------------
2023-10-23 22:16:24,253 epoch 6 - iter 89/894 - loss 0.03427358 - time (sec): 5.38 - samples/sec: 1390.08 - lr: 0.000027 - momentum: 0.000000
2023-10-23 22:16:29,876 epoch 6 - iter 178/894 - loss 0.03451237 - time (sec): 11.00 - samples/sec: 1458.98 - lr: 0.000027 - momentum: 0.000000
2023-10-23 22:16:35,622 epoch 6 - iter 267/894 - loss 0.03090456 - time (sec): 16.75 - samples/sec: 1510.50 - lr: 0.000026 - momentum: 0.000000
2023-10-23 22:16:41,315 epoch 6 - iter 356/894 - loss 0.03413662 - time (sec): 22.44 - samples/sec: 1515.71 - lr: 0.000026 - momentum: 0.000000
2023-10-23 22:16:47,226 epoch 6 - iter 445/894 - loss 0.03121911 - time (sec): 28.35 - samples/sec: 1538.17 - lr: 0.000025 - momentum: 0.000000
2023-10-23 22:16:52,738 epoch 6 - iter 534/894 - loss 0.03023548 - time (sec): 33.87 - samples/sec: 1526.90 - lr: 0.000024 - momentum: 0.000000
2023-10-23 22:16:58,470 epoch 6 - iter 623/894 - loss 0.03072382 - time (sec): 39.60 - samples/sec: 1531.66 - lr: 0.000024 - momentum: 0.000000
2023-10-23 22:17:04,206 epoch 6 - iter 712/894 - loss 0.02973244 - time (sec): 45.33 - samples/sec: 1523.66 - lr: 0.000023 - momentum: 0.000000
2023-10-23 22:17:09,881 epoch 6 - iter 801/894 - loss 0.02952191 - time (sec): 51.01 - samples/sec: 1523.58 - lr: 0.000023 - momentum: 0.000000
2023-10-23 22:17:15,533 epoch 6 - iter 890/894 - loss 0.02936006 - time (sec): 56.66 - samples/sec: 1521.80 - lr: 0.000022 - momentum: 0.000000
2023-10-23 22:17:15,773 ----------------------------------------------------------------------------------------------------
2023-10-23 22:17:15,773 EPOCH 6 done: loss 0.0294 - lr: 0.000022
2023-10-23 22:17:22,244 DEV : loss 0.2560969591140747 - f1-score (micro avg) 0.7591
2023-10-23 22:17:22,265 saving best model
2023-10-23 22:17:22,856 ----------------------------------------------------------------------------------------------------
2023-10-23 22:17:28,785 epoch 7 - iter 89/894 - loss 0.01473967 - time (sec): 5.93 - samples/sec: 1607.98 - lr: 0.000022 - momentum: 0.000000
2023-10-23 22:17:34,368 epoch 7 - iter 178/894 - loss 0.01596532 - time (sec): 11.51 - samples/sec: 1544.52 - lr: 0.000021 - momentum: 0.000000
2023-10-23 22:17:39,859 epoch 7 - iter 267/894 - loss 0.01847908 - time (sec): 17.00 - samples/sec: 1507.85 - lr: 0.000021 - momentum: 0.000000
2023-10-23 22:17:45,320 epoch 7 - iter 356/894 - loss 0.01735099 - time (sec): 22.46 - samples/sec: 1486.05 - lr: 0.000020 - momentum: 0.000000
2023-10-23 22:17:51,061 epoch 7 - iter 445/894 - loss 0.01666464 - time (sec): 28.20 - samples/sec: 1491.62 - lr: 0.000019 - momentum: 0.000000
2023-10-23 22:17:56,641 epoch 7 - iter 534/894 - loss 0.01833477 - time (sec): 33.78 - samples/sec: 1494.69 - lr: 0.000019 - momentum: 0.000000
2023-10-23 22:18:02,433 epoch 7 - iter 623/894 - loss 0.01813457 - time (sec): 39.58 - samples/sec: 1508.52 - lr: 0.000018 - momentum: 0.000000
2023-10-23 22:18:08,296 epoch 7 - iter 712/894 - loss 0.01766190 - time (sec): 45.44 - samples/sec: 1531.09 - lr: 0.000018 - momentum: 0.000000
2023-10-23 22:18:13,858 epoch 7 - iter 801/894 - loss 0.01731978 - time (sec): 51.00 - samples/sec: 1525.70 - lr: 0.000017 - momentum: 0.000000
2023-10-23 22:18:19,421 epoch 7 - iter 890/894 - loss 0.01885196 - time (sec): 56.56 - samples/sec: 1523.52 - lr: 0.000017 - momentum: 0.000000
2023-10-23 22:18:19,663 ----------------------------------------------------------------------------------------------------
2023-10-23 22:18:19,664 EPOCH 7 done: loss 0.0188 - lr: 0.000017
2023-10-23 22:18:26,167 DEV : loss 0.25700417160987854 - f1-score (micro avg) 0.7673
2023-10-23 22:18:26,188 saving best model
2023-10-23 22:18:26,776 ----------------------------------------------------------------------------------------------------
2023-10-23 22:18:32,428 epoch 8 - iter 89/894 - loss 0.00831319 - time (sec): 5.65 - samples/sec: 1519.15 - lr: 0.000016 - momentum: 0.000000
2023-10-23 22:18:38,212 epoch 8 - iter 178/894 - loss 0.00983876 - time (sec): 11.44 - samples/sec: 1510.57 - lr: 0.000016 - momentum: 0.000000
2023-10-23 22:18:43,974 epoch 8 - iter 267/894 - loss 0.01317277 - time (sec): 17.20 - samples/sec: 1532.68 - lr: 0.000015 - momentum: 0.000000
2023-10-23 22:18:49,909 epoch 8 - iter 356/894 - loss 0.01232841 - time (sec): 23.13 - samples/sec: 1547.32 - lr: 0.000014 - momentum: 0.000000
2023-10-23 22:18:55,332 epoch 8 - iter 445/894 - loss 0.01188126 - time (sec): 28.55 - samples/sec: 1522.90 - lr: 0.000014 - momentum: 0.000000
2023-10-23 22:19:00,901 epoch 8 - iter 534/894 - loss 0.01188323 - time (sec): 34.12 - samples/sec: 1525.83 - lr: 0.000013 - momentum: 0.000000
2023-10-23 22:19:06,449 epoch 8 - iter 623/894 - loss 0.01189307 - time (sec): 39.67 - samples/sec: 1523.80 - lr: 0.000013 - momentum: 0.000000
2023-10-23 22:19:11,886 epoch 8 - iter 712/894 - loss 0.01156878 - time (sec): 45.11 - samples/sec: 1508.18 - lr: 0.000012 - momentum: 0.000000
2023-10-23 22:19:17,744 epoch 8 - iter 801/894 - loss 0.01124655 - time (sec): 50.97 - samples/sec: 1518.04 - lr: 0.000012 - momentum: 0.000000
2023-10-23 22:19:23,437 epoch 8 - iter 890/894 - loss 0.01105335 - time (sec): 56.66 - samples/sec: 1522.10 - lr: 0.000011 - momentum: 0.000000
2023-10-23 22:19:23,684 ----------------------------------------------------------------------------------------------------
2023-10-23 22:19:23,685 EPOCH 8 done: loss 0.0110 - lr: 0.000011
2023-10-23 22:19:30,172 DEV : loss 0.2922624945640564 - f1-score (micro avg) 0.7633
2023-10-23 22:19:30,192 ----------------------------------------------------------------------------------------------------
2023-10-23 22:19:35,679 epoch 9 - iter 89/894 - loss 0.00212849 - time (sec): 5.49 - samples/sec: 1488.33 - lr: 0.000011 - momentum: 0.000000
2023-10-23 22:19:41,242 epoch 9 - iter 178/894 - loss 0.00466232 - time (sec): 11.05 - samples/sec: 1484.55 - lr: 0.000010 - momentum: 0.000000
2023-10-23 22:19:46,792 epoch 9 - iter 267/894 - loss 0.00624334 - time (sec): 16.60 - samples/sec: 1508.26 - lr: 0.000009 - momentum: 0.000000
2023-10-23 22:19:52,261 epoch 9 - iter 356/894 - loss 0.00529657 - time (sec): 22.07 - samples/sec: 1503.39 - lr: 0.000009 - momentum: 0.000000
2023-10-23 22:19:57,886 epoch 9 - iter 445/894 - loss 0.00482331 - time (sec): 27.69 - samples/sec: 1508.08 - lr: 0.000008 - momentum: 0.000000
2023-10-23 22:20:03,869 epoch 9 - iter 534/894 - loss 0.00634234 - time (sec): 33.68 - samples/sec: 1536.08 - lr: 0.000008 - momentum: 0.000000
2023-10-23 22:20:09,597 epoch 9 - iter 623/894 - loss 0.00603974 - time (sec): 39.40 - samples/sec: 1530.42 - lr: 0.000007 - momentum: 0.000000
2023-10-23 22:20:15,602 epoch 9 - iter 712/894 - loss 0.00635560 - time (sec): 45.41 - samples/sec: 1537.72 - lr: 0.000007 - momentum: 0.000000
2023-10-23 22:20:21,078 epoch 9 - iter 801/894 - loss 0.00602059 - time (sec): 50.88 - samples/sec: 1524.88 - lr: 0.000006 - momentum: 0.000000
2023-10-23 22:20:26,769 epoch 9 - iter 890/894 - loss 0.00592333 - time (sec): 56.58 - samples/sec: 1525.81 - lr: 0.000006 - momentum: 0.000000
2023-10-23 22:20:27,004 ----------------------------------------------------------------------------------------------------
2023-10-23 22:20:27,005 EPOCH 9 done: loss 0.0060 - lr: 0.000006
2023-10-23 22:20:33,496 DEV : loss 0.29060593247413635 - f1-score (micro avg) 0.7681
2023-10-23 22:20:33,516 saving best model
2023-10-23 22:20:34,105 ----------------------------------------------------------------------------------------------------
2023-10-23 22:20:39,963 epoch 10 - iter 89/894 - loss 0.00440159 - time (sec): 5.86 - samples/sec: 1526.28 - lr: 0.000005 - momentum: 0.000000
2023-10-23 22:20:45,656 epoch 10 - iter 178/894 - loss 0.00306763 - time (sec): 11.55 - samples/sec: 1499.98 - lr: 0.000004 - momentum: 0.000000
2023-10-23 22:20:51,138 epoch 10 - iter 267/894 - loss 0.00293745 - time (sec): 17.03 - samples/sec: 1528.11 - lr: 0.000004 - momentum: 0.000000
2023-10-23 22:20:56,938 epoch 10 - iter 356/894 - loss 0.00220417 - time (sec): 22.83 - samples/sec: 1548.00 - lr: 0.000003 - momentum: 0.000000
2023-10-23 22:21:02,446 epoch 10 - iter 445/894 - loss 0.00225457 - time (sec): 28.34 - samples/sec: 1522.61 - lr: 0.000003 - momentum: 0.000000
2023-10-23 22:21:07,950 epoch 10 - iter 534/894 - loss 0.00208242 - time (sec): 33.84 - samples/sec: 1517.45 - lr: 0.000002 - momentum: 0.000000
2023-10-23 22:21:13,674 epoch 10 - iter 623/894 - loss 0.00254697 - time (sec): 39.57 - samples/sec: 1519.31 - lr: 0.000002 - momentum: 0.000000
2023-10-23 22:21:19,162 epoch 10 - iter 712/894 - loss 0.00249384 - time (sec): 45.06 - samples/sec: 1512.52 - lr: 0.000001 - momentum: 0.000000
2023-10-23 22:21:24,848 epoch 10 - iter 801/894 - loss 0.00296117 - time (sec): 50.74 - samples/sec: 1514.47 - lr: 0.000001 - momentum: 0.000000
2023-10-23 22:21:30,534 epoch 10 - iter 890/894 - loss 0.00283140 - time (sec): 56.43 - samples/sec: 1514.86 - lr: 0.000000 - momentum: 0.000000
2023-10-23 22:21:31,030 ----------------------------------------------------------------------------------------------------
2023-10-23 22:21:31,030 EPOCH 10 done: loss 0.0028 - lr: 0.000000
2023-10-23 22:21:37,226 DEV : loss 0.291725754737854 - f1-score (micro avg) 0.7739
2023-10-23 22:21:37,246 saving best model
2023-10-23 22:21:38,317 ----------------------------------------------------------------------------------------------------
2023-10-23 22:21:38,317 Loading model from best epoch ...
2023-10-23 22:21:40,333 SequenceTagger predicts: Dictionary with 21 tags: O, S-loc, B-loc, E-loc, I-loc, S-pers, B-pers, E-pers, I-pers, S-org, B-org, E-org, I-org, S-prod, B-prod, E-prod, I-prod, S-time, B-time, E-time, I-time
2023-10-23 22:21:44,886
Results:
- F-score (micro) 0.7501
- F-score (macro) 0.6739
- Accuracy 0.6174
By class:
precision recall f1-score support
loc 0.8147 0.8557 0.8347 596
pers 0.6868 0.7508 0.7174 333
org 0.5537 0.5076 0.5296 132
prod 0.6491 0.5606 0.6016 66
time 0.6604 0.7143 0.6863 49
micro avg 0.7363 0.7645 0.7501 1176
macro avg 0.6729 0.6778 0.6739 1176
weighted avg 0.7335 0.7645 0.7480 1176
2023-10-23 22:21:44,886 ----------------------------------------------------------------------------------------------------