stefan-it's picture
Upload folder using huggingface_hub
8add643
2023-10-08 21:14:28,584 ----------------------------------------------------------------------------------------------------
2023-10-08 21:14:28,585 Model: "SequenceTagger(
(embeddings): ByT5Embeddings(
(model): T5EncoderModel(
(shared): Embedding(384, 1472)
(encoder): T5Stack(
(embed_tokens): Embedding(384, 1472)
(block): ModuleList(
(0): T5Block(
(layer): ModuleList(
(0): T5LayerSelfAttention(
(SelfAttention): T5Attention(
(q): Linear(in_features=1472, out_features=384, bias=False)
(k): Linear(in_features=1472, out_features=384, bias=False)
(v): Linear(in_features=1472, out_features=384, bias=False)
(o): Linear(in_features=384, out_features=1472, bias=False)
(relative_attention_bias): Embedding(32, 6)
)
(layer_norm): T5LayerNorm()
(dropout): Dropout(p=0.1, inplace=False)
)
(1): T5LayerFF(
(DenseReluDense): T5DenseGatedActDense(
(wi_0): Linear(in_features=1472, out_features=3584, bias=False)
(wi_1): Linear(in_features=1472, out_features=3584, bias=False)
(wo): Linear(in_features=3584, out_features=1472, bias=False)
(dropout): Dropout(p=0.1, inplace=False)
(act): NewGELUActivation()
)
(layer_norm): T5LayerNorm()
(dropout): Dropout(p=0.1, inplace=False)
)
)
)
(1-11): 11 x T5Block(
(layer): ModuleList(
(0): T5LayerSelfAttention(
(SelfAttention): T5Attention(
(q): Linear(in_features=1472, out_features=384, bias=False)
(k): Linear(in_features=1472, out_features=384, bias=False)
(v): Linear(in_features=1472, out_features=384, bias=False)
(o): Linear(in_features=384, out_features=1472, bias=False)
)
(layer_norm): T5LayerNorm()
(dropout): Dropout(p=0.1, inplace=False)
)
(1): T5LayerFF(
(DenseReluDense): T5DenseGatedActDense(
(wi_0): Linear(in_features=1472, out_features=3584, bias=False)
(wi_1): Linear(in_features=1472, out_features=3584, bias=False)
(wo): Linear(in_features=3584, out_features=1472, bias=False)
(dropout): Dropout(p=0.1, inplace=False)
(act): NewGELUActivation()
)
(layer_norm): T5LayerNorm()
(dropout): Dropout(p=0.1, inplace=False)
)
)
)
)
(final_layer_norm): T5LayerNorm()
(dropout): Dropout(p=0.1, inplace=False)
)
)
)
(locked_dropout): LockedDropout(p=0.5)
(linear): Linear(in_features=1472, out_features=25, bias=True)
(loss_function): CrossEntropyLoss()
)"
2023-10-08 21:14:28,586 ----------------------------------------------------------------------------------------------------
2023-10-08 21:14:28,586 MultiCorpus: 966 train + 219 dev + 204 test sentences
- NER_HIPE_2022 Corpus: 966 train + 219 dev + 204 test sentences - /app/.flair/datasets/ner_hipe_2022/v2.1/ajmc/fr/with_doc_seperator
2023-10-08 21:14:28,586 ----------------------------------------------------------------------------------------------------
2023-10-08 21:14:28,586 Train: 966 sentences
2023-10-08 21:14:28,586 (train_with_dev=False, train_with_test=False)
2023-10-08 21:14:28,586 ----------------------------------------------------------------------------------------------------
2023-10-08 21:14:28,586 Training Params:
2023-10-08 21:14:28,586 - learning_rate: "0.00015"
2023-10-08 21:14:28,586 - mini_batch_size: "8"
2023-10-08 21:14:28,586 - max_epochs: "10"
2023-10-08 21:14:28,586 - shuffle: "True"
2023-10-08 21:14:28,586 ----------------------------------------------------------------------------------------------------
2023-10-08 21:14:28,586 Plugins:
2023-10-08 21:14:28,586 - TensorboardLogger
2023-10-08 21:14:28,587 - LinearScheduler | warmup_fraction: '0.1'
2023-10-08 21:14:28,587 ----------------------------------------------------------------------------------------------------
2023-10-08 21:14:28,587 Final evaluation on model from best epoch (best-model.pt)
2023-10-08 21:14:28,587 - metric: "('micro avg', 'f1-score')"
2023-10-08 21:14:28,587 ----------------------------------------------------------------------------------------------------
2023-10-08 21:14:28,587 Computation:
2023-10-08 21:14:28,587 - compute on device: cuda:0
2023-10-08 21:14:28,587 - embedding storage: none
2023-10-08 21:14:28,587 ----------------------------------------------------------------------------------------------------
2023-10-08 21:14:28,587 Model training base path: "hmbench-ajmc/fr-hmbyt5-preliminary/byt5-small-historic-multilingual-span20-flax-bs8-wsFalse-e10-lr0.00015-poolingfirst-layers-1-crfFalse-3"
2023-10-08 21:14:28,587 ----------------------------------------------------------------------------------------------------
2023-10-08 21:14:28,587 ----------------------------------------------------------------------------------------------------
2023-10-08 21:14:28,587 Logging anything other than scalars to TensorBoard is currently not supported.
2023-10-08 21:14:37,139 epoch 1 - iter 12/121 - loss 3.21177106 - time (sec): 8.55 - samples/sec: 259.52 - lr: 0.000014 - momentum: 0.000000
2023-10-08 21:14:46,605 epoch 1 - iter 24/121 - loss 3.20608951 - time (sec): 18.02 - samples/sec: 264.31 - lr: 0.000029 - momentum: 0.000000
2023-10-08 21:14:56,209 epoch 1 - iter 36/121 - loss 3.19628445 - time (sec): 27.62 - samples/sec: 265.67 - lr: 0.000043 - momentum: 0.000000
2023-10-08 21:15:04,700 epoch 1 - iter 48/121 - loss 3.18243139 - time (sec): 36.11 - samples/sec: 265.59 - lr: 0.000058 - momentum: 0.000000
2023-10-08 21:15:13,465 epoch 1 - iter 60/121 - loss 3.15357514 - time (sec): 44.88 - samples/sec: 269.18 - lr: 0.000073 - momentum: 0.000000
2023-10-08 21:15:22,820 epoch 1 - iter 72/121 - loss 3.10278430 - time (sec): 54.23 - samples/sec: 272.44 - lr: 0.000088 - momentum: 0.000000
2023-10-08 21:15:31,596 epoch 1 - iter 84/121 - loss 3.04483509 - time (sec): 63.01 - samples/sec: 272.67 - lr: 0.000103 - momentum: 0.000000
2023-10-08 21:15:40,547 epoch 1 - iter 96/121 - loss 2.97369813 - time (sec): 71.96 - samples/sec: 273.29 - lr: 0.000118 - momentum: 0.000000
2023-10-08 21:15:49,388 epoch 1 - iter 108/121 - loss 2.89339132 - time (sec): 80.80 - samples/sec: 274.83 - lr: 0.000133 - momentum: 0.000000
2023-10-08 21:15:58,012 epoch 1 - iter 120/121 - loss 2.81592254 - time (sec): 89.42 - samples/sec: 274.20 - lr: 0.000148 - momentum: 0.000000
2023-10-08 21:15:58,788 ----------------------------------------------------------------------------------------------------
2023-10-08 21:15:58,788 EPOCH 1 done: loss 2.8075 - lr: 0.000148
2023-10-08 21:16:04,557 DEV : loss 1.8723174333572388 - f1-score (micro avg) 0.0
2023-10-08 21:16:04,562 ----------------------------------------------------------------------------------------------------
2023-10-08 21:16:13,342 epoch 2 - iter 12/121 - loss 1.85964569 - time (sec): 8.78 - samples/sec: 294.93 - lr: 0.000148 - momentum: 0.000000
2023-10-08 21:16:22,480 epoch 2 - iter 24/121 - loss 1.74836342 - time (sec): 17.92 - samples/sec: 295.59 - lr: 0.000147 - momentum: 0.000000
2023-10-08 21:16:30,927 epoch 2 - iter 36/121 - loss 1.64529788 - time (sec): 26.36 - samples/sec: 289.15 - lr: 0.000145 - momentum: 0.000000
2023-10-08 21:16:39,590 epoch 2 - iter 48/121 - loss 1.55227964 - time (sec): 35.03 - samples/sec: 285.18 - lr: 0.000144 - momentum: 0.000000
2023-10-08 21:16:48,227 epoch 2 - iter 60/121 - loss 1.45974271 - time (sec): 43.66 - samples/sec: 284.26 - lr: 0.000142 - momentum: 0.000000
2023-10-08 21:16:57,233 epoch 2 - iter 72/121 - loss 1.38015943 - time (sec): 52.67 - samples/sec: 282.33 - lr: 0.000140 - momentum: 0.000000
2023-10-08 21:17:06,122 epoch 2 - iter 84/121 - loss 1.31158140 - time (sec): 61.56 - samples/sec: 283.15 - lr: 0.000139 - momentum: 0.000000
2023-10-08 21:17:14,381 epoch 2 - iter 96/121 - loss 1.25439110 - time (sec): 69.82 - samples/sec: 282.16 - lr: 0.000137 - momentum: 0.000000
2023-10-08 21:17:22,870 epoch 2 - iter 108/121 - loss 1.18707751 - time (sec): 78.31 - samples/sec: 281.56 - lr: 0.000135 - momentum: 0.000000
2023-10-08 21:17:31,498 epoch 2 - iter 120/121 - loss 1.12894224 - time (sec): 86.93 - samples/sec: 282.13 - lr: 0.000134 - momentum: 0.000000
2023-10-08 21:17:32,144 ----------------------------------------------------------------------------------------------------
2023-10-08 21:17:32,144 EPOCH 2 done: loss 1.1236 - lr: 0.000134
2023-10-08 21:17:38,011 DEV : loss 0.6600139141082764 - f1-score (micro avg) 0.0
2023-10-08 21:17:38,020 ----------------------------------------------------------------------------------------------------
2023-10-08 21:17:46,355 epoch 3 - iter 12/121 - loss 0.60715833 - time (sec): 8.33 - samples/sec: 273.47 - lr: 0.000132 - momentum: 0.000000
2023-10-08 21:17:55,344 epoch 3 - iter 24/121 - loss 0.61049381 - time (sec): 17.32 - samples/sec: 281.48 - lr: 0.000130 - momentum: 0.000000
2023-10-08 21:18:04,107 epoch 3 - iter 36/121 - loss 0.60850163 - time (sec): 26.09 - samples/sec: 280.35 - lr: 0.000129 - momentum: 0.000000
2023-10-08 21:18:12,834 epoch 3 - iter 48/121 - loss 0.61432985 - time (sec): 34.81 - samples/sec: 281.13 - lr: 0.000127 - momentum: 0.000000
2023-10-08 21:18:21,843 epoch 3 - iter 60/121 - loss 0.60257379 - time (sec): 43.82 - samples/sec: 282.00 - lr: 0.000125 - momentum: 0.000000
2023-10-08 21:18:30,090 epoch 3 - iter 72/121 - loss 0.57636685 - time (sec): 52.07 - samples/sec: 280.61 - lr: 0.000124 - momentum: 0.000000
2023-10-08 21:18:38,656 epoch 3 - iter 84/121 - loss 0.55990581 - time (sec): 60.63 - samples/sec: 281.62 - lr: 0.000122 - momentum: 0.000000
2023-10-08 21:18:47,502 epoch 3 - iter 96/121 - loss 0.54183359 - time (sec): 69.48 - samples/sec: 281.26 - lr: 0.000120 - momentum: 0.000000
2023-10-08 21:18:56,502 epoch 3 - iter 108/121 - loss 0.51680309 - time (sec): 78.48 - samples/sec: 283.46 - lr: 0.000119 - momentum: 0.000000
2023-10-08 21:19:04,947 epoch 3 - iter 120/121 - loss 0.49824164 - time (sec): 86.93 - samples/sec: 282.56 - lr: 0.000117 - momentum: 0.000000
2023-10-08 21:19:05,507 ----------------------------------------------------------------------------------------------------
2023-10-08 21:19:05,508 EPOCH 3 done: loss 0.4988 - lr: 0.000117
2023-10-08 21:19:11,310 DEV : loss 0.38485732674598694 - f1-score (micro avg) 0.3971
2023-10-08 21:19:11,316 saving best model
2023-10-08 21:19:12,166 ----------------------------------------------------------------------------------------------------
2023-10-08 21:19:20,539 epoch 4 - iter 12/121 - loss 0.32266392 - time (sec): 8.37 - samples/sec: 268.19 - lr: 0.000115 - momentum: 0.000000
2023-10-08 21:19:29,154 epoch 4 - iter 24/121 - loss 0.33259984 - time (sec): 16.99 - samples/sec: 277.70 - lr: 0.000114 - momentum: 0.000000
2023-10-08 21:19:37,032 epoch 4 - iter 36/121 - loss 0.33446517 - time (sec): 24.86 - samples/sec: 274.89 - lr: 0.000112 - momentum: 0.000000
2023-10-08 21:19:45,423 epoch 4 - iter 48/121 - loss 0.33427205 - time (sec): 33.26 - samples/sec: 277.73 - lr: 0.000110 - momentum: 0.000000
2023-10-08 21:19:54,419 epoch 4 - iter 60/121 - loss 0.31503095 - time (sec): 42.25 - samples/sec: 281.93 - lr: 0.000109 - momentum: 0.000000
2023-10-08 21:20:03,554 epoch 4 - iter 72/121 - loss 0.31653198 - time (sec): 51.39 - samples/sec: 283.11 - lr: 0.000107 - momentum: 0.000000
2023-10-08 21:20:12,098 epoch 4 - iter 84/121 - loss 0.31878239 - time (sec): 59.93 - samples/sec: 282.23 - lr: 0.000105 - momentum: 0.000000
2023-10-08 21:20:21,412 epoch 4 - iter 96/121 - loss 0.31299844 - time (sec): 69.24 - samples/sec: 284.35 - lr: 0.000104 - momentum: 0.000000
2023-10-08 21:20:29,854 epoch 4 - iter 108/121 - loss 0.30657518 - time (sec): 77.69 - samples/sec: 284.62 - lr: 0.000102 - momentum: 0.000000
2023-10-08 21:20:38,710 epoch 4 - iter 120/121 - loss 0.30017694 - time (sec): 86.54 - samples/sec: 284.20 - lr: 0.000101 - momentum: 0.000000
2023-10-08 21:20:39,251 ----------------------------------------------------------------------------------------------------
2023-10-08 21:20:39,252 EPOCH 4 done: loss 0.3006 - lr: 0.000101
2023-10-08 21:20:45,148 DEV : loss 0.2667107880115509 - f1-score (micro avg) 0.5158
2023-10-08 21:20:45,159 saving best model
2023-10-08 21:20:49,581 ----------------------------------------------------------------------------------------------------
2023-10-08 21:20:57,877 epoch 5 - iter 12/121 - loss 0.23624634 - time (sec): 8.29 - samples/sec: 275.01 - lr: 0.000099 - momentum: 0.000000
2023-10-08 21:21:07,071 epoch 5 - iter 24/121 - loss 0.23945972 - time (sec): 17.49 - samples/sec: 276.63 - lr: 0.000097 - momentum: 0.000000
2023-10-08 21:21:16,065 epoch 5 - iter 36/121 - loss 0.24579163 - time (sec): 26.48 - samples/sec: 284.14 - lr: 0.000095 - momentum: 0.000000
2023-10-08 21:21:24,536 epoch 5 - iter 48/121 - loss 0.24922744 - time (sec): 34.95 - samples/sec: 283.72 - lr: 0.000094 - momentum: 0.000000
2023-10-08 21:21:33,632 epoch 5 - iter 60/121 - loss 0.25020133 - time (sec): 44.05 - samples/sec: 282.95 - lr: 0.000092 - momentum: 0.000000
2023-10-08 21:21:42,843 epoch 5 - iter 72/121 - loss 0.24065315 - time (sec): 53.26 - samples/sec: 281.05 - lr: 0.000091 - momentum: 0.000000
2023-10-08 21:21:51,432 epoch 5 - iter 84/121 - loss 0.23365645 - time (sec): 61.85 - samples/sec: 279.86 - lr: 0.000089 - momentum: 0.000000
2023-10-08 21:22:00,601 epoch 5 - iter 96/121 - loss 0.22580979 - time (sec): 71.02 - samples/sec: 282.42 - lr: 0.000087 - momentum: 0.000000
2023-10-08 21:22:08,903 epoch 5 - iter 108/121 - loss 0.22106582 - time (sec): 79.32 - samples/sec: 280.94 - lr: 0.000086 - momentum: 0.000000
2023-10-08 21:22:17,260 epoch 5 - iter 120/121 - loss 0.21835210 - time (sec): 87.68 - samples/sec: 280.07 - lr: 0.000084 - momentum: 0.000000
2023-10-08 21:22:17,850 ----------------------------------------------------------------------------------------------------
2023-10-08 21:22:17,851 EPOCH 5 done: loss 0.2188 - lr: 0.000084
2023-10-08 21:22:23,824 DEV : loss 0.20774191617965698 - f1-score (micro avg) 0.645
2023-10-08 21:22:23,830 saving best model
2023-10-08 21:22:28,190 ----------------------------------------------------------------------------------------------------
2023-10-08 21:22:37,342 epoch 6 - iter 12/121 - loss 0.19377178 - time (sec): 9.15 - samples/sec: 288.17 - lr: 0.000082 - momentum: 0.000000
2023-10-08 21:22:46,275 epoch 6 - iter 24/121 - loss 0.18524721 - time (sec): 18.08 - samples/sec: 283.07 - lr: 0.000081 - momentum: 0.000000
2023-10-08 21:22:54,823 epoch 6 - iter 36/121 - loss 0.17998391 - time (sec): 26.63 - samples/sec: 279.40 - lr: 0.000079 - momentum: 0.000000
2023-10-08 21:23:03,463 epoch 6 - iter 48/121 - loss 0.18781017 - time (sec): 35.27 - samples/sec: 271.89 - lr: 0.000077 - momentum: 0.000000
2023-10-08 21:23:12,141 epoch 6 - iter 60/121 - loss 0.17951063 - time (sec): 43.95 - samples/sec: 270.10 - lr: 0.000076 - momentum: 0.000000
2023-10-08 21:23:21,125 epoch 6 - iter 72/121 - loss 0.17167492 - time (sec): 52.93 - samples/sec: 270.83 - lr: 0.000074 - momentum: 0.000000
2023-10-08 21:23:30,276 epoch 6 - iter 84/121 - loss 0.17030840 - time (sec): 62.08 - samples/sec: 272.19 - lr: 0.000072 - momentum: 0.000000
2023-10-08 21:23:39,921 epoch 6 - iter 96/121 - loss 0.16588264 - time (sec): 71.73 - samples/sec: 272.79 - lr: 0.000071 - momentum: 0.000000
2023-10-08 21:23:49,153 epoch 6 - iter 108/121 - loss 0.16686461 - time (sec): 80.96 - samples/sec: 272.58 - lr: 0.000069 - momentum: 0.000000
2023-10-08 21:23:58,663 epoch 6 - iter 120/121 - loss 0.16535264 - time (sec): 90.47 - samples/sec: 271.62 - lr: 0.000067 - momentum: 0.000000
2023-10-08 21:23:59,329 ----------------------------------------------------------------------------------------------------
2023-10-08 21:23:59,330 EPOCH 6 done: loss 0.1649 - lr: 0.000067
2023-10-08 21:24:05,766 DEV : loss 0.17141008377075195 - f1-score (micro avg) 0.8123
2023-10-08 21:24:05,771 saving best model
2023-10-08 21:24:10,147 ----------------------------------------------------------------------------------------------------
2023-10-08 21:24:19,044 epoch 7 - iter 12/121 - loss 0.14292119 - time (sec): 8.90 - samples/sec: 261.59 - lr: 0.000066 - momentum: 0.000000
2023-10-08 21:24:28,178 epoch 7 - iter 24/121 - loss 0.14369390 - time (sec): 18.03 - samples/sec: 257.69 - lr: 0.000064 - momentum: 0.000000
2023-10-08 21:24:37,657 epoch 7 - iter 36/121 - loss 0.13170235 - time (sec): 27.51 - samples/sec: 262.17 - lr: 0.000062 - momentum: 0.000000
2023-10-08 21:24:46,735 epoch 7 - iter 48/121 - loss 0.13068039 - time (sec): 36.59 - samples/sec: 261.93 - lr: 0.000061 - momentum: 0.000000
2023-10-08 21:24:55,952 epoch 7 - iter 60/121 - loss 0.13139294 - time (sec): 45.80 - samples/sec: 261.40 - lr: 0.000059 - momentum: 0.000000
2023-10-08 21:25:05,541 epoch 7 - iter 72/121 - loss 0.13454519 - time (sec): 55.39 - samples/sec: 262.94 - lr: 0.000057 - momentum: 0.000000
2023-10-08 21:25:15,400 epoch 7 - iter 84/121 - loss 0.13115971 - time (sec): 65.25 - samples/sec: 262.81 - lr: 0.000056 - momentum: 0.000000
2023-10-08 21:25:25,303 epoch 7 - iter 96/121 - loss 0.13228543 - time (sec): 75.15 - samples/sec: 263.11 - lr: 0.000054 - momentum: 0.000000
2023-10-08 21:25:34,615 epoch 7 - iter 108/121 - loss 0.12906064 - time (sec): 84.47 - samples/sec: 263.96 - lr: 0.000052 - momentum: 0.000000
2023-10-08 21:25:43,524 epoch 7 - iter 120/121 - loss 0.12735861 - time (sec): 93.38 - samples/sec: 263.13 - lr: 0.000051 - momentum: 0.000000
2023-10-08 21:25:44,140 ----------------------------------------------------------------------------------------------------
2023-10-08 21:25:44,141 EPOCH 7 done: loss 0.1281 - lr: 0.000051
2023-10-08 21:25:50,522 DEV : loss 0.15452586114406586 - f1-score (micro avg) 0.8079
2023-10-08 21:25:50,528 ----------------------------------------------------------------------------------------------------
2023-10-08 21:25:59,490 epoch 8 - iter 12/121 - loss 0.10644231 - time (sec): 8.96 - samples/sec: 260.25 - lr: 0.000049 - momentum: 0.000000
2023-10-08 21:26:08,674 epoch 8 - iter 24/121 - loss 0.10453440 - time (sec): 18.14 - samples/sec: 268.84 - lr: 0.000047 - momentum: 0.000000
2023-10-08 21:26:17,971 epoch 8 - iter 36/121 - loss 0.09280974 - time (sec): 27.44 - samples/sec: 268.54 - lr: 0.000046 - momentum: 0.000000
2023-10-08 21:26:27,500 epoch 8 - iter 48/121 - loss 0.09757054 - time (sec): 36.97 - samples/sec: 268.70 - lr: 0.000044 - momentum: 0.000000
2023-10-08 21:26:37,625 epoch 8 - iter 60/121 - loss 0.09697149 - time (sec): 47.10 - samples/sec: 267.56 - lr: 0.000042 - momentum: 0.000000
2023-10-08 21:26:47,112 epoch 8 - iter 72/121 - loss 0.10027961 - time (sec): 56.58 - samples/sec: 265.45 - lr: 0.000041 - momentum: 0.000000
2023-10-08 21:26:56,144 epoch 8 - iter 84/121 - loss 0.09918830 - time (sec): 65.61 - samples/sec: 263.27 - lr: 0.000039 - momentum: 0.000000
2023-10-08 21:27:05,596 epoch 8 - iter 96/121 - loss 0.10574486 - time (sec): 75.07 - samples/sec: 262.19 - lr: 0.000038 - momentum: 0.000000
2023-10-08 21:27:15,179 epoch 8 - iter 108/121 - loss 0.10563529 - time (sec): 84.65 - samples/sec: 263.84 - lr: 0.000036 - momentum: 0.000000
2023-10-08 21:27:24,085 epoch 8 - iter 120/121 - loss 0.10734517 - time (sec): 93.56 - samples/sec: 263.34 - lr: 0.000034 - momentum: 0.000000
2023-10-08 21:27:24,561 ----------------------------------------------------------------------------------------------------
2023-10-08 21:27:24,561 EPOCH 8 done: loss 0.1071 - lr: 0.000034
2023-10-08 21:27:30,959 DEV : loss 0.14953818917274475 - f1-score (micro avg) 0.8148
2023-10-08 21:27:30,965 saving best model
2023-10-08 21:27:35,293 ----------------------------------------------------------------------------------------------------
2023-10-08 21:27:44,750 epoch 9 - iter 12/121 - loss 0.07776325 - time (sec): 9.46 - samples/sec: 255.84 - lr: 0.000032 - momentum: 0.000000
2023-10-08 21:27:53,878 epoch 9 - iter 24/121 - loss 0.08630200 - time (sec): 18.58 - samples/sec: 263.24 - lr: 0.000031 - momentum: 0.000000
2023-10-08 21:28:02,751 epoch 9 - iter 36/121 - loss 0.08651275 - time (sec): 27.46 - samples/sec: 266.21 - lr: 0.000029 - momentum: 0.000000
2023-10-08 21:28:11,967 epoch 9 - iter 48/121 - loss 0.08783448 - time (sec): 36.67 - samples/sec: 272.82 - lr: 0.000028 - momentum: 0.000000
2023-10-08 21:28:20,418 epoch 9 - iter 60/121 - loss 0.08812733 - time (sec): 45.12 - samples/sec: 273.60 - lr: 0.000026 - momentum: 0.000000
2023-10-08 21:28:29,325 epoch 9 - iter 72/121 - loss 0.09382060 - time (sec): 54.03 - samples/sec: 275.62 - lr: 0.000024 - momentum: 0.000000
2023-10-08 21:28:38,084 epoch 9 - iter 84/121 - loss 0.09450348 - time (sec): 62.79 - samples/sec: 277.88 - lr: 0.000023 - momentum: 0.000000
2023-10-08 21:28:46,527 epoch 9 - iter 96/121 - loss 0.09439803 - time (sec): 71.23 - samples/sec: 276.32 - lr: 0.000021 - momentum: 0.000000
2023-10-08 21:28:55,215 epoch 9 - iter 108/121 - loss 0.09399403 - time (sec): 79.92 - samples/sec: 276.71 - lr: 0.000019 - momentum: 0.000000
2023-10-08 21:29:03,927 epoch 9 - iter 120/121 - loss 0.09469954 - time (sec): 88.63 - samples/sec: 277.84 - lr: 0.000018 - momentum: 0.000000
2023-10-08 21:29:04,401 ----------------------------------------------------------------------------------------------------
2023-10-08 21:29:04,402 EPOCH 9 done: loss 0.0944 - lr: 0.000018
2023-10-08 21:29:10,229 DEV : loss 0.14289319515228271 - f1-score (micro avg) 0.8154
2023-10-08 21:29:10,235 saving best model
2023-10-08 21:29:14,602 ----------------------------------------------------------------------------------------------------
2023-10-08 21:29:24,207 epoch 10 - iter 12/121 - loss 0.09192582 - time (sec): 9.60 - samples/sec: 296.46 - lr: 0.000016 - momentum: 0.000000
2023-10-08 21:29:32,733 epoch 10 - iter 24/121 - loss 0.09309779 - time (sec): 18.13 - samples/sec: 293.56 - lr: 0.000014 - momentum: 0.000000
2023-10-08 21:29:41,266 epoch 10 - iter 36/121 - loss 0.09595932 - time (sec): 26.66 - samples/sec: 293.03 - lr: 0.000013 - momentum: 0.000000
2023-10-08 21:29:49,278 epoch 10 - iter 48/121 - loss 0.08904050 - time (sec): 34.67 - samples/sec: 288.66 - lr: 0.000011 - momentum: 0.000000
2023-10-08 21:29:58,385 epoch 10 - iter 60/121 - loss 0.08767127 - time (sec): 43.78 - samples/sec: 288.45 - lr: 0.000009 - momentum: 0.000000
2023-10-08 21:30:07,445 epoch 10 - iter 72/121 - loss 0.09059443 - time (sec): 52.84 - samples/sec: 288.13 - lr: 0.000008 - momentum: 0.000000
2023-10-08 21:30:15,455 epoch 10 - iter 84/121 - loss 0.08929389 - time (sec): 60.85 - samples/sec: 286.48 - lr: 0.000006 - momentum: 0.000000
2023-10-08 21:30:23,873 epoch 10 - iter 96/121 - loss 0.09063585 - time (sec): 69.27 - samples/sec: 287.22 - lr: 0.000004 - momentum: 0.000000
2023-10-08 21:30:32,554 epoch 10 - iter 108/121 - loss 0.08964034 - time (sec): 77.95 - samples/sec: 287.17 - lr: 0.000003 - momentum: 0.000000
2023-10-08 21:30:40,374 epoch 10 - iter 120/121 - loss 0.08896025 - time (sec): 85.77 - samples/sec: 285.60 - lr: 0.000001 - momentum: 0.000000
2023-10-08 21:30:41,070 ----------------------------------------------------------------------------------------------------
2023-10-08 21:30:41,071 EPOCH 10 done: loss 0.0885 - lr: 0.000001
2023-10-08 21:30:46,900 DEV : loss 0.13837310671806335 - f1-score (micro avg) 0.8248
2023-10-08 21:30:46,906 saving best model
2023-10-08 21:30:52,142 ----------------------------------------------------------------------------------------------------
2023-10-08 21:30:52,144 Loading model from best epoch ...
2023-10-08 21:30:56,088 SequenceTagger predicts: Dictionary with 25 tags: O, S-scope, B-scope, E-scope, I-scope, S-pers, B-pers, E-pers, I-pers, S-work, B-work, E-work, I-work, S-loc, B-loc, E-loc, I-loc, S-object, B-object, E-object, I-object, S-date, B-date, E-date, I-date
2023-10-08 21:31:01,818
Results:
- F-score (micro) 0.7766
- F-score (macro) 0.4651
- Accuracy 0.6651
By class:
precision recall f1-score support
pers 0.7987 0.8561 0.8264 139
scope 0.7793 0.8760 0.8248 129
work 0.6122 0.7500 0.6742 80
loc 0.0000 0.0000 0.0000 9
date 0.0000 0.0000 0.0000 3
micro avg 0.7449 0.8111 0.7766 360
macro avg 0.4380 0.4964 0.4651 360
weighted avg 0.7237 0.8111 0.7645 360
2023-10-08 21:31:01,819 ----------------------------------------------------------------------------------------------------