stefan-it's picture
Upload folder using huggingface_hub
379dffb
2023-10-17 23:20:48,607 ----------------------------------------------------------------------------------------------------
2023-10-17 23:20:48,608 Model: "SequenceTagger(
(embeddings): TransformerWordEmbeddings(
(model): ElectraModel(
(embeddings): ElectraEmbeddings(
(word_embeddings): Embedding(32001, 768)
(position_embeddings): Embedding(512, 768)
(token_type_embeddings): Embedding(2, 768)
(LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)
(dropout): Dropout(p=0.1, inplace=False)
)
(encoder): ElectraEncoder(
(layer): ModuleList(
(0-11): 12 x ElectraLayer(
(attention): ElectraAttention(
(self): ElectraSelfAttention(
(query): Linear(in_features=768, out_features=768, bias=True)
(key): Linear(in_features=768, out_features=768, bias=True)
(value): Linear(in_features=768, out_features=768, bias=True)
(dropout): Dropout(p=0.1, inplace=False)
)
(output): ElectraSelfOutput(
(dense): Linear(in_features=768, out_features=768, bias=True)
(LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)
(dropout): Dropout(p=0.1, inplace=False)
)
)
(intermediate): ElectraIntermediate(
(dense): Linear(in_features=768, out_features=3072, bias=True)
(intermediate_act_fn): GELUActivation()
)
(output): ElectraOutput(
(dense): Linear(in_features=3072, out_features=768, bias=True)
(LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)
(dropout): Dropout(p=0.1, inplace=False)
)
)
)
)
)
)
(locked_dropout): LockedDropout(p=0.5)
(linear): Linear(in_features=768, out_features=21, bias=True)
(loss_function): CrossEntropyLoss()
)"
2023-10-17 23:20:48,608 ----------------------------------------------------------------------------------------------------
2023-10-17 23:20:48,609 MultiCorpus: 5901 train + 1287 dev + 1505 test sentences
- NER_HIPE_2022 Corpus: 5901 train + 1287 dev + 1505 test sentences - /root/.flair/datasets/ner_hipe_2022/v2.1/hipe2020/fr/with_doc_seperator
2023-10-17 23:20:48,609 ----------------------------------------------------------------------------------------------------
2023-10-17 23:20:48,609 Train: 5901 sentences
2023-10-17 23:20:48,609 (train_with_dev=False, train_with_test=False)
2023-10-17 23:20:48,609 ----------------------------------------------------------------------------------------------------
2023-10-17 23:20:48,609 Training Params:
2023-10-17 23:20:48,609 - learning_rate: "5e-05"
2023-10-17 23:20:48,609 - mini_batch_size: "8"
2023-10-17 23:20:48,609 - max_epochs: "10"
2023-10-17 23:20:48,609 - shuffle: "True"
2023-10-17 23:20:48,609 ----------------------------------------------------------------------------------------------------
2023-10-17 23:20:48,609 Plugins:
2023-10-17 23:20:48,609 - TensorboardLogger
2023-10-17 23:20:48,609 - LinearScheduler | warmup_fraction: '0.1'
2023-10-17 23:20:48,609 ----------------------------------------------------------------------------------------------------
2023-10-17 23:20:48,609 Final evaluation on model from best epoch (best-model.pt)
2023-10-17 23:20:48,609 - metric: "('micro avg', 'f1-score')"
2023-10-17 23:20:48,609 ----------------------------------------------------------------------------------------------------
2023-10-17 23:20:48,609 Computation:
2023-10-17 23:20:48,610 - compute on device: cuda:0
2023-10-17 23:20:48,610 - embedding storage: none
2023-10-17 23:20:48,610 ----------------------------------------------------------------------------------------------------
2023-10-17 23:20:48,610 Model training base path: "hmbench-hipe2020/fr-hmteams/teams-base-historic-multilingual-discriminator-bs8-wsFalse-e10-lr5e-05-poolingfirst-layers-1-crfFalse-5"
2023-10-17 23:20:48,610 ----------------------------------------------------------------------------------------------------
2023-10-17 23:20:48,610 ----------------------------------------------------------------------------------------------------
2023-10-17 23:20:48,610 Logging anything other than scalars to TensorBoard is currently not supported.
2023-10-17 23:20:53,860 epoch 1 - iter 73/738 - loss 2.85474195 - time (sec): 5.25 - samples/sec: 3208.60 - lr: 0.000005 - momentum: 0.000000
2023-10-17 23:20:59,150 epoch 1 - iter 146/738 - loss 1.78283901 - time (sec): 10.54 - samples/sec: 3224.58 - lr: 0.000010 - momentum: 0.000000
2023-10-17 23:21:04,724 epoch 1 - iter 219/738 - loss 1.30841453 - time (sec): 16.11 - samples/sec: 3226.08 - lr: 0.000015 - momentum: 0.000000
2023-10-17 23:21:09,914 epoch 1 - iter 292/738 - loss 1.06896886 - time (sec): 21.30 - samples/sec: 3231.26 - lr: 0.000020 - momentum: 0.000000
2023-10-17 23:21:15,039 epoch 1 - iter 365/738 - loss 0.91894998 - time (sec): 26.43 - samples/sec: 3222.58 - lr: 0.000025 - momentum: 0.000000
2023-10-17 23:21:19,675 epoch 1 - iter 438/738 - loss 0.81887551 - time (sec): 31.06 - samples/sec: 3219.41 - lr: 0.000030 - momentum: 0.000000
2023-10-17 23:21:24,221 epoch 1 - iter 511/738 - loss 0.73797567 - time (sec): 35.61 - samples/sec: 3237.14 - lr: 0.000035 - momentum: 0.000000
2023-10-17 23:21:29,755 epoch 1 - iter 584/738 - loss 0.66529868 - time (sec): 41.14 - samples/sec: 3255.58 - lr: 0.000039 - momentum: 0.000000
2023-10-17 23:21:34,453 epoch 1 - iter 657/738 - loss 0.61811551 - time (sec): 45.84 - samples/sec: 3244.08 - lr: 0.000044 - momentum: 0.000000
2023-10-17 23:21:39,299 epoch 1 - iter 730/738 - loss 0.57233422 - time (sec): 50.69 - samples/sec: 3252.08 - lr: 0.000049 - momentum: 0.000000
2023-10-17 23:21:39,786 ----------------------------------------------------------------------------------------------------
2023-10-17 23:21:39,786 EPOCH 1 done: loss 0.5687 - lr: 0.000049
2023-10-17 23:21:46,270 DEV : loss 0.10680218786001205 - f1-score (micro avg) 0.7621
2023-10-17 23:21:46,307 saving best model
2023-10-17 23:21:46,722 ----------------------------------------------------------------------------------------------------
2023-10-17 23:21:52,044 epoch 2 - iter 73/738 - loss 0.12994419 - time (sec): 5.32 - samples/sec: 3181.55 - lr: 0.000049 - momentum: 0.000000
2023-10-17 23:21:57,252 epoch 2 - iter 146/738 - loss 0.11822578 - time (sec): 10.53 - samples/sec: 3134.72 - lr: 0.000049 - momentum: 0.000000
2023-10-17 23:22:02,033 epoch 2 - iter 219/738 - loss 0.11707730 - time (sec): 15.31 - samples/sec: 3223.08 - lr: 0.000048 - momentum: 0.000000
2023-10-17 23:22:06,854 epoch 2 - iter 292/738 - loss 0.12130436 - time (sec): 20.13 - samples/sec: 3238.62 - lr: 0.000048 - momentum: 0.000000
2023-10-17 23:22:11,419 epoch 2 - iter 365/738 - loss 0.12325461 - time (sec): 24.70 - samples/sec: 3254.74 - lr: 0.000047 - momentum: 0.000000
2023-10-17 23:22:16,413 epoch 2 - iter 438/738 - loss 0.12007044 - time (sec): 29.69 - samples/sec: 3274.88 - lr: 0.000047 - momentum: 0.000000
2023-10-17 23:22:21,264 epoch 2 - iter 511/738 - loss 0.11944053 - time (sec): 34.54 - samples/sec: 3296.37 - lr: 0.000046 - momentum: 0.000000
2023-10-17 23:22:27,327 epoch 2 - iter 584/738 - loss 0.11685937 - time (sec): 40.60 - samples/sec: 3287.09 - lr: 0.000046 - momentum: 0.000000
2023-10-17 23:22:32,387 epoch 2 - iter 657/738 - loss 0.11755883 - time (sec): 45.66 - samples/sec: 3277.59 - lr: 0.000045 - momentum: 0.000000
2023-10-17 23:22:36,985 epoch 2 - iter 730/738 - loss 0.11821850 - time (sec): 50.26 - samples/sec: 3280.65 - lr: 0.000045 - momentum: 0.000000
2023-10-17 23:22:37,406 ----------------------------------------------------------------------------------------------------
2023-10-17 23:22:37,406 EPOCH 2 done: loss 0.1178 - lr: 0.000045
2023-10-17 23:22:49,212 DEV : loss 0.13957878947257996 - f1-score (micro avg) 0.7821
2023-10-17 23:22:49,247 saving best model
2023-10-17 23:22:49,771 ----------------------------------------------------------------------------------------------------
2023-10-17 23:22:54,738 epoch 3 - iter 73/738 - loss 0.06730250 - time (sec): 4.96 - samples/sec: 3249.69 - lr: 0.000044 - momentum: 0.000000
2023-10-17 23:22:59,596 epoch 3 - iter 146/738 - loss 0.06639160 - time (sec): 9.82 - samples/sec: 3249.10 - lr: 0.000043 - momentum: 0.000000
2023-10-17 23:23:04,665 epoch 3 - iter 219/738 - loss 0.06940105 - time (sec): 14.89 - samples/sec: 3211.31 - lr: 0.000043 - momentum: 0.000000
2023-10-17 23:23:09,789 epoch 3 - iter 292/738 - loss 0.07344307 - time (sec): 20.02 - samples/sec: 3212.38 - lr: 0.000042 - momentum: 0.000000
2023-10-17 23:23:15,097 epoch 3 - iter 365/738 - loss 0.07364186 - time (sec): 25.32 - samples/sec: 3235.78 - lr: 0.000042 - momentum: 0.000000
2023-10-17 23:23:19,723 epoch 3 - iter 438/738 - loss 0.07290238 - time (sec): 29.95 - samples/sec: 3255.49 - lr: 0.000041 - momentum: 0.000000
2023-10-17 23:23:25,283 epoch 3 - iter 511/738 - loss 0.07401442 - time (sec): 35.51 - samples/sec: 3264.76 - lr: 0.000041 - momentum: 0.000000
2023-10-17 23:23:30,236 epoch 3 - iter 584/738 - loss 0.07353939 - time (sec): 40.46 - samples/sec: 3251.95 - lr: 0.000040 - momentum: 0.000000
2023-10-17 23:23:35,032 epoch 3 - iter 657/738 - loss 0.07127111 - time (sec): 45.26 - samples/sec: 3266.17 - lr: 0.000040 - momentum: 0.000000
2023-10-17 23:23:40,375 epoch 3 - iter 730/738 - loss 0.07266252 - time (sec): 50.60 - samples/sec: 3260.30 - lr: 0.000039 - momentum: 0.000000
2023-10-17 23:23:40,800 ----------------------------------------------------------------------------------------------------
2023-10-17 23:23:40,800 EPOCH 3 done: loss 0.0737 - lr: 0.000039
2023-10-17 23:23:52,463 DEV : loss 0.12460250407457352 - f1-score (micro avg) 0.8136
2023-10-17 23:23:52,497 saving best model
2023-10-17 23:23:53,038 ----------------------------------------------------------------------------------------------------
2023-10-17 23:23:58,038 epoch 4 - iter 73/738 - loss 0.03370323 - time (sec): 5.00 - samples/sec: 3418.52 - lr: 0.000038 - momentum: 0.000000
2023-10-17 23:24:02,660 epoch 4 - iter 146/738 - loss 0.04562321 - time (sec): 9.62 - samples/sec: 3364.30 - lr: 0.000038 - momentum: 0.000000
2023-10-17 23:24:08,381 epoch 4 - iter 219/738 - loss 0.04852262 - time (sec): 15.34 - samples/sec: 3309.97 - lr: 0.000037 - momentum: 0.000000
2023-10-17 23:24:13,478 epoch 4 - iter 292/738 - loss 0.05147819 - time (sec): 20.44 - samples/sec: 3325.35 - lr: 0.000037 - momentum: 0.000000
2023-10-17 23:24:18,159 epoch 4 - iter 365/738 - loss 0.04853046 - time (sec): 25.12 - samples/sec: 3315.02 - lr: 0.000036 - momentum: 0.000000
2023-10-17 23:24:22,760 epoch 4 - iter 438/738 - loss 0.04928914 - time (sec): 29.72 - samples/sec: 3294.20 - lr: 0.000036 - momentum: 0.000000
2023-10-17 23:24:27,929 epoch 4 - iter 511/738 - loss 0.04904117 - time (sec): 34.89 - samples/sec: 3290.30 - lr: 0.000035 - momentum: 0.000000
2023-10-17 23:24:33,404 epoch 4 - iter 584/738 - loss 0.04995549 - time (sec): 40.36 - samples/sec: 3262.97 - lr: 0.000035 - momentum: 0.000000
2023-10-17 23:24:38,303 epoch 4 - iter 657/738 - loss 0.04972125 - time (sec): 45.26 - samples/sec: 3252.44 - lr: 0.000034 - momentum: 0.000000
2023-10-17 23:24:43,566 epoch 4 - iter 730/738 - loss 0.05043296 - time (sec): 50.53 - samples/sec: 3250.06 - lr: 0.000033 - momentum: 0.000000
2023-10-17 23:24:44,260 ----------------------------------------------------------------------------------------------------
2023-10-17 23:24:44,260 EPOCH 4 done: loss 0.0506 - lr: 0.000033
2023-10-17 23:24:55,980 DEV : loss 0.15104417502880096 - f1-score (micro avg) 0.8221
2023-10-17 23:24:56,015 saving best model
2023-10-17 23:24:56,592 ----------------------------------------------------------------------------------------------------
2023-10-17 23:25:01,650 epoch 5 - iter 73/738 - loss 0.03956590 - time (sec): 5.06 - samples/sec: 3136.97 - lr: 0.000033 - momentum: 0.000000
2023-10-17 23:25:06,581 epoch 5 - iter 146/738 - loss 0.03562984 - time (sec): 9.99 - samples/sec: 3167.84 - lr: 0.000032 - momentum: 0.000000
2023-10-17 23:25:11,371 epoch 5 - iter 219/738 - loss 0.03556496 - time (sec): 14.78 - samples/sec: 3244.94 - lr: 0.000032 - momentum: 0.000000
2023-10-17 23:25:16,291 epoch 5 - iter 292/738 - loss 0.03548706 - time (sec): 19.70 - samples/sec: 3287.46 - lr: 0.000031 - momentum: 0.000000
2023-10-17 23:25:21,008 epoch 5 - iter 365/738 - loss 0.03604989 - time (sec): 24.41 - samples/sec: 3289.09 - lr: 0.000031 - momentum: 0.000000
2023-10-17 23:25:27,506 epoch 5 - iter 438/738 - loss 0.03712954 - time (sec): 30.91 - samples/sec: 3245.75 - lr: 0.000030 - momentum: 0.000000
2023-10-17 23:25:32,944 epoch 5 - iter 511/738 - loss 0.03554041 - time (sec): 36.35 - samples/sec: 3245.90 - lr: 0.000030 - momentum: 0.000000
2023-10-17 23:25:37,745 epoch 5 - iter 584/738 - loss 0.03576331 - time (sec): 41.15 - samples/sec: 3231.39 - lr: 0.000029 - momentum: 0.000000
2023-10-17 23:25:42,074 epoch 5 - iter 657/738 - loss 0.03519494 - time (sec): 45.48 - samples/sec: 3234.17 - lr: 0.000028 - momentum: 0.000000
2023-10-17 23:25:47,319 epoch 5 - iter 730/738 - loss 0.03557902 - time (sec): 50.73 - samples/sec: 3247.21 - lr: 0.000028 - momentum: 0.000000
2023-10-17 23:25:47,884 ----------------------------------------------------------------------------------------------------
2023-10-17 23:25:47,884 EPOCH 5 done: loss 0.0357 - lr: 0.000028
2023-10-17 23:25:59,681 DEV : loss 0.16061857342720032 - f1-score (micro avg) 0.8491
2023-10-17 23:25:59,713 saving best model
2023-10-17 23:26:00,283 ----------------------------------------------------------------------------------------------------
2023-10-17 23:26:05,830 epoch 6 - iter 73/738 - loss 0.03118785 - time (sec): 5.55 - samples/sec: 3271.74 - lr: 0.000027 - momentum: 0.000000
2023-10-17 23:26:11,069 epoch 6 - iter 146/738 - loss 0.02616507 - time (sec): 10.78 - samples/sec: 3173.32 - lr: 0.000027 - momentum: 0.000000
2023-10-17 23:26:15,848 epoch 6 - iter 219/738 - loss 0.02339505 - time (sec): 15.56 - samples/sec: 3211.29 - lr: 0.000026 - momentum: 0.000000
2023-10-17 23:26:21,924 epoch 6 - iter 292/738 - loss 0.02435022 - time (sec): 21.64 - samples/sec: 3228.36 - lr: 0.000026 - momentum: 0.000000
2023-10-17 23:26:26,608 epoch 6 - iter 365/738 - loss 0.02549636 - time (sec): 26.32 - samples/sec: 3239.19 - lr: 0.000025 - momentum: 0.000000
2023-10-17 23:26:31,540 epoch 6 - iter 438/738 - loss 0.02512432 - time (sec): 31.26 - samples/sec: 3223.33 - lr: 0.000025 - momentum: 0.000000
2023-10-17 23:26:36,807 epoch 6 - iter 511/738 - loss 0.02472183 - time (sec): 36.52 - samples/sec: 3223.08 - lr: 0.000024 - momentum: 0.000000
2023-10-17 23:26:41,722 epoch 6 - iter 584/738 - loss 0.02355624 - time (sec): 41.44 - samples/sec: 3212.31 - lr: 0.000023 - momentum: 0.000000
2023-10-17 23:26:46,324 epoch 6 - iter 657/738 - loss 0.02348594 - time (sec): 46.04 - samples/sec: 3228.99 - lr: 0.000023 - momentum: 0.000000
2023-10-17 23:26:51,137 epoch 6 - iter 730/738 - loss 0.02428917 - time (sec): 50.85 - samples/sec: 3239.02 - lr: 0.000022 - momentum: 0.000000
2023-10-17 23:26:51,631 ----------------------------------------------------------------------------------------------------
2023-10-17 23:26:51,631 EPOCH 6 done: loss 0.0247 - lr: 0.000022
2023-10-17 23:27:03,285 DEV : loss 0.1864573210477829 - f1-score (micro avg) 0.8343
2023-10-17 23:27:03,320 ----------------------------------------------------------------------------------------------------
2023-10-17 23:27:07,808 epoch 7 - iter 73/738 - loss 0.01908871 - time (sec): 4.49 - samples/sec: 3418.41 - lr: 0.000022 - momentum: 0.000000
2023-10-17 23:27:12,781 epoch 7 - iter 146/738 - loss 0.01731582 - time (sec): 9.46 - samples/sec: 3340.36 - lr: 0.000021 - momentum: 0.000000
2023-10-17 23:27:17,560 epoch 7 - iter 219/738 - loss 0.01576750 - time (sec): 14.24 - samples/sec: 3300.59 - lr: 0.000021 - momentum: 0.000000
2023-10-17 23:27:22,548 epoch 7 - iter 292/738 - loss 0.01585694 - time (sec): 19.23 - samples/sec: 3243.91 - lr: 0.000020 - momentum: 0.000000
2023-10-17 23:27:28,246 epoch 7 - iter 365/738 - loss 0.01552307 - time (sec): 24.93 - samples/sec: 3245.65 - lr: 0.000020 - momentum: 0.000000
2023-10-17 23:27:32,869 epoch 7 - iter 438/738 - loss 0.01447850 - time (sec): 29.55 - samples/sec: 3246.80 - lr: 0.000019 - momentum: 0.000000
2023-10-17 23:27:38,122 epoch 7 - iter 511/738 - loss 0.01650421 - time (sec): 34.80 - samples/sec: 3231.68 - lr: 0.000018 - momentum: 0.000000
2023-10-17 23:27:43,410 epoch 7 - iter 584/738 - loss 0.01625634 - time (sec): 40.09 - samples/sec: 3243.21 - lr: 0.000018 - momentum: 0.000000
2023-10-17 23:27:48,588 epoch 7 - iter 657/738 - loss 0.01672479 - time (sec): 45.27 - samples/sec: 3244.29 - lr: 0.000017 - momentum: 0.000000
2023-10-17 23:27:53,986 epoch 7 - iter 730/738 - loss 0.01610283 - time (sec): 50.67 - samples/sec: 3246.76 - lr: 0.000017 - momentum: 0.000000
2023-10-17 23:27:54,602 ----------------------------------------------------------------------------------------------------
2023-10-17 23:27:54,602 EPOCH 7 done: loss 0.0161 - lr: 0.000017
2023-10-17 23:28:06,282 DEV : loss 0.1937481015920639 - f1-score (micro avg) 0.8363
2023-10-17 23:28:06,316 ----------------------------------------------------------------------------------------------------
2023-10-17 23:28:11,106 epoch 8 - iter 73/738 - loss 0.00650832 - time (sec): 4.79 - samples/sec: 3384.54 - lr: 0.000016 - momentum: 0.000000
2023-10-17 23:28:15,525 epoch 8 - iter 146/738 - loss 0.00609167 - time (sec): 9.21 - samples/sec: 3324.21 - lr: 0.000016 - momentum: 0.000000
2023-10-17 23:28:21,030 epoch 8 - iter 219/738 - loss 0.00649774 - time (sec): 14.71 - samples/sec: 3312.05 - lr: 0.000015 - momentum: 0.000000
2023-10-17 23:28:25,843 epoch 8 - iter 292/738 - loss 0.00738941 - time (sec): 19.53 - samples/sec: 3261.54 - lr: 0.000015 - momentum: 0.000000
2023-10-17 23:28:31,461 epoch 8 - iter 365/738 - loss 0.00922834 - time (sec): 25.14 - samples/sec: 3260.92 - lr: 0.000014 - momentum: 0.000000
2023-10-17 23:28:36,767 epoch 8 - iter 438/738 - loss 0.01058127 - time (sec): 30.45 - samples/sec: 3229.20 - lr: 0.000013 - momentum: 0.000000
2023-10-17 23:28:41,311 epoch 8 - iter 511/738 - loss 0.01038329 - time (sec): 34.99 - samples/sec: 3244.01 - lr: 0.000013 - momentum: 0.000000
2023-10-17 23:28:45,794 epoch 8 - iter 584/738 - loss 0.01003015 - time (sec): 39.48 - samples/sec: 3265.71 - lr: 0.000012 - momentum: 0.000000
2023-10-17 23:28:50,264 epoch 8 - iter 657/738 - loss 0.01016964 - time (sec): 43.95 - samples/sec: 3276.86 - lr: 0.000012 - momentum: 0.000000
2023-10-17 23:28:55,761 epoch 8 - iter 730/738 - loss 0.00979342 - time (sec): 49.44 - samples/sec: 3287.76 - lr: 0.000011 - momentum: 0.000000
2023-10-17 23:28:56,757 ----------------------------------------------------------------------------------------------------
2023-10-17 23:28:56,757 EPOCH 8 done: loss 0.0099 - lr: 0.000011
2023-10-17 23:29:08,393 DEV : loss 0.21192434430122375 - f1-score (micro avg) 0.845
2023-10-17 23:29:08,437 ----------------------------------------------------------------------------------------------------
2023-10-17 23:29:13,398 epoch 9 - iter 73/738 - loss 0.00572354 - time (sec): 4.96 - samples/sec: 3245.85 - lr: 0.000011 - momentum: 0.000000
2023-10-17 23:29:18,282 epoch 9 - iter 146/738 - loss 0.00771689 - time (sec): 9.84 - samples/sec: 3246.31 - lr: 0.000010 - momentum: 0.000000
2023-10-17 23:29:23,619 epoch 9 - iter 219/738 - loss 0.00686114 - time (sec): 15.18 - samples/sec: 3276.24 - lr: 0.000010 - momentum: 0.000000
2023-10-17 23:29:28,492 epoch 9 - iter 292/738 - loss 0.00626330 - time (sec): 20.05 - samples/sec: 3267.08 - lr: 0.000009 - momentum: 0.000000
2023-10-17 23:29:33,568 epoch 9 - iter 365/738 - loss 0.00726506 - time (sec): 25.13 - samples/sec: 3305.24 - lr: 0.000008 - momentum: 0.000000
2023-10-17 23:29:38,439 epoch 9 - iter 438/738 - loss 0.00887684 - time (sec): 30.00 - samples/sec: 3289.80 - lr: 0.000008 - momentum: 0.000000
2023-10-17 23:29:43,087 epoch 9 - iter 511/738 - loss 0.00852118 - time (sec): 34.65 - samples/sec: 3285.43 - lr: 0.000007 - momentum: 0.000000
2023-10-17 23:29:48,690 epoch 9 - iter 584/738 - loss 0.00774901 - time (sec): 40.25 - samples/sec: 3261.36 - lr: 0.000007 - momentum: 0.000000
2023-10-17 23:29:53,636 epoch 9 - iter 657/738 - loss 0.00716142 - time (sec): 45.20 - samples/sec: 3264.88 - lr: 0.000006 - momentum: 0.000000
2023-10-17 23:29:58,790 epoch 9 - iter 730/738 - loss 0.00709577 - time (sec): 50.35 - samples/sec: 3260.45 - lr: 0.000006 - momentum: 0.000000
2023-10-17 23:29:59,499 ----------------------------------------------------------------------------------------------------
2023-10-17 23:29:59,499 EPOCH 9 done: loss 0.0072 - lr: 0.000006
2023-10-17 23:30:11,248 DEV : loss 0.20770353078842163 - f1-score (micro avg) 0.847
2023-10-17 23:30:11,286 ----------------------------------------------------------------------------------------------------
2023-10-17 23:30:16,877 epoch 10 - iter 73/738 - loss 0.00364011 - time (sec): 5.59 - samples/sec: 3144.56 - lr: 0.000005 - momentum: 0.000000
2023-10-17 23:30:22,277 epoch 10 - iter 146/738 - loss 0.00694055 - time (sec): 10.99 - samples/sec: 3114.55 - lr: 0.000004 - momentum: 0.000000
2023-10-17 23:30:27,427 epoch 10 - iter 219/738 - loss 0.00565523 - time (sec): 16.14 - samples/sec: 3115.09 - lr: 0.000004 - momentum: 0.000000
2023-10-17 23:30:32,962 epoch 10 - iter 292/738 - loss 0.00642647 - time (sec): 21.67 - samples/sec: 3158.25 - lr: 0.000003 - momentum: 0.000000
2023-10-17 23:30:37,900 epoch 10 - iter 365/738 - loss 0.00548506 - time (sec): 26.61 - samples/sec: 3171.74 - lr: 0.000003 - momentum: 0.000000
2023-10-17 23:30:42,656 epoch 10 - iter 438/738 - loss 0.00523018 - time (sec): 31.37 - samples/sec: 3217.05 - lr: 0.000002 - momentum: 0.000000
2023-10-17 23:30:47,193 epoch 10 - iter 511/738 - loss 0.00494489 - time (sec): 35.91 - samples/sec: 3233.06 - lr: 0.000002 - momentum: 0.000000
2023-10-17 23:30:52,200 epoch 10 - iter 584/738 - loss 0.00451099 - time (sec): 40.91 - samples/sec: 3227.12 - lr: 0.000001 - momentum: 0.000000
2023-10-17 23:30:57,205 epoch 10 - iter 657/738 - loss 0.00453998 - time (sec): 45.92 - samples/sec: 3232.95 - lr: 0.000001 - momentum: 0.000000
2023-10-17 23:31:02,184 epoch 10 - iter 730/738 - loss 0.00471051 - time (sec): 50.90 - samples/sec: 3232.14 - lr: 0.000000 - momentum: 0.000000
2023-10-17 23:31:02,735 ----------------------------------------------------------------------------------------------------
2023-10-17 23:31:02,736 EPOCH 10 done: loss 0.0047 - lr: 0.000000
2023-10-17 23:31:14,954 DEV : loss 0.21875163912773132 - f1-score (micro avg) 0.8505
2023-10-17 23:31:14,994 saving best model
2023-10-17 23:31:15,959 ----------------------------------------------------------------------------------------------------
2023-10-17 23:31:15,960 Loading model from best epoch ...
2023-10-17 23:31:17,507 SequenceTagger predicts: Dictionary with 21 tags: O, S-loc, B-loc, E-loc, I-loc, S-pers, B-pers, E-pers, I-pers, S-org, B-org, E-org, I-org, S-time, B-time, E-time, I-time, S-prod, B-prod, E-prod, I-prod
2023-10-17 23:31:24,230
Results:
- F-score (micro) 0.8068
- F-score (macro) 0.7197
- Accuracy 0.6964
By class:
precision recall f1-score support
loc 0.8642 0.8753 0.8697 858
pers 0.7603 0.8212 0.7896 537
org 0.5846 0.5758 0.5802 132
prod 0.7213 0.7213 0.7213 61
time 0.5968 0.6852 0.6379 54
micro avg 0.7926 0.8216 0.8068 1642
macro avg 0.7055 0.7358 0.7197 1642
weighted avg 0.7937 0.8216 0.8071 1642
2023-10-17 23:31:24,230 ----------------------------------------------------------------------------------------------------