stefan-it's picture
Upload folder using huggingface_hub
c7235be
2023-10-13 08:37:47,859 ----------------------------------------------------------------------------------------------------
2023-10-13 08:37:47,860 Model: "SequenceTagger(
(embeddings): TransformerWordEmbeddings(
(model): BertModel(
(embeddings): BertEmbeddings(
(word_embeddings): Embedding(32001, 768)
(position_embeddings): Embedding(512, 768)
(token_type_embeddings): Embedding(2, 768)
(LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)
(dropout): Dropout(p=0.1, inplace=False)
)
(encoder): BertEncoder(
(layer): ModuleList(
(0-11): 12 x BertLayer(
(attention): BertAttention(
(self): BertSelfAttention(
(query): Linear(in_features=768, out_features=768, bias=True)
(key): Linear(in_features=768, out_features=768, bias=True)
(value): Linear(in_features=768, out_features=768, bias=True)
(dropout): Dropout(p=0.1, inplace=False)
)
(output): BertSelfOutput(
(dense): Linear(in_features=768, out_features=768, bias=True)
(LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)
(dropout): Dropout(p=0.1, inplace=False)
)
)
(intermediate): BertIntermediate(
(dense): Linear(in_features=768, out_features=3072, bias=True)
(intermediate_act_fn): GELUActivation()
)
(output): BertOutput(
(dense): Linear(in_features=3072, out_features=768, bias=True)
(LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)
(dropout): Dropout(p=0.1, inplace=False)
)
)
)
)
(pooler): BertPooler(
(dense): Linear(in_features=768, out_features=768, bias=True)
(activation): Tanh()
)
)
)
(locked_dropout): LockedDropout(p=0.5)
(linear): Linear(in_features=768, out_features=25, bias=True)
(loss_function): CrossEntropyLoss()
)"
2023-10-13 08:37:47,860 ----------------------------------------------------------------------------------------------------
2023-10-13 08:37:47,861 MultiCorpus: 1100 train + 206 dev + 240 test sentences
- NER_HIPE_2022 Corpus: 1100 train + 206 dev + 240 test sentences - /root/.flair/datasets/ner_hipe_2022/v2.1/ajmc/de/with_doc_seperator
2023-10-13 08:37:47,861 ----------------------------------------------------------------------------------------------------
2023-10-13 08:37:47,861 Train: 1100 sentences
2023-10-13 08:37:47,861 (train_with_dev=False, train_with_test=False)
2023-10-13 08:37:47,861 ----------------------------------------------------------------------------------------------------
2023-10-13 08:37:47,861 Training Params:
2023-10-13 08:37:47,861 - learning_rate: "5e-05"
2023-10-13 08:37:47,861 - mini_batch_size: "8"
2023-10-13 08:37:47,861 - max_epochs: "10"
2023-10-13 08:37:47,861 - shuffle: "True"
2023-10-13 08:37:47,861 ----------------------------------------------------------------------------------------------------
2023-10-13 08:37:47,861 Plugins:
2023-10-13 08:37:47,861 - LinearScheduler | warmup_fraction: '0.1'
2023-10-13 08:37:47,861 ----------------------------------------------------------------------------------------------------
2023-10-13 08:37:47,861 Final evaluation on model from best epoch (best-model.pt)
2023-10-13 08:37:47,861 - metric: "('micro avg', 'f1-score')"
2023-10-13 08:37:47,861 ----------------------------------------------------------------------------------------------------
2023-10-13 08:37:47,861 Computation:
2023-10-13 08:37:47,861 - compute on device: cuda:0
2023-10-13 08:37:47,861 - embedding storage: none
2023-10-13 08:37:47,861 ----------------------------------------------------------------------------------------------------
2023-10-13 08:37:47,861 Model training base path: "hmbench-ajmc/de-dbmdz/bert-base-historic-multilingual-cased-bs8-wsFalse-e10-lr5e-05-poolingfirst-layers-1-crfFalse-3"
2023-10-13 08:37:47,861 ----------------------------------------------------------------------------------------------------
2023-10-13 08:37:47,861 ----------------------------------------------------------------------------------------------------
2023-10-13 08:37:48,585 epoch 1 - iter 13/138 - loss 3.12191784 - time (sec): 0.72 - samples/sec: 3245.74 - lr: 0.000004 - momentum: 0.000000
2023-10-13 08:37:49,285 epoch 1 - iter 26/138 - loss 2.79477586 - time (sec): 1.42 - samples/sec: 3137.58 - lr: 0.000009 - momentum: 0.000000
2023-10-13 08:37:50,019 epoch 1 - iter 39/138 - loss 2.26601948 - time (sec): 2.16 - samples/sec: 3019.38 - lr: 0.000014 - momentum: 0.000000
2023-10-13 08:37:50,785 epoch 1 - iter 52/138 - loss 1.86085985 - time (sec): 2.92 - samples/sec: 2988.47 - lr: 0.000018 - momentum: 0.000000
2023-10-13 08:37:51,485 epoch 1 - iter 65/138 - loss 1.65674856 - time (sec): 3.62 - samples/sec: 2991.82 - lr: 0.000023 - momentum: 0.000000
2023-10-13 08:37:52,175 epoch 1 - iter 78/138 - loss 1.51626490 - time (sec): 4.31 - samples/sec: 2951.59 - lr: 0.000028 - momentum: 0.000000
2023-10-13 08:37:52,968 epoch 1 - iter 91/138 - loss 1.35987571 - time (sec): 5.11 - samples/sec: 2969.91 - lr: 0.000033 - momentum: 0.000000
2023-10-13 08:37:53,717 epoch 1 - iter 104/138 - loss 1.23683846 - time (sec): 5.85 - samples/sec: 2932.96 - lr: 0.000037 - momentum: 0.000000
2023-10-13 08:37:54,479 epoch 1 - iter 117/138 - loss 1.14000975 - time (sec): 6.62 - samples/sec: 2924.45 - lr: 0.000042 - momentum: 0.000000
2023-10-13 08:37:55,234 epoch 1 - iter 130/138 - loss 1.05571133 - time (sec): 7.37 - samples/sec: 2929.02 - lr: 0.000047 - momentum: 0.000000
2023-10-13 08:37:55,661 ----------------------------------------------------------------------------------------------------
2023-10-13 08:37:55,661 EPOCH 1 done: loss 1.0217 - lr: 0.000047
2023-10-13 08:37:56,188 DEV : loss 0.23606571555137634 - f1-score (micro avg) 0.7367
2023-10-13 08:37:56,194 saving best model
2023-10-13 08:37:56,622 ----------------------------------------------------------------------------------------------------
2023-10-13 08:37:57,330 epoch 2 - iter 13/138 - loss 0.18739718 - time (sec): 0.71 - samples/sec: 2757.31 - lr: 0.000050 - momentum: 0.000000
2023-10-13 08:37:58,041 epoch 2 - iter 26/138 - loss 0.22339167 - time (sec): 1.42 - samples/sec: 2998.02 - lr: 0.000049 - momentum: 0.000000
2023-10-13 08:37:58,802 epoch 2 - iter 39/138 - loss 0.21884062 - time (sec): 2.18 - samples/sec: 3011.89 - lr: 0.000048 - momentum: 0.000000
2023-10-13 08:37:59,527 epoch 2 - iter 52/138 - loss 0.20984621 - time (sec): 2.90 - samples/sec: 3053.06 - lr: 0.000048 - momentum: 0.000000
2023-10-13 08:38:00,329 epoch 2 - iter 65/138 - loss 0.20749653 - time (sec): 3.71 - samples/sec: 3046.67 - lr: 0.000047 - momentum: 0.000000
2023-10-13 08:38:01,006 epoch 2 - iter 78/138 - loss 0.19761837 - time (sec): 4.38 - samples/sec: 3029.22 - lr: 0.000047 - momentum: 0.000000
2023-10-13 08:38:01,706 epoch 2 - iter 91/138 - loss 0.19300577 - time (sec): 5.08 - samples/sec: 3030.76 - lr: 0.000046 - momentum: 0.000000
2023-10-13 08:38:02,421 epoch 2 - iter 104/138 - loss 0.18925954 - time (sec): 5.80 - samples/sec: 3036.05 - lr: 0.000046 - momentum: 0.000000
2023-10-13 08:38:03,165 epoch 2 - iter 117/138 - loss 0.18405167 - time (sec): 6.54 - samples/sec: 3025.01 - lr: 0.000045 - momentum: 0.000000
2023-10-13 08:38:03,862 epoch 2 - iter 130/138 - loss 0.17839893 - time (sec): 7.24 - samples/sec: 3001.71 - lr: 0.000045 - momentum: 0.000000
2023-10-13 08:38:04,282 ----------------------------------------------------------------------------------------------------
2023-10-13 08:38:04,282 EPOCH 2 done: loss 0.1803 - lr: 0.000045
2023-10-13 08:38:05,081 DEV : loss 0.13411737978458405 - f1-score (micro avg) 0.8146
2023-10-13 08:38:05,086 saving best model
2023-10-13 08:38:05,652 ----------------------------------------------------------------------------------------------------
2023-10-13 08:38:06,395 epoch 3 - iter 13/138 - loss 0.09444506 - time (sec): 0.74 - samples/sec: 2887.38 - lr: 0.000044 - momentum: 0.000000
2023-10-13 08:38:07,137 epoch 3 - iter 26/138 - loss 0.11150540 - time (sec): 1.48 - samples/sec: 2964.13 - lr: 0.000043 - momentum: 0.000000
2023-10-13 08:38:07,908 epoch 3 - iter 39/138 - loss 0.11884629 - time (sec): 2.25 - samples/sec: 2984.08 - lr: 0.000043 - momentum: 0.000000
2023-10-13 08:38:08,627 epoch 3 - iter 52/138 - loss 0.10508658 - time (sec): 2.97 - samples/sec: 2958.62 - lr: 0.000042 - momentum: 0.000000
2023-10-13 08:38:09,403 epoch 3 - iter 65/138 - loss 0.10598495 - time (sec): 3.75 - samples/sec: 2967.10 - lr: 0.000042 - momentum: 0.000000
2023-10-13 08:38:10,128 epoch 3 - iter 78/138 - loss 0.11012567 - time (sec): 4.47 - samples/sec: 2932.82 - lr: 0.000041 - momentum: 0.000000
2023-10-13 08:38:10,873 epoch 3 - iter 91/138 - loss 0.10740409 - time (sec): 5.22 - samples/sec: 2909.11 - lr: 0.000041 - momentum: 0.000000
2023-10-13 08:38:11,667 epoch 3 - iter 104/138 - loss 0.10020216 - time (sec): 6.01 - samples/sec: 2889.42 - lr: 0.000040 - momentum: 0.000000
2023-10-13 08:38:12,399 epoch 3 - iter 117/138 - loss 0.10639550 - time (sec): 6.74 - samples/sec: 2904.68 - lr: 0.000040 - momentum: 0.000000
2023-10-13 08:38:13,133 epoch 3 - iter 130/138 - loss 0.10364137 - time (sec): 7.48 - samples/sec: 2877.96 - lr: 0.000039 - momentum: 0.000000
2023-10-13 08:38:13,582 ----------------------------------------------------------------------------------------------------
2023-10-13 08:38:13,582 EPOCH 3 done: loss 0.1027 - lr: 0.000039
2023-10-13 08:38:14,254 DEV : loss 0.10886522382497787 - f1-score (micro avg) 0.8325
2023-10-13 08:38:14,259 saving best model
2023-10-13 08:38:14,783 ----------------------------------------------------------------------------------------------------
2023-10-13 08:38:15,495 epoch 4 - iter 13/138 - loss 0.05419081 - time (sec): 0.71 - samples/sec: 3037.27 - lr: 0.000038 - momentum: 0.000000
2023-10-13 08:38:16,192 epoch 4 - iter 26/138 - loss 0.06872623 - time (sec): 1.41 - samples/sec: 3049.86 - lr: 0.000038 - momentum: 0.000000
2023-10-13 08:38:16,892 epoch 4 - iter 39/138 - loss 0.05617233 - time (sec): 2.11 - samples/sec: 3124.55 - lr: 0.000037 - momentum: 0.000000
2023-10-13 08:38:17,578 epoch 4 - iter 52/138 - loss 0.06640055 - time (sec): 2.79 - samples/sec: 3090.82 - lr: 0.000037 - momentum: 0.000000
2023-10-13 08:38:18,301 epoch 4 - iter 65/138 - loss 0.06925633 - time (sec): 3.52 - samples/sec: 3046.98 - lr: 0.000036 - momentum: 0.000000
2023-10-13 08:38:18,991 epoch 4 - iter 78/138 - loss 0.06827384 - time (sec): 4.21 - samples/sec: 3063.35 - lr: 0.000036 - momentum: 0.000000
2023-10-13 08:38:19,698 epoch 4 - iter 91/138 - loss 0.07238295 - time (sec): 4.91 - samples/sec: 3053.97 - lr: 0.000035 - momentum: 0.000000
2023-10-13 08:38:20,461 epoch 4 - iter 104/138 - loss 0.07205906 - time (sec): 5.68 - samples/sec: 3028.65 - lr: 0.000035 - momentum: 0.000000
2023-10-13 08:38:21,140 epoch 4 - iter 117/138 - loss 0.06979008 - time (sec): 6.36 - samples/sec: 3023.58 - lr: 0.000034 - momentum: 0.000000
2023-10-13 08:38:21,846 epoch 4 - iter 130/138 - loss 0.06928812 - time (sec): 7.06 - samples/sec: 3023.34 - lr: 0.000034 - momentum: 0.000000
2023-10-13 08:38:22,319 ----------------------------------------------------------------------------------------------------
2023-10-13 08:38:22,319 EPOCH 4 done: loss 0.0669 - lr: 0.000034
2023-10-13 08:38:22,973 DEV : loss 0.14741753041744232 - f1-score (micro avg) 0.8455
2023-10-13 08:38:22,978 saving best model
2023-10-13 08:38:23,491 ----------------------------------------------------------------------------------------------------
2023-10-13 08:38:24,282 epoch 5 - iter 13/138 - loss 0.05064424 - time (sec): 0.79 - samples/sec: 3019.28 - lr: 0.000033 - momentum: 0.000000
2023-10-13 08:38:24,978 epoch 5 - iter 26/138 - loss 0.04863752 - time (sec): 1.48 - samples/sec: 3068.74 - lr: 0.000032 - momentum: 0.000000
2023-10-13 08:38:25,673 epoch 5 - iter 39/138 - loss 0.04896499 - time (sec): 2.18 - samples/sec: 3050.90 - lr: 0.000032 - momentum: 0.000000
2023-10-13 08:38:26,448 epoch 5 - iter 52/138 - loss 0.04623801 - time (sec): 2.95 - samples/sec: 2978.58 - lr: 0.000031 - momentum: 0.000000
2023-10-13 08:38:27,161 epoch 5 - iter 65/138 - loss 0.04641899 - time (sec): 3.67 - samples/sec: 2979.16 - lr: 0.000031 - momentum: 0.000000
2023-10-13 08:38:27,854 epoch 5 - iter 78/138 - loss 0.04323703 - time (sec): 4.36 - samples/sec: 2935.85 - lr: 0.000030 - momentum: 0.000000
2023-10-13 08:38:28,625 epoch 5 - iter 91/138 - loss 0.04707400 - time (sec): 5.13 - samples/sec: 2932.59 - lr: 0.000030 - momentum: 0.000000
2023-10-13 08:38:29,353 epoch 5 - iter 104/138 - loss 0.04876560 - time (sec): 5.86 - samples/sec: 2931.69 - lr: 0.000029 - momentum: 0.000000
2023-10-13 08:38:30,100 epoch 5 - iter 117/138 - loss 0.04641900 - time (sec): 6.60 - samples/sec: 2939.58 - lr: 0.000029 - momentum: 0.000000
2023-10-13 08:38:30,779 epoch 5 - iter 130/138 - loss 0.04582599 - time (sec): 7.28 - samples/sec: 2964.92 - lr: 0.000028 - momentum: 0.000000
2023-10-13 08:38:31,180 ----------------------------------------------------------------------------------------------------
2023-10-13 08:38:31,180 EPOCH 5 done: loss 0.0475 - lr: 0.000028
2023-10-13 08:38:31,834 DEV : loss 0.13973049819469452 - f1-score (micro avg) 0.8551
2023-10-13 08:38:31,839 saving best model
2023-10-13 08:38:32,354 ----------------------------------------------------------------------------------------------------
2023-10-13 08:38:33,138 epoch 6 - iter 13/138 - loss 0.02421077 - time (sec): 0.78 - samples/sec: 2843.24 - lr: 0.000027 - momentum: 0.000000
2023-10-13 08:38:33,868 epoch 6 - iter 26/138 - loss 0.02946353 - time (sec): 1.51 - samples/sec: 3027.71 - lr: 0.000027 - momentum: 0.000000
2023-10-13 08:38:34,542 epoch 6 - iter 39/138 - loss 0.03593275 - time (sec): 2.18 - samples/sec: 2971.50 - lr: 0.000026 - momentum: 0.000000
2023-10-13 08:38:35,268 epoch 6 - iter 52/138 - loss 0.03498252 - time (sec): 2.91 - samples/sec: 2945.05 - lr: 0.000026 - momentum: 0.000000
2023-10-13 08:38:35,998 epoch 6 - iter 65/138 - loss 0.03172762 - time (sec): 3.64 - samples/sec: 2923.52 - lr: 0.000025 - momentum: 0.000000
2023-10-13 08:38:36,729 epoch 6 - iter 78/138 - loss 0.03234017 - time (sec): 4.37 - samples/sec: 2925.80 - lr: 0.000025 - momentum: 0.000000
2023-10-13 08:38:37,438 epoch 6 - iter 91/138 - loss 0.02934412 - time (sec): 5.08 - samples/sec: 2935.96 - lr: 0.000024 - momentum: 0.000000
2023-10-13 08:38:38,157 epoch 6 - iter 104/138 - loss 0.03350809 - time (sec): 5.80 - samples/sec: 2934.06 - lr: 0.000024 - momentum: 0.000000
2023-10-13 08:38:38,919 epoch 6 - iter 117/138 - loss 0.03355378 - time (sec): 6.56 - samples/sec: 2955.49 - lr: 0.000023 - momentum: 0.000000
2023-10-13 08:38:39,622 epoch 6 - iter 130/138 - loss 0.03472346 - time (sec): 7.26 - samples/sec: 2971.30 - lr: 0.000023 - momentum: 0.000000
2023-10-13 08:38:40,051 ----------------------------------------------------------------------------------------------------
2023-10-13 08:38:40,051 EPOCH 6 done: loss 0.0344 - lr: 0.000023
2023-10-13 08:38:40,687 DEV : loss 0.1507454365491867 - f1-score (micro avg) 0.868
2023-10-13 08:38:40,692 saving best model
2023-10-13 08:38:41,198 ----------------------------------------------------------------------------------------------------
2023-10-13 08:38:41,937 epoch 7 - iter 13/138 - loss 0.03663415 - time (sec): 0.74 - samples/sec: 3000.60 - lr: 0.000022 - momentum: 0.000000
2023-10-13 08:38:42,639 epoch 7 - iter 26/138 - loss 0.02703902 - time (sec): 1.44 - samples/sec: 2872.21 - lr: 0.000021 - momentum: 0.000000
2023-10-13 08:38:43,355 epoch 7 - iter 39/138 - loss 0.03269859 - time (sec): 2.16 - samples/sec: 2918.80 - lr: 0.000021 - momentum: 0.000000
2023-10-13 08:38:44,080 epoch 7 - iter 52/138 - loss 0.02902945 - time (sec): 2.88 - samples/sec: 2979.77 - lr: 0.000020 - momentum: 0.000000
2023-10-13 08:38:44,807 epoch 7 - iter 65/138 - loss 0.03443512 - time (sec): 3.61 - samples/sec: 2982.06 - lr: 0.000020 - momentum: 0.000000
2023-10-13 08:38:45,571 epoch 7 - iter 78/138 - loss 0.03014541 - time (sec): 4.37 - samples/sec: 2972.88 - lr: 0.000019 - momentum: 0.000000
2023-10-13 08:38:46,300 epoch 7 - iter 91/138 - loss 0.02898238 - time (sec): 5.10 - samples/sec: 2976.73 - lr: 0.000019 - momentum: 0.000000
2023-10-13 08:38:47,065 epoch 7 - iter 104/138 - loss 0.03177851 - time (sec): 5.87 - samples/sec: 2935.26 - lr: 0.000018 - momentum: 0.000000
2023-10-13 08:38:47,790 epoch 7 - iter 117/138 - loss 0.02907918 - time (sec): 6.59 - samples/sec: 2939.45 - lr: 0.000018 - momentum: 0.000000
2023-10-13 08:38:48,493 epoch 7 - iter 130/138 - loss 0.02906726 - time (sec): 7.29 - samples/sec: 2956.21 - lr: 0.000017 - momentum: 0.000000
2023-10-13 08:38:48,925 ----------------------------------------------------------------------------------------------------
2023-10-13 08:38:48,925 EPOCH 7 done: loss 0.0304 - lr: 0.000017
2023-10-13 08:38:49,592 DEV : loss 0.15140853822231293 - f1-score (micro avg) 0.8673
2023-10-13 08:38:49,598 ----------------------------------------------------------------------------------------------------
2023-10-13 08:38:50,300 epoch 8 - iter 13/138 - loss 0.00802457 - time (sec): 0.70 - samples/sec: 3188.69 - lr: 0.000016 - momentum: 0.000000
2023-10-13 08:38:51,036 epoch 8 - iter 26/138 - loss 0.01416937 - time (sec): 1.44 - samples/sec: 2994.34 - lr: 0.000016 - momentum: 0.000000
2023-10-13 08:38:51,742 epoch 8 - iter 39/138 - loss 0.01226446 - time (sec): 2.14 - samples/sec: 2977.14 - lr: 0.000015 - momentum: 0.000000
2023-10-13 08:38:52,472 epoch 8 - iter 52/138 - loss 0.01634313 - time (sec): 2.87 - samples/sec: 3027.34 - lr: 0.000015 - momentum: 0.000000
2023-10-13 08:38:53,266 epoch 8 - iter 65/138 - loss 0.01710077 - time (sec): 3.67 - samples/sec: 2985.89 - lr: 0.000014 - momentum: 0.000000
2023-10-13 08:38:53,987 epoch 8 - iter 78/138 - loss 0.01875374 - time (sec): 4.39 - samples/sec: 2978.11 - lr: 0.000014 - momentum: 0.000000
2023-10-13 08:38:54,715 epoch 8 - iter 91/138 - loss 0.02285244 - time (sec): 5.12 - samples/sec: 2994.21 - lr: 0.000013 - momentum: 0.000000
2023-10-13 08:38:55,385 epoch 8 - iter 104/138 - loss 0.02117597 - time (sec): 5.79 - samples/sec: 2944.52 - lr: 0.000013 - momentum: 0.000000
2023-10-13 08:38:56,127 epoch 8 - iter 117/138 - loss 0.01979320 - time (sec): 6.53 - samples/sec: 2926.75 - lr: 0.000012 - momentum: 0.000000
2023-10-13 08:38:56,873 epoch 8 - iter 130/138 - loss 0.01944512 - time (sec): 7.27 - samples/sec: 2956.67 - lr: 0.000012 - momentum: 0.000000
2023-10-13 08:38:57,285 ----------------------------------------------------------------------------------------------------
2023-10-13 08:38:57,285 EPOCH 8 done: loss 0.0208 - lr: 0.000012
2023-10-13 08:38:57,927 DEV : loss 0.17069277167320251 - f1-score (micro avg) 0.8747
2023-10-13 08:38:57,932 saving best model
2023-10-13 08:38:58,409 ----------------------------------------------------------------------------------------------------
2023-10-13 08:38:59,141 epoch 9 - iter 13/138 - loss 0.03033239 - time (sec): 0.72 - samples/sec: 3027.73 - lr: 0.000011 - momentum: 0.000000
2023-10-13 08:38:59,883 epoch 9 - iter 26/138 - loss 0.01590142 - time (sec): 1.46 - samples/sec: 2889.28 - lr: 0.000010 - momentum: 0.000000
2023-10-13 08:39:00,571 epoch 9 - iter 39/138 - loss 0.01144992 - time (sec): 2.15 - samples/sec: 2927.63 - lr: 0.000010 - momentum: 0.000000
2023-10-13 08:39:01,270 epoch 9 - iter 52/138 - loss 0.01252603 - time (sec): 2.85 - samples/sec: 2859.47 - lr: 0.000009 - momentum: 0.000000
2023-10-13 08:39:02,050 epoch 9 - iter 65/138 - loss 0.01865718 - time (sec): 3.63 - samples/sec: 2889.80 - lr: 0.000009 - momentum: 0.000000
2023-10-13 08:39:02,781 epoch 9 - iter 78/138 - loss 0.01772222 - time (sec): 4.36 - samples/sec: 2989.41 - lr: 0.000008 - momentum: 0.000000
2023-10-13 08:39:03,505 epoch 9 - iter 91/138 - loss 0.01632623 - time (sec): 5.09 - samples/sec: 2986.01 - lr: 0.000008 - momentum: 0.000000
2023-10-13 08:39:04,205 epoch 9 - iter 104/138 - loss 0.01679270 - time (sec): 5.79 - samples/sec: 3009.82 - lr: 0.000007 - momentum: 0.000000
2023-10-13 08:39:04,911 epoch 9 - iter 117/138 - loss 0.01525853 - time (sec): 6.49 - samples/sec: 2995.66 - lr: 0.000007 - momentum: 0.000000
2023-10-13 08:39:05,626 epoch 9 - iter 130/138 - loss 0.01457529 - time (sec): 7.21 - samples/sec: 2990.06 - lr: 0.000006 - momentum: 0.000000
2023-10-13 08:39:06,084 ----------------------------------------------------------------------------------------------------
2023-10-13 08:39:06,084 EPOCH 9 done: loss 0.0164 - lr: 0.000006
2023-10-13 08:39:06,715 DEV : loss 0.16805338859558105 - f1-score (micro avg) 0.8746
2023-10-13 08:39:06,720 ----------------------------------------------------------------------------------------------------
2023-10-13 08:39:07,409 epoch 10 - iter 13/138 - loss 0.03520316 - time (sec): 0.69 - samples/sec: 3121.77 - lr: 0.000005 - momentum: 0.000000
2023-10-13 08:39:08,172 epoch 10 - iter 26/138 - loss 0.04008353 - time (sec): 1.45 - samples/sec: 3070.07 - lr: 0.000005 - momentum: 0.000000
2023-10-13 08:39:08,947 epoch 10 - iter 39/138 - loss 0.02721871 - time (sec): 2.23 - samples/sec: 3013.72 - lr: 0.000004 - momentum: 0.000000
2023-10-13 08:39:09,644 epoch 10 - iter 52/138 - loss 0.02543276 - time (sec): 2.92 - samples/sec: 2991.84 - lr: 0.000004 - momentum: 0.000000
2023-10-13 08:39:10,341 epoch 10 - iter 65/138 - loss 0.02081193 - time (sec): 3.62 - samples/sec: 2995.16 - lr: 0.000003 - momentum: 0.000000
2023-10-13 08:39:11,073 epoch 10 - iter 78/138 - loss 0.01824013 - time (sec): 4.35 - samples/sec: 3032.91 - lr: 0.000003 - momentum: 0.000000
2023-10-13 08:39:11,766 epoch 10 - iter 91/138 - loss 0.01637250 - time (sec): 5.05 - samples/sec: 3043.69 - lr: 0.000002 - momentum: 0.000000
2023-10-13 08:39:12,502 epoch 10 - iter 104/138 - loss 0.01505165 - time (sec): 5.78 - samples/sec: 3001.02 - lr: 0.000002 - momentum: 0.000000
2023-10-13 08:39:13,293 epoch 10 - iter 117/138 - loss 0.01521768 - time (sec): 6.57 - samples/sec: 2955.72 - lr: 0.000001 - momentum: 0.000000
2023-10-13 08:39:14,014 epoch 10 - iter 130/138 - loss 0.01418094 - time (sec): 7.29 - samples/sec: 2932.16 - lr: 0.000000 - momentum: 0.000000
2023-10-13 08:39:14,501 ----------------------------------------------------------------------------------------------------
2023-10-13 08:39:14,501 EPOCH 10 done: loss 0.0137 - lr: 0.000000
2023-10-13 08:39:15,151 DEV : loss 0.16549541056156158 - f1-score (micro avg) 0.8766
2023-10-13 08:39:15,156 saving best model
2023-10-13 08:39:16,129 ----------------------------------------------------------------------------------------------------
2023-10-13 08:39:16,130 Loading model from best epoch ...
2023-10-13 08:39:17,874 SequenceTagger predicts: Dictionary with 25 tags: O, S-scope, B-scope, E-scope, I-scope, S-pers, B-pers, E-pers, I-pers, S-work, B-work, E-work, I-work, S-loc, B-loc, E-loc, I-loc, S-object, B-object, E-object, I-object, S-date, B-date, E-date, I-date
2023-10-13 08:39:18,787
Results:
- F-score (micro) 0.9129
- F-score (macro) 0.6832
- Accuracy 0.8586
By class:
precision recall f1-score support
scope 0.9040 0.9091 0.9065 176
pers 0.9677 0.9375 0.9524 128
work 0.9028 0.8784 0.8904 74
loc 0.0000 0.0000 0.0000 2
object 1.0000 0.5000 0.6667 2
micro avg 0.9202 0.9058 0.9129 382
macro avg 0.7549 0.6450 0.6832 382
weighted avg 0.9209 0.9058 0.9128 382
2023-10-13 08:39:18,788 ----------------------------------------------------------------------------------------------------