Upload folder using huggingface_hub
Browse files- best-model.pt +3 -0
- dev.tsv +0 -0
- loss.tsv +11 -0
- test.tsv +0 -0
- training.log +240 -0
best-model.pt
ADDED
@@ -0,0 +1,3 @@
|
|
|
|
|
|
|
|
|
1 |
+
version https://git-lfs.github.com/spec/v1
|
2 |
+
oid sha256:ca38981bae04f9c3def628dae190b227b2c8f69d167c1837e7ab1ccb0856fc87
|
3 |
+
size 443323527
|
dev.tsv
ADDED
The diff for this file is too large to render.
See raw diff
|
|
loss.tsv
ADDED
@@ -0,0 +1,11 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
1 |
+
EPOCH TIMESTAMP LEARNING_RATE TRAIN_LOSS DEV_LOSS DEV_PRECISION DEV_RECALL DEV_F1 DEV_ACCURACY
|
2 |
+
1 22:30:33 0.0000 0.3646 0.1284 0.2186 0.5701 0.3160 0.1880
|
3 |
+
2 22:33:49 0.0000 0.1525 0.1527 0.2196 0.6496 0.3282 0.1978
|
4 |
+
3 22:37:07 0.0000 0.1089 0.1974 0.2543 0.5360 0.3449 0.2093
|
5 |
+
4 22:40:26 0.0000 0.0764 0.2757 0.2264 0.6231 0.3322 0.2002
|
6 |
+
5 22:43:47 0.0000 0.0558 0.2588 0.2979 0.5568 0.3881 0.2416
|
7 |
+
6 22:47:07 0.0000 0.0404 0.3224 0.2730 0.5833 0.3720 0.2299
|
8 |
+
7 22:50:27 0.0000 0.0308 0.4393 0.2475 0.6117 0.3524 0.2155
|
9 |
+
8 22:53:47 0.0000 0.0228 0.4547 0.2629 0.6080 0.3671 0.2256
|
10 |
+
9 22:57:06 0.0000 0.0140 0.4650 0.2498 0.6080 0.3541 0.2160
|
11 |
+
10 23:00:25 0.0000 0.0103 0.4552 0.2659 0.5947 0.3675 0.2259
|
test.tsv
ADDED
The diff for this file is too large to render.
See raw diff
|
|
training.log
ADDED
@@ -0,0 +1,240 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
1 |
+
2023-10-15 22:27:17,165 ----------------------------------------------------------------------------------------------------
|
2 |
+
2023-10-15 22:27:17,166 Model: "SequenceTagger(
|
3 |
+
(embeddings): TransformerWordEmbeddings(
|
4 |
+
(model): BertModel(
|
5 |
+
(embeddings): BertEmbeddings(
|
6 |
+
(word_embeddings): Embedding(32001, 768)
|
7 |
+
(position_embeddings): Embedding(512, 768)
|
8 |
+
(token_type_embeddings): Embedding(2, 768)
|
9 |
+
(LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)
|
10 |
+
(dropout): Dropout(p=0.1, inplace=False)
|
11 |
+
)
|
12 |
+
(encoder): BertEncoder(
|
13 |
+
(layer): ModuleList(
|
14 |
+
(0-11): 12 x BertLayer(
|
15 |
+
(attention): BertAttention(
|
16 |
+
(self): BertSelfAttention(
|
17 |
+
(query): Linear(in_features=768, out_features=768, bias=True)
|
18 |
+
(key): Linear(in_features=768, out_features=768, bias=True)
|
19 |
+
(value): Linear(in_features=768, out_features=768, bias=True)
|
20 |
+
(dropout): Dropout(p=0.1, inplace=False)
|
21 |
+
)
|
22 |
+
(output): BertSelfOutput(
|
23 |
+
(dense): Linear(in_features=768, out_features=768, bias=True)
|
24 |
+
(LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)
|
25 |
+
(dropout): Dropout(p=0.1, inplace=False)
|
26 |
+
)
|
27 |
+
)
|
28 |
+
(intermediate): BertIntermediate(
|
29 |
+
(dense): Linear(in_features=768, out_features=3072, bias=True)
|
30 |
+
(intermediate_act_fn): GELUActivation()
|
31 |
+
)
|
32 |
+
(output): BertOutput(
|
33 |
+
(dense): Linear(in_features=3072, out_features=768, bias=True)
|
34 |
+
(LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)
|
35 |
+
(dropout): Dropout(p=0.1, inplace=False)
|
36 |
+
)
|
37 |
+
)
|
38 |
+
)
|
39 |
+
)
|
40 |
+
(pooler): BertPooler(
|
41 |
+
(dense): Linear(in_features=768, out_features=768, bias=True)
|
42 |
+
(activation): Tanh()
|
43 |
+
)
|
44 |
+
)
|
45 |
+
)
|
46 |
+
(locked_dropout): LockedDropout(p=0.5)
|
47 |
+
(linear): Linear(in_features=768, out_features=17, bias=True)
|
48 |
+
(loss_function): CrossEntropyLoss()
|
49 |
+
)"
|
50 |
+
2023-10-15 22:27:17,166 ----------------------------------------------------------------------------------------------------
|
51 |
+
2023-10-15 22:27:17,167 MultiCorpus: 20847 train + 1123 dev + 3350 test sentences
|
52 |
+
- NER_HIPE_2022 Corpus: 20847 train + 1123 dev + 3350 test sentences - /root/.flair/datasets/ner_hipe_2022/v2.1/newseye/de/with_doc_seperator
|
53 |
+
2023-10-15 22:27:17,167 ----------------------------------------------------------------------------------------------------
|
54 |
+
2023-10-15 22:27:17,167 Train: 20847 sentences
|
55 |
+
2023-10-15 22:27:17,167 (train_with_dev=False, train_with_test=False)
|
56 |
+
2023-10-15 22:27:17,167 ----------------------------------------------------------------------------------------------------
|
57 |
+
2023-10-15 22:27:17,167 Training Params:
|
58 |
+
2023-10-15 22:27:17,167 - learning_rate: "5e-05"
|
59 |
+
2023-10-15 22:27:17,167 - mini_batch_size: "8"
|
60 |
+
2023-10-15 22:27:17,167 - max_epochs: "10"
|
61 |
+
2023-10-15 22:27:17,167 - shuffle: "True"
|
62 |
+
2023-10-15 22:27:17,167 ----------------------------------------------------------------------------------------------------
|
63 |
+
2023-10-15 22:27:17,167 Plugins:
|
64 |
+
2023-10-15 22:27:17,167 - LinearScheduler | warmup_fraction: '0.1'
|
65 |
+
2023-10-15 22:27:17,167 ----------------------------------------------------------------------------------------------------
|
66 |
+
2023-10-15 22:27:17,167 Final evaluation on model from best epoch (best-model.pt)
|
67 |
+
2023-10-15 22:27:17,167 - metric: "('micro avg', 'f1-score')"
|
68 |
+
2023-10-15 22:27:17,167 ----------------------------------------------------------------------------------------------------
|
69 |
+
2023-10-15 22:27:17,167 Computation:
|
70 |
+
2023-10-15 22:27:17,167 - compute on device: cuda:0
|
71 |
+
2023-10-15 22:27:17,167 - embedding storage: none
|
72 |
+
2023-10-15 22:27:17,167 ----------------------------------------------------------------------------------------------------
|
73 |
+
2023-10-15 22:27:17,167 Model training base path: "hmbench-newseye/de-dbmdz/bert-base-historic-multilingual-cased-bs8-wsFalse-e10-lr5e-05-poolingfirst-layers-1-crfFalse-5"
|
74 |
+
2023-10-15 22:27:17,167 ----------------------------------------------------------------------------------------------------
|
75 |
+
2023-10-15 22:27:17,167 ----------------------------------------------------------------------------------------------------
|
76 |
+
2023-10-15 22:27:35,255 epoch 1 - iter 260/2606 - loss 1.55162857 - time (sec): 18.09 - samples/sec: 1967.99 - lr: 0.000005 - momentum: 0.000000
|
77 |
+
2023-10-15 22:27:54,274 epoch 1 - iter 520/2606 - loss 0.96412232 - time (sec): 37.11 - samples/sec: 1964.22 - lr: 0.000010 - momentum: 0.000000
|
78 |
+
2023-10-15 22:28:12,959 epoch 1 - iter 780/2606 - loss 0.73562218 - time (sec): 55.79 - samples/sec: 1943.27 - lr: 0.000015 - momentum: 0.000000
|
79 |
+
2023-10-15 22:28:31,659 epoch 1 - iter 1040/2606 - loss 0.61663296 - time (sec): 74.49 - samples/sec: 1932.82 - lr: 0.000020 - momentum: 0.000000
|
80 |
+
2023-10-15 22:28:50,994 epoch 1 - iter 1300/2606 - loss 0.53611543 - time (sec): 93.83 - samples/sec: 1924.35 - lr: 0.000025 - momentum: 0.000000
|
81 |
+
2023-10-15 22:29:09,993 epoch 1 - iter 1560/2606 - loss 0.47717725 - time (sec): 112.83 - samples/sec: 1932.36 - lr: 0.000030 - momentum: 0.000000
|
82 |
+
2023-10-15 22:29:28,212 epoch 1 - iter 1820/2606 - loss 0.43977687 - time (sec): 131.04 - samples/sec: 1946.37 - lr: 0.000035 - momentum: 0.000000
|
83 |
+
2023-10-15 22:29:47,039 epoch 1 - iter 2080/2606 - loss 0.40953012 - time (sec): 149.87 - samples/sec: 1941.48 - lr: 0.000040 - momentum: 0.000000
|
84 |
+
2023-10-15 22:30:05,821 epoch 1 - iter 2340/2606 - loss 0.38758796 - time (sec): 168.65 - samples/sec: 1935.49 - lr: 0.000045 - momentum: 0.000000
|
85 |
+
2023-10-15 22:30:25,851 epoch 1 - iter 2600/2606 - loss 0.36513327 - time (sec): 188.68 - samples/sec: 1941.95 - lr: 0.000050 - momentum: 0.000000
|
86 |
+
2023-10-15 22:30:26,350 ----------------------------------------------------------------------------------------------------
|
87 |
+
2023-10-15 22:30:26,350 EPOCH 1 done: loss 0.3646 - lr: 0.000050
|
88 |
+
2023-10-15 22:30:33,057 DEV : loss 0.12843316793441772 - f1-score (micro avg) 0.316
|
89 |
+
2023-10-15 22:30:33,086 saving best model
|
90 |
+
2023-10-15 22:30:33,466 ----------------------------------------------------------------------------------------------------
|
91 |
+
2023-10-15 22:30:53,080 epoch 2 - iter 260/2606 - loss 0.16328381 - time (sec): 19.61 - samples/sec: 1977.33 - lr: 0.000049 - momentum: 0.000000
|
92 |
+
2023-10-15 22:31:11,726 epoch 2 - iter 520/2606 - loss 0.15786652 - time (sec): 38.26 - samples/sec: 1960.14 - lr: 0.000049 - momentum: 0.000000
|
93 |
+
2023-10-15 22:31:30,523 epoch 2 - iter 780/2606 - loss 0.15096452 - time (sec): 57.06 - samples/sec: 1953.36 - lr: 0.000048 - momentum: 0.000000
|
94 |
+
2023-10-15 22:31:49,690 epoch 2 - iter 1040/2606 - loss 0.15162785 - time (sec): 76.22 - samples/sec: 1951.95 - lr: 0.000048 - momentum: 0.000000
|
95 |
+
2023-10-15 22:32:08,501 epoch 2 - iter 1300/2606 - loss 0.15521001 - time (sec): 95.03 - samples/sec: 1945.47 - lr: 0.000047 - momentum: 0.000000
|
96 |
+
2023-10-15 22:32:27,507 epoch 2 - iter 1560/2606 - loss 0.15206366 - time (sec): 114.04 - samples/sec: 1949.84 - lr: 0.000047 - momentum: 0.000000
|
97 |
+
2023-10-15 22:32:45,650 epoch 2 - iter 1820/2606 - loss 0.15321996 - time (sec): 132.18 - samples/sec: 1952.09 - lr: 0.000046 - momentum: 0.000000
|
98 |
+
2023-10-15 22:33:05,338 epoch 2 - iter 2080/2606 - loss 0.15196432 - time (sec): 151.87 - samples/sec: 1955.77 - lr: 0.000046 - momentum: 0.000000
|
99 |
+
2023-10-15 22:33:22,997 epoch 2 - iter 2340/2606 - loss 0.15199041 - time (sec): 169.53 - samples/sec: 1948.27 - lr: 0.000045 - momentum: 0.000000
|
100 |
+
2023-10-15 22:33:41,251 epoch 2 - iter 2600/2606 - loss 0.15242040 - time (sec): 187.78 - samples/sec: 1953.00 - lr: 0.000044 - momentum: 0.000000
|
101 |
+
2023-10-15 22:33:41,598 ----------------------------------------------------------------------------------------------------
|
102 |
+
2023-10-15 22:33:41,598 EPOCH 2 done: loss 0.1525 - lr: 0.000044
|
103 |
+
2023-10-15 22:33:49,877 DEV : loss 0.15268105268478394 - f1-score (micro avg) 0.3282
|
104 |
+
2023-10-15 22:33:49,905 saving best model
|
105 |
+
2023-10-15 22:33:51,198 ----------------------------------------------------------------------------------------------------
|
106 |
+
2023-10-15 22:34:09,869 epoch 3 - iter 260/2606 - loss 0.13841022 - time (sec): 18.67 - samples/sec: 1935.44 - lr: 0.000044 - momentum: 0.000000
|
107 |
+
2023-10-15 22:34:27,325 epoch 3 - iter 520/2606 - loss 0.12007742 - time (sec): 36.12 - samples/sec: 1904.26 - lr: 0.000043 - momentum: 0.000000
|
108 |
+
2023-10-15 22:34:45,146 epoch 3 - iter 780/2606 - loss 0.11917281 - time (sec): 53.94 - samples/sec: 1908.77 - lr: 0.000043 - momentum: 0.000000
|
109 |
+
2023-10-15 22:35:03,249 epoch 3 - iter 1040/2606 - loss 0.11929376 - time (sec): 72.05 - samples/sec: 1927.33 - lr: 0.000042 - momentum: 0.000000
|
110 |
+
2023-10-15 22:35:22,165 epoch 3 - iter 1300/2606 - loss 0.11370851 - time (sec): 90.96 - samples/sec: 1935.39 - lr: 0.000042 - momentum: 0.000000
|
111 |
+
2023-10-15 22:35:41,375 epoch 3 - iter 1560/2606 - loss 0.11243711 - time (sec): 110.17 - samples/sec: 1934.73 - lr: 0.000041 - momentum: 0.000000
|
112 |
+
2023-10-15 22:36:00,749 epoch 3 - iter 1820/2606 - loss 0.11154229 - time (sec): 129.55 - samples/sec: 1938.83 - lr: 0.000041 - momentum: 0.000000
|
113 |
+
2023-10-15 22:36:19,809 epoch 3 - iter 2080/2606 - loss 0.11088653 - time (sec): 148.61 - samples/sec: 1934.20 - lr: 0.000040 - momentum: 0.000000
|
114 |
+
2023-10-15 22:36:39,165 epoch 3 - iter 2340/2606 - loss 0.11020741 - time (sec): 167.96 - samples/sec: 1942.87 - lr: 0.000039 - momentum: 0.000000
|
115 |
+
2023-10-15 22:36:59,231 epoch 3 - iter 2600/2606 - loss 0.10875412 - time (sec): 188.03 - samples/sec: 1951.27 - lr: 0.000039 - momentum: 0.000000
|
116 |
+
2023-10-15 22:36:59,592 ----------------------------------------------------------------------------------------------------
|
117 |
+
2023-10-15 22:36:59,592 EPOCH 3 done: loss 0.1089 - lr: 0.000039
|
118 |
+
2023-10-15 22:37:07,822 DEV : loss 0.1973780244588852 - f1-score (micro avg) 0.3449
|
119 |
+
2023-10-15 22:37:07,851 saving best model
|
120 |
+
2023-10-15 22:37:08,459 ----------------------------------------------------------------------------------------------------
|
121 |
+
2023-10-15 22:37:26,324 epoch 4 - iter 260/2606 - loss 0.06966027 - time (sec): 17.86 - samples/sec: 1961.16 - lr: 0.000038 - momentum: 0.000000
|
122 |
+
2023-10-15 22:37:44,366 epoch 4 - iter 520/2606 - loss 0.07834214 - time (sec): 35.91 - samples/sec: 1997.72 - lr: 0.000038 - momentum: 0.000000
|
123 |
+
2023-10-15 22:38:04,320 epoch 4 - iter 780/2606 - loss 0.07562032 - time (sec): 55.86 - samples/sec: 1973.69 - lr: 0.000037 - momentum: 0.000000
|
124 |
+
2023-10-15 22:38:24,283 epoch 4 - iter 1040/2606 - loss 0.07734676 - time (sec): 75.82 - samples/sec: 1973.03 - lr: 0.000037 - momentum: 0.000000
|
125 |
+
2023-10-15 22:38:43,354 epoch 4 - iter 1300/2606 - loss 0.07949064 - time (sec): 94.89 - samples/sec: 1971.18 - lr: 0.000036 - momentum: 0.000000
|
126 |
+
2023-10-15 22:39:01,902 epoch 4 - iter 1560/2606 - loss 0.08030660 - time (sec): 113.44 - samples/sec: 1963.18 - lr: 0.000036 - momentum: 0.000000
|
127 |
+
2023-10-15 22:39:21,242 epoch 4 - iter 1820/2606 - loss 0.07936732 - time (sec): 132.78 - samples/sec: 1949.28 - lr: 0.000035 - momentum: 0.000000
|
128 |
+
2023-10-15 22:39:40,653 epoch 4 - iter 2080/2606 - loss 0.07794236 - time (sec): 152.19 - samples/sec: 1948.31 - lr: 0.000034 - momentum: 0.000000
|
129 |
+
2023-10-15 22:39:58,840 epoch 4 - iter 2340/2606 - loss 0.07686992 - time (sec): 170.38 - samples/sec: 1942.92 - lr: 0.000034 - momentum: 0.000000
|
130 |
+
2023-10-15 22:40:17,812 epoch 4 - iter 2600/2606 - loss 0.07640796 - time (sec): 189.35 - samples/sec: 1936.72 - lr: 0.000033 - momentum: 0.000000
|
131 |
+
2023-10-15 22:40:18,202 ----------------------------------------------------------------------------------------------------
|
132 |
+
2023-10-15 22:40:18,203 EPOCH 4 done: loss 0.0764 - lr: 0.000033
|
133 |
+
2023-10-15 22:40:26,731 DEV : loss 0.2757483124732971 - f1-score (micro avg) 0.3322
|
134 |
+
2023-10-15 22:40:26,762 ----------------------------------------------------------------------------------------------------
|
135 |
+
2023-10-15 22:40:46,807 epoch 5 - iter 260/2606 - loss 0.05178851 - time (sec): 20.04 - samples/sec: 1921.49 - lr: 0.000033 - momentum: 0.000000
|
136 |
+
2023-10-15 22:41:05,169 epoch 5 - iter 520/2606 - loss 0.05717423 - time (sec): 38.41 - samples/sec: 1904.14 - lr: 0.000032 - momentum: 0.000000
|
137 |
+
2023-10-15 22:41:23,526 epoch 5 - iter 780/2606 - loss 0.05742065 - time (sec): 56.76 - samples/sec: 1913.71 - lr: 0.000032 - momentum: 0.000000
|
138 |
+
2023-10-15 22:41:43,051 epoch 5 - iter 1040/2606 - loss 0.05696528 - time (sec): 76.29 - samples/sec: 1931.89 - lr: 0.000031 - momentum: 0.000000
|
139 |
+
2023-10-15 22:42:03,807 epoch 5 - iter 1300/2606 - loss 0.05634791 - time (sec): 97.04 - samples/sec: 1912.72 - lr: 0.000031 - momentum: 0.000000
|
140 |
+
2023-10-15 22:42:22,326 epoch 5 - iter 1560/2606 - loss 0.05592339 - time (sec): 115.56 - samples/sec: 1914.46 - lr: 0.000030 - momentum: 0.000000
|
141 |
+
2023-10-15 22:42:41,680 epoch 5 - iter 1820/2606 - loss 0.05667760 - time (sec): 134.92 - samples/sec: 1899.98 - lr: 0.000029 - momentum: 0.000000
|
142 |
+
2023-10-15 22:43:01,527 epoch 5 - iter 2080/2606 - loss 0.05618848 - time (sec): 154.76 - samples/sec: 1902.76 - lr: 0.000029 - momentum: 0.000000
|
143 |
+
2023-10-15 22:43:19,937 epoch 5 - iter 2340/2606 - loss 0.05550520 - time (sec): 173.17 - samples/sec: 1908.88 - lr: 0.000028 - momentum: 0.000000
|
144 |
+
2023-10-15 22:43:38,771 epoch 5 - iter 2600/2606 - loss 0.05582007 - time (sec): 192.01 - samples/sec: 1909.87 - lr: 0.000028 - momentum: 0.000000
|
145 |
+
2023-10-15 22:43:39,185 ----------------------------------------------------------------------------------------------------
|
146 |
+
2023-10-15 22:43:39,185 EPOCH 5 done: loss 0.0558 - lr: 0.000028
|
147 |
+
2023-10-15 22:43:47,656 DEV : loss 0.2588236331939697 - f1-score (micro avg) 0.3881
|
148 |
+
2023-10-15 22:43:47,689 saving best model
|
149 |
+
2023-10-15 22:43:48,321 ----------------------------------------------------------------------------------------------------
|
150 |
+
2023-10-15 22:44:07,112 epoch 6 - iter 260/2606 - loss 0.03581917 - time (sec): 18.79 - samples/sec: 1939.80 - lr: 0.000027 - momentum: 0.000000
|
151 |
+
2023-10-15 22:44:25,901 epoch 6 - iter 520/2606 - loss 0.03857227 - time (sec): 37.58 - samples/sec: 1955.95 - lr: 0.000027 - momentum: 0.000000
|
152 |
+
2023-10-15 22:44:45,197 epoch 6 - iter 780/2606 - loss 0.03718849 - time (sec): 56.87 - samples/sec: 1938.69 - lr: 0.000026 - momentum: 0.000000
|
153 |
+
2023-10-15 22:45:04,355 epoch 6 - iter 1040/2606 - loss 0.03795267 - time (sec): 76.03 - samples/sec: 1941.79 - lr: 0.000026 - momentum: 0.000000
|
154 |
+
2023-10-15 22:45:22,754 epoch 6 - iter 1300/2606 - loss 0.03927938 - time (sec): 94.43 - samples/sec: 1939.52 - lr: 0.000025 - momentum: 0.000000
|
155 |
+
2023-10-15 22:45:41,969 epoch 6 - iter 1560/2606 - loss 0.04094059 - time (sec): 113.64 - samples/sec: 1940.09 - lr: 0.000024 - momentum: 0.000000
|
156 |
+
2023-10-15 22:46:02,357 epoch 6 - iter 1820/2606 - loss 0.04042786 - time (sec): 134.03 - samples/sec: 1930.67 - lr: 0.000024 - momentum: 0.000000
|
157 |
+
2023-10-15 22:46:21,777 epoch 6 - iter 2080/2606 - loss 0.04023301 - time (sec): 153.45 - samples/sec: 1932.81 - lr: 0.000023 - momentum: 0.000000
|
158 |
+
2023-10-15 22:46:41,120 epoch 6 - iter 2340/2606 - loss 0.04049483 - time (sec): 172.80 - samples/sec: 1926.06 - lr: 0.000023 - momentum: 0.000000
|
159 |
+
2023-10-15 22:46:59,172 epoch 6 - iter 2600/2606 - loss 0.04034539 - time (sec): 190.85 - samples/sec: 1918.45 - lr: 0.000022 - momentum: 0.000000
|
160 |
+
2023-10-15 22:46:59,704 ----------------------------------------------------------------------------------------------------
|
161 |
+
2023-10-15 22:46:59,704 EPOCH 6 done: loss 0.0404 - lr: 0.000022
|
162 |
+
2023-10-15 22:47:07,922 DEV : loss 0.32239729166030884 - f1-score (micro avg) 0.372
|
163 |
+
2023-10-15 22:47:07,949 ----------------------------------------------------------------------------------------------------
|
164 |
+
2023-10-15 22:47:25,980 epoch 7 - iter 260/2606 - loss 0.03480989 - time (sec): 18.03 - samples/sec: 1933.48 - lr: 0.000022 - momentum: 0.000000
|
165 |
+
2023-10-15 22:47:44,510 epoch 7 - iter 520/2606 - loss 0.02997614 - time (sec): 36.56 - samples/sec: 1954.61 - lr: 0.000021 - momentum: 0.000000
|
166 |
+
2023-10-15 22:48:04,165 epoch 7 - iter 780/2606 - loss 0.03233103 - time (sec): 56.21 - samples/sec: 1925.02 - lr: 0.000021 - momentum: 0.000000
|
167 |
+
2023-10-15 22:48:24,145 epoch 7 - iter 1040/2606 - loss 0.03295900 - time (sec): 76.19 - samples/sec: 1934.66 - lr: 0.000020 - momentum: 0.000000
|
168 |
+
2023-10-15 22:48:43,108 epoch 7 - iter 1300/2606 - loss 0.03274939 - time (sec): 95.16 - samples/sec: 1926.80 - lr: 0.000019 - momentum: 0.000000
|
169 |
+
2023-10-15 22:49:02,094 epoch 7 - iter 1560/2606 - loss 0.03398439 - time (sec): 114.14 - samples/sec: 1930.83 - lr: 0.000019 - momentum: 0.000000
|
170 |
+
2023-10-15 22:49:21,082 epoch 7 - iter 1820/2606 - loss 0.03293724 - time (sec): 133.13 - samples/sec: 1940.00 - lr: 0.000018 - momentum: 0.000000
|
171 |
+
2023-10-15 22:49:40,117 epoch 7 - iter 2080/2606 - loss 0.03190338 - time (sec): 152.17 - samples/sec: 1943.14 - lr: 0.000018 - momentum: 0.000000
|
172 |
+
2023-10-15 22:49:58,402 epoch 7 - iter 2340/2606 - loss 0.03087084 - time (sec): 170.45 - samples/sec: 1932.03 - lr: 0.000017 - momentum: 0.000000
|
173 |
+
2023-10-15 22:50:18,500 epoch 7 - iter 2600/2606 - loss 0.03081740 - time (sec): 190.55 - samples/sec: 1922.45 - lr: 0.000017 - momentum: 0.000000
|
174 |
+
2023-10-15 22:50:19,063 ----------------------------------------------------------------------------------------------------
|
175 |
+
2023-10-15 22:50:19,063 EPOCH 7 done: loss 0.0308 - lr: 0.000017
|
176 |
+
2023-10-15 22:50:27,425 DEV : loss 0.4393313229084015 - f1-score (micro avg) 0.3524
|
177 |
+
2023-10-15 22:50:27,461 ----------------------------------------------------------------------------------------------------
|
178 |
+
2023-10-15 22:50:47,488 epoch 8 - iter 260/2606 - loss 0.02062903 - time (sec): 20.03 - samples/sec: 1948.84 - lr: 0.000016 - momentum: 0.000000
|
179 |
+
2023-10-15 22:51:06,776 epoch 8 - iter 520/2606 - loss 0.02059864 - time (sec): 39.31 - samples/sec: 1957.39 - lr: 0.000016 - momentum: 0.000000
|
180 |
+
2023-10-15 22:51:25,601 epoch 8 - iter 780/2606 - loss 0.02243684 - time (sec): 58.14 - samples/sec: 1942.91 - lr: 0.000015 - momentum: 0.000000
|
181 |
+
2023-10-15 22:51:44,843 epoch 8 - iter 1040/2606 - loss 0.02218227 - time (sec): 77.38 - samples/sec: 1932.78 - lr: 0.000014 - momentum: 0.000000
|
182 |
+
2023-10-15 22:52:04,003 epoch 8 - iter 1300/2606 - loss 0.02168277 - time (sec): 96.54 - samples/sec: 1928.59 - lr: 0.000014 - momentum: 0.000000
|
183 |
+
2023-10-15 22:52:23,397 epoch 8 - iter 1560/2606 - loss 0.02361754 - time (sec): 115.93 - samples/sec: 1929.53 - lr: 0.000013 - momentum: 0.000000
|
184 |
+
2023-10-15 22:52:41,555 epoch 8 - iter 1820/2606 - loss 0.02366463 - time (sec): 134.09 - samples/sec: 1934.57 - lr: 0.000013 - momentum: 0.000000
|
185 |
+
2023-10-15 22:53:00,142 epoch 8 - iter 2080/2606 - loss 0.02356939 - time (sec): 152.68 - samples/sec: 1934.05 - lr: 0.000012 - momentum: 0.000000
|
186 |
+
2023-10-15 22:53:18,836 epoch 8 - iter 2340/2606 - loss 0.02340726 - time (sec): 171.37 - samples/sec: 1924.57 - lr: 0.000012 - momentum: 0.000000
|
187 |
+
2023-10-15 22:53:37,933 epoch 8 - iter 2600/2606 - loss 0.02280304 - time (sec): 190.47 - samples/sec: 1926.00 - lr: 0.000011 - momentum: 0.000000
|
188 |
+
2023-10-15 22:53:38,313 ----------------------------------------------------------------------------------------------------
|
189 |
+
2023-10-15 22:53:38,313 EPOCH 8 done: loss 0.0228 - lr: 0.000011
|
190 |
+
2023-10-15 22:53:47,439 DEV : loss 0.4546719491481781 - f1-score (micro avg) 0.3671
|
191 |
+
2023-10-15 22:53:47,466 ----------------------------------------------------------------------------------------------------
|
192 |
+
2023-10-15 22:54:06,558 epoch 9 - iter 260/2606 - loss 0.01469418 - time (sec): 19.09 - samples/sec: 1971.09 - lr: 0.000011 - momentum: 0.000000
|
193 |
+
2023-10-15 22:54:24,684 epoch 9 - iter 520/2606 - loss 0.01564107 - time (sec): 37.22 - samples/sec: 1948.48 - lr: 0.000010 - momentum: 0.000000
|
194 |
+
2023-10-15 22:54:43,192 epoch 9 - iter 780/2606 - loss 0.01644586 - time (sec): 55.72 - samples/sec: 1925.64 - lr: 0.000009 - momentum: 0.000000
|
195 |
+
2023-10-15 22:55:01,624 epoch 9 - iter 1040/2606 - loss 0.01628689 - time (sec): 74.16 - samples/sec: 1925.25 - lr: 0.000009 - momentum: 0.000000
|
196 |
+
2023-10-15 22:55:21,440 epoch 9 - iter 1300/2606 - loss 0.01534667 - time (sec): 93.97 - samples/sec: 1929.47 - lr: 0.000008 - momentum: 0.000000
|
197 |
+
2023-10-15 22:55:40,355 epoch 9 - iter 1560/2606 - loss 0.01439162 - time (sec): 112.89 - samples/sec: 1935.81 - lr: 0.000008 - momentum: 0.000000
|
198 |
+
2023-10-15 22:55:59,468 epoch 9 - iter 1820/2606 - loss 0.01446621 - time (sec): 132.00 - samples/sec: 1933.75 - lr: 0.000007 - momentum: 0.000000
|
199 |
+
2023-10-15 22:56:18,466 epoch 9 - iter 2080/2606 - loss 0.01388150 - time (sec): 151.00 - samples/sec: 1938.34 - lr: 0.000007 - momentum: 0.000000
|
200 |
+
2023-10-15 22:56:38,019 epoch 9 - iter 2340/2606 - loss 0.01387827 - time (sec): 170.55 - samples/sec: 1938.36 - lr: 0.000006 - momentum: 0.000000
|
201 |
+
2023-10-15 22:56:56,829 epoch 9 - iter 2600/2606 - loss 0.01405118 - time (sec): 189.36 - samples/sec: 1935.71 - lr: 0.000006 - momentum: 0.000000
|
202 |
+
2023-10-15 22:56:57,290 ----------------------------------------------------------------------------------------------------
|
203 |
+
2023-10-15 22:56:57,290 EPOCH 9 done: loss 0.0140 - lr: 0.000006
|
204 |
+
2023-10-15 22:57:06,397 DEV : loss 0.46497806906700134 - f1-score (micro avg) 0.3541
|
205 |
+
2023-10-15 22:57:06,426 ----------------------------------------------------------------------------------------------------
|
206 |
+
2023-10-15 22:57:24,867 epoch 10 - iter 260/2606 - loss 0.01004813 - time (sec): 18.44 - samples/sec: 1931.32 - lr: 0.000005 - momentum: 0.000000
|
207 |
+
2023-10-15 22:57:43,792 epoch 10 - iter 520/2606 - loss 0.01164960 - time (sec): 37.36 - samples/sec: 1912.24 - lr: 0.000004 - momentum: 0.000000
|
208 |
+
2023-10-15 22:58:02,149 epoch 10 - iter 780/2606 - loss 0.01022217 - time (sec): 55.72 - samples/sec: 1914.91 - lr: 0.000004 - momentum: 0.000000
|
209 |
+
2023-10-15 22:58:20,621 epoch 10 - iter 1040/2606 - loss 0.01089894 - time (sec): 74.19 - samples/sec: 1922.29 - lr: 0.000003 - momentum: 0.000000
|
210 |
+
2023-10-15 22:58:39,283 epoch 10 - iter 1300/2606 - loss 0.01043785 - time (sec): 92.86 - samples/sec: 1917.68 - lr: 0.000003 - momentum: 0.000000
|
211 |
+
2023-10-15 22:58:58,972 epoch 10 - iter 1560/2606 - loss 0.01058729 - time (sec): 112.55 - samples/sec: 1924.74 - lr: 0.000002 - momentum: 0.000000
|
212 |
+
2023-10-15 22:59:18,245 epoch 10 - iter 1820/2606 - loss 0.01038180 - time (sec): 131.82 - samples/sec: 1931.33 - lr: 0.000002 - momentum: 0.000000
|
213 |
+
2023-10-15 22:59:38,495 epoch 10 - iter 2080/2606 - loss 0.00998178 - time (sec): 152.07 - samples/sec: 1930.07 - lr: 0.000001 - momentum: 0.000000
|
214 |
+
2023-10-15 22:59:57,836 epoch 10 - iter 2340/2606 - loss 0.01023802 - time (sec): 171.41 - samples/sec: 1931.92 - lr: 0.000001 - momentum: 0.000000
|
215 |
+
2023-10-15 23:00:16,128 epoch 10 - iter 2600/2606 - loss 0.01035035 - time (sec): 189.70 - samples/sec: 1933.34 - lr: 0.000000 - momentum: 0.000000
|
216 |
+
2023-10-15 23:00:16,510 ----------------------------------------------------------------------------------------------------
|
217 |
+
2023-10-15 23:00:16,510 EPOCH 10 done: loss 0.0103 - lr: 0.000000
|
218 |
+
2023-10-15 23:00:25,551 DEV : loss 0.4552249312400818 - f1-score (micro avg) 0.3675
|
219 |
+
2023-10-15 23:00:25,955 ----------------------------------------------------------------------------------------------------
|
220 |
+
2023-10-15 23:00:25,956 Loading model from best epoch ...
|
221 |
+
2023-10-15 23:00:27,429 SequenceTagger predicts: Dictionary with 17 tags: O, S-LOC, B-LOC, E-LOC, I-LOC, S-PER, B-PER, E-PER, I-PER, S-ORG, B-ORG, E-ORG, I-ORG, S-HumanProd, B-HumanProd, E-HumanProd, I-HumanProd
|
222 |
+
2023-10-15 23:00:42,906
|
223 |
+
Results:
|
224 |
+
- F-score (micro) 0.4196
|
225 |
+
- F-score (macro) 0.2764
|
226 |
+
- Accuracy 0.2689
|
227 |
+
|
228 |
+
By class:
|
229 |
+
precision recall f1-score support
|
230 |
+
|
231 |
+
LOC 0.4960 0.4547 0.4744 1214
|
232 |
+
PER 0.3994 0.4567 0.4261 808
|
233 |
+
ORG 0.2405 0.1785 0.2049 353
|
234 |
+
HumanProd 0.0000 0.0000 0.0000 15
|
235 |
+
|
236 |
+
micro avg 0.4278 0.4117 0.4196 2390
|
237 |
+
macro avg 0.2839 0.2725 0.2764 2390
|
238 |
+
weighted avg 0.4224 0.4117 0.4153 2390
|
239 |
+
|
240 |
+
2023-10-15 23:00:42,906 ----------------------------------------------------------------------------------------------------
|