stefan-it commited on
Commit
b913789
1 Parent(s): e39d8de

Upload ./training.log with huggingface_hub

Browse files
Files changed (1) hide show
  1. training.log +512 -0
training.log ADDED
@@ -0,0 +1,512 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ 2023-10-23 22:21:59,802 ----------------------------------------------------------------------------------------------------
2
+ 2023-10-23 22:21:59,803 Model: "SequenceTagger(
3
+ (embeddings): TransformerWordEmbeddings(
4
+ (model): BertModel(
5
+ (embeddings): BertEmbeddings(
6
+ (word_embeddings): Embedding(64001, 768)
7
+ (position_embeddings): Embedding(512, 768)
8
+ (token_type_embeddings): Embedding(2, 768)
9
+ (LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)
10
+ (dropout): Dropout(p=0.1, inplace=False)
11
+ )
12
+ (encoder): BertEncoder(
13
+ (layer): ModuleList(
14
+ (0): BertLayer(
15
+ (attention): BertAttention(
16
+ (self): BertSelfAttention(
17
+ (query): Linear(in_features=768, out_features=768, bias=True)
18
+ (key): Linear(in_features=768, out_features=768, bias=True)
19
+ (value): Linear(in_features=768, out_features=768, bias=True)
20
+ (dropout): Dropout(p=0.1, inplace=False)
21
+ )
22
+ (output): BertSelfOutput(
23
+ (dense): Linear(in_features=768, out_features=768, bias=True)
24
+ (LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)
25
+ (dropout): Dropout(p=0.1, inplace=False)
26
+ )
27
+ )
28
+ (intermediate): BertIntermediate(
29
+ (dense): Linear(in_features=768, out_features=3072, bias=True)
30
+ (intermediate_act_fn): GELUActivation()
31
+ )
32
+ (output): BertOutput(
33
+ (dense): Linear(in_features=3072, out_features=768, bias=True)
34
+ (LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)
35
+ (dropout): Dropout(p=0.1, inplace=False)
36
+ )
37
+ )
38
+ (1): BertLayer(
39
+ (attention): BertAttention(
40
+ (self): BertSelfAttention(
41
+ (query): Linear(in_features=768, out_features=768, bias=True)
42
+ (key): Linear(in_features=768, out_features=768, bias=True)
43
+ (value): Linear(in_features=768, out_features=768, bias=True)
44
+ (dropout): Dropout(p=0.1, inplace=False)
45
+ )
46
+ (output): BertSelfOutput(
47
+ (dense): Linear(in_features=768, out_features=768, bias=True)
48
+ (LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)
49
+ (dropout): Dropout(p=0.1, inplace=False)
50
+ )
51
+ )
52
+ (intermediate): BertIntermediate(
53
+ (dense): Linear(in_features=768, out_features=3072, bias=True)
54
+ (intermediate_act_fn): GELUActivation()
55
+ )
56
+ (output): BertOutput(
57
+ (dense): Linear(in_features=3072, out_features=768, bias=True)
58
+ (LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)
59
+ (dropout): Dropout(p=0.1, inplace=False)
60
+ )
61
+ )
62
+ (2): BertLayer(
63
+ (attention): BertAttention(
64
+ (self): BertSelfAttention(
65
+ (query): Linear(in_features=768, out_features=768, bias=True)
66
+ (key): Linear(in_features=768, out_features=768, bias=True)
67
+ (value): Linear(in_features=768, out_features=768, bias=True)
68
+ (dropout): Dropout(p=0.1, inplace=False)
69
+ )
70
+ (output): BertSelfOutput(
71
+ (dense): Linear(in_features=768, out_features=768, bias=True)
72
+ (LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)
73
+ (dropout): Dropout(p=0.1, inplace=False)
74
+ )
75
+ )
76
+ (intermediate): BertIntermediate(
77
+ (dense): Linear(in_features=768, out_features=3072, bias=True)
78
+ (intermediate_act_fn): GELUActivation()
79
+ )
80
+ (output): BertOutput(
81
+ (dense): Linear(in_features=3072, out_features=768, bias=True)
82
+ (LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)
83
+ (dropout): Dropout(p=0.1, inplace=False)
84
+ )
85
+ )
86
+ (3): BertLayer(
87
+ (attention): BertAttention(
88
+ (self): BertSelfAttention(
89
+ (query): Linear(in_features=768, out_features=768, bias=True)
90
+ (key): Linear(in_features=768, out_features=768, bias=True)
91
+ (value): Linear(in_features=768, out_features=768, bias=True)
92
+ (dropout): Dropout(p=0.1, inplace=False)
93
+ )
94
+ (output): BertSelfOutput(
95
+ (dense): Linear(in_features=768, out_features=768, bias=True)
96
+ (LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)
97
+ (dropout): Dropout(p=0.1, inplace=False)
98
+ )
99
+ )
100
+ (intermediate): BertIntermediate(
101
+ (dense): Linear(in_features=768, out_features=3072, bias=True)
102
+ (intermediate_act_fn): GELUActivation()
103
+ )
104
+ (output): BertOutput(
105
+ (dense): Linear(in_features=3072, out_features=768, bias=True)
106
+ (LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)
107
+ (dropout): Dropout(p=0.1, inplace=False)
108
+ )
109
+ )
110
+ (4): BertLayer(
111
+ (attention): BertAttention(
112
+ (self): BertSelfAttention(
113
+ (query): Linear(in_features=768, out_features=768, bias=True)
114
+ (key): Linear(in_features=768, out_features=768, bias=True)
115
+ (value): Linear(in_features=768, out_features=768, bias=True)
116
+ (dropout): Dropout(p=0.1, inplace=False)
117
+ )
118
+ (output): BertSelfOutput(
119
+ (dense): Linear(in_features=768, out_features=768, bias=True)
120
+ (LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)
121
+ (dropout): Dropout(p=0.1, inplace=False)
122
+ )
123
+ )
124
+ (intermediate): BertIntermediate(
125
+ (dense): Linear(in_features=768, out_features=3072, bias=True)
126
+ (intermediate_act_fn): GELUActivation()
127
+ )
128
+ (output): BertOutput(
129
+ (dense): Linear(in_features=3072, out_features=768, bias=True)
130
+ (LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)
131
+ (dropout): Dropout(p=0.1, inplace=False)
132
+ )
133
+ )
134
+ (5): BertLayer(
135
+ (attention): BertAttention(
136
+ (self): BertSelfAttention(
137
+ (query): Linear(in_features=768, out_features=768, bias=True)
138
+ (key): Linear(in_features=768, out_features=768, bias=True)
139
+ (value): Linear(in_features=768, out_features=768, bias=True)
140
+ (dropout): Dropout(p=0.1, inplace=False)
141
+ )
142
+ (output): BertSelfOutput(
143
+ (dense): Linear(in_features=768, out_features=768, bias=True)
144
+ (LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)
145
+ (dropout): Dropout(p=0.1, inplace=False)
146
+ )
147
+ )
148
+ (intermediate): BertIntermediate(
149
+ (dense): Linear(in_features=768, out_features=3072, bias=True)
150
+ (intermediate_act_fn): GELUActivation()
151
+ )
152
+ (output): BertOutput(
153
+ (dense): Linear(in_features=3072, out_features=768, bias=True)
154
+ (LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)
155
+ (dropout): Dropout(p=0.1, inplace=False)
156
+ )
157
+ )
158
+ (6): BertLayer(
159
+ (attention): BertAttention(
160
+ (self): BertSelfAttention(
161
+ (query): Linear(in_features=768, out_features=768, bias=True)
162
+ (key): Linear(in_features=768, out_features=768, bias=True)
163
+ (value): Linear(in_features=768, out_features=768, bias=True)
164
+ (dropout): Dropout(p=0.1, inplace=False)
165
+ )
166
+ (output): BertSelfOutput(
167
+ (dense): Linear(in_features=768, out_features=768, bias=True)
168
+ (LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)
169
+ (dropout): Dropout(p=0.1, inplace=False)
170
+ )
171
+ )
172
+ (intermediate): BertIntermediate(
173
+ (dense): Linear(in_features=768, out_features=3072, bias=True)
174
+ (intermediate_act_fn): GELUActivation()
175
+ )
176
+ (output): BertOutput(
177
+ (dense): Linear(in_features=3072, out_features=768, bias=True)
178
+ (LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)
179
+ (dropout): Dropout(p=0.1, inplace=False)
180
+ )
181
+ )
182
+ (7): BertLayer(
183
+ (attention): BertAttention(
184
+ (self): BertSelfAttention(
185
+ (query): Linear(in_features=768, out_features=768, bias=True)
186
+ (key): Linear(in_features=768, out_features=768, bias=True)
187
+ (value): Linear(in_features=768, out_features=768, bias=True)
188
+ (dropout): Dropout(p=0.1, inplace=False)
189
+ )
190
+ (output): BertSelfOutput(
191
+ (dense): Linear(in_features=768, out_features=768, bias=True)
192
+ (LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)
193
+ (dropout): Dropout(p=0.1, inplace=False)
194
+ )
195
+ )
196
+ (intermediate): BertIntermediate(
197
+ (dense): Linear(in_features=768, out_features=3072, bias=True)
198
+ (intermediate_act_fn): GELUActivation()
199
+ )
200
+ (output): BertOutput(
201
+ (dense): Linear(in_features=3072, out_features=768, bias=True)
202
+ (LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)
203
+ (dropout): Dropout(p=0.1, inplace=False)
204
+ )
205
+ )
206
+ (8): BertLayer(
207
+ (attention): BertAttention(
208
+ (self): BertSelfAttention(
209
+ (query): Linear(in_features=768, out_features=768, bias=True)
210
+ (key): Linear(in_features=768, out_features=768, bias=True)
211
+ (value): Linear(in_features=768, out_features=768, bias=True)
212
+ (dropout): Dropout(p=0.1, inplace=False)
213
+ )
214
+ (output): BertSelfOutput(
215
+ (dense): Linear(in_features=768, out_features=768, bias=True)
216
+ (LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)
217
+ (dropout): Dropout(p=0.1, inplace=False)
218
+ )
219
+ )
220
+ (intermediate): BertIntermediate(
221
+ (dense): Linear(in_features=768, out_features=3072, bias=True)
222
+ (intermediate_act_fn): GELUActivation()
223
+ )
224
+ (output): BertOutput(
225
+ (dense): Linear(in_features=3072, out_features=768, bias=True)
226
+ (LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)
227
+ (dropout): Dropout(p=0.1, inplace=False)
228
+ )
229
+ )
230
+ (9): BertLayer(
231
+ (attention): BertAttention(
232
+ (self): BertSelfAttention(
233
+ (query): Linear(in_features=768, out_features=768, bias=True)
234
+ (key): Linear(in_features=768, out_features=768, bias=True)
235
+ (value): Linear(in_features=768, out_features=768, bias=True)
236
+ (dropout): Dropout(p=0.1, inplace=False)
237
+ )
238
+ (output): BertSelfOutput(
239
+ (dense): Linear(in_features=768, out_features=768, bias=True)
240
+ (LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)
241
+ (dropout): Dropout(p=0.1, inplace=False)
242
+ )
243
+ )
244
+ (intermediate): BertIntermediate(
245
+ (dense): Linear(in_features=768, out_features=3072, bias=True)
246
+ (intermediate_act_fn): GELUActivation()
247
+ )
248
+ (output): BertOutput(
249
+ (dense): Linear(in_features=3072, out_features=768, bias=True)
250
+ (LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)
251
+ (dropout): Dropout(p=0.1, inplace=False)
252
+ )
253
+ )
254
+ (10): BertLayer(
255
+ (attention): BertAttention(
256
+ (self): BertSelfAttention(
257
+ (query): Linear(in_features=768, out_features=768, bias=True)
258
+ (key): Linear(in_features=768, out_features=768, bias=True)
259
+ (value): Linear(in_features=768, out_features=768, bias=True)
260
+ (dropout): Dropout(p=0.1, inplace=False)
261
+ )
262
+ (output): BertSelfOutput(
263
+ (dense): Linear(in_features=768, out_features=768, bias=True)
264
+ (LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)
265
+ (dropout): Dropout(p=0.1, inplace=False)
266
+ )
267
+ )
268
+ (intermediate): BertIntermediate(
269
+ (dense): Linear(in_features=768, out_features=3072, bias=True)
270
+ (intermediate_act_fn): GELUActivation()
271
+ )
272
+ (output): BertOutput(
273
+ (dense): Linear(in_features=3072, out_features=768, bias=True)
274
+ (LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)
275
+ (dropout): Dropout(p=0.1, inplace=False)
276
+ )
277
+ )
278
+ (11): BertLayer(
279
+ (attention): BertAttention(
280
+ (self): BertSelfAttention(
281
+ (query): Linear(in_features=768, out_features=768, bias=True)
282
+ (key): Linear(in_features=768, out_features=768, bias=True)
283
+ (value): Linear(in_features=768, out_features=768, bias=True)
284
+ (dropout): Dropout(p=0.1, inplace=False)
285
+ )
286
+ (output): BertSelfOutput(
287
+ (dense): Linear(in_features=768, out_features=768, bias=True)
288
+ (LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)
289
+ (dropout): Dropout(p=0.1, inplace=False)
290
+ )
291
+ )
292
+ (intermediate): BertIntermediate(
293
+ (dense): Linear(in_features=768, out_features=3072, bias=True)
294
+ (intermediate_act_fn): GELUActivation()
295
+ )
296
+ (output): BertOutput(
297
+ (dense): Linear(in_features=3072, out_features=768, bias=True)
298
+ (LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)
299
+ (dropout): Dropout(p=0.1, inplace=False)
300
+ )
301
+ )
302
+ )
303
+ )
304
+ (pooler): BertPooler(
305
+ (dense): Linear(in_features=768, out_features=768, bias=True)
306
+ (activation): Tanh()
307
+ )
308
+ )
309
+ )
310
+ (locked_dropout): LockedDropout(p=0.5)
311
+ (linear): Linear(in_features=768, out_features=21, bias=True)
312
+ (loss_function): CrossEntropyLoss()
313
+ )"
314
+ 2023-10-23 22:21:59,803 ----------------------------------------------------------------------------------------------------
315
+ 2023-10-23 22:21:59,804 MultiCorpus: 3575 train + 1235 dev + 1266 test sentences
316
+ - NER_HIPE_2022 Corpus: 3575 train + 1235 dev + 1266 test sentences - /home/ubuntu/.flair/datasets/ner_hipe_2022/v2.1/hipe2020/de/with_doc_seperator
317
+ 2023-10-23 22:21:59,804 ----------------------------------------------------------------------------------------------------
318
+ 2023-10-23 22:21:59,804 Train: 3575 sentences
319
+ 2023-10-23 22:21:59,804 (train_with_dev=False, train_with_test=False)
320
+ 2023-10-23 22:21:59,804 ----------------------------------------------------------------------------------------------------
321
+ 2023-10-23 22:21:59,804 Training Params:
322
+ 2023-10-23 22:21:59,804 - learning_rate: "3e-05"
323
+ 2023-10-23 22:21:59,804 - mini_batch_size: "8"
324
+ 2023-10-23 22:21:59,804 - max_epochs: "10"
325
+ 2023-10-23 22:21:59,804 - shuffle: "True"
326
+ 2023-10-23 22:21:59,804 ----------------------------------------------------------------------------------------------------
327
+ 2023-10-23 22:21:59,804 Plugins:
328
+ 2023-10-23 22:21:59,804 - TensorboardLogger
329
+ 2023-10-23 22:21:59,804 - LinearScheduler | warmup_fraction: '0.1'
330
+ 2023-10-23 22:21:59,804 ----------------------------------------------------------------------------------------------------
331
+ 2023-10-23 22:21:59,804 Final evaluation on model from best epoch (best-model.pt)
332
+ 2023-10-23 22:21:59,804 - metric: "('micro avg', 'f1-score')"
333
+ 2023-10-23 22:21:59,804 ----------------------------------------------------------------------------------------------------
334
+ 2023-10-23 22:21:59,804 Computation:
335
+ 2023-10-23 22:21:59,804 - compute on device: cuda:0
336
+ 2023-10-23 22:21:59,804 - embedding storage: none
337
+ 2023-10-23 22:21:59,804 ----------------------------------------------------------------------------------------------------
338
+ 2023-10-23 22:21:59,804 Model training base path: "hmbench-hipe2020/de-dbmdz/bert-base-historic-multilingual-64k-td-cased-bs8-wsFalse-e10-lr3e-05-poolingfirst-layers-1-crfFalse-4"
339
+ 2023-10-23 22:21:59,804 ----------------------------------------------------------------------------------------------------
340
+ 2023-10-23 22:21:59,804 ----------------------------------------------------------------------------------------------------
341
+ 2023-10-23 22:21:59,804 Logging anything other than scalars to TensorBoard is currently not supported.
342
+ 2023-10-23 22:22:03,847 epoch 1 - iter 44/447 - loss 2.92018008 - time (sec): 4.04 - samples/sec: 2032.64 - lr: 0.000003 - momentum: 0.000000
343
+ 2023-10-23 22:22:07,968 epoch 1 - iter 88/447 - loss 1.92758216 - time (sec): 8.16 - samples/sec: 2089.91 - lr: 0.000006 - momentum: 0.000000
344
+ 2023-10-23 22:22:11,886 epoch 1 - iter 132/447 - loss 1.46261625 - time (sec): 12.08 - samples/sec: 2086.10 - lr: 0.000009 - momentum: 0.000000
345
+ 2023-10-23 22:22:15,679 epoch 1 - iter 176/447 - loss 1.22672961 - time (sec): 15.87 - samples/sec: 2103.25 - lr: 0.000012 - momentum: 0.000000
346
+ 2023-10-23 22:22:19,568 epoch 1 - iter 220/447 - loss 1.04272271 - time (sec): 19.76 - samples/sec: 2125.02 - lr: 0.000015 - momentum: 0.000000
347
+ 2023-10-23 22:22:23,281 epoch 1 - iter 264/447 - loss 0.91930567 - time (sec): 23.48 - samples/sec: 2138.47 - lr: 0.000018 - momentum: 0.000000
348
+ 2023-10-23 22:22:27,217 epoch 1 - iter 308/447 - loss 0.82267150 - time (sec): 27.41 - samples/sec: 2146.19 - lr: 0.000021 - momentum: 0.000000
349
+ 2023-10-23 22:22:31,115 epoch 1 - iter 352/447 - loss 0.74697299 - time (sec): 31.31 - samples/sec: 2150.51 - lr: 0.000024 - momentum: 0.000000
350
+ 2023-10-23 22:22:34,954 epoch 1 - iter 396/447 - loss 0.69198290 - time (sec): 35.15 - samples/sec: 2151.70 - lr: 0.000027 - momentum: 0.000000
351
+ 2023-10-23 22:22:39,219 epoch 1 - iter 440/447 - loss 0.63800201 - time (sec): 39.41 - samples/sec: 2156.09 - lr: 0.000029 - momentum: 0.000000
352
+ 2023-10-23 22:22:39,972 ----------------------------------------------------------------------------------------------------
353
+ 2023-10-23 22:22:39,972 EPOCH 1 done: loss 0.6318 - lr: 0.000029
354
+ 2023-10-23 22:22:44,793 DEV : loss 0.14997614920139313 - f1-score (micro avg) 0.6314
355
+ 2023-10-23 22:22:44,814 saving best model
356
+ 2023-10-23 22:22:45,370 ----------------------------------------------------------------------------------------------------
357
+ 2023-10-23 22:22:49,085 epoch 2 - iter 44/447 - loss 0.14886525 - time (sec): 3.71 - samples/sec: 2151.01 - lr: 0.000030 - momentum: 0.000000
358
+ 2023-10-23 22:22:52,913 epoch 2 - iter 88/447 - loss 0.14909506 - time (sec): 7.54 - samples/sec: 2231.06 - lr: 0.000029 - momentum: 0.000000
359
+ 2023-10-23 22:22:57,328 epoch 2 - iter 132/447 - loss 0.14851561 - time (sec): 11.96 - samples/sec: 2173.51 - lr: 0.000029 - momentum: 0.000000
360
+ 2023-10-23 22:23:01,211 epoch 2 - iter 176/447 - loss 0.15023382 - time (sec): 15.84 - samples/sec: 2163.41 - lr: 0.000029 - momentum: 0.000000
361
+ 2023-10-23 22:23:05,281 epoch 2 - iter 220/447 - loss 0.15236516 - time (sec): 19.91 - samples/sec: 2154.22 - lr: 0.000028 - momentum: 0.000000
362
+ 2023-10-23 22:23:09,340 epoch 2 - iter 264/447 - loss 0.14349368 - time (sec): 23.97 - samples/sec: 2128.94 - lr: 0.000028 - momentum: 0.000000
363
+ 2023-10-23 22:23:13,120 epoch 2 - iter 308/447 - loss 0.14583098 - time (sec): 27.75 - samples/sec: 2131.92 - lr: 0.000028 - momentum: 0.000000
364
+ 2023-10-23 22:23:16,865 epoch 2 - iter 352/447 - loss 0.14036361 - time (sec): 31.49 - samples/sec: 2124.32 - lr: 0.000027 - momentum: 0.000000
365
+ 2023-10-23 22:23:21,191 epoch 2 - iter 396/447 - loss 0.13709503 - time (sec): 35.82 - samples/sec: 2125.42 - lr: 0.000027 - momentum: 0.000000
366
+ 2023-10-23 22:23:25,203 epoch 2 - iter 440/447 - loss 0.13162957 - time (sec): 39.83 - samples/sec: 2136.40 - lr: 0.000027 - momentum: 0.000000
367
+ 2023-10-23 22:23:25,803 ----------------------------------------------------------------------------------------------------
368
+ 2023-10-23 22:23:25,803 EPOCH 2 done: loss 0.1312 - lr: 0.000027
369
+ 2023-10-23 22:23:32,296 DEV : loss 0.12065546214580536 - f1-score (micro avg) 0.7139
370
+ 2023-10-23 22:23:32,316 saving best model
371
+ 2023-10-23 22:23:33,038 ----------------------------------------------------------------------------------------------------
372
+ 2023-10-23 22:23:36,912 epoch 3 - iter 44/447 - loss 0.07762606 - time (sec): 3.87 - samples/sec: 2018.49 - lr: 0.000026 - momentum: 0.000000
373
+ 2023-10-23 22:23:41,076 epoch 3 - iter 88/447 - loss 0.07919298 - time (sec): 8.04 - samples/sec: 2015.72 - lr: 0.000026 - momentum: 0.000000
374
+ 2023-10-23 22:23:44,839 epoch 3 - iter 132/447 - loss 0.07541745 - time (sec): 11.80 - samples/sec: 2075.74 - lr: 0.000026 - momentum: 0.000000
375
+ 2023-10-23 22:23:48,984 epoch 3 - iter 176/447 - loss 0.07757789 - time (sec): 15.95 - samples/sec: 2110.02 - lr: 0.000025 - momentum: 0.000000
376
+ 2023-10-23 22:23:52,699 epoch 3 - iter 220/447 - loss 0.07526481 - time (sec): 19.66 - samples/sec: 2089.85 - lr: 0.000025 - momentum: 0.000000
377
+ 2023-10-23 22:23:57,165 epoch 3 - iter 264/447 - loss 0.07614139 - time (sec): 24.13 - samples/sec: 2084.77 - lr: 0.000025 - momentum: 0.000000
378
+ 2023-10-23 22:24:01,492 epoch 3 - iter 308/447 - loss 0.07545581 - time (sec): 28.45 - samples/sec: 2092.33 - lr: 0.000024 - momentum: 0.000000
379
+ 2023-10-23 22:24:05,292 epoch 3 - iter 352/447 - loss 0.07211717 - time (sec): 32.25 - samples/sec: 2111.02 - lr: 0.000024 - momentum: 0.000000
380
+ 2023-10-23 22:24:09,139 epoch 3 - iter 396/447 - loss 0.07279091 - time (sec): 36.10 - samples/sec: 2120.24 - lr: 0.000024 - momentum: 0.000000
381
+ 2023-10-23 22:24:13,184 epoch 3 - iter 440/447 - loss 0.07369125 - time (sec): 40.15 - samples/sec: 2123.09 - lr: 0.000023 - momentum: 0.000000
382
+ 2023-10-23 22:24:13,765 ----------------------------------------------------------------------------------------------------
383
+ 2023-10-23 22:24:13,765 EPOCH 3 done: loss 0.0737 - lr: 0.000023
384
+ 2023-10-23 22:24:20,280 DEV : loss 0.12288995832204819 - f1-score (micro avg) 0.7436
385
+ 2023-10-23 22:24:20,300 saving best model
386
+ 2023-10-23 22:24:21,019 ----------------------------------------------------------------------------------------------------
387
+ 2023-10-23 22:24:24,930 epoch 4 - iter 44/447 - loss 0.04523597 - time (sec): 3.91 - samples/sec: 2145.55 - lr: 0.000023 - momentum: 0.000000
388
+ 2023-10-23 22:24:28,688 epoch 4 - iter 88/447 - loss 0.04395548 - time (sec): 7.67 - samples/sec: 2133.79 - lr: 0.000023 - momentum: 0.000000
389
+ 2023-10-23 22:24:32,564 epoch 4 - iter 132/447 - loss 0.04343610 - time (sec): 11.54 - samples/sec: 2165.17 - lr: 0.000022 - momentum: 0.000000
390
+ 2023-10-23 22:24:36,880 epoch 4 - iter 176/447 - loss 0.04360098 - time (sec): 15.86 - samples/sec: 2160.67 - lr: 0.000022 - momentum: 0.000000
391
+ 2023-10-23 22:24:41,124 epoch 4 - iter 220/447 - loss 0.04351512 - time (sec): 20.10 - samples/sec: 2135.58 - lr: 0.000022 - momentum: 0.000000
392
+ 2023-10-23 22:24:44,974 epoch 4 - iter 264/447 - loss 0.04663272 - time (sec): 23.95 - samples/sec: 2137.05 - lr: 0.000021 - momentum: 0.000000
393
+ 2023-10-23 22:24:48,692 epoch 4 - iter 308/447 - loss 0.04463606 - time (sec): 27.67 - samples/sec: 2145.58 - lr: 0.000021 - momentum: 0.000000
394
+ 2023-10-23 22:24:52,690 epoch 4 - iter 352/447 - loss 0.04338000 - time (sec): 31.67 - samples/sec: 2142.49 - lr: 0.000021 - momentum: 0.000000
395
+ 2023-10-23 22:24:56,574 epoch 4 - iter 396/447 - loss 0.04260646 - time (sec): 35.55 - samples/sec: 2139.33 - lr: 0.000020 - momentum: 0.000000
396
+ 2023-10-23 22:25:00,808 epoch 4 - iter 440/447 - loss 0.04252168 - time (sec): 39.79 - samples/sec: 2139.56 - lr: 0.000020 - momentum: 0.000000
397
+ 2023-10-23 22:25:01,476 ----------------------------------------------------------------------------------------------------
398
+ 2023-10-23 22:25:01,477 EPOCH 4 done: loss 0.0423 - lr: 0.000020
399
+ 2023-10-23 22:25:07,998 DEV : loss 0.16860494017601013 - f1-score (micro avg) 0.7442
400
+ 2023-10-23 22:25:08,018 saving best model
401
+ 2023-10-23 22:25:08,736 ----------------------------------------------------------------------------------------------------
402
+ 2023-10-23 22:25:12,979 epoch 5 - iter 44/447 - loss 0.02802474 - time (sec): 4.24 - samples/sec: 2110.69 - lr: 0.000020 - momentum: 0.000000
403
+ 2023-10-23 22:25:17,051 epoch 5 - iter 88/447 - loss 0.02729669 - time (sec): 8.31 - samples/sec: 2091.32 - lr: 0.000019 - momentum: 0.000000
404
+ 2023-10-23 22:25:20,797 epoch 5 - iter 132/447 - loss 0.02848820 - time (sec): 12.06 - samples/sec: 2114.88 - lr: 0.000019 - momentum: 0.000000
405
+ 2023-10-23 22:25:24,924 epoch 5 - iter 176/447 - loss 0.03037370 - time (sec): 16.19 - samples/sec: 2132.25 - lr: 0.000019 - momentum: 0.000000
406
+ 2023-10-23 22:25:29,219 epoch 5 - iter 220/447 - loss 0.03160364 - time (sec): 20.48 - samples/sec: 2157.95 - lr: 0.000018 - momentum: 0.000000
407
+ 2023-10-23 22:25:32,933 epoch 5 - iter 264/447 - loss 0.03281760 - time (sec): 24.20 - samples/sec: 2147.02 - lr: 0.000018 - momentum: 0.000000
408
+ 2023-10-23 22:25:37,105 epoch 5 - iter 308/447 - loss 0.03205699 - time (sec): 28.37 - samples/sec: 2135.24 - lr: 0.000018 - momentum: 0.000000
409
+ 2023-10-23 22:25:40,844 epoch 5 - iter 352/447 - loss 0.03311176 - time (sec): 32.11 - samples/sec: 2138.65 - lr: 0.000017 - momentum: 0.000000
410
+ 2023-10-23 22:25:44,732 epoch 5 - iter 396/447 - loss 0.03260248 - time (sec): 36.00 - samples/sec: 2130.28 - lr: 0.000017 - momentum: 0.000000
411
+ 2023-10-23 22:25:48,605 epoch 5 - iter 440/447 - loss 0.03136514 - time (sec): 39.87 - samples/sec: 2134.27 - lr: 0.000017 - momentum: 0.000000
412
+ 2023-10-23 22:25:49,291 ----------------------------------------------------------------------------------------------------
413
+ 2023-10-23 22:25:49,292 EPOCH 5 done: loss 0.0313 - lr: 0.000017
414
+ 2023-10-23 22:25:55,808 DEV : loss 0.200847327709198 - f1-score (micro avg) 0.7667
415
+ 2023-10-23 22:25:55,829 saving best model
416
+ 2023-10-23 22:25:56,552 ----------------------------------------------------------------------------------------------------
417
+ 2023-10-23 22:26:00,120 epoch 6 - iter 44/447 - loss 0.02177514 - time (sec): 3.57 - samples/sec: 2084.94 - lr: 0.000016 - momentum: 0.000000
418
+ 2023-10-23 22:26:04,046 epoch 6 - iter 88/447 - loss 0.02190717 - time (sec): 7.49 - samples/sec: 2121.33 - lr: 0.000016 - momentum: 0.000000
419
+ 2023-10-23 22:26:08,182 epoch 6 - iter 132/447 - loss 0.01946378 - time (sec): 11.63 - samples/sec: 2156.61 - lr: 0.000016 - momentum: 0.000000
420
+ 2023-10-23 22:26:12,249 epoch 6 - iter 176/447 - loss 0.01968006 - time (sec): 15.70 - samples/sec: 2144.33 - lr: 0.000015 - momentum: 0.000000
421
+ 2023-10-23 22:26:16,657 epoch 6 - iter 220/447 - loss 0.02027486 - time (sec): 20.10 - samples/sec: 2146.14 - lr: 0.000015 - momentum: 0.000000
422
+ 2023-10-23 22:26:20,363 epoch 6 - iter 264/447 - loss 0.02058604 - time (sec): 23.81 - samples/sec: 2150.25 - lr: 0.000015 - momentum: 0.000000
423
+ 2023-10-23 22:26:24,533 epoch 6 - iter 308/447 - loss 0.01936116 - time (sec): 27.98 - samples/sec: 2143.75 - lr: 0.000014 - momentum: 0.000000
424
+ 2023-10-23 22:26:28,696 epoch 6 - iter 352/447 - loss 0.01879624 - time (sec): 32.14 - samples/sec: 2131.02 - lr: 0.000014 - momentum: 0.000000
425
+ 2023-10-23 22:26:32,687 epoch 6 - iter 396/447 - loss 0.01900730 - time (sec): 36.13 - samples/sec: 2124.58 - lr: 0.000014 - momentum: 0.000000
426
+ 2023-10-23 22:26:36,558 epoch 6 - iter 440/447 - loss 0.01897263 - time (sec): 40.01 - samples/sec: 2125.86 - lr: 0.000013 - momentum: 0.000000
427
+ 2023-10-23 22:26:37,262 ----------------------------------------------------------------------------------------------------
428
+ 2023-10-23 22:26:37,262 EPOCH 6 done: loss 0.0192 - lr: 0.000013
429
+ 2023-10-23 22:26:43,765 DEV : loss 0.1983201950788498 - f1-score (micro avg) 0.7736
430
+ 2023-10-23 22:26:43,785 saving best model
431
+ 2023-10-23 22:26:44,461 ----------------------------------------------------------------------------------------------------
432
+ 2023-10-23 22:26:48,789 epoch 7 - iter 44/447 - loss 0.01146703 - time (sec): 4.33 - samples/sec: 2167.42 - lr: 0.000013 - momentum: 0.000000
433
+ 2023-10-23 22:26:52,693 epoch 7 - iter 88/447 - loss 0.00983839 - time (sec): 8.23 - samples/sec: 2132.95 - lr: 0.000013 - momentum: 0.000000
434
+ 2023-10-23 22:26:56,465 epoch 7 - iter 132/447 - loss 0.01147917 - time (sec): 12.00 - samples/sec: 2112.49 - lr: 0.000012 - momentum: 0.000000
435
+ 2023-10-23 22:27:00,181 epoch 7 - iter 176/447 - loss 0.01213536 - time (sec): 15.72 - samples/sec: 2103.43 - lr: 0.000012 - momentum: 0.000000
436
+ 2023-10-23 22:27:04,208 epoch 7 - iter 220/447 - loss 0.01145021 - time (sec): 19.75 - samples/sec: 2114.57 - lr: 0.000012 - momentum: 0.000000
437
+ 2023-10-23 22:27:08,092 epoch 7 - iter 264/447 - loss 0.01226204 - time (sec): 23.63 - samples/sec: 2104.33 - lr: 0.000011 - momentum: 0.000000
438
+ 2023-10-23 22:27:12,363 epoch 7 - iter 308/447 - loss 0.01179778 - time (sec): 27.90 - samples/sec: 2117.04 - lr: 0.000011 - momentum: 0.000000
439
+ 2023-10-23 22:27:16,673 epoch 7 - iter 352/447 - loss 0.01203243 - time (sec): 32.21 - samples/sec: 2136.52 - lr: 0.000011 - momentum: 0.000000
440
+ 2023-10-23 22:27:20,542 epoch 7 - iter 396/447 - loss 0.01222001 - time (sec): 36.08 - samples/sec: 2133.49 - lr: 0.000010 - momentum: 0.000000
441
+ 2023-10-23 22:27:24,318 epoch 7 - iter 440/447 - loss 0.01181947 - time (sec): 39.86 - samples/sec: 2132.84 - lr: 0.000010 - momentum: 0.000000
442
+ 2023-10-23 22:27:24,945 ----------------------------------------------------------------------------------------------------
443
+ 2023-10-23 22:27:24,945 EPOCH 7 done: loss 0.0120 - lr: 0.000010
444
+ 2023-10-23 22:27:31,441 DEV : loss 0.21346606314182281 - f1-score (micro avg) 0.7828
445
+ 2023-10-23 22:27:31,462 saving best model
446
+ 2023-10-23 22:27:32,110 ----------------------------------------------------------------------------------------------------
447
+ 2023-10-23 22:27:36,118 epoch 8 - iter 44/447 - loss 0.00685455 - time (sec): 4.01 - samples/sec: 2125.06 - lr: 0.000010 - momentum: 0.000000
448
+ 2023-10-23 22:27:40,362 epoch 8 - iter 88/447 - loss 0.01104525 - time (sec): 8.25 - samples/sec: 2070.56 - lr: 0.000009 - momentum: 0.000000
449
+ 2023-10-23 22:27:44,233 epoch 8 - iter 132/447 - loss 0.00915699 - time (sec): 12.12 - samples/sec: 2113.11 - lr: 0.000009 - momentum: 0.000000
450
+ 2023-10-23 22:27:48,854 epoch 8 - iter 176/447 - loss 0.00762872 - time (sec): 16.74 - samples/sec: 2106.65 - lr: 0.000009 - momentum: 0.000000
451
+ 2023-10-23 22:27:52,512 epoch 8 - iter 220/447 - loss 0.00683083 - time (sec): 20.40 - samples/sec: 2110.20 - lr: 0.000008 - momentum: 0.000000
452
+ 2023-10-23 22:27:56,394 epoch 8 - iter 264/447 - loss 0.00802052 - time (sec): 24.28 - samples/sec: 2126.46 - lr: 0.000008 - momentum: 0.000000
453
+ 2023-10-23 22:28:00,106 epoch 8 - iter 308/447 - loss 0.00865904 - time (sec): 27.99 - samples/sec: 2135.97 - lr: 0.000008 - momentum: 0.000000
454
+ 2023-10-23 22:28:03,741 epoch 8 - iter 352/447 - loss 0.00841205 - time (sec): 31.63 - samples/sec: 2128.04 - lr: 0.000007 - momentum: 0.000000
455
+ 2023-10-23 22:28:07,701 epoch 8 - iter 396/447 - loss 0.00783808 - time (sec): 35.59 - samples/sec: 2131.82 - lr: 0.000007 - momentum: 0.000000
456
+ 2023-10-23 22:28:12,081 epoch 8 - iter 440/447 - loss 0.00814555 - time (sec): 39.97 - samples/sec: 2130.93 - lr: 0.000007 - momentum: 0.000000
457
+ 2023-10-23 22:28:12,772 ----------------------------------------------------------------------------------------------------
458
+ 2023-10-23 22:28:12,772 EPOCH 8 done: loss 0.0080 - lr: 0.000007
459
+ 2023-10-23 22:28:19,278 DEV : loss 0.2258313149213791 - f1-score (micro avg) 0.7854
460
+ 2023-10-23 22:28:19,299 saving best model
461
+ 2023-10-23 22:28:19,998 ----------------------------------------------------------------------------------------------------
462
+ 2023-10-23 22:28:23,671 epoch 9 - iter 44/447 - loss 0.00629803 - time (sec): 3.67 - samples/sec: 2175.80 - lr: 0.000006 - momentum: 0.000000
463
+ 2023-10-23 22:28:27,528 epoch 9 - iter 88/447 - loss 0.00804701 - time (sec): 7.53 - samples/sec: 2141.78 - lr: 0.000006 - momentum: 0.000000
464
+ 2023-10-23 22:28:31,356 epoch 9 - iter 132/447 - loss 0.00634854 - time (sec): 11.36 - samples/sec: 2175.71 - lr: 0.000006 - momentum: 0.000000
465
+ 2023-10-23 22:28:35,001 epoch 9 - iter 176/447 - loss 0.00589061 - time (sec): 15.00 - samples/sec: 2187.76 - lr: 0.000005 - momentum: 0.000000
466
+ 2023-10-23 22:28:38,990 epoch 9 - iter 220/447 - loss 0.00579974 - time (sec): 18.99 - samples/sec: 2180.18 - lr: 0.000005 - momentum: 0.000000
467
+ 2023-10-23 22:28:43,354 epoch 9 - iter 264/447 - loss 0.00610104 - time (sec): 23.35 - samples/sec: 2182.46 - lr: 0.000005 - momentum: 0.000000
468
+ 2023-10-23 22:28:47,454 epoch 9 - iter 308/447 - loss 0.00636396 - time (sec): 27.45 - samples/sec: 2171.15 - lr: 0.000004 - momentum: 0.000000
469
+ 2023-10-23 22:28:51,842 epoch 9 - iter 352/447 - loss 0.00587760 - time (sec): 31.84 - samples/sec: 2164.71 - lr: 0.000004 - momentum: 0.000000
470
+ 2023-10-23 22:28:55,760 epoch 9 - iter 396/447 - loss 0.00548106 - time (sec): 35.76 - samples/sec: 2152.27 - lr: 0.000004 - momentum: 0.000000
471
+ 2023-10-23 22:28:59,716 epoch 9 - iter 440/447 - loss 0.00551406 - time (sec): 39.72 - samples/sec: 2148.37 - lr: 0.000003 - momentum: 0.000000
472
+ 2023-10-23 22:29:00,334 ----------------------------------------------------------------------------------------------------
473
+ 2023-10-23 22:29:00,334 EPOCH 9 done: loss 0.0055 - lr: 0.000003
474
+ 2023-10-23 22:29:06,821 DEV : loss 0.2376076877117157 - f1-score (micro avg) 0.779
475
+ 2023-10-23 22:29:06,841 ----------------------------------------------------------------------------------------------------
476
+ 2023-10-23 22:29:11,130 epoch 10 - iter 44/447 - loss 0.00018174 - time (sec): 4.29 - samples/sec: 2068.38 - lr: 0.000003 - momentum: 0.000000
477
+ 2023-10-23 22:29:15,127 epoch 10 - iter 88/447 - loss 0.00104477 - time (sec): 8.29 - samples/sec: 2068.77 - lr: 0.000003 - momentum: 0.000000
478
+ 2023-10-23 22:29:18,817 epoch 10 - iter 132/447 - loss 0.00073255 - time (sec): 11.98 - samples/sec: 2148.11 - lr: 0.000002 - momentum: 0.000000
479
+ 2023-10-23 22:29:22,963 epoch 10 - iter 176/447 - loss 0.00142077 - time (sec): 16.12 - samples/sec: 2160.08 - lr: 0.000002 - momentum: 0.000000
480
+ 2023-10-23 22:29:26,746 epoch 10 - iter 220/447 - loss 0.00213151 - time (sec): 19.90 - samples/sec: 2152.25 - lr: 0.000002 - momentum: 0.000000
481
+ 2023-10-23 22:29:30,414 epoch 10 - iter 264/447 - loss 0.00292525 - time (sec): 23.57 - samples/sec: 2153.18 - lr: 0.000001 - momentum: 0.000000
482
+ 2023-10-23 22:29:34,566 epoch 10 - iter 308/447 - loss 0.00317358 - time (sec): 27.72 - samples/sec: 2149.54 - lr: 0.000001 - momentum: 0.000000
483
+ 2023-10-23 22:29:38,286 epoch 10 - iter 352/447 - loss 0.00305912 - time (sec): 31.44 - samples/sec: 2136.75 - lr: 0.000001 - momentum: 0.000000
484
+ 2023-10-23 22:29:42,353 epoch 10 - iter 396/447 - loss 0.00332032 - time (sec): 35.51 - samples/sec: 2142.38 - lr: 0.000000 - momentum: 0.000000
485
+ 2023-10-23 22:29:46,365 epoch 10 - iter 440/447 - loss 0.00332155 - time (sec): 39.52 - samples/sec: 2131.35 - lr: 0.000000 - momentum: 0.000000
486
+ 2023-10-23 22:29:47,401 ----------------------------------------------------------------------------------------------------
487
+ 2023-10-23 22:29:47,401 EPOCH 10 done: loss 0.0033 - lr: 0.000000
488
+ 2023-10-23 22:29:53,628 DEV : loss 0.23625218868255615 - f1-score (micro avg) 0.7868
489
+ 2023-10-23 22:29:53,649 saving best model
490
+ 2023-10-23 22:29:54,860 ----------------------------------------------------------------------------------------------------
491
+ 2023-10-23 22:29:54,861 Loading model from best epoch ...
492
+ 2023-10-23 22:29:56,906 SequenceTagger predicts: Dictionary with 21 tags: O, S-loc, B-loc, E-loc, I-loc, S-pers, B-pers, E-pers, I-pers, S-org, B-org, E-org, I-org, S-prod, B-prod, E-prod, I-prod, S-time, B-time, E-time, I-time
493
+ 2023-10-23 22:30:01,450
494
+ Results:
495
+ - F-score (micro) 0.7506
496
+ - F-score (macro) 0.6633
497
+ - Accuracy 0.6181
498
+
499
+ By class:
500
+ precision recall f1-score support
501
+
502
+ loc 0.8218 0.8591 0.8400 596
503
+ pers 0.6789 0.7808 0.7263 333
504
+ org 0.5161 0.4848 0.5000 132
505
+ prod 0.6600 0.5000 0.5690 66
506
+ time 0.7381 0.6327 0.6813 49
507
+
508
+ micro avg 0.7365 0.7653 0.7506 1176
509
+ macro avg 0.6830 0.6515 0.6633 1176
510
+ weighted avg 0.7345 0.7653 0.7478 1176
511
+
512
+ 2023-10-23 22:30:01,450 ----------------------------------------------------------------------------------------------------