stefan-it commited on
Commit
8065a26
1 Parent(s): 665d2a2

Upload ./training.log with huggingface_hub

Browse files
Files changed (1) hide show
  1. training.log +511 -0
training.log ADDED
@@ -0,0 +1,511 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ 2023-10-23 21:04:58,060 ----------------------------------------------------------------------------------------------------
2
+ 2023-10-23 21:04:58,061 Model: "SequenceTagger(
3
+ (embeddings): TransformerWordEmbeddings(
4
+ (model): BertModel(
5
+ (embeddings): BertEmbeddings(
6
+ (word_embeddings): Embedding(64001, 768)
7
+ (position_embeddings): Embedding(512, 768)
8
+ (token_type_embeddings): Embedding(2, 768)
9
+ (LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)
10
+ (dropout): Dropout(p=0.1, inplace=False)
11
+ )
12
+ (encoder): BertEncoder(
13
+ (layer): ModuleList(
14
+ (0): BertLayer(
15
+ (attention): BertAttention(
16
+ (self): BertSelfAttention(
17
+ (query): Linear(in_features=768, out_features=768, bias=True)
18
+ (key): Linear(in_features=768, out_features=768, bias=True)
19
+ (value): Linear(in_features=768, out_features=768, bias=True)
20
+ (dropout): Dropout(p=0.1, inplace=False)
21
+ )
22
+ (output): BertSelfOutput(
23
+ (dense): Linear(in_features=768, out_features=768, bias=True)
24
+ (LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)
25
+ (dropout): Dropout(p=0.1, inplace=False)
26
+ )
27
+ )
28
+ (intermediate): BertIntermediate(
29
+ (dense): Linear(in_features=768, out_features=3072, bias=True)
30
+ (intermediate_act_fn): GELUActivation()
31
+ )
32
+ (output): BertOutput(
33
+ (dense): Linear(in_features=3072, out_features=768, bias=True)
34
+ (LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)
35
+ (dropout): Dropout(p=0.1, inplace=False)
36
+ )
37
+ )
38
+ (1): BertLayer(
39
+ (attention): BertAttention(
40
+ (self): BertSelfAttention(
41
+ (query): Linear(in_features=768, out_features=768, bias=True)
42
+ (key): Linear(in_features=768, out_features=768, bias=True)
43
+ (value): Linear(in_features=768, out_features=768, bias=True)
44
+ (dropout): Dropout(p=0.1, inplace=False)
45
+ )
46
+ (output): BertSelfOutput(
47
+ (dense): Linear(in_features=768, out_features=768, bias=True)
48
+ (LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)
49
+ (dropout): Dropout(p=0.1, inplace=False)
50
+ )
51
+ )
52
+ (intermediate): BertIntermediate(
53
+ (dense): Linear(in_features=768, out_features=3072, bias=True)
54
+ (intermediate_act_fn): GELUActivation()
55
+ )
56
+ (output): BertOutput(
57
+ (dense): Linear(in_features=3072, out_features=768, bias=True)
58
+ (LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)
59
+ (dropout): Dropout(p=0.1, inplace=False)
60
+ )
61
+ )
62
+ (2): BertLayer(
63
+ (attention): BertAttention(
64
+ (self): BertSelfAttention(
65
+ (query): Linear(in_features=768, out_features=768, bias=True)
66
+ (key): Linear(in_features=768, out_features=768, bias=True)
67
+ (value): Linear(in_features=768, out_features=768, bias=True)
68
+ (dropout): Dropout(p=0.1, inplace=False)
69
+ )
70
+ (output): BertSelfOutput(
71
+ (dense): Linear(in_features=768, out_features=768, bias=True)
72
+ (LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)
73
+ (dropout): Dropout(p=0.1, inplace=False)
74
+ )
75
+ )
76
+ (intermediate): BertIntermediate(
77
+ (dense): Linear(in_features=768, out_features=3072, bias=True)
78
+ (intermediate_act_fn): GELUActivation()
79
+ )
80
+ (output): BertOutput(
81
+ (dense): Linear(in_features=3072, out_features=768, bias=True)
82
+ (LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)
83
+ (dropout): Dropout(p=0.1, inplace=False)
84
+ )
85
+ )
86
+ (3): BertLayer(
87
+ (attention): BertAttention(
88
+ (self): BertSelfAttention(
89
+ (query): Linear(in_features=768, out_features=768, bias=True)
90
+ (key): Linear(in_features=768, out_features=768, bias=True)
91
+ (value): Linear(in_features=768, out_features=768, bias=True)
92
+ (dropout): Dropout(p=0.1, inplace=False)
93
+ )
94
+ (output): BertSelfOutput(
95
+ (dense): Linear(in_features=768, out_features=768, bias=True)
96
+ (LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)
97
+ (dropout): Dropout(p=0.1, inplace=False)
98
+ )
99
+ )
100
+ (intermediate): BertIntermediate(
101
+ (dense): Linear(in_features=768, out_features=3072, bias=True)
102
+ (intermediate_act_fn): GELUActivation()
103
+ )
104
+ (output): BertOutput(
105
+ (dense): Linear(in_features=3072, out_features=768, bias=True)
106
+ (LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)
107
+ (dropout): Dropout(p=0.1, inplace=False)
108
+ )
109
+ )
110
+ (4): BertLayer(
111
+ (attention): BertAttention(
112
+ (self): BertSelfAttention(
113
+ (query): Linear(in_features=768, out_features=768, bias=True)
114
+ (key): Linear(in_features=768, out_features=768, bias=True)
115
+ (value): Linear(in_features=768, out_features=768, bias=True)
116
+ (dropout): Dropout(p=0.1, inplace=False)
117
+ )
118
+ (output): BertSelfOutput(
119
+ (dense): Linear(in_features=768, out_features=768, bias=True)
120
+ (LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)
121
+ (dropout): Dropout(p=0.1, inplace=False)
122
+ )
123
+ )
124
+ (intermediate): BertIntermediate(
125
+ (dense): Linear(in_features=768, out_features=3072, bias=True)
126
+ (intermediate_act_fn): GELUActivation()
127
+ )
128
+ (output): BertOutput(
129
+ (dense): Linear(in_features=3072, out_features=768, bias=True)
130
+ (LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)
131
+ (dropout): Dropout(p=0.1, inplace=False)
132
+ )
133
+ )
134
+ (5): BertLayer(
135
+ (attention): BertAttention(
136
+ (self): BertSelfAttention(
137
+ (query): Linear(in_features=768, out_features=768, bias=True)
138
+ (key): Linear(in_features=768, out_features=768, bias=True)
139
+ (value): Linear(in_features=768, out_features=768, bias=True)
140
+ (dropout): Dropout(p=0.1, inplace=False)
141
+ )
142
+ (output): BertSelfOutput(
143
+ (dense): Linear(in_features=768, out_features=768, bias=True)
144
+ (LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)
145
+ (dropout): Dropout(p=0.1, inplace=False)
146
+ )
147
+ )
148
+ (intermediate): BertIntermediate(
149
+ (dense): Linear(in_features=768, out_features=3072, bias=True)
150
+ (intermediate_act_fn): GELUActivation()
151
+ )
152
+ (output): BertOutput(
153
+ (dense): Linear(in_features=3072, out_features=768, bias=True)
154
+ (LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)
155
+ (dropout): Dropout(p=0.1, inplace=False)
156
+ )
157
+ )
158
+ (6): BertLayer(
159
+ (attention): BertAttention(
160
+ (self): BertSelfAttention(
161
+ (query): Linear(in_features=768, out_features=768, bias=True)
162
+ (key): Linear(in_features=768, out_features=768, bias=True)
163
+ (value): Linear(in_features=768, out_features=768, bias=True)
164
+ (dropout): Dropout(p=0.1, inplace=False)
165
+ )
166
+ (output): BertSelfOutput(
167
+ (dense): Linear(in_features=768, out_features=768, bias=True)
168
+ (LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)
169
+ (dropout): Dropout(p=0.1, inplace=False)
170
+ )
171
+ )
172
+ (intermediate): BertIntermediate(
173
+ (dense): Linear(in_features=768, out_features=3072, bias=True)
174
+ (intermediate_act_fn): GELUActivation()
175
+ )
176
+ (output): BertOutput(
177
+ (dense): Linear(in_features=3072, out_features=768, bias=True)
178
+ (LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)
179
+ (dropout): Dropout(p=0.1, inplace=False)
180
+ )
181
+ )
182
+ (7): BertLayer(
183
+ (attention): BertAttention(
184
+ (self): BertSelfAttention(
185
+ (query): Linear(in_features=768, out_features=768, bias=True)
186
+ (key): Linear(in_features=768, out_features=768, bias=True)
187
+ (value): Linear(in_features=768, out_features=768, bias=True)
188
+ (dropout): Dropout(p=0.1, inplace=False)
189
+ )
190
+ (output): BertSelfOutput(
191
+ (dense): Linear(in_features=768, out_features=768, bias=True)
192
+ (LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)
193
+ (dropout): Dropout(p=0.1, inplace=False)
194
+ )
195
+ )
196
+ (intermediate): BertIntermediate(
197
+ (dense): Linear(in_features=768, out_features=3072, bias=True)
198
+ (intermediate_act_fn): GELUActivation()
199
+ )
200
+ (output): BertOutput(
201
+ (dense): Linear(in_features=3072, out_features=768, bias=True)
202
+ (LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)
203
+ (dropout): Dropout(p=0.1, inplace=False)
204
+ )
205
+ )
206
+ (8): BertLayer(
207
+ (attention): BertAttention(
208
+ (self): BertSelfAttention(
209
+ (query): Linear(in_features=768, out_features=768, bias=True)
210
+ (key): Linear(in_features=768, out_features=768, bias=True)
211
+ (value): Linear(in_features=768, out_features=768, bias=True)
212
+ (dropout): Dropout(p=0.1, inplace=False)
213
+ )
214
+ (output): BertSelfOutput(
215
+ (dense): Linear(in_features=768, out_features=768, bias=True)
216
+ (LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)
217
+ (dropout): Dropout(p=0.1, inplace=False)
218
+ )
219
+ )
220
+ (intermediate): BertIntermediate(
221
+ (dense): Linear(in_features=768, out_features=3072, bias=True)
222
+ (intermediate_act_fn): GELUActivation()
223
+ )
224
+ (output): BertOutput(
225
+ (dense): Linear(in_features=3072, out_features=768, bias=True)
226
+ (LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)
227
+ (dropout): Dropout(p=0.1, inplace=False)
228
+ )
229
+ )
230
+ (9): BertLayer(
231
+ (attention): BertAttention(
232
+ (self): BertSelfAttention(
233
+ (query): Linear(in_features=768, out_features=768, bias=True)
234
+ (key): Linear(in_features=768, out_features=768, bias=True)
235
+ (value): Linear(in_features=768, out_features=768, bias=True)
236
+ (dropout): Dropout(p=0.1, inplace=False)
237
+ )
238
+ (output): BertSelfOutput(
239
+ (dense): Linear(in_features=768, out_features=768, bias=True)
240
+ (LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)
241
+ (dropout): Dropout(p=0.1, inplace=False)
242
+ )
243
+ )
244
+ (intermediate): BertIntermediate(
245
+ (dense): Linear(in_features=768, out_features=3072, bias=True)
246
+ (intermediate_act_fn): GELUActivation()
247
+ )
248
+ (output): BertOutput(
249
+ (dense): Linear(in_features=3072, out_features=768, bias=True)
250
+ (LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)
251
+ (dropout): Dropout(p=0.1, inplace=False)
252
+ )
253
+ )
254
+ (10): BertLayer(
255
+ (attention): BertAttention(
256
+ (self): BertSelfAttention(
257
+ (query): Linear(in_features=768, out_features=768, bias=True)
258
+ (key): Linear(in_features=768, out_features=768, bias=True)
259
+ (value): Linear(in_features=768, out_features=768, bias=True)
260
+ (dropout): Dropout(p=0.1, inplace=False)
261
+ )
262
+ (output): BertSelfOutput(
263
+ (dense): Linear(in_features=768, out_features=768, bias=True)
264
+ (LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)
265
+ (dropout): Dropout(p=0.1, inplace=False)
266
+ )
267
+ )
268
+ (intermediate): BertIntermediate(
269
+ (dense): Linear(in_features=768, out_features=3072, bias=True)
270
+ (intermediate_act_fn): GELUActivation()
271
+ )
272
+ (output): BertOutput(
273
+ (dense): Linear(in_features=3072, out_features=768, bias=True)
274
+ (LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)
275
+ (dropout): Dropout(p=0.1, inplace=False)
276
+ )
277
+ )
278
+ (11): BertLayer(
279
+ (attention): BertAttention(
280
+ (self): BertSelfAttention(
281
+ (query): Linear(in_features=768, out_features=768, bias=True)
282
+ (key): Linear(in_features=768, out_features=768, bias=True)
283
+ (value): Linear(in_features=768, out_features=768, bias=True)
284
+ (dropout): Dropout(p=0.1, inplace=False)
285
+ )
286
+ (output): BertSelfOutput(
287
+ (dense): Linear(in_features=768, out_features=768, bias=True)
288
+ (LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)
289
+ (dropout): Dropout(p=0.1, inplace=False)
290
+ )
291
+ )
292
+ (intermediate): BertIntermediate(
293
+ (dense): Linear(in_features=768, out_features=3072, bias=True)
294
+ (intermediate_act_fn): GELUActivation()
295
+ )
296
+ (output): BertOutput(
297
+ (dense): Linear(in_features=3072, out_features=768, bias=True)
298
+ (LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)
299
+ (dropout): Dropout(p=0.1, inplace=False)
300
+ )
301
+ )
302
+ )
303
+ )
304
+ (pooler): BertPooler(
305
+ (dense): Linear(in_features=768, out_features=768, bias=True)
306
+ (activation): Tanh()
307
+ )
308
+ )
309
+ )
310
+ (locked_dropout): LockedDropout(p=0.5)
311
+ (linear): Linear(in_features=768, out_features=21, bias=True)
312
+ (loss_function): CrossEntropyLoss()
313
+ )"
314
+ 2023-10-23 21:04:58,061 ----------------------------------------------------------------------------------------------------
315
+ 2023-10-23 21:04:58,061 MultiCorpus: 3575 train + 1235 dev + 1266 test sentences
316
+ - NER_HIPE_2022 Corpus: 3575 train + 1235 dev + 1266 test sentences - /home/ubuntu/.flair/datasets/ner_hipe_2022/v2.1/hipe2020/de/with_doc_seperator
317
+ 2023-10-23 21:04:58,061 ----------------------------------------------------------------------------------------------------
318
+ 2023-10-23 21:04:58,061 Train: 3575 sentences
319
+ 2023-10-23 21:04:58,061 (train_with_dev=False, train_with_test=False)
320
+ 2023-10-23 21:04:58,061 ----------------------------------------------------------------------------------------------------
321
+ 2023-10-23 21:04:58,061 Training Params:
322
+ 2023-10-23 21:04:58,061 - learning_rate: "3e-05"
323
+ 2023-10-23 21:04:58,061 - mini_batch_size: "8"
324
+ 2023-10-23 21:04:58,061 - max_epochs: "10"
325
+ 2023-10-23 21:04:58,061 - shuffle: "True"
326
+ 2023-10-23 21:04:58,061 ----------------------------------------------------------------------------------------------------
327
+ 2023-10-23 21:04:58,061 Plugins:
328
+ 2023-10-23 21:04:58,061 - TensorboardLogger
329
+ 2023-10-23 21:04:58,061 - LinearScheduler | warmup_fraction: '0.1'
330
+ 2023-10-23 21:04:58,061 ----------------------------------------------------------------------------------------------------
331
+ 2023-10-23 21:04:58,061 Final evaluation on model from best epoch (best-model.pt)
332
+ 2023-10-23 21:04:58,061 - metric: "('micro avg', 'f1-score')"
333
+ 2023-10-23 21:04:58,062 ----------------------------------------------------------------------------------------------------
334
+ 2023-10-23 21:04:58,062 Computation:
335
+ 2023-10-23 21:04:58,062 - compute on device: cuda:0
336
+ 2023-10-23 21:04:58,062 - embedding storage: none
337
+ 2023-10-23 21:04:58,062 ----------------------------------------------------------------------------------------------------
338
+ 2023-10-23 21:04:58,062 Model training base path: "hmbench-hipe2020/de-dbmdz/bert-base-historic-multilingual-64k-td-cased-bs8-wsFalse-e10-lr3e-05-poolingfirst-layers-1-crfFalse-2"
339
+ 2023-10-23 21:04:58,062 ----------------------------------------------------------------------------------------------------
340
+ 2023-10-23 21:04:58,062 ----------------------------------------------------------------------------------------------------
341
+ 2023-10-23 21:04:58,062 Logging anything other than scalars to TensorBoard is currently not supported.
342
+ 2023-10-23 21:05:01,858 epoch 1 - iter 44/447 - loss 2.77674307 - time (sec): 3.80 - samples/sec: 2068.88 - lr: 0.000003 - momentum: 0.000000
343
+ 2023-10-23 21:05:05,973 epoch 1 - iter 88/447 - loss 1.75956664 - time (sec): 7.91 - samples/sec: 2088.59 - lr: 0.000006 - momentum: 0.000000
344
+ 2023-10-23 21:05:10,065 epoch 1 - iter 132/447 - loss 1.30302502 - time (sec): 12.00 - samples/sec: 2082.97 - lr: 0.000009 - momentum: 0.000000
345
+ 2023-10-23 21:05:14,086 epoch 1 - iter 176/447 - loss 1.08268102 - time (sec): 16.02 - samples/sec: 2078.90 - lr: 0.000012 - momentum: 0.000000
346
+ 2023-10-23 21:05:17,972 epoch 1 - iter 220/447 - loss 0.93560897 - time (sec): 19.91 - samples/sec: 2103.90 - lr: 0.000015 - momentum: 0.000000
347
+ 2023-10-23 21:05:21,763 epoch 1 - iter 264/447 - loss 0.83394592 - time (sec): 23.70 - samples/sec: 2104.11 - lr: 0.000018 - momentum: 0.000000
348
+ 2023-10-23 21:05:25,684 epoch 1 - iter 308/447 - loss 0.75226450 - time (sec): 27.62 - samples/sec: 2106.94 - lr: 0.000021 - momentum: 0.000000
349
+ 2023-10-23 21:05:29,651 epoch 1 - iter 352/447 - loss 0.68133343 - time (sec): 31.59 - samples/sec: 2108.95 - lr: 0.000024 - momentum: 0.000000
350
+ 2023-10-23 21:05:34,085 epoch 1 - iter 396/447 - loss 0.62684172 - time (sec): 36.02 - samples/sec: 2124.23 - lr: 0.000027 - momentum: 0.000000
351
+ 2023-10-23 21:05:37,891 epoch 1 - iter 440/447 - loss 0.58441638 - time (sec): 39.83 - samples/sec: 2137.74 - lr: 0.000029 - momentum: 0.000000
352
+ 2023-10-23 21:05:38,506 ----------------------------------------------------------------------------------------------------
353
+ 2023-10-23 21:05:38,506 EPOCH 1 done: loss 0.5780 - lr: 0.000029
354
+ 2023-10-23 21:05:43,315 DEV : loss 0.15914756059646606 - f1-score (micro avg) 0.5805
355
+ 2023-10-23 21:05:43,335 saving best model
356
+ 2023-10-23 21:05:43,804 ----------------------------------------------------------------------------------------------------
357
+ 2023-10-23 21:05:47,532 epoch 2 - iter 44/447 - loss 0.17247700 - time (sec): 3.73 - samples/sec: 2206.20 - lr: 0.000030 - momentum: 0.000000
358
+ 2023-10-23 21:05:51,556 epoch 2 - iter 88/447 - loss 0.15210658 - time (sec): 7.75 - samples/sec: 2170.56 - lr: 0.000029 - momentum: 0.000000
359
+ 2023-10-23 21:05:55,638 epoch 2 - iter 132/447 - loss 0.14415904 - time (sec): 11.83 - samples/sec: 2166.91 - lr: 0.000029 - momentum: 0.000000
360
+ 2023-10-23 21:05:59,766 epoch 2 - iter 176/447 - loss 0.14348377 - time (sec): 15.96 - samples/sec: 2153.80 - lr: 0.000029 - momentum: 0.000000
361
+ 2023-10-23 21:06:03,542 epoch 2 - iter 220/447 - loss 0.13706169 - time (sec): 19.74 - samples/sec: 2132.36 - lr: 0.000028 - momentum: 0.000000
362
+ 2023-10-23 21:06:07,684 epoch 2 - iter 264/447 - loss 0.13827372 - time (sec): 23.88 - samples/sec: 2135.49 - lr: 0.000028 - momentum: 0.000000
363
+ 2023-10-23 21:06:11,700 epoch 2 - iter 308/447 - loss 0.13580790 - time (sec): 27.90 - samples/sec: 2140.60 - lr: 0.000028 - momentum: 0.000000
364
+ 2023-10-23 21:06:15,332 epoch 2 - iter 352/447 - loss 0.13583544 - time (sec): 31.53 - samples/sec: 2145.24 - lr: 0.000027 - momentum: 0.000000
365
+ 2023-10-23 21:06:19,814 epoch 2 - iter 396/447 - loss 0.13680668 - time (sec): 36.01 - samples/sec: 2139.14 - lr: 0.000027 - momentum: 0.000000
366
+ 2023-10-23 21:06:23,630 epoch 2 - iter 440/447 - loss 0.13356993 - time (sec): 39.83 - samples/sec: 2137.55 - lr: 0.000027 - momentum: 0.000000
367
+ 2023-10-23 21:06:24,229 ----------------------------------------------------------------------------------------------------
368
+ 2023-10-23 21:06:24,230 EPOCH 2 done: loss 0.1328 - lr: 0.000027
369
+ 2023-10-23 21:06:30,708 DEV : loss 0.12941311299800873 - f1-score (micro avg) 0.7109
370
+ 2023-10-23 21:06:30,728 saving best model
371
+ 2023-10-23 21:06:31,322 ----------------------------------------------------------------------------------------------------
372
+ 2023-10-23 21:06:35,398 epoch 3 - iter 44/447 - loss 0.05897943 - time (sec): 4.07 - samples/sec: 2144.85 - lr: 0.000026 - momentum: 0.000000
373
+ 2023-10-23 21:06:39,480 epoch 3 - iter 88/447 - loss 0.07267667 - time (sec): 8.16 - samples/sec: 2139.98 - lr: 0.000026 - momentum: 0.000000
374
+ 2023-10-23 21:06:43,614 epoch 3 - iter 132/447 - loss 0.07124643 - time (sec): 12.29 - samples/sec: 2163.33 - lr: 0.000026 - momentum: 0.000000
375
+ 2023-10-23 21:06:47,536 epoch 3 - iter 176/447 - loss 0.06874515 - time (sec): 16.21 - samples/sec: 2126.51 - lr: 0.000025 - momentum: 0.000000
376
+ 2023-10-23 21:06:51,418 epoch 3 - iter 220/447 - loss 0.06845317 - time (sec): 20.10 - samples/sec: 2144.00 - lr: 0.000025 - momentum: 0.000000
377
+ 2023-10-23 21:06:55,177 epoch 3 - iter 264/447 - loss 0.06741457 - time (sec): 23.85 - samples/sec: 2150.15 - lr: 0.000025 - momentum: 0.000000
378
+ 2023-10-23 21:06:59,046 epoch 3 - iter 308/447 - loss 0.06763183 - time (sec): 27.72 - samples/sec: 2142.44 - lr: 0.000024 - momentum: 0.000000
379
+ 2023-10-23 21:07:03,214 epoch 3 - iter 352/447 - loss 0.06611837 - time (sec): 31.89 - samples/sec: 2146.71 - lr: 0.000024 - momentum: 0.000000
380
+ 2023-10-23 21:07:07,031 epoch 3 - iter 396/447 - loss 0.06604370 - time (sec): 35.71 - samples/sec: 2149.25 - lr: 0.000024 - momentum: 0.000000
381
+ 2023-10-23 21:07:11,156 epoch 3 - iter 440/447 - loss 0.06746680 - time (sec): 39.83 - samples/sec: 2134.13 - lr: 0.000023 - momentum: 0.000000
382
+ 2023-10-23 21:07:11,816 ----------------------------------------------------------------------------------------------------
383
+ 2023-10-23 21:07:11,816 EPOCH 3 done: loss 0.0677 - lr: 0.000023
384
+ 2023-10-23 21:07:18,319 DEV : loss 0.13155309855937958 - f1-score (micro avg) 0.7518
385
+ 2023-10-23 21:07:18,339 saving best model
386
+ 2023-10-23 21:07:18,912 ----------------------------------------------------------------------------------------------------
387
+ 2023-10-23 21:07:22,645 epoch 4 - iter 44/447 - loss 0.04956695 - time (sec): 3.73 - samples/sec: 2133.43 - lr: 0.000023 - momentum: 0.000000
388
+ 2023-10-23 21:07:26,708 epoch 4 - iter 88/447 - loss 0.04093136 - time (sec): 7.79 - samples/sec: 2113.74 - lr: 0.000023 - momentum: 0.000000
389
+ 2023-10-23 21:07:30,759 epoch 4 - iter 132/447 - loss 0.03912973 - time (sec): 11.85 - samples/sec: 2133.26 - lr: 0.000022 - momentum: 0.000000
390
+ 2023-10-23 21:07:34,940 epoch 4 - iter 176/447 - loss 0.03954112 - time (sec): 16.03 - samples/sec: 2116.49 - lr: 0.000022 - momentum: 0.000000
391
+ 2023-10-23 21:07:39,131 epoch 4 - iter 220/447 - loss 0.03912372 - time (sec): 20.22 - samples/sec: 2109.00 - lr: 0.000022 - momentum: 0.000000
392
+ 2023-10-23 21:07:43,162 epoch 4 - iter 264/447 - loss 0.04029854 - time (sec): 24.25 - samples/sec: 2116.55 - lr: 0.000021 - momentum: 0.000000
393
+ 2023-10-23 21:07:47,407 epoch 4 - iter 308/447 - loss 0.04017848 - time (sec): 28.49 - samples/sec: 2117.43 - lr: 0.000021 - momentum: 0.000000
394
+ 2023-10-23 21:07:51,304 epoch 4 - iter 352/447 - loss 0.03995350 - time (sec): 32.39 - samples/sec: 2121.72 - lr: 0.000021 - momentum: 0.000000
395
+ 2023-10-23 21:07:55,232 epoch 4 - iter 396/447 - loss 0.04119580 - time (sec): 36.32 - samples/sec: 2122.18 - lr: 0.000020 - momentum: 0.000000
396
+ 2023-10-23 21:07:59,018 epoch 4 - iter 440/447 - loss 0.04224669 - time (sec): 40.10 - samples/sec: 2126.79 - lr: 0.000020 - momentum: 0.000000
397
+ 2023-10-23 21:07:59,609 ----------------------------------------------------------------------------------------------------
398
+ 2023-10-23 21:07:59,609 EPOCH 4 done: loss 0.0431 - lr: 0.000020
399
+ 2023-10-23 21:08:06,099 DEV : loss 0.17578744888305664 - f1-score (micro avg) 0.764
400
+ 2023-10-23 21:08:06,120 saving best model
401
+ 2023-10-23 21:08:06,714 ----------------------------------------------------------------------------------------------------
402
+ 2023-10-23 21:08:10,724 epoch 5 - iter 44/447 - loss 0.03173418 - time (sec): 4.01 - samples/sec: 2167.42 - lr: 0.000020 - momentum: 0.000000
403
+ 2023-10-23 21:08:14,826 epoch 5 - iter 88/447 - loss 0.03285878 - time (sec): 8.11 - samples/sec: 2075.18 - lr: 0.000019 - momentum: 0.000000
404
+ 2023-10-23 21:08:18,587 epoch 5 - iter 132/447 - loss 0.03247897 - time (sec): 11.87 - samples/sec: 2091.33 - lr: 0.000019 - momentum: 0.000000
405
+ 2023-10-23 21:08:22,960 epoch 5 - iter 176/447 - loss 0.03016416 - time (sec): 16.24 - samples/sec: 2100.92 - lr: 0.000019 - momentum: 0.000000
406
+ 2023-10-23 21:08:26,816 epoch 5 - iter 220/447 - loss 0.02778420 - time (sec): 20.10 - samples/sec: 2102.65 - lr: 0.000018 - momentum: 0.000000
407
+ 2023-10-23 21:08:30,624 epoch 5 - iter 264/447 - loss 0.02814833 - time (sec): 23.91 - samples/sec: 2103.17 - lr: 0.000018 - momentum: 0.000000
408
+ 2023-10-23 21:08:35,077 epoch 5 - iter 308/447 - loss 0.02586079 - time (sec): 28.36 - samples/sec: 2110.52 - lr: 0.000018 - momentum: 0.000000
409
+ 2023-10-23 21:08:39,022 epoch 5 - iter 352/447 - loss 0.02544490 - time (sec): 32.31 - samples/sec: 2113.41 - lr: 0.000017 - momentum: 0.000000
410
+ 2023-10-23 21:08:42,951 epoch 5 - iter 396/447 - loss 0.02670970 - time (sec): 36.24 - samples/sec: 2126.29 - lr: 0.000017 - momentum: 0.000000
411
+ 2023-10-23 21:08:46,739 epoch 5 - iter 440/447 - loss 0.02604997 - time (sec): 40.02 - samples/sec: 2134.99 - lr: 0.000017 - momentum: 0.000000
412
+ 2023-10-23 21:08:47,301 ----------------------------------------------------------------------------------------------------
413
+ 2023-10-23 21:08:47,301 EPOCH 5 done: loss 0.0260 - lr: 0.000017
414
+ 2023-10-23 21:08:53,795 DEV : loss 0.19835765659809113 - f1-score (micro avg) 0.7738
415
+ 2023-10-23 21:08:53,815 saving best model
416
+ 2023-10-23 21:08:54,418 ----------------------------------------------------------------------------------------------------
417
+ 2023-10-23 21:08:58,311 epoch 6 - iter 44/447 - loss 0.02184566 - time (sec): 3.89 - samples/sec: 2032.43 - lr: 0.000016 - momentum: 0.000000
418
+ 2023-10-23 21:09:02,255 epoch 6 - iter 88/447 - loss 0.02189035 - time (sec): 7.84 - samples/sec: 2042.74 - lr: 0.000016 - momentum: 0.000000
419
+ 2023-10-23 21:09:06,400 epoch 6 - iter 132/447 - loss 0.01858513 - time (sec): 11.98 - samples/sec: 2069.63 - lr: 0.000016 - momentum: 0.000000
420
+ 2023-10-23 21:09:10,451 epoch 6 - iter 176/447 - loss 0.01839336 - time (sec): 16.03 - samples/sec: 2120.22 - lr: 0.000015 - momentum: 0.000000
421
+ 2023-10-23 21:09:14,439 epoch 6 - iter 220/447 - loss 0.01779606 - time (sec): 20.02 - samples/sec: 2132.02 - lr: 0.000015 - momentum: 0.000000
422
+ 2023-10-23 21:09:18,507 epoch 6 - iter 264/447 - loss 0.01809152 - time (sec): 24.09 - samples/sec: 2112.60 - lr: 0.000015 - momentum: 0.000000
423
+ 2023-10-23 21:09:22,335 epoch 6 - iter 308/447 - loss 0.01799876 - time (sec): 27.92 - samples/sec: 2123.56 - lr: 0.000014 - momentum: 0.000000
424
+ 2023-10-23 21:09:26,257 epoch 6 - iter 352/447 - loss 0.01963981 - time (sec): 31.84 - samples/sec: 2130.81 - lr: 0.000014 - momentum: 0.000000
425
+ 2023-10-23 21:09:30,544 epoch 6 - iter 396/447 - loss 0.01948104 - time (sec): 36.13 - samples/sec: 2122.38 - lr: 0.000014 - momentum: 0.000000
426
+ 2023-10-23 21:09:34,423 epoch 6 - iter 440/447 - loss 0.01934598 - time (sec): 40.00 - samples/sec: 2135.33 - lr: 0.000013 - momentum: 0.000000
427
+ 2023-10-23 21:09:34,989 ----------------------------------------------------------------------------------------------------
428
+ 2023-10-23 21:09:34,990 EPOCH 6 done: loss 0.0195 - lr: 0.000013
429
+ 2023-10-23 21:09:41,480 DEV : loss 0.2243068516254425 - f1-score (micro avg) 0.7653
430
+ 2023-10-23 21:09:41,500 ----------------------------------------------------------------------------------------------------
431
+ 2023-10-23 21:09:45,224 epoch 7 - iter 44/447 - loss 0.00920856 - time (sec): 3.72 - samples/sec: 2231.28 - lr: 0.000013 - momentum: 0.000000
432
+ 2023-10-23 21:09:49,298 epoch 7 - iter 88/447 - loss 0.00662161 - time (sec): 7.80 - samples/sec: 2168.79 - lr: 0.000013 - momentum: 0.000000
433
+ 2023-10-23 21:09:53,791 epoch 7 - iter 132/447 - loss 0.00770613 - time (sec): 12.29 - samples/sec: 2141.83 - lr: 0.000012 - momentum: 0.000000
434
+ 2023-10-23 21:09:57,731 epoch 7 - iter 176/447 - loss 0.00876968 - time (sec): 16.23 - samples/sec: 2139.89 - lr: 0.000012 - momentum: 0.000000
435
+ 2023-10-23 21:10:01,668 epoch 7 - iter 220/447 - loss 0.01105371 - time (sec): 20.17 - samples/sec: 2133.73 - lr: 0.000012 - momentum: 0.000000
436
+ 2023-10-23 21:10:05,750 epoch 7 - iter 264/447 - loss 0.01232071 - time (sec): 24.25 - samples/sec: 2134.11 - lr: 0.000011 - momentum: 0.000000
437
+ 2023-10-23 21:10:09,797 epoch 7 - iter 308/447 - loss 0.01229655 - time (sec): 28.30 - samples/sec: 2129.39 - lr: 0.000011 - momentum: 0.000000
438
+ 2023-10-23 21:10:13,591 epoch 7 - iter 352/447 - loss 0.01195873 - time (sec): 32.09 - samples/sec: 2131.21 - lr: 0.000011 - momentum: 0.000000
439
+ 2023-10-23 21:10:17,520 epoch 7 - iter 396/447 - loss 0.01218580 - time (sec): 36.02 - samples/sec: 2138.12 - lr: 0.000010 - momentum: 0.000000
440
+ 2023-10-23 21:10:21,420 epoch 7 - iter 440/447 - loss 0.01243524 - time (sec): 39.92 - samples/sec: 2141.02 - lr: 0.000010 - momentum: 0.000000
441
+ 2023-10-23 21:10:21,970 ----------------------------------------------------------------------------------------------------
442
+ 2023-10-23 21:10:21,971 EPOCH 7 done: loss 0.0123 - lr: 0.000010
443
+ 2023-10-23 21:10:28,450 DEV : loss 0.23942111432552338 - f1-score (micro avg) 0.7782
444
+ 2023-10-23 21:10:28,471 saving best model
445
+ 2023-10-23 21:10:29,061 ----------------------------------------------------------------------------------------------------
446
+ 2023-10-23 21:10:32,915 epoch 8 - iter 44/447 - loss 0.01227698 - time (sec): 3.85 - samples/sec: 2174.91 - lr: 0.000010 - momentum: 0.000000
447
+ 2023-10-23 21:10:36,826 epoch 8 - iter 88/447 - loss 0.01241903 - time (sec): 7.76 - samples/sec: 2170.56 - lr: 0.000009 - momentum: 0.000000
448
+ 2023-10-23 21:10:40,673 epoch 8 - iter 132/447 - loss 0.01170822 - time (sec): 11.61 - samples/sec: 2125.15 - lr: 0.000009 - momentum: 0.000000
449
+ 2023-10-23 21:10:45,301 epoch 8 - iter 176/447 - loss 0.00886420 - time (sec): 16.24 - samples/sec: 2140.47 - lr: 0.000009 - momentum: 0.000000
450
+ 2023-10-23 21:10:49,255 epoch 8 - iter 220/447 - loss 0.00834151 - time (sec): 20.19 - samples/sec: 2146.67 - lr: 0.000008 - momentum: 0.000000
451
+ 2023-10-23 21:10:52,900 epoch 8 - iter 264/447 - loss 0.00734419 - time (sec): 23.84 - samples/sec: 2125.14 - lr: 0.000008 - momentum: 0.000000
452
+ 2023-10-23 21:10:57,066 epoch 8 - iter 308/447 - loss 0.00699004 - time (sec): 28.00 - samples/sec: 2125.54 - lr: 0.000008 - momentum: 0.000000
453
+ 2023-10-23 21:11:01,039 epoch 8 - iter 352/447 - loss 0.00726347 - time (sec): 31.98 - samples/sec: 2127.42 - lr: 0.000007 - momentum: 0.000000
454
+ 2023-10-23 21:11:05,458 epoch 8 - iter 396/447 - loss 0.00762904 - time (sec): 36.40 - samples/sec: 2122.16 - lr: 0.000007 - momentum: 0.000000
455
+ 2023-10-23 21:11:09,201 epoch 8 - iter 440/447 - loss 0.00722230 - time (sec): 40.14 - samples/sec: 2121.45 - lr: 0.000007 - momentum: 0.000000
456
+ 2023-10-23 21:11:09,820 ----------------------------------------------------------------------------------------------------
457
+ 2023-10-23 21:11:09,820 EPOCH 8 done: loss 0.0072 - lr: 0.000007
458
+ 2023-10-23 21:11:16,024 DEV : loss 0.24051210284233093 - f1-score (micro avg) 0.7845
459
+ 2023-10-23 21:11:16,045 saving best model
460
+ 2023-10-23 21:11:16,944 ----------------------------------------------------------------------------------------------------
461
+ 2023-10-23 21:11:20,570 epoch 9 - iter 44/447 - loss 0.00666362 - time (sec): 3.62 - samples/sec: 2217.26 - lr: 0.000006 - momentum: 0.000000
462
+ 2023-10-23 21:11:24,590 epoch 9 - iter 88/447 - loss 0.00577915 - time (sec): 7.65 - samples/sec: 2129.85 - lr: 0.000006 - momentum: 0.000000
463
+ 2023-10-23 21:11:28,832 epoch 9 - iter 132/447 - loss 0.00485046 - time (sec): 11.89 - samples/sec: 2118.85 - lr: 0.000006 - momentum: 0.000000
464
+ 2023-10-23 21:11:32,648 epoch 9 - iter 176/447 - loss 0.00491996 - time (sec): 15.70 - samples/sec: 2142.21 - lr: 0.000005 - momentum: 0.000000
465
+ 2023-10-23 21:11:36,618 epoch 9 - iter 220/447 - loss 0.00513214 - time (sec): 19.67 - samples/sec: 2151.50 - lr: 0.000005 - momentum: 0.000000
466
+ 2023-10-23 21:11:40,891 epoch 9 - iter 264/447 - loss 0.00479915 - time (sec): 23.95 - samples/sec: 2148.31 - lr: 0.000005 - momentum: 0.000000
467
+ 2023-10-23 21:11:45,144 epoch 9 - iter 308/447 - loss 0.00461094 - time (sec): 28.20 - samples/sec: 2147.40 - lr: 0.000004 - momentum: 0.000000
468
+ 2023-10-23 21:11:48,902 epoch 9 - iter 352/447 - loss 0.00508793 - time (sec): 31.96 - samples/sec: 2144.23 - lr: 0.000004 - momentum: 0.000000
469
+ 2023-10-23 21:11:52,648 epoch 9 - iter 396/447 - loss 0.00553986 - time (sec): 35.70 - samples/sec: 2147.82 - lr: 0.000004 - momentum: 0.000000
470
+ 2023-10-23 21:11:56,680 epoch 9 - iter 440/447 - loss 0.00517049 - time (sec): 39.74 - samples/sec: 2149.08 - lr: 0.000003 - momentum: 0.000000
471
+ 2023-10-23 21:11:57,308 ----------------------------------------------------------------------------------------------------
472
+ 2023-10-23 21:11:57,309 EPOCH 9 done: loss 0.0051 - lr: 0.000003
473
+ 2023-10-23 21:12:03,519 DEV : loss 0.2497478574514389 - f1-score (micro avg) 0.7909
474
+ 2023-10-23 21:12:03,540 saving best model
475
+ 2023-10-23 21:12:04,111 ----------------------------------------------------------------------------------------------------
476
+ 2023-10-23 21:12:08,002 epoch 10 - iter 44/447 - loss 0.00252055 - time (sec): 3.89 - samples/sec: 2207.00 - lr: 0.000003 - momentum: 0.000000
477
+ 2023-10-23 21:12:12,159 epoch 10 - iter 88/447 - loss 0.00205105 - time (sec): 8.05 - samples/sec: 2163.44 - lr: 0.000003 - momentum: 0.000000
478
+ 2023-10-23 21:12:16,370 epoch 10 - iter 132/447 - loss 0.00162416 - time (sec): 12.26 - samples/sec: 2100.99 - lr: 0.000002 - momentum: 0.000000
479
+ 2023-10-23 21:12:20,040 epoch 10 - iter 176/447 - loss 0.00314123 - time (sec): 15.93 - samples/sec: 2135.69 - lr: 0.000002 - momentum: 0.000000
480
+ 2023-10-23 21:12:24,341 epoch 10 - iter 220/447 - loss 0.00360926 - time (sec): 20.23 - samples/sec: 2144.50 - lr: 0.000002 - momentum: 0.000000
481
+ 2023-10-23 21:12:28,106 epoch 10 - iter 264/447 - loss 0.00340675 - time (sec): 23.99 - samples/sec: 2135.76 - lr: 0.000001 - momentum: 0.000000
482
+ 2023-10-23 21:12:31,919 epoch 10 - iter 308/447 - loss 0.00353423 - time (sec): 27.81 - samples/sec: 2147.17 - lr: 0.000001 - momentum: 0.000000
483
+ 2023-10-23 21:12:36,013 epoch 10 - iter 352/447 - loss 0.00314415 - time (sec): 31.90 - samples/sec: 2139.65 - lr: 0.000001 - momentum: 0.000000
484
+ 2023-10-23 21:12:40,324 epoch 10 - iter 396/447 - loss 0.00312445 - time (sec): 36.21 - samples/sec: 2122.16 - lr: 0.000000 - momentum: 0.000000
485
+ 2023-10-23 21:12:44,382 epoch 10 - iter 440/447 - loss 0.00304078 - time (sec): 40.27 - samples/sec: 2116.62 - lr: 0.000000 - momentum: 0.000000
486
+ 2023-10-23 21:12:45,004 ----------------------------------------------------------------------------------------------------
487
+ 2023-10-23 21:12:45,004 EPOCH 10 done: loss 0.0030 - lr: 0.000000
488
+ 2023-10-23 21:12:51,224 DEV : loss 0.25497499108314514 - f1-score (micro avg) 0.7901
489
+ 2023-10-23 21:12:51,722 ----------------------------------------------------------------------------------------------------
490
+ 2023-10-23 21:12:51,723 Loading model from best epoch ...
491
+ 2023-10-23 21:12:53,466 SequenceTagger predicts: Dictionary with 21 tags: O, S-loc, B-loc, E-loc, I-loc, S-pers, B-pers, E-pers, I-pers, S-org, B-org, E-org, I-org, S-prod, B-prod, E-prod, I-prod, S-time, B-time, E-time, I-time
492
+ 2023-10-23 21:12:58,280
493
+ Results:
494
+ - F-score (micro) 0.7524
495
+ - F-score (macro) 0.665
496
+ - Accuracy 0.6214
497
+
498
+ By class:
499
+ precision recall f1-score support
500
+
501
+ loc 0.8280 0.8641 0.8456 596
502
+ pers 0.7064 0.7658 0.7349 333
503
+ org 0.4706 0.4848 0.4776 132
504
+ prod 0.6071 0.5152 0.5574 66
505
+ time 0.7500 0.6735 0.7097 49
506
+
507
+ micro avg 0.7391 0.7662 0.7524 1176
508
+ macro avg 0.6724 0.6607 0.6650 1176
509
+ weighted avg 0.7378 0.7662 0.7511 1176
510
+
511
+ 2023-10-23 21:12:58,280 ----------------------------------------------------------------------------------------------------