Edit model card

rubert-base-cased_neg

This model is a fine-tuned version of DeepPavlov/rubert-base-cased on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.6290
  • Precision: 0.5977
  • Recall: 0.6106
  • F1: 0.6041
  • Accuracy: 0.8995

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 5e-05
  • train_batch_size: 4
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Precision Recall F1 Accuracy
No log 1.11 50 0.6820 0.0 0.0 0.0 0.7747
No log 2.22 100 0.5532 0.0489 0.0212 0.0296 0.8040
No log 3.33 150 0.4231 0.1320 0.1004 0.1140 0.8397
No log 4.44 200 0.3736 0.2200 0.1873 0.2023 0.8576
No log 5.56 250 0.3511 0.3096 0.2606 0.2830 0.8713
No log 6.67 300 0.2976 0.3635 0.4498 0.4021 0.8833
No log 7.78 350 0.2713 0.3793 0.4942 0.4292 0.8925
No log 8.89 400 0.2569 0.4174 0.5753 0.4838 0.9016
No log 10.0 450 0.2599 0.4687 0.5772 0.5173 0.9031
0.389 11.11 500 0.2775 0.5008 0.5849 0.5396 0.9084
0.389 12.22 550 0.3123 0.4257 0.6641 0.5189 0.8834
0.389 13.33 600 0.3271 0.5036 0.5425 0.5223 0.9064
0.389 14.44 650 0.3390 0.5328 0.5174 0.5250 0.9016
0.389 15.56 700 0.3688 0.4788 0.6313 0.5445 0.8988
0.389 16.67 750 0.4260 0.4942 0.6583 0.5646 0.8997
0.389 17.78 800 0.3622 0.5217 0.6042 0.5599 0.9106
0.389 18.89 850 0.4054 0.5266 0.6120 0.5661 0.9105
0.389 20.0 900 0.3988 0.5070 0.6255 0.5601 0.9049
0.389 21.11 950 0.4422 0.5281 0.5444 0.5361 0.9093
0.0777 22.22 1000 0.4207 0.5635 0.5656 0.5645 0.9117
0.0777 23.33 1050 0.4721 0.5505 0.5792 0.5644 0.9130
0.0777 24.44 1100 0.4261 0.5379 0.6158 0.5743 0.9114
0.0777 25.56 1150 0.5339 0.6157 0.5290 0.5691 0.9
0.0777 26.67 1200 0.3761 0.4949 0.6544 0.5636 0.9036
0.0777 27.78 1250 0.4250 0.5650 0.5792 0.5720 0.9110
0.0777 28.89 1300 0.3790 0.5731 0.6429 0.6060 0.9193
0.0777 30.0 1350 0.5330 0.5942 0.5907 0.5924 0.9076
0.0777 31.11 1400 0.4419 0.5957 0.5888 0.5922 0.9180
0.0777 32.22 1450 0.5531 0.6008 0.5695 0.5847 0.9088
0.0362 33.33 1500 0.4544 0.5231 0.6564 0.5822 0.9135
0.0362 34.44 1550 0.4990 0.5695 0.5772 0.5733 0.9082
0.0362 35.56 1600 0.4040 0.5709 0.5753 0.5731 0.9086
0.0362 36.67 1650 0.3807 0.5989 0.6255 0.6119 0.9123
0.0362 37.78 1700 0.5088 0.5996 0.6100 0.6048 0.9139
0.0362 38.89 1750 0.4525 0.6151 0.5985 0.6067 0.9189
0.0362 40.0 1800 0.3787 0.6184 0.6100 0.6142 0.9211
0.0362 41.11 1850 0.3974 0.6097 0.5849 0.5970 0.9162
0.0362 42.22 1900 0.3944 0.5762 0.6274 0.6007 0.9148
0.0362 43.33 1950 0.3865 0.5124 0.6795 0.5842 0.9022
0.0264 44.44 2000 0.4583 0.5462 0.6274 0.5840 0.9169
0.0264 45.56 2050 0.4640 0.5635 0.6429 0.6005 0.9105
0.0264 46.67 2100 0.5028 0.5945 0.5830 0.5887 0.9128
0.0264 47.78 2150 0.3917 0.6267 0.6255 0.6261 0.9221
0.0264 48.89 2200 0.4833 0.6214 0.6274 0.6244 0.9138
0.0264 50.0 2250 0.4147 0.6130 0.6390 0.6257 0.9190
0.0264 51.11 2300 0.4455 0.6546 0.5927 0.6221 0.9185
0.0264 52.22 2350 0.4575 0.6138 0.6351 0.6243 0.9180
0.0264 53.33 2400 0.7707 0.3732 0.6815 0.4822 0.8354
0.0264 54.44 2450 0.4440 0.6015 0.6236 0.6123 0.9130
0.0248 55.56 2500 0.4815 0.5739 0.6448 0.6073 0.9124
0.0248 56.67 2550 0.3971 0.6204 0.6467 0.6333 0.9227
0.0248 57.78 2600 0.4770 0.6208 0.6100 0.6154 0.9193
0.0248 58.89 2650 0.5450 0.6699 0.5367 0.5959 0.9109
0.0248 60.0 2700 0.5033 0.5439 0.6332 0.5852 0.9019
0.0248 61.11 2750 0.5185 0.5187 0.6699 0.5847 0.9010
0.0248 62.22 2800 0.4277 0.6627 0.6371 0.6496 0.9194
0.0248 63.33 2850 0.4688 0.4869 0.6467 0.5556 0.9066
0.0248 64.44 2900 0.4779 0.6135 0.6313 0.6223 0.9153
0.0248 65.56 2950 0.5012 0.5852 0.6100 0.5974 0.9079
0.0232 66.67 3000 0.4788 0.5259 0.6081 0.5640 0.9052
0.0232 67.78 3050 0.4556 0.5726 0.6544 0.6108 0.9099
0.0232 68.89 3100 0.5026 0.608 0.5869 0.5972 0.9091
0.0232 70.0 3150 0.8153 0.3071 0.7143 0.4295 0.7567
0.0232 71.11 3200 0.4670 0.6169 0.6062 0.6115 0.9113
0.0232 72.22 3250 0.5249 0.5727 0.6313 0.6006 0.9068
0.0232 73.33 3300 0.4343 0.6085 0.6390 0.6234 0.9162
0.0232 74.44 3350 0.5067 0.6364 0.5811 0.6075 0.9135
0.0232 75.56 3400 0.4415 0.5812 0.6429 0.6104 0.9149
0.0232 76.67 3450 0.4052 0.5757 0.6313 0.6022 0.9137
0.0266 77.78 3500 0.5119 0.5233 0.5425 0.5327 0.9038
0.0266 78.89 3550 0.4689 0.5945 0.5888 0.5917 0.9145
0.0266 80.0 3600 0.3973 0.5609 0.6313 0.5940 0.9154
0.0266 81.11 3650 0.4848 0.5947 0.6486 0.6205 0.9181
0.0266 82.22 3700 0.4825 0.5877 0.6274 0.6069 0.9160
0.0266 83.33 3750 0.5193 0.5138 0.6467 0.5726 0.9
0.0266 84.44 3800 0.5344 0.5777 0.5811 0.5794 0.9107
0.0266 85.56 3850 0.5227 0.6591 0.5637 0.6077 0.9107
0.0266 86.67 3900 0.4490 0.5176 0.6255 0.5664 0.9097
0.0266 87.78 3950 0.6307 0.6464 0.5541 0.5967 0.9068
0.029 88.89 4000 0.4432 0.5667 0.5985 0.5822 0.9099
0.029 90.0 4050 0.4822 0.5148 0.6371 0.5695 0.9018
0.029 91.11 4100 0.4706 0.5966 0.6023 0.5994 0.9128

Framework versions

  • Transformers 4.38.2
  • Pytorch 2.1.2
  • Datasets 2.1.0
  • Tokenizers 0.15.2
Downloads last month
2
Safetensors
Model size
177M params
Tensor type
F32
·

Finetuned from