medbert-512-finetuned-grasc
This model is a fine-tuned version of GerMedBERT/medbert-512 on the None dataset. It achieves the following results on the evaluation set:
- Loss: 0.0060
- Accuracy: 1.0
- F1-weighted: 1.0
- F1-micro: 1.0
- F1-macro: 1.0
- Precision-weighted: 1.0
- Precision-micro: 1.0
- Precision-macro: 1.0
- Recall-weighted: 1.0
- Recall-micro: 1.0
- Recall-macro: 1.0
- Ballanced-accuracy: 1.0
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 3e-05
- train_batch_size: 4
- eval_batch_size: 4
- seed: 42
- optimizer: Use OptimizerNames.ADAMW_TORCH_FUSED with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- num_epochs: 50
Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy | F1-weighted | F1-micro | F1-macro | Precision-weighted | Precision-micro | Precision-macro | Recall-weighted | Recall-micro | Recall-macro | Ballanced-accuracy |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
| 3.7258 | 1.0 | 16 | 2.4052 | 0.4355 | 0.2642 | 0.4355 | 0.0758 | 0.1896 | 0.4355 | 0.0544 | 0.4355 | 0.4355 | 0.125 | 0.125 |
| 2.2185 | 2.0 | 32 | 1.8621 | 0.6452 | 0.5636 | 0.6452 | 0.1873 | 0.5625 | 0.6452 | 0.1939 | 0.6452 | 0.6452 | 0.2062 | 0.2062 |
| 1.8431 | 3.0 | 48 | 1.5623 | 0.5323 | 0.4322 | 0.5323 | 0.1390 | 0.5325 | 0.5323 | 0.1853 | 0.5323 | 0.5323 | 0.1625 | 0.1625 |
| 1.6836 | 4.0 | 64 | 1.1638 | 0.7742 | 0.7250 | 0.7742 | 0.2820 | 0.6922 | 0.7742 | 0.2812 | 0.7742 | 0.7742 | 0.2917 | 0.2917 |
| 1.1477 | 5.0 | 80 | 1.0214 | 0.7258 | 0.6300 | 0.7258 | 0.2110 | 0.5597 | 0.7258 | 0.1888 | 0.7258 | 0.7258 | 0.2407 | 0.2407 |
| 1.3781 | 6.0 | 96 | 0.6523 | 0.8710 | 0.8281 | 0.8710 | 0.3360 | 0.7902 | 0.8710 | 0.3183 | 0.8710 | 0.8710 | 0.3565 | 0.3565 |
| 0.5876 | 7.0 | 112 | 0.5180 | 0.9194 | 0.8907 | 0.9194 | 0.4745 | 0.8677 | 0.9194 | 0.4582 | 0.9194 | 0.9194 | 0.4954 | 0.4954 |
| 0.8002 | 8.0 | 128 | 0.3721 | 0.9194 | 0.8900 | 0.9194 | 0.4545 | 0.8651 | 0.9194 | 0.4319 | 0.9194 | 0.9194 | 0.4861 | 0.4861 |
| 0.3814 | 9.0 | 144 | 0.2805 | 0.9516 | 0.9292 | 0.9516 | 0.6102 | 0.9097 | 0.9516 | 0.5978 | 0.9516 | 0.9516 | 0.625 | 0.625 |
| 0.3762 | 10.0 | 160 | 0.2354 | 0.9516 | 0.9306 | 0.9516 | 0.5875 | 0.9145 | 0.9516 | 0.5606 | 0.9516 | 0.9516 | 0.625 | 0.625 |
| 0.2225 | 11.0 | 176 | 0.2019 | 0.9839 | 0.9762 | 0.9839 | 0.8684 | 0.9694 | 0.9839 | 0.8625 | 0.9839 | 0.9839 | 0.875 | 0.875 |
| 0.236 | 12.0 | 192 | 0.1627 | 0.9839 | 0.9762 | 0.9839 | 0.8684 | 0.9694 | 0.9839 | 0.8625 | 0.9839 | 0.9839 | 0.875 | 0.875 |
| 0.2791 | 13.0 | 208 | 0.1400 | 0.9839 | 0.9762 | 0.9839 | 0.8684 | 0.9694 | 0.9839 | 0.8625 | 0.9839 | 0.9839 | 0.875 | 0.875 |
| 0.0915 | 14.0 | 224 | 0.1219 | 0.9839 | 0.9762 | 0.9839 | 0.8684 | 0.9694 | 0.9839 | 0.8625 | 0.9839 | 0.9839 | 0.875 | 0.875 |
| 0.1426 | 15.0 | 240 | 0.1118 | 0.9839 | 0.9762 | 0.9839 | 0.8684 | 0.9694 | 0.9839 | 0.8625 | 0.9839 | 0.9839 | 0.875 | 0.875 |
| 0.1103 | 16.0 | 256 | 0.1005 | 0.9839 | 0.9762 | 0.9839 | 0.8684 | 0.9694 | 0.9839 | 0.8625 | 0.9839 | 0.9839 | 0.875 | 0.875 |
| 0.1324 | 17.0 | 272 | 0.0916 | 0.9839 | 0.9762 | 0.9839 | 0.8684 | 0.9694 | 0.9839 | 0.8625 | 0.9839 | 0.9839 | 0.875 | 0.875 |
| 0.034 | 18.0 | 288 | 0.0880 | 0.9839 | 0.9762 | 0.9839 | 0.8684 | 0.9694 | 0.9839 | 0.8625 | 0.9839 | 0.9839 | 0.875 | 0.875 |
| 0.1798 | 19.0 | 304 | 0.0592 | 0.9839 | 0.9762 | 0.9839 | 0.8684 | 0.9694 | 0.9839 | 0.8625 | 0.9839 | 0.9839 | 0.875 | 0.875 |
| 0.0274 | 20.0 | 320 | 0.0627 | 0.9839 | 0.9762 | 0.9839 | 0.8684 | 0.9694 | 0.9839 | 0.8625 | 0.9839 | 0.9839 | 0.875 | 0.875 |
| 0.0768 | 21.0 | 336 | 0.0726 | 0.9839 | 0.9762 | 0.9839 | 0.8684 | 0.9694 | 0.9839 | 0.8625 | 0.9839 | 0.9839 | 0.875 | 0.875 |
| 0.1617 | 22.0 | 352 | 0.0624 | 0.9839 | 0.9762 | 0.9839 | 0.8684 | 0.9694 | 0.9839 | 0.8625 | 0.9839 | 0.9839 | 0.875 | 0.875 |
| 0.0706 | 23.0 | 368 | 0.0618 | 0.9839 | 0.9762 | 0.9839 | 0.8684 | 0.9694 | 0.9839 | 0.8625 | 0.9839 | 0.9839 | 0.875 | 0.875 |
| 0.0198 | 24.0 | 384 | 0.0651 | 0.9839 | 0.9762 | 0.9839 | 0.8684 | 0.9694 | 0.9839 | 0.8625 | 0.9839 | 0.9839 | 0.875 | 0.875 |
| 0.0762 | 25.0 | 400 | 0.0443 | 0.9839 | 0.9762 | 0.9839 | 0.8684 | 0.9694 | 0.9839 | 0.8625 | 0.9839 | 0.9839 | 0.875 | 0.875 |
| 0.0641 | 26.0 | 416 | 0.0319 | 0.9839 | 0.9762 | 0.9839 | 0.8684 | 0.9694 | 0.9839 | 0.8625 | 0.9839 | 0.9839 | 0.875 | 0.875 |
| 0.0468 | 27.0 | 432 | 0.0334 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 |
| 0.0357 | 28.0 | 448 | 0.0207 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 |
| 0.0386 | 29.0 | 464 | 0.0347 | 0.9839 | 0.9762 | 0.9839 | 0.8684 | 0.9694 | 0.9839 | 0.8625 | 0.9839 | 0.9839 | 0.875 | 0.875 |
| 0.0361 | 30.0 | 480 | 0.0214 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 |
| 0.0207 | 31.0 | 496 | 0.0152 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 |
| 0.0197 | 32.0 | 512 | 0.0126 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 |
| 0.0153 | 33.0 | 528 | 0.0116 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 |
| 0.0101 | 34.0 | 544 | 0.0108 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 |
| 0.0185 | 35.0 | 560 | 0.0093 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 |
| 0.0101 | 36.0 | 576 | 0.0088 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 |
| 0.0069 | 37.0 | 592 | 0.0084 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 |
| 0.0135 | 38.0 | 608 | 0.0077 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 |
| 0.0068 | 39.0 | 624 | 0.0074 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 |
| 0.0103 | 40.0 | 640 | 0.0072 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 |
| 0.0112 | 41.0 | 656 | 0.0069 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 |
| 0.0096 | 42.0 | 672 | 0.0067 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 |
| 0.0098 | 43.0 | 688 | 0.0065 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 |
| 0.0104 | 44.0 | 704 | 0.0063 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 |
| 0.008 | 45.0 | 720 | 0.0062 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 |
| 0.0081 | 46.0 | 736 | 0.0061 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 |
| 0.0079 | 47.0 | 752 | 0.0061 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 |
| 0.0081 | 48.0 | 768 | 0.0061 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 |
| 0.0068 | 49.0 | 784 | 0.0060 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 |
| 0.0089 | 50.0 | 800 | 0.0060 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 |
Framework versions
- Transformers 4.56.1
- Pytorch 2.8.0+cu126
- Datasets 4.0.0
- Tokenizers 0.22.0
- Downloads last month
- 1
Model tree for Luggi/medbert-512-finetuned-grasc
Base model
GerMedBERT/medbert-512