parlange commited on
Commit
76f48c7
·
verified ·
1 Parent(s): 736abfb

Upload CaiT model from experiment b3

Browse files
This view is limited to 50 files because it contains too many changes.   See raw diff
Files changed (50) hide show
  1. .gitattributes +2 -0
  2. README.md +165 -0
  3. cait-gravit-b3.pth +3 -0
  4. config.json +76 -0
  5. confusion_matrices/CaiT_Confusion_Matrix_a.png +0 -0
  6. confusion_matrices/CaiT_Confusion_Matrix_b.png +0 -0
  7. confusion_matrices/CaiT_Confusion_Matrix_c.png +0 -0
  8. confusion_matrices/CaiT_Confusion_Matrix_d.png +0 -0
  9. confusion_matrices/CaiT_Confusion_Matrix_e.png +0 -0
  10. confusion_matrices/CaiT_Confusion_Matrix_f.png +0 -0
  11. confusion_matrices/CaiT_Confusion_Matrix_g.png +0 -0
  12. confusion_matrices/CaiT_Confusion_Matrix_h.png +0 -0
  13. confusion_matrices/CaiT_Confusion_Matrix_i.png +0 -0
  14. confusion_matrices/CaiT_Confusion_Matrix_j.png +0 -0
  15. confusion_matrices/CaiT_Confusion_Matrix_k.png +0 -0
  16. confusion_matrices/CaiT_Confusion_Matrix_l.png +0 -0
  17. evaluation_results.csv +133 -0
  18. model.safetensors +3 -0
  19. pytorch_model.bin +3 -0
  20. roc_confusion_matrix/CaiT_roc_confusion_matrix_a.png +0 -0
  21. roc_confusion_matrix/CaiT_roc_confusion_matrix_b.png +0 -0
  22. roc_confusion_matrix/CaiT_roc_confusion_matrix_c.png +0 -0
  23. roc_confusion_matrix/CaiT_roc_confusion_matrix_d.png +0 -0
  24. roc_confusion_matrix/CaiT_roc_confusion_matrix_e.png +0 -0
  25. roc_confusion_matrix/CaiT_roc_confusion_matrix_f.png +0 -0
  26. roc_confusion_matrix/CaiT_roc_confusion_matrix_g.png +0 -0
  27. roc_confusion_matrix/CaiT_roc_confusion_matrix_h.png +0 -0
  28. roc_confusion_matrix/CaiT_roc_confusion_matrix_i.png +0 -0
  29. roc_confusion_matrix/CaiT_roc_confusion_matrix_j.png +0 -0
  30. roc_confusion_matrix/CaiT_roc_confusion_matrix_k.png +0 -0
  31. roc_confusion_matrix/CaiT_roc_confusion_matrix_l.png +0 -0
  32. roc_curves/CaiT_ROC_a.png +0 -0
  33. roc_curves/CaiT_ROC_b.png +0 -0
  34. roc_curves/CaiT_ROC_c.png +0 -0
  35. roc_curves/CaiT_ROC_d.png +0 -0
  36. roc_curves/CaiT_ROC_e.png +0 -0
  37. roc_curves/CaiT_ROC_f.png +0 -0
  38. roc_curves/CaiT_ROC_g.png +0 -0
  39. roc_curves/CaiT_ROC_h.png +0 -0
  40. roc_curves/CaiT_ROC_i.png +0 -0
  41. roc_curves/CaiT_ROC_j.png +0 -0
  42. roc_curves/CaiT_ROC_k.png +0 -0
  43. roc_curves/CaiT_ROC_l.png +0 -0
  44. training_curves/CaiT_accuracy.png +0 -0
  45. training_curves/CaiT_auc.png +0 -0
  46. training_curves/CaiT_combined_metrics.png +3 -0
  47. training_curves/CaiT_f1.png +0 -0
  48. training_curves/CaiT_loss.png +0 -0
  49. training_curves/CaiT_metrics.csv +26 -0
  50. training_metrics.csv +26 -0
.gitattributes CHANGED
@@ -33,3 +33,5 @@ saved_model/**/* filter=lfs diff=lfs merge=lfs -text
33
  *.zip filter=lfs diff=lfs merge=lfs -text
34
  *.zst filter=lfs diff=lfs merge=lfs -text
35
  *tfevents* filter=lfs diff=lfs merge=lfs -text
 
 
 
33
  *.zip filter=lfs diff=lfs merge=lfs -text
34
  *.zst filter=lfs diff=lfs merge=lfs -text
35
  *tfevents* filter=lfs diff=lfs merge=lfs -text
36
+ training_curves/CaiT_combined_metrics.png filter=lfs diff=lfs merge=lfs -text
37
+ training_notebook_b3.ipynb filter=lfs diff=lfs merge=lfs -text
README.md ADDED
@@ -0,0 +1,165 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ license: apache-2.0
3
+ tags:
4
+ - vision-transformer
5
+ - image-classification
6
+ - pytorch
7
+ - timm
8
+ - cait
9
+ - gravitational-lensing
10
+ - strong-lensing
11
+ - astronomy
12
+ - astrophysics
13
+ datasets:
14
+ - parlange/gravit-j24
15
+ metrics:
16
+ - accuracy
17
+ - auc
18
+ - f1
19
+ paper:
20
+ - title: "GraViT: A Gravitational Lens Discovery Toolkit with Vision Transformers"
21
+ url: "https://arxiv.org/abs/2509.00226"
22
+ authors: "Parlange et al."
23
+ model-index:
24
+ - name: CaiT-b3
25
+ results:
26
+ - task:
27
+ type: image-classification
28
+ name: Strong Gravitational Lens Discovery
29
+ dataset:
30
+ type: common-test-sample
31
+ name: Common Test Sample (More et al. 2024)
32
+ metrics:
33
+ - type: accuracy
34
+ value: 0.8430
35
+ name: Average Accuracy
36
+ - type: auc
37
+ value: 0.8265
38
+ name: Average AUC-ROC
39
+ - type: f1
40
+ value: 0.5613
41
+ name: Average F1-Score
42
+ ---
43
+
44
+ # 🌌 cait-gravit-b3
45
+
46
+ 🔭 This model is part of **GraViT**: Transfer Learning with Vision Transformers and MLP-Mixer for Strong Gravitational Lens Discovery
47
+
48
+ 🔗 **GitHub Repository**: [https://github.com/parlange/gravit](https://github.com/parlange/gravit)
49
+
50
+ ## 🛰️ Model Details
51
+
52
+ - **🤖 Model Type**: CaiT
53
+ - **🧪 Experiment**: B3 - J24-all-blocks
54
+ - **🌌 Dataset**: J24
55
+ - **🪐 Fine-tuning Strategy**: all-blocks
56
+
57
+
58
+
59
+ ## 💻 Quick Start
60
+
61
+ ```python
62
+ import torch
63
+ import timm
64
+
65
+ # Load the model directly from the Hub
66
+ model = timm.create_model(
67
+ 'hf-hub:parlange/cait-gravit-b3',
68
+ pretrained=True
69
+ )
70
+ model.eval()
71
+
72
+ # Example inference
73
+ dummy_input = torch.randn(1, 3, 224, 224)
74
+ with torch.no_grad():
75
+ output = model(dummy_input)
76
+ predictions = torch.softmax(output, dim=1)
77
+ print(f"Lens probability: {predictions[0][1]:.4f}")
78
+ ```
79
+
80
+ ## ⚡️ Training Configuration
81
+
82
+ **Training Dataset:** J24 (Jaelani et al. 2024)
83
+ **Fine-tuning Strategy:** all-blocks
84
+
85
+
86
+ | 🔧 Parameter | 📝 Value |
87
+ |--------------|----------|
88
+ | Batch Size | 192 |
89
+ | Learning Rate | AdamW with ReduceLROnPlateau |
90
+ | Epochs | 100 |
91
+ | Patience | 10 |
92
+ | Optimizer | AdamW |
93
+ | Scheduler | ReduceLROnPlateau |
94
+ | Image Size | 224x224 |
95
+ | Fine Tune Mode | all_blocks |
96
+ | Stochastic Depth Probability | 0.1 |
97
+
98
+
99
+ ## 📈 Training Curves
100
+
101
+ ![Combined Training Metrics](https://huggingface.co/parlange/cait-gravit-b3/resolve/main/training_curves/CaiT_combined_metrics.png)
102
+
103
+
104
+ ## 🏁 Final Epoch Training Metrics
105
+
106
+ | Metric | Training | Validation |
107
+ |:---------:|:-----------:|:-------------:|
108
+ | 📉 Loss | 0.0188 | 0.0494 |
109
+ | 🎯 Accuracy | 0.9935 | 0.9882 |
110
+ | 📊 AUC-ROC | 0.9997 | 0.9981 |
111
+ | ⚖️ F1 Score | 0.9934 | 0.9882 |
112
+
113
+
114
+ ## ☑️ Evaluation Results
115
+
116
+ ### ROC Curves and Confusion Matrices
117
+
118
+ Performance across all test datasets (a through l) in the Common Test Sample (More et al. 2024):
119
+
120
+ ![ROC + Confusion Matrix - Dataset A](https://huggingface.co/parlange/cait-gravit-b3/resolve/main/roc_confusion_matrix/CaiT_roc_confusion_matrix_a.png)
121
+ ![ROC + Confusion Matrix - Dataset B](https://huggingface.co/parlange/cait-gravit-b3/resolve/main/roc_confusion_matrix/CaiT_roc_confusion_matrix_b.png)
122
+ ![ROC + Confusion Matrix - Dataset C](https://huggingface.co/parlange/cait-gravit-b3/resolve/main/roc_confusion_matrix/CaiT_roc_confusion_matrix_c.png)
123
+ ![ROC + Confusion Matrix - Dataset D](https://huggingface.co/parlange/cait-gravit-b3/resolve/main/roc_confusion_matrix/CaiT_roc_confusion_matrix_d.png)
124
+ ![ROC + Confusion Matrix - Dataset E](https://huggingface.co/parlange/cait-gravit-b3/resolve/main/roc_confusion_matrix/CaiT_roc_confusion_matrix_e.png)
125
+ ![ROC + Confusion Matrix - Dataset F](https://huggingface.co/parlange/cait-gravit-b3/resolve/main/roc_confusion_matrix/CaiT_roc_confusion_matrix_f.png)
126
+ ![ROC + Confusion Matrix - Dataset G](https://huggingface.co/parlange/cait-gravit-b3/resolve/main/roc_confusion_matrix/CaiT_roc_confusion_matrix_g.png)
127
+ ![ROC + Confusion Matrix - Dataset H](https://huggingface.co/parlange/cait-gravit-b3/resolve/main/roc_confusion_matrix/CaiT_roc_confusion_matrix_h.png)
128
+ ![ROC + Confusion Matrix - Dataset I](https://huggingface.co/parlange/cait-gravit-b3/resolve/main/roc_confusion_matrix/CaiT_roc_confusion_matrix_i.png)
129
+ ![ROC + Confusion Matrix - Dataset J](https://huggingface.co/parlange/cait-gravit-b3/resolve/main/roc_confusion_matrix/CaiT_roc_confusion_matrix_j.png)
130
+ ![ROC + Confusion Matrix - Dataset K](https://huggingface.co/parlange/cait-gravit-b3/resolve/main/roc_confusion_matrix/CaiT_roc_confusion_matrix_k.png)
131
+ ![ROC + Confusion Matrix - Dataset L](https://huggingface.co/parlange/cait-gravit-b3/resolve/main/roc_confusion_matrix/CaiT_roc_confusion_matrix_l.png)
132
+
133
+ ### 📋 Performance Summary
134
+
135
+ Average performance across 12 test datasets from the Common Test Sample (More et al. 2024):
136
+
137
+ | Metric | Value |
138
+ |-----------|----------|
139
+ | 🎯 Average Accuracy | 0.8430 |
140
+ | 📈 Average AUC-ROC | 0.8265 |
141
+ | ⚖️ Average F1-Score | 0.5613 |
142
+
143
+
144
+ ## 📘 Citation
145
+
146
+ If you use this model in your research, please cite:
147
+
148
+ ```bibtex
149
+ @misc{parlange2025gravit,
150
+ title={GraViT: Transfer Learning with Vision Transformers and MLP-Mixer for Strong Gravitational Lens Discovery},
151
+ author={René Parlange and Juan C. Cuevas-Tello and Octavio Valenzuela and Omar de J. Cabrera-Rosas and Tomás Verdugo and Anupreeta More and Anton T. Jaelani},
152
+ year={2025},
153
+ eprint={2509.00226},
154
+ archivePrefix={arXiv},
155
+ primaryClass={cs.CV},
156
+ url={https://arxiv.org/abs/2509.00226},
157
+ }
158
+ ```
159
+
160
+ ---
161
+
162
+
163
+ ## Model Card Contact
164
+
165
+ For questions about this model, please contact the author through: https://github.com/parlange/
cait-gravit-b3.pth ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:58a11eeed0bbeecd6477d03f9a0f51c3b0a86777dafc794489c2bbd77f68b10b
3
+ size 186296378
config.json ADDED
@@ -0,0 +1,76 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "architecture": "cait_s24_224",
3
+ "num_classes": 2,
4
+ "num_features": 1000,
5
+ "global_pool": "avg",
6
+ "crop_pct": 0.875,
7
+ "interpolation": "bicubic",
8
+ "mean": [
9
+ 0.485,
10
+ 0.456,
11
+ 0.406
12
+ ],
13
+ "std": [
14
+ 0.229,
15
+ 0.224,
16
+ 0.225
17
+ ],
18
+ "first_conv": "conv1",
19
+ "classifier": "fc",
20
+ "input_size": [
21
+ 3,
22
+ 224,
23
+ 224
24
+ ],
25
+ "pool_size": [
26
+ 7,
27
+ 7
28
+ ],
29
+ "pretrained_cfg": {
30
+ "tag": "gravit_b3",
31
+ "custom_load": false,
32
+ "input_size": [
33
+ 3,
34
+ 224,
35
+ 224
36
+ ],
37
+ "fixed_input_size": true,
38
+ "interpolation": "bicubic",
39
+ "crop_pct": 0.875,
40
+ "crop_mode": "center",
41
+ "mean": [
42
+ 0.485,
43
+ 0.456,
44
+ 0.406
45
+ ],
46
+ "std": [
47
+ 0.229,
48
+ 0.224,
49
+ 0.225
50
+ ],
51
+ "num_classes": 2,
52
+ "pool_size": [
53
+ 7,
54
+ 7
55
+ ],
56
+ "first_conv": "conv1",
57
+ "classifier": "fc"
58
+ },
59
+ "model_name": "cait_gravit_b3",
60
+ "experiment": "b3",
61
+ "training_strategy": "all-blocks",
62
+ "dataset": "J24",
63
+ "hyperparameters": {
64
+ "batch_size": "192",
65
+ "learning_rate": "AdamW with ReduceLROnPlateau",
66
+ "epochs": "100",
67
+ "patience": "10",
68
+ "optimizer": "AdamW",
69
+ "scheduler": "ReduceLROnPlateau",
70
+ "image_size": "224x224",
71
+ "fine_tune_mode": "all_blocks",
72
+ "stochastic_depth_probability": "0.1"
73
+ },
74
+ "hf_hub_id": "parlange/cait-gravit-b3",
75
+ "license": "apache-2.0"
76
+ }
confusion_matrices/CaiT_Confusion_Matrix_a.png ADDED
confusion_matrices/CaiT_Confusion_Matrix_b.png ADDED
confusion_matrices/CaiT_Confusion_Matrix_c.png ADDED
confusion_matrices/CaiT_Confusion_Matrix_d.png ADDED
confusion_matrices/CaiT_Confusion_Matrix_e.png ADDED
confusion_matrices/CaiT_Confusion_Matrix_f.png ADDED
confusion_matrices/CaiT_Confusion_Matrix_g.png ADDED
confusion_matrices/CaiT_Confusion_Matrix_h.png ADDED
confusion_matrices/CaiT_Confusion_Matrix_i.png ADDED
confusion_matrices/CaiT_Confusion_Matrix_j.png ADDED
confusion_matrices/CaiT_Confusion_Matrix_k.png ADDED
confusion_matrices/CaiT_Confusion_Matrix_l.png ADDED
evaluation_results.csv ADDED
@@ -0,0 +1,133 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ Model,Dataset,Loss,Accuracy,AUCROC,F1
2
+ ViT,a,0.3790108698956886,0.9188934297390757,0.7895534069981583,0.40825688073394495
3
+ ViT,b,0.6474953413815365,0.8736246463376297,0.7928941068139963,0.30689655172413793
4
+ ViT,c,0.3887653840258975,0.9154353976736875,0.7815948434622468,0.3982102908277405
5
+ ViT,d,0.4339898591969456,0.9006601697579377,0.7802946593001843,0.3603238866396761
6
+ ViT,e,0.7275202240136531,0.8770581778265643,0.8333762203890108,0.6137931034482759
7
+ ViT,f,0.32068420921553115,0.923398652311982,0.7887962050752345,0.15252784918594686
8
+ ViT,g,2.4499126282334327,0.5935,0.7337697777777779,0.41664673523080603
9
+ ViT,h,2.3127426176071166,0.6156666666666667,0.7213267777777778,0.43033596837944665
10
+ ViT,i,2.3367191193699837,0.6078333333333333,0.7155961666666667,0.4253968253968254
11
+ ViT,j,0.43574089199304583,0.9035,0.9639603333333334,0.9041549412348949
12
+ ViT,k,0.3225473812222481,0.9178333333333333,0.9721483333333332,0.9172124265323258
13
+ ViT,l,0.9802426720999076,0.8208978901168632,0.848824705820602,0.685486117559662
14
+ MLP-Mixer,a,0.24821871188368222,0.9342973907576234,0.7274953959484346,0.40114613180515757
15
+ MLP-Mixer,b,0.3468351167483511,0.8972021376925495,0.7498066298342543,0.29978586723768735
16
+ MLP-Mixer,c,0.22863765820715645,0.9415278214397989,0.7362136279926336,0.4294478527607362
17
+ MLP-Mixer,d,0.22197297275188171,0.9402703552342031,0.7448802946593002,0.42424242424242425
18
+ MLP-Mixer,e,0.5210685612128804,0.8682766190998902,0.8295996367214108,0.5384615384615384
19
+ MLP-Mixer,f,0.1573308274558462,0.9544574393927658,0.7447600612812645,0.19230769230769232
20
+ MLP-Mixer,g,2.0364777975082395,0.5281666666666667,0.5996590555555555,0.21382949180783115
21
+ MLP-Mixer,h,1.9738134279251098,0.5516666666666666,0.5622524444444443,0.22254335260115607
22
+ MLP-Mixer,i,1.9702800275236367,0.551,0.5762447222222222,0.2222863741339492
23
+ MLP-Mixer,j,0.35843076133728025,0.875,0.9505481111111111,0.8680042238648363
24
+ MLP-Mixer,k,0.2922329999357462,0.8978333333333334,0.9618148888888888,0.8894499549143372
25
+ MLP-Mixer,l,0.7973672104744812,0.8023901433028396,0.7688593430466856,0.6098757699133521
26
+ CvT,a,0.5576080598665536,0.8827412763281987,0.7585920810313076,0.33511586452762926
27
+ CvT,b,0.8406257543388458,0.839987425337944,0.7651546961325967,0.2697274031563845
28
+ CvT,c,0.45941355157408165,0.9053756680289218,0.7661510128913444,0.38445807770961143
29
+ CvT,d,0.4890136360203984,0.899402703552342,0.8040036832412524,0.3700787401574803
30
+ CvT,e,0.7525440884670233,0.8616904500548848,0.7940134715810185,0.5987261146496815
31
+ CvT,f,0.4638477216021927,0.900782278677097,0.7746531228706713,0.1279782164737917
32
+ CvT,g,2.635408263206482,0.5545,0.6485876666666667,0.359146487652841
33
+ CvT,h,2.433302253484726,0.5891666666666666,0.6446558888888889,0.3779964673227353
34
+ CvT,i,2.4489952618479727,0.586,0.699406611111111,0.3761928679055751
35
+ CvT,j,0.6779288778305054,0.8571666666666666,0.929701777777778,0.8568565224653416
36
+ CvT,k,0.4915158650279045,0.8886666666666667,0.9551427222222222,0.8847878578820283
37
+ CvT,l,1.142270628382002,0.7902279096821956,0.8003466836323919,0.6321060929240471
38
+ Swin,a,0.4143876506730619,0.8641936497956617,0.7043434622467771,0.2627986348122867
39
+ Swin,b,0.6756683163993653,0.8022634391700723,0.720195211786372,0.1966794380587484
40
+ Swin,c,0.3583531161967556,0.8934297390757623,0.7160773480662983,0.31237322515212984
41
+ Swin,d,0.29349016994284743,0.9135491983652939,0.7763278084714549,0.358974358974359
42
+ Swin,e,0.5968914167254215,0.845225027442371,0.7776356618481798,0.5220338983050847
43
+ Swin,f,0.3336647423351405,0.8915653318875377,0.7320114316466519,0.0990990990990991
44
+ Swin,g,2.182571418762207,0.49166666666666664,0.5955143333333334,0.2375
45
+ Swin,h,2.0143414651155473,0.54,0.5570215555555555,0.2560646900269542
46
+ Swin,i,1.9799532911777495,0.5506666666666666,0.6583314444444445,0.2605595172792101
47
+ Swin,j,0.3886156997680664,0.8788333333333334,0.964886111111111,0.8850229321524593
48
+ Swin,k,0.1859975779056549,0.9378333333333333,0.9823477777777778,0.9375104707656223
49
+ Swin,l,0.8632472551865445,0.7817672254243562,0.7818820389829563,0.618823312090145
50
+ CaiT,a,0.23817305107586642,0.9374410562716127,0.8123416206261509,0.5134474327628362
51
+ CaiT,b,0.3217430385037512,0.9104055328513047,0.8156022099447514,0.42424242424242425
52
+ CaiT,c,0.23880257207580574,0.9374410562716127,0.8067292817679558,0.5134474327628362
53
+ CaiT,d,0.28198474085357195,0.9173215969820812,0.8376279926335175,0.4439746300211416
54
+ CaiT,e,0.5301304184284744,0.8682766190998902,0.8315673957466131,0.6363636363636364
55
+ CaiT,f,0.19037221234553495,0.9409805592130741,0.8188489798752676,0.21604938271604937
56
+ CaiT,g,1.6843535647392274,0.6265,0.7207983333333333,0.46349054345223845
57
+ CaiT,h,1.6403812870979309,0.6408333333333334,0.7088211111111111,0.4732339281349303
58
+ CaiT,i,1.663275026500225,0.6301666666666667,0.7565758888888889,0.46594464500601684
59
+ CaiT,j,0.23444856452941895,0.9306666666666666,0.9765742777777777,0.9306897700766411
60
+ CaiT,k,0.21337003868818283,0.9343333333333333,0.9810072777777779,0.934113712374582
61
+ CaiT,l,0.6691959589620489,0.8413092908889006,0.851808223589687,0.7203950433243268
62
+ DeiT,a,0.27532369791711886,0.9355548569632192,0.7400883977900553,0.4383561643835616
63
+ DeiT,b,0.37732297870225406,0.9009745363093367,0.7632817679558012,0.3368421052631579
64
+ DeiT,c,0.27752348167909013,0.9393272555800063,0.7630893186003682,0.45325779036827196
65
+ DeiT,d,0.27014286825489003,0.9364979566174159,0.795318600368324,0.4419889502762431
66
+ DeiT,e,0.6625746366344024,0.8726673984632273,0.80245213047756,0.5797101449275363
67
+ DeiT,f,0.17108403316157025,0.9514367593524902,0.767566717155716,0.20330368487928843
68
+ DeiT,g,2.1656926336288453,0.59,0.6878379444444445,0.3800403225806452
69
+ DeiT,h,2.11278225171566,0.6103333333333333,0.6916527777777778,0.3920956838273531
70
+ DeiT,i,2.108869301110506,0.6088333333333333,0.7280663333333334,0.39118028534370947
71
+ DeiT,j,0.31411294651031496,0.9131666666666667,0.9682927222222223,0.9117995598442525
72
+ DeiT,k,0.25728961780667303,0.932,0.9767967777777777,0.9295823265447014
73
+ DeiT,l,0.83515618283903,0.8318439003754429,0.831689744806227,0.6892710572601134
74
+ DeiT3,a,0.27852537169354546,0.9038038352719271,0.7915755064456723,0.39763779527559057
75
+ DeiT3,b,0.33796260499590713,0.8667085822068532,0.8198545119705342,0.3226837060702875
76
+ DeiT3,c,0.26950191666367146,0.9075762338887142,0.800132596685083,0.40725806451612906
77
+ DeiT3,d,0.2598652228155589,0.908519333542911,0.8328038674033148,0.40973630831643004
78
+ DeiT3,e,0.3859063482278003,0.8792535675082327,0.8585105577840006,0.6474358974358975
79
+ DeiT3,f,0.2226914082389882,0.9144140655255208,0.8138108526862634,0.1545524100994644
80
+ DeiT3,g,1.2969025439023971,0.6038333333333333,0.732013,0.4486198097889121
81
+ DeiT3,h,1.2606069605350494,0.6255,0.7032497222222223,0.4625687634537192
82
+ DeiT3,i,1.2554979050159454,0.626,0.750082888888889,0.46290090952608903
83
+ DeiT3,j,0.2535483182668686,0.9056666666666666,0.9677222222222222,0.907546553413917
84
+ DeiT3,k,0.2121436755657196,0.9278333333333333,0.9726957777777777,0.9277007847720822
85
+ DeiT3,l,0.559908740118049,0.8223256305853736,0.8453933340814723,0.6959826275787188
86
+ Twins_SVT,a,0.32229545287125066,0.9047469349261239,0.7872716390423573,0.40471512770137524
87
+ Twins_SVT,b,0.4744058540550578,0.858220685319082,0.799073664825046,0.3135464231354642
88
+ Twins_SVT,c,0.23853482503686377,0.9327255580006287,0.8031399631675874,0.49047619047619045
89
+ Twins_SVT,d,0.20696176561834076,0.9386985224772084,0.8572504604051567,0.513715710723192
90
+ Twins_SVT,e,0.4441066685037608,0.8792535675082327,0.828918489366533,0.6518987341772152
91
+ Twins_SVT,f,0.24040106920338317,0.9255673456742313,0.8126722450556175,0.17652099400171378
92
+ Twins_SVT,g,1.6869367780685425,0.5768333333333333,0.6634996111111111,0.3964820537199905
93
+ Twins_SVT,h,1.5618858572244645,0.6163333333333333,0.6534029444444445,0.42015113350125943
94
+ Twins_SVT,i,1.5451468470990657,0.6195,0.7416233888888889,0.42217160212604404
95
+ Twins_SVT,j,0.4541445074081421,0.8555,0.9362851666666666,0.8525259397856778
96
+ Twins_SVT,k,0.3123545649945736,0.8981666666666667,0.9654593888888889,0.8913391428063311
97
+ Twins_SVT,l,0.7169495407413323,0.8085241393897732,0.811499193039308,0.6553726087370324
98
+ Twins_PCPVT,a,0.3407551253865028,0.9157497642250865,0.7543489871086556,0.36792452830188677
99
+ Twins_PCPVT,b,0.4568046205300874,0.8827412763281987,0.7560791896869244,0.2948960302457467
100
+ Twins_PCPVT,c,0.3403567170729543,0.9163784973278843,0.7330865561694292,0.3696682464454976
101
+ Twins_PCPVT,d,0.2872482358388127,0.9276956931782459,0.7866114180478823,0.40414507772020725
102
+ Twins_PCPVT,e,0.6489756450539559,0.8704720087815587,0.8124347233784909,0.5693430656934306
103
+ Twins_PCPVT,f,0.22982485620439833,0.9347068391294245,0.7606799529540434,0.15615615615615616
104
+ Twins_PCPVT,g,2.0454217346906662,0.5676666666666667,0.6946129999999999,0.3426254434870755
105
+ Twins_PCPVT,h,1.9836849460601806,0.5855,0.667161,0.3521750455847877
106
+ Twins_PCPVT,i,1.9555286951363087,0.5915,0.7357337222222222,0.35550880883513014
107
+ Twins_PCPVT,j,0.3498825296163559,0.9013333333333333,0.9614410555555556,0.9004707464694015
108
+ Twins_PCPVT,k,0.2599894929230213,0.9251666666666667,0.9737956666666667,0.9226528854435831
109
+ Twins_PCPVT,l,0.8202423188098834,0.8155042039024906,0.8326626157015399,0.662996232975949
110
+ PiT,a,0.43440851872726566,0.9195221628418736,0.6457163904235728,0.37254901960784315
111
+ PiT,b,0.6821634257150798,0.8792832442628105,0.7135303867403315,0.2835820895522388
112
+ PiT,c,0.4489740155864459,0.9166928638792833,0.6253001841620626,0.3645083932853717
113
+ PiT,d,0.36322150241931406,0.9352404904118202,0.7464861878453039,0.4245810055865922
114
+ PiT,e,0.8982287239688157,0.8726673984632273,0.8314765760992961,0.5671641791044776
115
+ PiT,f,0.29423939931023113,0.9374951591666021,0.6912865159517909,0.15849843587069865
116
+ PiT,g,3.0194950482845306,0.562,0.6416187777777778,0.33129770992366414
117
+ PiT,h,2.895865800142288,0.5818333333333333,0.522822,0.3416426134872737
118
+ PiT,i,2.8504026728868483,0.5916666666666667,0.6812351666666667,0.34701492537313433
119
+ PiT,j,0.4774704883098602,0.8945,0.9592785555555556,0.8931645569620253
120
+ PiT,k,0.30837811005115506,0.9241666666666667,0.975773277777778,0.9208282582216809
121
+ PiT,l,1.1643054382698763,0.8143937390936492,0.7846461923134944,0.6577613104524181
122
+ Ensemble,a,,0.9289531593838416,0.7939346224677717,0.44607843137254904
123
+ Ensemble,b,,0.8783401446086136,0.8044364640883978,0.31985940246045697
124
+ Ensemble,c,,0.9383841559258095,0.7978895027624309,0.48148148148148145
125
+ Ensemble,d,,0.9383841559258095,0.8286721915285451,0.48148148148148145
126
+ Ensemble,e,,0.8880351262349067,0.850836297585711,0.6408450704225352
127
+ Ensemble,f,,0.9421423592285648,0.8087909536354286,0.19590958019375673
128
+ Ensemble,g,,0.5633333333333334,0.7028778888888889,0.34071464519375944
129
+ Ensemble,h,,0.5951666666666666,0.6861525555555557,0.35791699709225483
130
+ Ensemble,i,,0.5951666666666666,0.73121,0.35791699709225483
131
+ Ensemble,j,,0.9088333333333334,0.9700168888888888,0.9095419216140235
132
+ Ensemble,k,,0.9406666666666667,0.9819625555555556,0.9392076502732241
133
+ Ensemble,l,,0.8244408016498335,0.8396341104616648,0.6794129007338741
model.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:f3548177142e4f259672d92d9a72c7ca701da833d2f99582b686a7256b326775
3
+ size 186172152
pytorch_model.bin ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:58a11eeed0bbeecd6477d03f9a0f51c3b0a86777dafc794489c2bbd77f68b10b
3
+ size 186296378
roc_confusion_matrix/CaiT_roc_confusion_matrix_a.png ADDED
roc_confusion_matrix/CaiT_roc_confusion_matrix_b.png ADDED
roc_confusion_matrix/CaiT_roc_confusion_matrix_c.png ADDED
roc_confusion_matrix/CaiT_roc_confusion_matrix_d.png ADDED
roc_confusion_matrix/CaiT_roc_confusion_matrix_e.png ADDED
roc_confusion_matrix/CaiT_roc_confusion_matrix_f.png ADDED
roc_confusion_matrix/CaiT_roc_confusion_matrix_g.png ADDED
roc_confusion_matrix/CaiT_roc_confusion_matrix_h.png ADDED
roc_confusion_matrix/CaiT_roc_confusion_matrix_i.png ADDED
roc_confusion_matrix/CaiT_roc_confusion_matrix_j.png ADDED
roc_confusion_matrix/CaiT_roc_confusion_matrix_k.png ADDED
roc_confusion_matrix/CaiT_roc_confusion_matrix_l.png ADDED
roc_curves/CaiT_ROC_a.png ADDED
roc_curves/CaiT_ROC_b.png ADDED
roc_curves/CaiT_ROC_c.png ADDED
roc_curves/CaiT_ROC_d.png ADDED
roc_curves/CaiT_ROC_e.png ADDED
roc_curves/CaiT_ROC_f.png ADDED
roc_curves/CaiT_ROC_g.png ADDED
roc_curves/CaiT_ROC_h.png ADDED
roc_curves/CaiT_ROC_i.png ADDED
roc_curves/CaiT_ROC_j.png ADDED
roc_curves/CaiT_ROC_k.png ADDED
roc_curves/CaiT_ROC_l.png ADDED
training_curves/CaiT_accuracy.png ADDED
training_curves/CaiT_auc.png ADDED
training_curves/CaiT_combined_metrics.png ADDED

Git LFS Details

  • SHA256: 9e48ef28eb8359ce2ddcfc9d683e74b692d6ba342680354987421b12a1325b21
  • Pointer size: 131 Bytes
  • Size of remote file: 138 kB
training_curves/CaiT_f1.png ADDED
training_curves/CaiT_loss.png ADDED
training_curves/CaiT_metrics.csv ADDED
@@ -0,0 +1,26 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ epoch,train_loss,val_loss,train_accuracy,val_accuracy,train_auc,val_auc,train_f1,val_f1
2
+ 1,0.16319053076783863,0.06426756953598986,0.9319399785637728,0.9796355841371919,0.9828557740410238,0.9966926635488789,0.9308655416439847,0.9794372294372294
3
+ 2,0.06834075130984997,0.06178499017976871,0.9768432334856434,0.9812433011789925,0.9958335609810347,0.9977633261304852,0.9766860712764447,0.9809056192034915
4
+ 3,0.05956324981360049,0.05794612078708851,0.9795509674507813,0.9801714898177921,0.9968159075513607,0.9970418925454544,0.979419195503449,0.9799675148890092
5
+ 4,0.05225409181792662,0.05390293830316933,0.9824561403508771,0.9817792068595927,0.9973848875829588,0.9979971027778639,0.9823405825904264,0.9817007534983854
6
+ 5,0.05099076973386254,0.044286027682460004,0.9829920460314774,0.984994640943194,0.9976094008026417,0.9985341572380582,0.9828873059568067,0.9849785407725322
7
+ 6,0.043280831175793406,0.0475954504689603,0.9854741354995205,0.9844587352625938,0.9982134536162212,0.9986375474015181,0.9853979415350591,0.9842988630211154
8
+ 7,0.040034028190352185,0.055318317145492485,0.987504935973374,0.9839228295819936,0.998398373717598,0.9986318035035481,0.9874465130777296,0.9839400428265525
9
+ 8,0.040712364526100824,0.053482549414757365,0.986545946860721,0.9823151125401929,0.9984449422510566,0.9981389770577227,0.9864914615842089,0.9823245848955544
10
+ 9,0.03604965847263604,0.05389839756958354,0.9878998138432898,0.9839228295819936,0.998734180131549,0.9978104260938392,0.9878501231980514,0.983957219251337
11
+ 10,0.03513301415206575,0.04746750639182579,0.9883793083996164,0.9866023579849946,0.9988209925462693,0.9983147403356045,0.9883299342850669,0.9865951742627346
12
+ 11,0.03577863203289445,0.05294743896584802,0.9877587860326056,0.984994640943194,0.9988438790361309,0.9976857835078904,0.9877074718178214,0.9848812095032398
13
+ 12,0.02777121727852717,0.04766687935284096,0.9910306312404806,0.9871382636655949,0.9992824264416935,0.9984755694787641,0.9909986413043478,0.9870967741935484
14
+ 13,0.023669739699384892,0.050936045873778424,0.9921588537259548,0.9866023579849946,0.9994208718849524,0.9984836109359222,0.9921353400475275,0.9865663621708759
15
+ 14,0.022428250333988077,0.05070090880827122,0.9932024595250183,0.9876741693461951,0.9994492954288129,0.9980034210656309,0.9931811108281696,0.9876543209876543
16
+ 15,0.022516270096097465,0.047786054575749914,0.992892198341513,0.9871382636655949,0.9994642072880302,0.9986082535218711,0.9928708837840896,0.9870967741935484
17
+ 16,0.021324889257141714,0.049530307028646255,0.9930050205900603,0.9876741693461951,0.9995353584319144,0.9986961351608119,0.9929888047042859,0.9876277568585261
18
+ 17,0.01956209415003448,0.047512864640096374,0.9937101596434817,0.9887459807073955,0.9996440883520467,0.9986806266362929,0.9936943305528064,0.9887157442235357
19
+ 18,0.019650975176927418,0.050015772845584096,0.9933998984599763,0.9882100750267953,0.9996132797380874,0.9981251917025947,0.993382727221311,0.9881847475832438
20
+ 19,0.01757498224456425,0.050443994078988814,0.9938229818920291,0.9882100750267953,0.9997128751102291,0.9981188734148277,0.9938084871787622,0.9881847475832438
21
+ 20,0.018919436069639462,0.049963348643956074,0.993653748519208,0.9876741693461951,0.9996542873509353,0.9981355307189407,0.9936377774635939,0.9876543209876543
22
+ 21,0.019814754205029775,0.04953950315330573,0.9931460484007446,0.9876741693461951,0.9996500979649644,0.9981498904638657,0.9931291882263127,0.9876410531972059
23
+ 22,0.01891736049970423,0.049022212313110804,0.9936255429570712,0.9876741693461951,0.9996817928257848,0.998665692501571,0.9936111268163058,0.9876410531972059
24
+ 23,0.018869503482074217,0.049181231779684205,0.994020420826987,0.9876741693461951,0.9996414884824392,0.9981470185148806,0.9940048639782818,0.9876410531972059
25
+ 24,0.01867372968988605,0.04912239593420765,0.9938793930163028,0.9882100750267953,0.9996259131314065,0.998641568130097,0.993863989820444,0.9881847475832438
26
+ 25,0.018832523674422168,0.04942907131848995,0.99345630958425,0.9882100750267953,0.9996565976389831,0.9981194478046248,0.9934389140271493,0.9881847475832438
training_metrics.csv ADDED
@@ -0,0 +1,26 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ epoch,train_loss,val_loss,train_accuracy,val_accuracy,train_auc,val_auc,train_f1,val_f1
2
+ 1,0.16319053076783863,0.06426756953598986,0.9319399785637728,0.9796355841371919,0.9828557740410238,0.9966926635488789,0.9308655416439847,0.9794372294372294
3
+ 2,0.06834075130984997,0.06178499017976871,0.9768432334856434,0.9812433011789925,0.9958335609810347,0.9977633261304852,0.9766860712764447,0.9809056192034915
4
+ 3,0.05956324981360049,0.05794612078708851,0.9795509674507813,0.9801714898177921,0.9968159075513607,0.9970418925454544,0.979419195503449,0.9799675148890092
5
+ 4,0.05225409181792662,0.05390293830316933,0.9824561403508771,0.9817792068595927,0.9973848875829588,0.9979971027778639,0.9823405825904264,0.9817007534983854
6
+ 5,0.05099076973386254,0.044286027682460004,0.9829920460314774,0.984994640943194,0.9976094008026417,0.9985341572380582,0.9828873059568067,0.9849785407725322
7
+ 6,0.043280831175793406,0.0475954504689603,0.9854741354995205,0.9844587352625938,0.9982134536162212,0.9986375474015181,0.9853979415350591,0.9842988630211154
8
+ 7,0.040034028190352185,0.055318317145492485,0.987504935973374,0.9839228295819936,0.998398373717598,0.9986318035035481,0.9874465130777296,0.9839400428265525
9
+ 8,0.040712364526100824,0.053482549414757365,0.986545946860721,0.9823151125401929,0.9984449422510566,0.9981389770577227,0.9864914615842089,0.9823245848955544
10
+ 9,0.03604965847263604,0.05389839756958354,0.9878998138432898,0.9839228295819936,0.998734180131549,0.9978104260938392,0.9878501231980514,0.983957219251337
11
+ 10,0.03513301415206575,0.04746750639182579,0.9883793083996164,0.9866023579849946,0.9988209925462693,0.9983147403356045,0.9883299342850669,0.9865951742627346
12
+ 11,0.03577863203289445,0.05294743896584802,0.9877587860326056,0.984994640943194,0.9988438790361309,0.9976857835078904,0.9877074718178214,0.9848812095032398
13
+ 12,0.02777121727852717,0.04766687935284096,0.9910306312404806,0.9871382636655949,0.9992824264416935,0.9984755694787641,0.9909986413043478,0.9870967741935484
14
+ 13,0.023669739699384892,0.050936045873778424,0.9921588537259548,0.9866023579849946,0.9994208718849524,0.9984836109359222,0.9921353400475275,0.9865663621708759
15
+ 14,0.022428250333988077,0.05070090880827122,0.9932024595250183,0.9876741693461951,0.9994492954288129,0.9980034210656309,0.9931811108281696,0.9876543209876543
16
+ 15,0.022516270096097465,0.047786054575749914,0.992892198341513,0.9871382636655949,0.9994642072880302,0.9986082535218711,0.9928708837840896,0.9870967741935484
17
+ 16,0.021324889257141714,0.049530307028646255,0.9930050205900603,0.9876741693461951,0.9995353584319144,0.9986961351608119,0.9929888047042859,0.9876277568585261
18
+ 17,0.01956209415003448,0.047512864640096374,0.9937101596434817,0.9887459807073955,0.9996440883520467,0.9986806266362929,0.9936943305528064,0.9887157442235357
19
+ 18,0.019650975176927418,0.050015772845584096,0.9933998984599763,0.9882100750267953,0.9996132797380874,0.9981251917025947,0.993382727221311,0.9881847475832438
20
+ 19,0.01757498224456425,0.050443994078988814,0.9938229818920291,0.9882100750267953,0.9997128751102291,0.9981188734148277,0.9938084871787622,0.9881847475832438
21
+ 20,0.018919436069639462,0.049963348643956074,0.993653748519208,0.9876741693461951,0.9996542873509353,0.9981355307189407,0.9936377774635939,0.9876543209876543
22
+ 21,0.019814754205029775,0.04953950315330573,0.9931460484007446,0.9876741693461951,0.9996500979649644,0.9981498904638657,0.9931291882263127,0.9876410531972059
23
+ 22,0.01891736049970423,0.049022212313110804,0.9936255429570712,0.9876741693461951,0.9996817928257848,0.998665692501571,0.9936111268163058,0.9876410531972059
24
+ 23,0.018869503482074217,0.049181231779684205,0.994020420826987,0.9876741693461951,0.9996414884824392,0.9981470185148806,0.9940048639782818,0.9876410531972059
25
+ 24,0.01867372968988605,0.04912239593420765,0.9938793930163028,0.9882100750267953,0.9996259131314065,0.998641568130097,0.993863989820444,0.9881847475832438
26
+ 25,0.018832523674422168,0.04942907131848995,0.99345630958425,0.9882100750267953,0.9996565976389831,0.9981194478046248,0.9934389140271493,0.9881847475832438