yolo_finetuned_fruits
This model is a fine-tuned version of hustvl/yolos-tiny on the None dataset. It achieves the following results on the evaluation set:
- Loss: 1.2685
- Map: 0.2896
- Map 50: 0.6734
- Map 75: 0.2043
- Map Small: -1.0
- Map Medium: 0.1549
- Map Large: 0.3044
- Mar 1: 0.0408
- Mar 10: 0.2803
- Mar 100: 0.4733
- Mar Small: -1.0
- Mar Medium: 0.2421
- Mar Large: 0.4972
- Map Grape: 0.2896
- Mar 100 Grape: 0.4733
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 4
- eval_batch_size: 8
- seed: 42
- optimizer: Use OptimizerNames.ADAMW_TORCH_FUSED with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: cosine
- num_epochs: 30
Training results
| Training Loss | Epoch | Step | Validation Loss | Map | Map 50 | Map 75 | Map Small | Map Medium | Map Large | Mar 1 | Mar 10 | Mar 100 | Mar Small | Mar Medium | Mar Large | Map Grape | Mar 100 Grape |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
| No log | 1.0 | 61 | 1.6346 | 0.1421 | 0.4179 | 0.0503 | -1.0 | 0.0666 | 0.152 | 0.029 | 0.1704 | 0.3311 | -1.0 | 0.1092 | 0.3539 | 0.1421 | 0.3311 |
| No log | 2.0 | 122 | 1.5049 | 0.1757 | 0.4964 | 0.0709 | -1.0 | 0.0814 | 0.1864 | 0.0262 | 0.1978 | 0.3781 | -1.0 | 0.1855 | 0.398 | 0.1757 | 0.3781 |
| No log | 3.0 | 183 | 1.5005 | 0.156 | 0.4872 | 0.0547 | -1.0 | 0.0436 | 0.1675 | 0.0306 | 0.1893 | 0.362 | -1.0 | 0.1382 | 0.3851 | 0.156 | 0.362 |
| No log | 4.0 | 244 | 1.4297 | 0.1915 | 0.5511 | 0.0842 | -1.0 | 0.0976 | 0.2021 | 0.034 | 0.2072 | 0.3834 | -1.0 | 0.1908 | 0.4033 | 0.1915 | 0.3834 |
| No log | 5.0 | 305 | 1.3976 | 0.2206 | 0.5831 | 0.105 | -1.0 | 0.1182 | 0.2322 | 0.0316 | 0.2292 | 0.4091 | -1.0 | 0.1947 | 0.4312 | 0.2206 | 0.4091 |
| No log | 6.0 | 366 | 1.3509 | 0.2152 | 0.6007 | 0.1 | -1.0 | 0.102 | 0.2276 | 0.0316 | 0.2362 | 0.4203 | -1.0 | 0.1776 | 0.4453 | 0.2152 | 0.4203 |
| No log | 7.0 | 427 | 1.3724 | 0.224 | 0.6053 | 0.1031 | -1.0 | 0.1049 | 0.2369 | 0.0345 | 0.2394 | 0.4018 | -1.0 | 0.1868 | 0.424 | 0.224 | 0.4018 |
| No log | 8.0 | 488 | 1.3513 | 0.22 | 0.601 | 0.1175 | -1.0 | 0.1186 | 0.2317 | 0.0292 | 0.2441 | 0.4177 | -1.0 | 0.1895 | 0.4412 | 0.22 | 0.4177 |
| 1.4498 | 9.0 | 549 | 1.3314 | 0.2318 | 0.6336 | 0.1159 | -1.0 | 0.1465 | 0.2424 | 0.0337 | 0.2353 | 0.4227 | -1.0 | 0.2289 | 0.4427 | 0.2318 | 0.4227 |
| 1.4498 | 10.0 | 610 | 1.3243 | 0.2397 | 0.6146 | 0.1244 | -1.0 | 0.1173 | 0.2532 | 0.0377 | 0.2486 | 0.4226 | -1.0 | 0.1868 | 0.4469 | 0.2397 | 0.4226 |
| 1.4498 | 11.0 | 671 | 1.2976 | 0.255 | 0.6567 | 0.1328 | -1.0 | 0.1221 | 0.2697 | 0.0421 | 0.2604 | 0.4324 | -1.0 | 0.1987 | 0.4565 | 0.255 | 0.4324 |
| 1.4498 | 12.0 | 732 | 1.3261 | 0.2491 | 0.6486 | 0.1455 | -1.0 | 0.1193 | 0.2627 | 0.0356 | 0.2501 | 0.4378 | -1.0 | 0.2316 | 0.4591 | 0.2491 | 0.4378 |
| 1.4498 | 13.0 | 793 | 1.3027 | 0.2568 | 0.6481 | 0.1543 | -1.0 | 0.1451 | 0.2696 | 0.0397 | 0.2603 | 0.4334 | -1.0 | 0.25 | 0.4523 | 0.2568 | 0.4334 |
| 1.4498 | 14.0 | 854 | 1.3278 | 0.2566 | 0.6406 | 0.1619 | -1.0 | 0.118 | 0.2716 | 0.04 | 0.2598 | 0.4457 | -1.0 | 0.2 | 0.471 | 0.2566 | 0.4457 |
| 1.4498 | 15.0 | 915 | 1.3038 | 0.2582 | 0.639 | 0.1583 | -1.0 | 0.1343 | 0.2723 | 0.0362 | 0.2665 | 0.4504 | -1.0 | 0.2039 | 0.4757 | 0.2582 | 0.4504 |
| 1.4498 | 16.0 | 976 | 1.2797 | 0.2598 | 0.6437 | 0.1641 | -1.0 | 0.1357 | 0.2732 | 0.0386 | 0.2615 | 0.458 | -1.0 | 0.2237 | 0.4821 | 0.2598 | 0.458 |
| 1.1642 | 17.0 | 1037 | 1.2855 | 0.2704 | 0.6443 | 0.1712 | -1.0 | 0.1446 | 0.2845 | 0.0404 | 0.2698 | 0.4541 | -1.0 | 0.2487 | 0.4752 | 0.2704 | 0.4541 |
| 1.1642 | 18.0 | 1098 | 1.2894 | 0.2625 | 0.6503 | 0.1615 | -1.0 | 0.1269 | 0.2769 | 0.0387 | 0.258 | 0.4565 | -1.0 | 0.2184 | 0.481 | 0.2625 | 0.4565 |
| 1.1642 | 19.0 | 1159 | 1.2742 | 0.2754 | 0.6682 | 0.1886 | -1.0 | 0.1456 | 0.2897 | 0.0391 | 0.2706 | 0.4651 | -1.0 | 0.2289 | 0.4894 | 0.2754 | 0.4651 |
| 1.1642 | 20.0 | 1220 | 1.2764 | 0.2736 | 0.6651 | 0.1822 | -1.0 | 0.1411 | 0.2885 | 0.04 | 0.2677 | 0.464 | -1.0 | 0.2329 | 0.4878 | 0.2736 | 0.464 |
| 1.1642 | 21.0 | 1281 | 1.2587 | 0.2802 | 0.6697 | 0.1876 | -1.0 | 0.1557 | 0.2943 | 0.0396 | 0.2704 | 0.4692 | -1.0 | 0.2342 | 0.4934 | 0.2802 | 0.4692 |
| 1.1642 | 22.0 | 1342 | 1.2644 | 0.2819 | 0.6706 | 0.1836 | -1.0 | 0.1539 | 0.2961 | 0.0399 | 0.2716 | 0.4678 | -1.0 | 0.2316 | 0.4921 | 0.2819 | 0.4678 |
| 1.1642 | 23.0 | 1403 | 1.2664 | 0.2841 | 0.6736 | 0.1968 | -1.0 | 0.1454 | 0.2989 | 0.0394 | 0.2733 | 0.4727 | -1.0 | 0.2487 | 0.4958 | 0.2841 | 0.4727 |
| 1.1642 | 24.0 | 1464 | 1.2639 | 0.2866 | 0.6725 | 0.199 | -1.0 | 0.1453 | 0.3017 | 0.0418 | 0.2742 | 0.4747 | -1.0 | 0.2408 | 0.4988 | 0.2866 | 0.4747 |
| 0.9955 | 25.0 | 1525 | 1.2702 | 0.2847 | 0.6656 | 0.2011 | -1.0 | 0.1492 | 0.2997 | 0.0407 | 0.2789 | 0.4767 | -1.0 | 0.2474 | 0.5003 | 0.2847 | 0.4767 |
| 0.9955 | 26.0 | 1586 | 1.2608 | 0.2896 | 0.6766 | 0.2078 | -1.0 | 0.1485 | 0.3051 | 0.0412 | 0.2758 | 0.472 | -1.0 | 0.2355 | 0.4963 | 0.2896 | 0.472 |
| 0.9955 | 27.0 | 1647 | 1.2682 | 0.2908 | 0.6731 | 0.2145 | -1.0 | 0.147 | 0.3066 | 0.041 | 0.2805 | 0.4754 | -1.0 | 0.2382 | 0.4999 | 0.2908 | 0.4754 |
| 0.9955 | 28.0 | 1708 | 1.2645 | 0.29 | 0.676 | 0.2089 | -1.0 | 0.1541 | 0.3045 | 0.041 | 0.2803 | 0.4721 | -1.0 | 0.2447 | 0.4955 | 0.29 | 0.4721 |
| 0.9955 | 29.0 | 1769 | 1.2660 | 0.2888 | 0.6733 | 0.2036 | -1.0 | 0.1549 | 0.3035 | 0.0405 | 0.2802 | 0.4726 | -1.0 | 0.2421 | 0.4963 | 0.2888 | 0.4726 |
| 0.9955 | 30.0 | 1830 | 1.2685 | 0.2896 | 0.6734 | 0.2043 | -1.0 | 0.1549 | 0.3044 | 0.0408 | 0.2803 | 0.4733 | -1.0 | 0.2421 | 0.4972 | 0.2896 | 0.4733 |
Framework versions
- Transformers 4.57.6
- Pytorch 2.9.0+cu128
- Datasets 4.0.0
- Tokenizers 0.22.2
- Downloads last month
- 102
Model tree for rugarce/yolo_finetuned_fruits
Base model
hustvl/yolos-tiny