<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. -->
glpn-nyu-finetuned-diode-230113-130735
This model is a fine-tuned version of vinvino02/glpn-nyu on the diode-subset dataset. It achieves the following results on the evaluation set:
- Loss: 0.4320
- Mae: 0.4213
- Rmse: 0.6133
- Abs Rel: 0.4298
- Log Mae: 0.1697
- Log Rmse: 0.2216
- Delta1: 0.3800
- Delta2: 0.6396
- Delta3: 0.8189
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0003
- train_batch_size: 24
- eval_batch_size: 48
- seed: 2022
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.15
- num_epochs: 75
- mixed_precision_training: Native AMP
Training results
Training Loss | Epoch | Step | Validation Loss | Mae | Rmse | Abs Rel | Log Mae | Log Rmse | Delta1 | Delta2 | Delta3 |
---|---|---|---|---|---|---|---|---|---|---|---|
1.0073 | 1.0 | 72 | 0.4927 | 0.4684 | 0.6425 | 0.5680 | 0.1955 | 0.2515 | 0.3154 | 0.5289 | 0.7834 |
0.4694 | 2.0 | 144 | 0.4560 | 0.4425 | 0.6285 | 0.4674 | 0.1818 | 0.2341 | 0.3395 | 0.6061 | 0.7873 |
0.4632 | 3.0 | 216 | 0.4817 | 0.4646 | 0.6341 | 0.5412 | 0.1930 | 0.2453 | 0.3181 | 0.5368 | 0.7491 |
0.4363 | 4.0 | 288 | 0.4589 | 0.4379 | 0.6228 | 0.4880 | 0.1793 | 0.2348 | 0.3588 | 0.6025 | 0.7952 |
0.4636 | 5.0 | 360 | 0.4767 | 0.4545 | 0.6301 | 0.5367 | 0.1878 | 0.2430 | 0.3279 | 0.5716 | 0.7705 |
0.4642 | 6.0 | 432 | 0.4437 | 0.4185 | 0.6200 | 0.4405 | 0.1689 | 0.2283 | 0.4071 | 0.6531 | 0.8091 |
0.409 | 7.0 | 504 | 0.4787 | 0.4542 | 0.6291 | 0.5399 | 0.1873 | 0.2430 | 0.3345 | 0.5679 | 0.7648 |
0.4081 | 8.0 | 576 | 0.4545 | 0.4359 | 0.6258 | 0.4554 | 0.1779 | 0.2311 | 0.3717 | 0.6035 | 0.7952 |
0.4146 | 9.0 | 648 | 0.4726 | 0.4523 | 0.6293 | 0.5108 | 0.1870 | 0.2403 | 0.3394 | 0.5692 | 0.7571 |
0.392 | 10.0 | 720 | 0.4643 | 0.4453 | 0.6249 | 0.5081 | 0.1831 | 0.2372 | 0.3380 | 0.5881 | 0.7917 |
0.3722 | 11.0 | 792 | 0.4670 | 0.4475 | 0.6245 | 0.4957 | 0.1838 | 0.2355 | 0.3413 | 0.5739 | 0.7689 |
0.4397 | 12.0 | 864 | 0.4548 | 0.4367 | 0.6262 | 0.4604 | 0.1780 | 0.2319 | 0.3664 | 0.6081 | 0.7903 |
0.43 | 13.0 | 936 | 0.4281 | 0.4223 | 0.6230 | 0.3974 | 0.1691 | 0.2207 | 0.3975 | 0.6426 | 0.7943 |
0.3976 | 14.0 | 1008 | 0.4592 | 0.4470 | 0.6249 | 0.4759 | 0.1827 | 0.2321 | 0.3482 | 0.5784 | 0.7507 |
0.4251 | 15.0 | 1080 | 0.4515 | 0.4366 | 0.6205 | 0.4589 | 0.1773 | 0.2285 | 0.3689 | 0.5990 | 0.7785 |
0.4007 | 16.0 | 1152 | 0.4859 | 0.4668 | 0.6347 | 0.5570 | 0.1939 | 0.2467 | 0.3156 | 0.5378 | 0.7265 |
0.376 | 17.0 | 1224 | 0.4529 | 0.4331 | 0.6195 | 0.4421 | 0.1752 | 0.2260 | 0.3795 | 0.6016 | 0.7702 |
0.4028 | 18.0 | 1296 | 0.5027 | 0.4775 | 0.6420 | 0.6169 | 0.1993 | 0.2569 | 0.3098 | 0.5228 | 0.7035 |
0.3816 | 19.0 | 1368 | 0.4869 | 0.4634 | 0.6342 | 0.5565 | 0.1924 | 0.2473 | 0.3276 | 0.5448 | 0.7370 |
0.4092 | 20.0 | 1440 | 0.4317 | 0.4155 | 0.6164 | 0.4083 | 0.1661 | 0.2218 | 0.4003 | 0.6569 | 0.8123 |
0.3673 | 21.0 | 1512 | 0.4433 | 0.4326 | 0.6208 | 0.4295 | 0.1750 | 0.2244 | 0.3751 | 0.6068 | 0.7879 |
0.3698 | 22.0 | 1584 | 0.4607 | 0.4322 | 0.6216 | 0.4981 | 0.1758 | 0.2354 | 0.3831 | 0.6163 | 0.7906 |
0.3771 | 23.0 | 1656 | 0.4668 | 0.4478 | 0.6255 | 0.5075 | 0.1841 | 0.2373 | 0.3390 | 0.5819 | 0.7697 |
0.4343 | 24.0 | 1728 | 0.4532 | 0.4331 | 0.6203 | 0.4722 | 0.1767 | 0.2312 | 0.3587 | 0.6166 | 0.8087 |
0.4011 | 25.0 | 1800 | 0.4499 | 0.4327 | 0.6213 | 0.4519 | 0.1755 | 0.2279 | 0.3716 | 0.6152 | 0.7844 |
0.3714 | 26.0 | 1872 | 0.4460 | 0.4254 | 0.6188 | 0.4495 | 0.1716 | 0.2278 | 0.3932 | 0.6352 | 0.7916 |
0.3436 | 27.0 | 1944 | 0.4360 | 0.4182 | 0.6165 | 0.4192 | 0.1682 | 0.2224 | 0.3894 | 0.6524 | 0.8145 |
0.3698 | 28.0 | 2016 | 0.4694 | 0.4536 | 0.6274 | 0.5040 | 0.1863 | 0.2369 | 0.3356 | 0.5667 | 0.7469 |
0.365 | 29.0 | 2088 | 0.4288 | 0.4139 | 0.6156 | 0.4025 | 0.1655 | 0.2199 | 0.4028 | 0.6623 | 0.8109 |
0.3723 | 30.0 | 2160 | 0.4337 | 0.4148 | 0.6141 | 0.4192 | 0.1661 | 0.2215 | 0.4044 | 0.6578 | 0.8073 |
0.365 | 31.0 | 2232 | 0.4529 | 0.4309 | 0.6192 | 0.4751 | 0.1755 | 0.2314 | 0.3770 | 0.6115 | 0.7909 |
0.3571 | 32.0 | 2304 | 0.4302 | 0.4151 | 0.6170 | 0.4134 | 0.1663 | 0.2227 | 0.4089 | 0.6611 | 0.8078 |
0.3727 | 33.0 | 2376 | 0.4599 | 0.4352 | 0.6214 | 0.4937 | 0.1776 | 0.2348 | 0.3659 | 0.6120 | 0.7949 |
0.3538 | 34.0 | 2448 | 0.4391 | 0.4257 | 0.6161 | 0.4404 | 0.1720 | 0.2248 | 0.3768 | 0.6317 | 0.8042 |
0.3306 | 35.0 | 2520 | 0.4393 | 0.4223 | 0.6198 | 0.4328 | 0.1702 | 0.2262 | 0.3886 | 0.6493 | 0.8062 |
0.3369 | 36.0 | 2592 | 0.4496 | 0.4316 | 0.6182 | 0.4642 | 0.1751 | 0.2289 | 0.3712 | 0.6124 | 0.8005 |
0.3389 | 37.0 | 2664 | 0.4573 | 0.4376 | 0.6213 | 0.4897 | 0.1787 | 0.2338 | 0.3628 | 0.6014 | 0.7932 |
0.3767 | 38.0 | 2736 | 0.4558 | 0.4366 | 0.6216 | 0.4840 | 0.1786 | 0.2334 | 0.3566 | 0.6064 | 0.7973 |
0.3462 | 39.0 | 2808 | 0.4580 | 0.4380 | 0.6221 | 0.4815 | 0.1785 | 0.2328 | 0.3640 | 0.6020 | 0.7850 |
0.3834 | 40.0 | 2880 | 0.4664 | 0.4459 | 0.6245 | 0.5155 | 0.1836 | 0.2385 | 0.3426 | 0.5782 | 0.7944 |
0.3564 | 41.0 | 2952 | 0.4452 | 0.4271 | 0.6175 | 0.4563 | 0.1733 | 0.2282 | 0.3749 | 0.6269 | 0.8081 |
0.3571 | 42.0 | 3024 | 0.4357 | 0.4189 | 0.6151 | 0.4360 | 0.1686 | 0.2243 | 0.3947 | 0.6482 | 0.8163 |
0.345 | 43.0 | 3096 | 0.4285 | 0.4130 | 0.6114 | 0.4173 | 0.1653 | 0.2202 | 0.4034 | 0.6611 | 0.8223 |
0.3163 | 44.0 | 3168 | 0.4473 | 0.4274 | 0.6176 | 0.4624 | 0.1732 | 0.2288 | 0.3790 | 0.6245 | 0.8095 |
0.3331 | 45.0 | 3240 | 0.4392 | 0.4214 | 0.6139 | 0.4429 | 0.1699 | 0.2244 | 0.3887 | 0.6388 | 0.8081 |
0.3574 | 46.0 | 3312 | 0.4487 | 0.4230 | 0.6156 | 0.4608 | 0.1710 | 0.2282 | 0.3860 | 0.6431 | 0.8063 |
0.3703 | 47.0 | 3384 | 0.4342 | 0.4176 | 0.6179 | 0.4286 | 0.1678 | 0.2247 | 0.3918 | 0.6668 | 0.8098 |
0.325 | 48.0 | 3456 | 0.4390 | 0.4238 | 0.6150 | 0.4500 | 0.1715 | 0.2256 | 0.3695 | 0.6334 | 0.8216 |
0.3494 | 49.0 | 3528 | 0.4364 | 0.4182 | 0.6165 | 0.4348 | 0.1680 | 0.2248 | 0.4041 | 0.6539 | 0.8104 |
0.3439 | 50.0 | 3600 | 0.4401 | 0.4252 | 0.6156 | 0.4414 | 0.1716 | 0.2243 | 0.3831 | 0.6260 | 0.8042 |
0.3235 | 51.0 | 3672 | 0.4459 | 0.4258 | 0.6173 | 0.4607 | 0.1728 | 0.2287 | 0.3819 | 0.6272 | 0.8106 |
0.3197 | 52.0 | 3744 | 0.4341 | 0.4205 | 0.6153 | 0.4291 | 0.1691 | 0.2226 | 0.3874 | 0.6429 | 0.8173 |
0.3231 | 53.0 | 3816 | 0.4499 | 0.4297 | 0.6180 | 0.4654 | 0.1745 | 0.2290 | 0.3730 | 0.6166 | 0.8053 |
0.3182 | 54.0 | 3888 | 0.4407 | 0.4242 | 0.6145 | 0.4501 | 0.1714 | 0.2252 | 0.3762 | 0.6366 | 0.8124 |
0.334 | 55.0 | 3960 | 0.4518 | 0.4335 | 0.6176 | 0.4773 | 0.1768 | 0.2304 | 0.3591 | 0.6065 | 0.8111 |
0.3198 | 56.0 | 4032 | 0.4505 | 0.4322 | 0.6173 | 0.4725 | 0.1760 | 0.2298 | 0.3637 | 0.6131 | 0.8025 |
0.3165 | 57.0 | 4104 | 0.4378 | 0.4248 | 0.6174 | 0.4369 | 0.1720 | 0.2246 | 0.3729 | 0.6377 | 0.8137 |
0.3269 | 58.0 | 4176 | 0.4372 | 0.4275 | 0.6156 | 0.4415 | 0.1730 | 0.2240 | 0.3675 | 0.6276 | 0.8095 |
0.3224 | 59.0 | 4248 | 0.4359 | 0.4244 | 0.6149 | 0.4351 | 0.1711 | 0.2231 | 0.3721 | 0.6366 | 0.8090 |
0.3104 | 60.0 | 4320 | 0.4317 | 0.4209 | 0.6146 | 0.4284 | 0.1696 | 0.2220 | 0.3799 | 0.6395 | 0.8179 |
0.3248 | 61.0 | 4392 | 0.4323 | 0.4207 | 0.6138 | 0.4268 | 0.1694 | 0.2216 | 0.3864 | 0.6386 | 0.8148 |
0.303 | 62.0 | 4464 | 0.4309 | 0.4189 | 0.6126 | 0.4264 | 0.1685 | 0.2213 | 0.3853 | 0.6453 | 0.8194 |
0.3126 | 63.0 | 4536 | 0.4308 | 0.4206 | 0.6141 | 0.4229 | 0.1693 | 0.2208 | 0.3783 | 0.6447 | 0.8162 |
0.3099 | 64.0 | 4608 | 0.4330 | 0.4239 | 0.6149 | 0.4298 | 0.1709 | 0.2218 | 0.3709 | 0.6323 | 0.8182 |
0.3075 | 65.0 | 4680 | 0.4322 | 0.4222 | 0.6144 | 0.4276 | 0.1701 | 0.2217 | 0.3784 | 0.6374 | 0.8159 |
0.3024 | 66.0 | 4752 | 0.4393 | 0.4269 | 0.6155 | 0.4456 | 0.1729 | 0.2249 | 0.3722 | 0.6245 | 0.8100 |
0.3319 | 67.0 | 4824 | 0.4385 | 0.4273 | 0.6155 | 0.4402 | 0.1728 | 0.2238 | 0.3722 | 0.6244 | 0.8085 |
0.3163 | 68.0 | 4896 | 0.4334 | 0.4215 | 0.6128 | 0.4305 | 0.1699 | 0.2216 | 0.3814 | 0.6379 | 0.8145 |
0.3219 | 69.0 | 4968 | 0.4298 | 0.4197 | 0.6131 | 0.4215 | 0.1688 | 0.2203 | 0.3821 | 0.6453 | 0.8170 |
0.3155 | 70.0 | 5040 | 0.4295 | 0.4199 | 0.6134 | 0.4219 | 0.1687 | 0.2204 | 0.3846 | 0.6453 | 0.8164 |
0.3265 | 71.0 | 5112 | 0.4294 | 0.4194 | 0.6123 | 0.4232 | 0.1687 | 0.2203 | 0.3804 | 0.6468 | 0.8203 |
0.3231 | 72.0 | 5184 | 0.4338 | 0.4231 | 0.6138 | 0.4333 | 0.1707 | 0.2222 | 0.3775 | 0.6340 | 0.8166 |
0.3077 | 73.0 | 5256 | 0.4327 | 0.4221 | 0.6134 | 0.4315 | 0.1702 | 0.2219 | 0.3800 | 0.6361 | 0.8185 |
0.3178 | 74.0 | 5328 | 0.4312 | 0.4203 | 0.6126 | 0.4278 | 0.1693 | 0.2212 | 0.3813 | 0.6417 | 0.8194 |
0.3157 | 75.0 | 5400 | 0.4320 | 0.4213 | 0.6133 | 0.4298 | 0.1697 | 0.2216 | 0.3800 | 0.6396 | 0.8189 |
Framework versions
- Transformers 4.24.0
- Pytorch 1.12.1+cu116
- Datasets 2.8.0
- Tokenizers 0.13.2