<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. -->
glpn-nyu-finetuned-diode-230119-100058
This model is a fine-tuned version of vinvino02/glpn-nyu on the diode-subset dataset. It achieves the following results on the evaluation set:
- Loss: 0.4305
- Mae: 0.4203
- Rmse: 0.6123
- Abs Rel: 0.4280
- Log Mae: 0.1694
- Log Rmse: 0.2214
- Delta1: 0.3813
- Delta2: 0.6446
- Delta3: 0.8152
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 24
- eval_batch_size: 48
- seed: 2022
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.15
- num_epochs: 75
- mixed_precision_training: Native AMP
Training results
Training Loss | Epoch | Step | Validation Loss | Mae | Rmse | Abs Rel | Log Mae | Log Rmse | Delta1 | Delta2 | Delta3 |
---|---|---|---|---|---|---|---|---|---|---|---|
1.2807 | 1.0 | 72 | 0.9866 | 0.8312 | 1.0131 | 0.7179 | 0.5655 | 0.5924 | 0.0087 | 0.0200 | 0.0552 |
0.7396 | 2.0 | 144 | 0.4976 | 0.4741 | 0.6670 | 0.5279 | 0.1989 | 0.2567 | 0.3070 | 0.5470 | 0.7943 |
0.5018 | 3.0 | 216 | 0.4811 | 0.4630 | 0.6367 | 0.5198 | 0.1929 | 0.2446 | 0.3211 | 0.5440 | 0.7506 |
0.482 | 4.0 | 288 | 0.4726 | 0.4556 | 0.6337 | 0.4951 | 0.1893 | 0.2410 | 0.3306 | 0.5636 | 0.7663 |
0.4874 | 5.0 | 360 | 0.4813 | 0.4662 | 0.6355 | 0.5265 | 0.1941 | 0.2446 | 0.3179 | 0.5385 | 0.7278 |
0.4648 | 6.0 | 432 | 0.4681 | 0.4512 | 0.6309 | 0.4783 | 0.1869 | 0.2383 | 0.3430 | 0.5757 | 0.7527 |
0.4346 | 7.0 | 504 | 0.4637 | 0.4499 | 0.6292 | 0.4710 | 0.1859 | 0.2357 | 0.3453 | 0.5671 | 0.7644 |
0.4018 | 8.0 | 576 | 0.4790 | 0.4638 | 0.6349 | 0.5161 | 0.1928 | 0.2436 | 0.3255 | 0.5408 | 0.7338 |
0.4092 | 9.0 | 648 | 0.4559 | 0.4449 | 0.6267 | 0.4540 | 0.1827 | 0.2319 | 0.3541 | 0.5814 | 0.7692 |
0.3891 | 10.0 | 720 | 0.4619 | 0.4433 | 0.6259 | 0.4748 | 0.1823 | 0.2351 | 0.3579 | 0.5870 | 0.7742 |
0.3707 | 11.0 | 792 | 0.4624 | 0.4500 | 0.6269 | 0.4828 | 0.1851 | 0.2350 | 0.3421 | 0.5672 | 0.7638 |
0.4129 | 12.0 | 864 | 0.4648 | 0.4468 | 0.6265 | 0.4836 | 0.1836 | 0.2358 | 0.3533 | 0.5786 | 0.7625 |
0.4108 | 13.0 | 936 | 0.4474 | 0.4312 | 0.6187 | 0.4501 | 0.1752 | 0.2280 | 0.3801 | 0.6088 | 0.7887 |
0.3948 | 14.0 | 1008 | 0.4619 | 0.4498 | 0.6263 | 0.4853 | 0.1844 | 0.2344 | 0.3401 | 0.5721 | 0.7645 |
0.4009 | 15.0 | 1080 | 0.4619 | 0.4440 | 0.6244 | 0.4889 | 0.1820 | 0.2351 | 0.3563 | 0.5841 | 0.7751 |
0.3657 | 16.0 | 1152 | 0.4636 | 0.4491 | 0.6260 | 0.4936 | 0.1846 | 0.2360 | 0.3422 | 0.5734 | 0.7644 |
0.3605 | 17.0 | 1224 | 0.4353 | 0.4255 | 0.6153 | 0.4248 | 0.1715 | 0.2218 | 0.3844 | 0.6207 | 0.8008 |
0.3937 | 18.0 | 1296 | 0.4756 | 0.4609 | 0.6310 | 0.5281 | 0.1909 | 0.2423 | 0.3220 | 0.5461 | 0.7538 |
0.3453 | 19.0 | 1368 | 0.4698 | 0.4517 | 0.6270 | 0.5145 | 0.1863 | 0.2392 | 0.3360 | 0.5702 | 0.7689 |
0.3883 | 20.0 | 1440 | 0.4349 | 0.4240 | 0.6145 | 0.4311 | 0.1712 | 0.2230 | 0.3841 | 0.6321 | 0.8030 |
0.3482 | 21.0 | 1512 | 0.4339 | 0.4209 | 0.6146 | 0.4223 | 0.1694 | 0.2223 | 0.3967 | 0.6337 | 0.8036 |
0.3374 | 22.0 | 1584 | 0.4400 | 0.4289 | 0.6167 | 0.4431 | 0.1737 | 0.2254 | 0.3743 | 0.6191 | 0.7971 |
0.3516 | 23.0 | 1656 | 0.4395 | 0.4280 | 0.6171 | 0.4426 | 0.1737 | 0.2259 | 0.3710 | 0.6241 | 0.7998 |
0.3901 | 24.0 | 1728 | 0.4444 | 0.4324 | 0.6184 | 0.4562 | 0.1758 | 0.2280 | 0.3665 | 0.6118 | 0.7991 |
0.3587 | 25.0 | 1800 | 0.4326 | 0.4200 | 0.6129 | 0.4281 | 0.1690 | 0.2222 | 0.3920 | 0.6403 | 0.8073 |
0.3425 | 26.0 | 1872 | 0.4371 | 0.4231 | 0.6152 | 0.4341 | 0.1709 | 0.2242 | 0.3852 | 0.6372 | 0.7974 |
0.3252 | 27.0 | 1944 | 0.4381 | 0.4225 | 0.6140 | 0.4399 | 0.1705 | 0.2245 | 0.3851 | 0.6396 | 0.8065 |
0.3586 | 28.0 | 2016 | 0.4441 | 0.4304 | 0.6162 | 0.4488 | 0.1746 | 0.2258 | 0.3674 | 0.6179 | 0.7929 |
0.3389 | 29.0 | 2088 | 0.4240 | 0.4112 | 0.6100 | 0.4017 | 0.1640 | 0.2173 | 0.4152 | 0.6599 | 0.8128 |
0.3418 | 30.0 | 2160 | 0.4312 | 0.4195 | 0.6126 | 0.4211 | 0.1687 | 0.2206 | 0.3899 | 0.6435 | 0.8123 |
0.3454 | 31.0 | 2232 | 0.4301 | 0.4176 | 0.6126 | 0.4167 | 0.1674 | 0.2203 | 0.3974 | 0.6479 | 0.8089 |
0.3499 | 32.0 | 2304 | 0.4262 | 0.4154 | 0.6115 | 0.4081 | 0.1661 | 0.2184 | 0.3997 | 0.6578 | 0.8083 |
0.3649 | 33.0 | 2376 | 0.4429 | 0.4313 | 0.6171 | 0.4507 | 0.1753 | 0.2263 | 0.3641 | 0.6134 | 0.7982 |
0.3341 | 34.0 | 2448 | 0.4292 | 0.4207 | 0.6127 | 0.4161 | 0.1689 | 0.2192 | 0.3874 | 0.6415 | 0.8007 |
0.3323 | 35.0 | 2520 | 0.4402 | 0.4266 | 0.6148 | 0.4434 | 0.1728 | 0.2247 | 0.3754 | 0.6254 | 0.7983 |
0.3374 | 36.0 | 2592 | 0.4336 | 0.4233 | 0.6139 | 0.4277 | 0.1706 | 0.2219 | 0.3810 | 0.6362 | 0.8008 |
0.334 | 37.0 | 2664 | 0.4310 | 0.4230 | 0.6138 | 0.4240 | 0.1703 | 0.2209 | 0.3826 | 0.6345 | 0.8034 |
0.3471 | 38.0 | 2736 | 0.4372 | 0.4250 | 0.6144 | 0.4397 | 0.1720 | 0.2240 | 0.3780 | 0.6303 | 0.8046 |
0.3283 | 39.0 | 2808 | 0.4421 | 0.4301 | 0.6168 | 0.4497 | 0.1743 | 0.2259 | 0.3654 | 0.6209 | 0.7993 |
0.3418 | 40.0 | 2880 | 0.4340 | 0.4224 | 0.6137 | 0.4334 | 0.1703 | 0.2228 | 0.3857 | 0.6351 | 0.8054 |
0.3455 | 41.0 | 2952 | 0.4294 | 0.4174 | 0.6118 | 0.4212 | 0.1675 | 0.2203 | 0.3959 | 0.6469 | 0.8109 |
0.3229 | 42.0 | 3024 | 0.4291 | 0.4165 | 0.6121 | 0.4199 | 0.1671 | 0.2207 | 0.4035 | 0.6464 | 0.8103 |
0.352 | 43.0 | 3096 | 0.4393 | 0.4266 | 0.6154 | 0.4462 | 0.1729 | 0.2253 | 0.3744 | 0.6287 | 0.8049 |
0.3163 | 44.0 | 3168 | 0.4250 | 0.4113 | 0.6098 | 0.4112 | 0.1647 | 0.2187 | 0.4041 | 0.6620 | 0.8201 |
0.3284 | 45.0 | 3240 | 0.4358 | 0.4245 | 0.6138 | 0.4379 | 0.1716 | 0.2233 | 0.3745 | 0.6306 | 0.8106 |
0.3359 | 46.0 | 3312 | 0.4321 | 0.4217 | 0.6124 | 0.4283 | 0.1699 | 0.2210 | 0.3770 | 0.6412 | 0.8129 |
0.3406 | 47.0 | 3384 | 0.4238 | 0.4127 | 0.6104 | 0.4084 | 0.1653 | 0.2183 | 0.3982 | 0.6617 | 0.8177 |
0.3207 | 48.0 | 3456 | 0.4375 | 0.4275 | 0.6147 | 0.4435 | 0.1733 | 0.2243 | 0.3658 | 0.6262 | 0.8071 |
0.3338 | 49.0 | 3528 | 0.4331 | 0.4223 | 0.6142 | 0.4310 | 0.1705 | 0.2228 | 0.3846 | 0.6374 | 0.8071 |
0.3203 | 50.0 | 3600 | 0.4308 | 0.4212 | 0.6136 | 0.4253 | 0.1695 | 0.2213 | 0.3878 | 0.6407 | 0.8054 |
0.3238 | 51.0 | 3672 | 0.4379 | 0.4267 | 0.6148 | 0.4416 | 0.1727 | 0.2241 | 0.3723 | 0.6244 | 0.8036 |
0.3209 | 52.0 | 3744 | 0.4289 | 0.4187 | 0.6121 | 0.4178 | 0.1681 | 0.2198 | 0.3920 | 0.6461 | 0.8096 |
0.3198 | 53.0 | 3816 | 0.4376 | 0.4264 | 0.6145 | 0.4402 | 0.1724 | 0.2237 | 0.3708 | 0.6279 | 0.8066 |
0.3137 | 54.0 | 3888 | 0.4294 | 0.4180 | 0.6115 | 0.4242 | 0.1681 | 0.2208 | 0.3888 | 0.6494 | 0.8152 |
0.3238 | 55.0 | 3960 | 0.4416 | 0.4294 | 0.6158 | 0.4521 | 0.1743 | 0.2261 | 0.3645 | 0.6205 | 0.8069 |
0.3173 | 56.0 | 4032 | 0.4257 | 0.4142 | 0.6116 | 0.4145 | 0.1661 | 0.2198 | 0.4016 | 0.6586 | 0.8136 |
0.3173 | 57.0 | 4104 | 0.4303 | 0.4193 | 0.6123 | 0.4246 | 0.1687 | 0.2210 | 0.3879 | 0.6451 | 0.8118 |
0.3297 | 58.0 | 4176 | 0.4302 | 0.4219 | 0.6132 | 0.4259 | 0.1700 | 0.2211 | 0.3792 | 0.6394 | 0.8122 |
0.3261 | 59.0 | 4248 | 0.4319 | 0.4220 | 0.6131 | 0.4312 | 0.1702 | 0.2221 | 0.3781 | 0.6407 | 0.8142 |
0.3082 | 60.0 | 4320 | 0.4340 | 0.4234 | 0.6136 | 0.4346 | 0.1710 | 0.2228 | 0.3754 | 0.6373 | 0.8106 |
0.31 | 61.0 | 4392 | 0.4225 | 0.4120 | 0.6104 | 0.4073 | 0.1646 | 0.2181 | 0.4054 | 0.6626 | 0.8168 |
0.3065 | 62.0 | 4464 | 0.4313 | 0.4197 | 0.6125 | 0.4280 | 0.1690 | 0.2216 | 0.3854 | 0.6472 | 0.8127 |
0.3046 | 63.0 | 4536 | 0.4316 | 0.4202 | 0.6127 | 0.4268 | 0.1691 | 0.2213 | 0.3849 | 0.6448 | 0.8131 |
0.303 | 64.0 | 4608 | 0.4352 | 0.4241 | 0.6137 | 0.4373 | 0.1712 | 0.2231 | 0.3760 | 0.6364 | 0.8097 |
0.3094 | 65.0 | 4680 | 0.4318 | 0.4205 | 0.6128 | 0.4304 | 0.1695 | 0.2220 | 0.3828 | 0.6438 | 0.8140 |
0.3035 | 66.0 | 4752 | 0.4351 | 0.4233 | 0.6136 | 0.4386 | 0.1709 | 0.2235 | 0.3781 | 0.6388 | 0.8099 |
0.327 | 67.0 | 4824 | 0.4307 | 0.4203 | 0.6131 | 0.4280 | 0.1693 | 0.2216 | 0.3828 | 0.6463 | 0.8143 |
0.3175 | 68.0 | 4896 | 0.4325 | 0.4219 | 0.6137 | 0.4314 | 0.1701 | 0.2222 | 0.3809 | 0.6406 | 0.8135 |
0.3188 | 69.0 | 4968 | 0.4299 | 0.4203 | 0.6126 | 0.4271 | 0.1694 | 0.2214 | 0.3827 | 0.6440 | 0.8141 |
0.3158 | 70.0 | 5040 | 0.4304 | 0.4203 | 0.6126 | 0.4274 | 0.1694 | 0.2215 | 0.3832 | 0.6443 | 0.8133 |
0.3298 | 71.0 | 5112 | 0.4315 | 0.4219 | 0.6135 | 0.4292 | 0.1700 | 0.2218 | 0.3792 | 0.6423 | 0.8136 |
0.3246 | 72.0 | 5184 | 0.4323 | 0.4219 | 0.6129 | 0.4322 | 0.1703 | 0.2223 | 0.3769 | 0.6418 | 0.8133 |
0.3116 | 73.0 | 5256 | 0.4301 | 0.4198 | 0.6124 | 0.4264 | 0.1691 | 0.2213 | 0.3833 | 0.6459 | 0.8141 |
0.3192 | 74.0 | 5328 | 0.4301 | 0.4200 | 0.6125 | 0.4266 | 0.1691 | 0.2213 | 0.3819 | 0.6464 | 0.8156 |
0.3172 | 75.0 | 5400 | 0.4305 | 0.4203 | 0.6123 | 0.4280 | 0.1694 | 0.2214 | 0.3813 | 0.6446 | 0.8152 |
Framework versions
- Transformers 4.24.0
- Pytorch 1.12.1+cu116
- Datasets 2.8.0
- Tokenizers 0.13.2