generated_from_keras_callback

<!-- This model card has been generated automatically according to the information Keras had access to. You should probably proofread and complete it, then remove this comment. -->

AhamadShaik/SegFormer_PADDING_x.6

This model is a fine-tuned version of nvidia/mit-b0 on an unknown dataset. It achieves the following results on the evaluation set:

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

Training results

Train Loss Train Dice Coef Train Iou Validation Loss Validation Dice Coef Validation Iou Train Lr Epoch
0.2116 0.3018 0.1931 0.0863 0.6813 0.5211 1e-04 0
0.0722 0.4966 0.3490 0.0565 0.7560 0.6108 1e-04 1
0.0544 0.5768 0.4227 0.0465 0.7728 0.6368 1e-04 2
0.0446 0.6305 0.4771 0.0379 0.8130 0.6869 1e-04 3
0.0422 0.6479 0.4950 0.0366 0.8005 0.6719 1e-04 4
0.0375 0.6776 0.5273 0.0315 0.8327 0.7155 1e-04 5
0.0351 0.6926 0.5428 0.0311 0.8340 0.7177 1e-04 6
0.0341 0.6967 0.5485 0.0295 0.8377 0.7228 1e-04 7
0.0307 0.7246 0.5794 0.0278 0.8444 0.7328 1e-04 8
0.0318 0.7119 0.5664 0.0278 0.8423 0.7297 1e-04 9
0.0284 0.7362 0.5940 0.0280 0.8435 0.7314 1e-04 10
0.0278 0.7382 0.5979 0.0284 0.8371 0.7232 1e-04 11
0.0268 0.7429 0.6030 0.0261 0.8504 0.7419 1e-04 12
0.0262 0.7464 0.6072 0.0285 0.8408 0.7280 1e-04 13
0.0247 0.7560 0.6189 0.0255 0.8505 0.7419 1e-04 14
0.0244 0.7580 0.6209 0.0249 0.8524 0.7450 1e-04 15
0.0221 0.7719 0.6385 0.0246 0.8503 0.7422 1e-04 16
0.0234 0.7623 0.6261 0.0233 0.8567 0.7516 1e-04 17
0.0253 0.7527 0.6147 0.0258 0.8481 0.7401 1e-04 18
0.0241 0.7597 0.6236 0.0258 0.8430 0.7331 1e-04 19
0.0230 0.7657 0.6310 0.0224 0.8571 0.7522 1e-04 20
0.0210 0.7755 0.6431 0.0220 0.8609 0.7577 1e-04 21
0.0195 0.7867 0.6572 0.0231 0.8578 0.7531 1e-04 22
0.0192 0.7880 0.6592 0.0226 0.8602 0.7568 1e-04 23
0.0185 0.7909 0.6630 0.0231 0.8591 0.7549 1e-04 24
0.0186 0.7906 0.6626 0.0221 0.8590 0.7551 1e-04 25
0.0196 0.7836 0.6531 0.0239 0.8550 0.7491 1e-04 26
0.0177 0.7975 0.6717 0.0223 0.8589 0.7549 5e-06 27
0.0173 0.7979 0.6727 0.0228 0.8585 0.7542 5e-06 28
0.0170 0.7980 0.6731 0.0215 0.8594 0.7556 5e-06 29
0.0168 0.8003 0.6755 0.0213 0.8616 0.7590 5e-06 30
0.0167 0.8016 0.6774 0.0211 0.8614 0.7587 5e-06 31
0.0167 0.8044 0.6807 0.0217 0.8598 0.7562 5e-06 32
0.0167 0.8048 0.6815 0.0211 0.8622 0.7599 5e-06 33
0.0164 0.8013 0.6773 0.0213 0.8621 0.7596 5e-06 34
0.0162 0.8025 0.6790 0.0216 0.8608 0.7578 5e-06 35
0.0163 0.8018 0.6784 0.0212 0.8615 0.7587 5e-06 36
0.0161 0.8043 0.6818 0.0211 0.8627 0.7605 2.5e-07 37
0.0161 0.8025 0.6793 0.0218 0.8604 0.7572 2.5e-07 38
0.0163 0.8039 0.6810 0.0211 0.8618 0.7592 2.5e-07 39
0.0159 0.8044 0.6816 0.0215 0.8622 0.7597 2.5e-07 40
0.0157 0.8068 0.6841 0.0213 0.8612 0.7584 2.5e-07 41
0.0159 0.8063 0.6837 0.0214 0.8615 0.7588 1.25e-08 42
0.0160 0.8040 0.6814 0.0217 0.8609 0.7578 1.25e-08 43
0.0159 0.8072 0.6852 0.0213 0.8616 0.7589 1.25e-08 44
0.0160 0.8062 0.6836 0.0215 0.8611 0.7581 1.25e-08 45
0.0159 0.8045 0.6820 0.0211 0.8623 0.7600 1.25e-08 46
0.0162 0.8027 0.6798 0.0210 0.8622 0.7599 6.25e-10 47
0.0160 0.8039 0.6807 0.0218 0.8606 0.7575 6.25e-10 48
0.0159 0.8093 0.6874 0.0220 0.8601 0.7566 6.25e-10 49
0.0159 0.8072 0.6841 0.0217 0.8622 0.7596 6.25e-10 50
0.0159 0.8045 0.6815 0.0213 0.8614 0.7586 6.25e-10 51
0.0159 0.8111 0.6894 0.0216 0.8615 0.7588 6.25e-10 52
0.0158 0.8066 0.6843 0.0213 0.8617 0.7592 1e-10 53
0.0161 0.8042 0.6813 0.0212 0.8618 0.7592 1e-10 54
0.0163 0.8058 0.6829 0.0221 0.8604 0.7570 1e-10 55
0.0164 0.8017 0.6785 0.0214 0.8612 0.7583 1e-10 56
0.0160 0.8059 0.6827 0.0210 0.8620 0.7595 1e-10 57
0.0162 0.8038 0.6805 0.0216 0.8616 0.7587 1e-10 58
0.0160 0.8022 0.6791 0.0222 0.8598 0.7562 1e-10 59
0.0161 0.8045 0.6812 0.0215 0.8614 0.7585 1e-10 60
0.0159 0.8026 0.6794 0.0213 0.8605 0.7572 1e-10 61
0.0161 0.8069 0.6846 0.0216 0.8608 0.7577 1e-10 62
0.0159 0.8088 0.6873 0.0209 0.8628 0.7607 1e-10 63
0.0161 0.8016 0.6783 0.0212 0.8616 0.7588 1e-10 64
0.0161 0.8031 0.6798 0.0213 0.8612 0.7583 1e-10 65
0.0161 0.8038 0.6811 0.0215 0.8601 0.7566 1e-10 66
0.0160 0.8052 0.6827 0.0216 0.8608 0.7576 1e-10 67
0.0161 0.8051 0.6825 0.0216 0.8610 0.7580 1e-10 68
0.0159 0.8055 0.6826 0.0218 0.8601 0.7568 1e-10 69
0.0159 0.8024 0.6793 0.0212 0.8617 0.7591 1e-10 70
0.0158 0.8043 0.6813 0.0214 0.8608 0.7578 1e-10 71
0.0161 0.8074 0.6850 0.0212 0.8610 0.7579 1e-10 72
0.0161 0.8066 0.6841 0.0216 0.8615 0.7586 1e-10 73
0.0159 0.8065 0.6841 0.0214 0.8611 0.7582 1e-10 74
0.0162 0.8039 0.6808 0.0212 0.8617 0.7591 1e-10 75
0.0160 0.8036 0.6801 0.0214 0.8616 0.7589 1e-10 76
0.0161 0.8100 0.6879 0.0211 0.8619 0.7595 1e-10 77
0.0161 0.8049 0.6816 0.0211 0.8616 0.7590 1e-10 78
0.0161 0.8037 0.6805 0.0221 0.8596 0.7558 1e-10 79
0.0159 0.8044 0.6816 0.0219 0.8615 0.7587 1e-10 80
0.0161 0.8031 0.6796 0.0214 0.8611 0.7581 1e-10 81
0.0160 0.8016 0.6782 0.0209 0.8622 0.7599 1e-10 82
0.0162 0.8040 0.6810 0.0211 0.8623 0.7601 1e-10 83
0.0159 0.8065 0.6844 0.0210 0.8624 0.7602 1e-10 84
0.0159 0.8064 0.6841 0.0216 0.8613 0.7585 1e-10 85
0.0159 0.8068 0.6851 0.0212 0.8626 0.7604 1e-10 86
0.0158 0.8049 0.6822 0.0222 0.8600 0.7564 1e-10 87
0.0161 0.8028 0.6797 0.0210 0.8621 0.7597 1e-10 88
0.0163 0.8050 0.6814 0.0218 0.8602 0.7567 1e-10 89
0.0159 0.8077 0.6858 0.0215 0.8611 0.7582 1e-10 90
0.0159 0.8067 0.6841 0.0213 0.8623 0.7599 1e-10 91
0.0160 0.8064 0.6837 0.0213 0.8615 0.7588 1e-10 92
0.0160 0.8073 0.6847 0.0209 0.8627 0.7606 1e-10 93
0.0159 0.8056 0.6833 0.0214 0.8612 0.7583 1e-10 94
0.0159 0.8073 0.6852 0.0213 0.8616 0.7590 1e-10 95
0.0158 0.8051 0.6832 0.0219 0.8615 0.7587 1e-10 96
0.0161 0.8053 0.6826 0.0220 0.8593 0.7555 1e-10 97
0.0161 0.8059 0.6832 0.0218 0.8608 0.7577 1e-10 98
0.0159 0.8034 0.6803 0.0223 0.8606 0.7573 1e-10 99

Framework versions