generated_from_keras_callback

<!-- This model card has been generated automatically according to the information Keras had access to. You should probably proofread and complete it, then remove this comment. -->

AhamadShaik/SegFormer_PADDING_LM

This model is a fine-tuned version of nvidia/mit-b0 on an unknown dataset. It achieves the following results on the evaluation set:

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

Training results

Train Loss Train Dice Coef Train Iou Validation Loss Validation Dice Coef Validation Iou Train Lr Epoch
0.1460 0.3657 0.2410 0.0908 0.4603 0.3168 1e-04 0
0.0610 0.5251 0.3760 0.1773 0.1542 0.0892 1e-04 1
0.0500 0.5831 0.4322 0.0806 0.5067 0.3659 1e-04 2
0.0432 0.6204 0.4699 0.1085 0.3757 0.2479 1e-04 3
0.0413 0.6306 0.4831 0.0771 0.5239 0.3646 1e-04 4
0.0374 0.6569 0.5086 0.0719 0.5267 0.3854 1e-04 5
0.0336 0.6770 0.5307 0.0540 0.6264 0.4881 1e-04 6
0.0302 0.7029 0.5592 0.0518 0.6516 0.5234 1e-04 7
0.0306 0.7010 0.5582 0.0704 0.5946 0.4483 1e-04 8
0.0285 0.7160 0.5744 0.0504 0.6951 0.5568 1e-04 9
0.0287 0.7245 0.5830 0.0357 0.7899 0.6630 1e-04 10
0.0273 0.7228 0.5825 0.0659 0.6279 0.4914 1e-04 11
0.0259 0.7344 0.5961 0.0357 0.7986 0.6716 1e-04 12
0.0257 0.7405 0.6010 0.0385 0.7970 0.6702 1e-04 13
0.0237 0.7434 0.6076 0.0364 0.8060 0.6841 1e-04 14
0.0227 0.7532 0.6192 0.0556 0.6927 0.5449 1e-04 15
0.0225 0.7546 0.6202 0.0242 0.8446 0.7356 5e-06 16
0.0207 0.7614 0.6312 0.0235 0.8482 0.7406 5e-06 17
0.0205 0.7676 0.6365 0.0235 0.8489 0.7414 5e-06 18
0.0200 0.7689 0.6389 0.0238 0.8497 0.7424 5e-06 19
0.0201 0.7693 0.6384 0.0237 0.8492 0.7418 5e-06 20
0.0195 0.7738 0.6438 0.0231 0.8504 0.7440 5e-06 21
0.0196 0.7749 0.6458 0.0234 0.8504 0.7436 5e-06 22
0.0192 0.7756 0.6464 0.0236 0.8482 0.7407 5e-06 23
0.0191 0.7741 0.6447 0.0231 0.8503 0.7435 5e-06 24
0.0191 0.7761 0.6466 0.0238 0.8493 0.7419 5e-06 25
0.0188 0.7781 0.6503 0.0237 0.8481 0.7405 5e-06 26
0.0192 0.7729 0.6440 0.0234 0.8483 0.7414 2.5e-07 27
0.0187 0.7849 0.6572 0.0241 0.8478 0.7398 2.5e-07 28
0.0188 0.7786 0.6501 0.0241 0.8484 0.7406 2.5e-07 29
0.0189 0.7815 0.6520 0.0232 0.8507 0.7439 2.5e-07 30
0.0185 0.7715 0.6440 0.0232 0.8505 0.7437 2.5e-07 31
0.0186 0.7764 0.6488 0.0233 0.8487 0.7416 1.25e-08 32
0.0189 0.7725 0.6438 0.0235 0.8492 0.7418 1.25e-08 33
0.0186 0.7767 0.6484 0.0237 0.8491 0.7414 1.25e-08 34
0.0186 0.7800 0.6517 0.0229 0.8503 0.7436 1.25e-08 35
0.0187 0.7758 0.6463 0.0232 0.8501 0.7433 1.25e-08 36
0.0187 0.7774 0.6497 0.0232 0.8496 0.7423 1.25e-08 37
0.0187 0.7791 0.6502 0.0234 0.8496 0.7424 1.25e-08 38
0.0189 0.7743 0.6446 0.0237 0.8501 0.7429 1.25e-08 39
0.0189 0.7770 0.6491 0.0234 0.8479 0.7402 1.25e-08 40
0.0187 0.7793 0.6507 0.0233 0.8507 0.7441 6.25e-10 41
0.0186 0.7788 0.6505 0.0231 0.8502 0.7434 6.25e-10 42
0.0188 0.7773 0.6491 0.0232 0.8510 0.7443 6.25e-10 43
0.0185 0.7775 0.6493 0.0229 0.8518 0.7456 6.25e-10 44
0.0187 0.7765 0.6487 0.0233 0.8491 0.7416 6.25e-10 45
0.0186 0.7804 0.6521 0.0234 0.8499 0.7430 1e-10 46
0.0187 0.7765 0.6482 0.0235 0.8486 0.7410 1e-10 47
0.0187 0.7777 0.6497 0.0233 0.8493 0.7419 1e-10 48
0.0187 0.7785 0.6498 0.0230 0.8502 0.7432 1e-10 49
0.0188 0.7813 0.6529 0.0235 0.8491 0.7418 1e-10 50
0.0186 0.7770 0.6498 0.0229 0.8504 0.7435 1e-10 51
0.0190 0.7764 0.6483 0.0232 0.8503 0.7437 1e-10 52
0.0189 0.7764 0.6480 0.0233 0.8500 0.7430 1e-10 53
0.0189 0.7744 0.6461 0.0231 0.8516 0.7449 1e-10 54
0.0188 0.7767 0.6485 0.0233 0.8499 0.7429 1e-10 55
0.0189 0.7729 0.6441 0.0234 0.8488 0.7413 1e-10 56
0.0186 0.7814 0.6531 0.0235 0.8486 0.7408 1e-10 57
0.0189 0.7772 0.6480 0.0237 0.8482 0.7405 1e-10 58
0.0187 0.7756 0.6477 0.0231 0.8511 0.7443 1e-10 59
0.0188 0.7783 0.6500 0.0234 0.8489 0.7415 1e-10 60
0.0186 0.7771 0.6484 0.0238 0.8482 0.7402 1e-10 61
0.0186 0.7776 0.6502 0.0231 0.8499 0.7429 1e-10 62
0.0185 0.7784 0.6504 0.0232 0.8496 0.7422 1e-10 63
0.0188 0.7797 0.6519 0.0234 0.8484 0.7406 1e-10 64
0.0189 0.7851 0.6566 0.0230 0.8518 0.7455 1e-10 65
0.0187 0.7795 0.6515 0.0237 0.8494 0.7420 1e-10 66
0.0188 0.7779 0.6489 0.0237 0.8470 0.7395 1e-10 67
0.0190 0.7751 0.6455 0.0243 0.8472 0.7391 1e-10 68
0.0188 0.7767 0.6486 0.0233 0.8502 0.7433 1e-10 69
0.0189 0.7819 0.6535 0.0231 0.8504 0.7436 1e-10 70
0.0188 0.7734 0.6452 0.0230 0.8508 0.7442 1e-10 71
0.0186 0.7784 0.6516 0.0234 0.8484 0.7414 1e-10 72
0.0187 0.7706 0.6424 0.0236 0.8483 0.7407 1e-10 73
0.0189 0.7720 0.6430 0.0237 0.8481 0.7401 1e-10 74
0.0189 0.7753 0.6464 0.0232 0.8505 0.7439 1e-10 75
0.0188 0.7759 0.6481 0.0232 0.8500 0.7427 1e-10 76
0.0188 0.7760 0.6479 0.0235 0.8494 0.7418 1e-10 77
0.0187 0.7828 0.6538 0.0231 0.8518 0.7456 1e-10 78
0.0188 0.7771 0.6489 0.0235 0.8488 0.7414 1e-10 79
0.0188 0.7766 0.6480 0.0235 0.8487 0.7411 1e-10 80
0.0187 0.7764 0.6492 0.0236 0.8497 0.7421 1e-10 81
0.0188 0.7769 0.6489 0.0232 0.8504 0.7434 1e-10 82
0.0190 0.7805 0.6507 0.0237 0.8494 0.7418 1e-10 83
0.0187 0.7752 0.6473 0.0231 0.8502 0.7431 1e-10 84
0.0189 0.7758 0.6472 0.0234 0.8484 0.7414 1e-10 85
0.0185 0.7735 0.6460 0.0234 0.8492 0.7417 1e-10 86
0.0185 0.7814 0.6534 0.0235 0.8490 0.7414 1e-10 87
0.0186 0.7762 0.6472 0.0234 0.8490 0.7415 1e-10 88
0.0189 0.7769 0.6481 0.0230 0.8514 0.7452 1e-10 89
0.0186 0.7776 0.6495 0.0238 0.8496 0.7422 1e-10 90
0.0188 0.7772 0.6486 0.0233 0.8496 0.7423 1e-10 91
0.0186 0.7743 0.6467 0.0231 0.8505 0.7436 1e-10 92
0.0188 0.7794 0.6505 0.0233 0.8503 0.7431 1e-10 93
0.0186 0.7739 0.6455 0.0237 0.8476 0.7395 1e-10 94
0.0188 0.7769 0.6477 0.0234 0.8492 0.7419 1e-10 95
0.0188 0.7689 0.6415 0.0236 0.8487 0.7409 1e-10 96
0.0194 0.7756 0.6476 0.0236 0.8504 0.7433 1e-10 97
0.0187 0.7792 0.6504 0.0231 0.8502 0.7436 1e-10 98
0.0186 0.7789 0.6508 0.0233 0.8506 0.7439 1e-10 99

Framework versions