<!-- This model card has been generated automatically according to the information Keras had access to. You should probably proofread and complete it, then remove this comment. -->
bedus-creation/t5-small-dataset-ii-lim-to-eng-002
This model is a fine-tuned version of mBart on an unknown dataset. It achieves the following results on the evaluation set:
- Train Loss: 0.2514
- Validation Loss: 0.3001
- Epoch: 99
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- optimizer: {'name': 'AdamWeightDecay', 'learning_rate': 2e-05, 'decay': 0.0, 'beta_1': 0.9, 'beta_2': 0.999, 'epsilon': 1e-07, 'amsgrad': False, 'weight_decay_rate': 0.01}
- training_precision: float32
Training results
Train Loss | Validation Loss | Epoch |
---|---|---|
1.0068 | 0.4628 | 0 |
0.4954 | 0.3665 | 1 |
0.4239 | 0.3488 | 2 |
0.3989 | 0.3300 | 3 |
0.3810 | 0.3232 | 4 |
0.3678 | 0.3192 | 5 |
0.3601 | 0.3140 | 6 |
0.3523 | 0.3110 | 7 |
0.3461 | 0.3099 | 8 |
0.3426 | 0.3074 | 9 |
0.3385 | 0.3055 | 10 |
0.3347 | 0.3019 | 11 |
0.3316 | 0.3036 | 12 |
0.3284 | 0.2997 | 13 |
0.3253 | 0.2983 | 14 |
0.3230 | 0.3004 | 15 |
0.3204 | 0.2977 | 16 |
0.3191 | 0.2957 | 17 |
0.3161 | 0.2931 | 18 |
0.3150 | 0.2925 | 19 |
0.3131 | 0.2921 | 20 |
0.3114 | 0.2909 | 21 |
0.3088 | 0.2925 | 22 |
0.3081 | 0.2922 | 23 |
0.3071 | 0.2894 | 24 |
0.3057 | 0.2889 | 25 |
0.3030 | 0.2898 | 26 |
0.3032 | 0.2884 | 27 |
0.3018 | 0.2873 | 28 |
0.2995 | 0.2887 | 29 |
0.3000 | 0.2864 | 30 |
0.2986 | 0.2868 | 31 |
0.2981 | 0.2854 | 32 |
0.2965 | 0.2867 | 33 |
0.2953 | 0.2862 | 34 |
0.2959 | 0.2848 | 35 |
0.2941 | 0.2849 | 36 |
0.2933 | 0.2867 | 37 |
0.2925 | 0.2875 | 38 |
0.2905 | 0.2843 | 39 |
0.2911 | 0.2843 | 40 |
0.2897 | 0.2863 | 41 |
0.2888 | 0.2855 | 42 |
0.2875 | 0.2852 | 43 |
0.2884 | 0.2878 | 44 |
0.2868 | 0.2853 | 45 |
0.2855 | 0.2843 | 46 |
0.2846 | 0.2852 | 47 |
0.2844 | 0.2833 | 48 |
0.2834 | 0.2847 | 49 |
0.2831 | 0.2851 | 50 |
0.2818 | 0.2839 | 51 |
0.2821 | 0.2843 | 52 |
0.2798 | 0.2858 | 53 |
0.2801 | 0.2843 | 54 |
0.2798 | 0.2851 | 55 |
0.2785 | 0.2880 | 56 |
0.2790 | 0.2853 | 57 |
0.2775 | 0.2860 | 58 |
0.2776 | 0.2848 | 59 |
0.2766 | 0.2875 | 60 |
0.2758 | 0.2864 | 61 |
0.2753 | 0.2857 | 62 |
0.2741 | 0.2899 | 63 |
0.2731 | 0.2904 | 64 |
0.2728 | 0.2887 | 65 |
0.2728 | 0.2879 | 66 |
0.2714 | 0.2877 | 67 |
0.2715 | 0.2901 | 68 |
0.2704 | 0.2864 | 69 |
0.2705 | 0.2876 | 70 |
0.2694 | 0.2925 | 71 |
0.2683 | 0.2923 | 72 |
0.2668 | 0.2910 | 73 |
0.2676 | 0.2878 | 74 |
0.2666 | 0.2928 | 75 |
0.2656 | 0.2903 | 76 |
0.2649 | 0.2913 | 77 |
0.2642 | 0.2912 | 78 |
0.2643 | 0.2944 | 79 |
0.2636 | 0.2910 | 80 |
0.2631 | 0.2922 | 81 |
0.2625 | 0.2983 | 82 |
0.2617 | 0.2945 | 83 |
0.2609 | 0.2914 | 84 |
0.2609 | 0.2974 | 85 |
0.2594 | 0.2960 | 86 |
0.2597 | 0.2977 | 87 |
0.2589 | 0.2972 | 88 |
0.2583 | 0.2970 | 89 |
0.2562 | 0.2951 | 90 |
0.2565 | 0.3004 | 91 |
0.2556 | 0.2971 | 92 |
0.2555 | 0.2963 | 93 |
0.2541 | 0.2991 | 94 |
0.2548 | 0.3000 | 95 |
0.2540 | 0.3015 | 96 |
0.2527 | 0.3004 | 97 |
0.2528 | 0.3012 | 98 |
0.2514 | 0.3001 | 99 |
Framework versions
- Transformers 4.33.2
- TensorFlow 2.13.0
- Datasets 2.14.5
- Tokenizers 0.13.3