Fine-tuned Falcon-7B Model for Medical Diagnosis
Model Details
Model Description:
This model is a fine-tuned version of the Falcon-7B model. This model was fine-tuned on Gretel.ai's "Symtoms to Diagnosis" dataset, found at the following link: https://huggingface.co/datasets/gretelai/symptom_to_diagnosis, in order to provide preliminary diagnoses based on the symptom descriptions it is prompted with.
Baseline Model:
For more details about the baseline Falcon-7B model, please see the following links:
- https://huggingface.co/tiiuae/falcon-7b
- https://huggingface.co/blog/falcon
Training procedure
The following bitsandbytes
quantization config was used during training:
- quant_method: bitsandbytes
- load_in_8bit: False
- load_in_4bit: True
- llm_int8_threshold: 6.0
- llm_int8_skip_modules: None
- llm_int8_enable_fp32_cpu_offload: False
- llm_int8_has_fp16_weight: False
- bnb_4bit_quant_type: nf4
- bnb_4bit_use_double_quant: True
- bnb_4bit_compute_dtype: bfloat16
Framework versions
- PEFT 0.5.0.dev0