tiiuae code instruct databricks-dolly-15k falcon-40b

Finetuning Overview:

Model Used: tiiuae/falcon-40b
Dataset: Databricks-dolly-15k

Dataset Insights:

The Databricks-dolly-15k dataset, comprising over 15,000 records, stands as a testament to the dedication of numerous Databricks professionals. Aimed at refining the interactive capabilities of systems like ChatGPT, the dataset offers:

Note: Some data categories incorporate Wikipedia references, evident from bracketed citation numbers, e.g., [42]. Exclusion is recommended for downstream applications.

Finetuning Details:

Leveraging MonsterAPI's no-code LLM finetuner, our finetuning emphasized:

Hyperparameters & Additional Details:


Prompt Structure:

### INSTRUCTION:
[instruction]

[context]

### RESPONSE:
[response]

Loss metrics

Training loss: training loss


license: apache-2.0