Note

This model trained in Flax/Jax Using OST Library

This Model Doesn't Contain Orginal LLama weights so its open source

or Use Colab Easy as much as possible

🚀 Colab 🤗

this model uses Task classification and the conversation is between <|prompter|> and Answer or <|ai|>:

Using Model in Huggingface Transformers

Examples 🚀

def prompt_to_instruction_lgem(text: str):
    return f"<|prompter|> {text} </s><|ai|>:"

Generate Method to get res Text by Text


def generate(model_,input_ids_,tokeinzer_,max_length:int=256,temperature :float= 1,eos_token_id:int=2):
  with torch.no_grad():
    before_start = len(input_ids_[0])+1
    for _ in range(max_length):
      out = model_(
          input_ids=input_ids_,
          return_dict=True,
      )
      opa = torch.nn.functional.softmax(out.logits[:,-1,:]/temperature)
      Camila = torch.multinomial(opa,1)
      input_ids_ = torch.cat([input_ids_,Camila],-1)
      clear_output(wait=True)
      print(f"\r{tokeinzer_.decode(input_ids_[0],skip_special_tokens=True)[before_start:]}",end='')
      if Camila[0].item() == eos_token_id:
        break
      yield tokeinzer_.decode(Camila[0],skip_special_tokens=True)
  return f"{tokeinzer_.decode(input_ids_[0],skip_special_tokens=True)[before_start:]}"

Result

import socket
import time

def check_internet_connection():
    try:
        s = socket.socket(socket.AF_INET, socket.SOCK_STREAM)
        s.connect(("www.google.com", 80))
        print("Internet connection is active.")
    except:
        print("Internet connection is not active.")

if __name__ == "__main__":

  check_internet_connection()

Using Model in OST

LGeM 🚀

from modules import LGeMForCausalLM

OR in Flax Jax

from modules import FlaxLGeMForCausalLM
python3 LGeM-train.py