Edit model card

This is a finetuned version microsoft phi - 2. This is finetuned using the alpaca dataset.

Training arguments

num_train_epochs=1

per_device_train_batch_size=1

gradient_accumulation_steps=8

optim="paged_adamw_32bit"

logging_steps=25

learning_rate=2e-4

weight_decay=0.001

max_grad_norm=0.3

warmup_ratio=0.03

lr_scheduler_type="cosine"

Prompt format:

Use the below format

Below is an instruction that describes a task. Write a response that appropriately completes the request. ### Instruction:{prompt} ### Response:{generated_output} 
Downloads last month
6
Safetensors
Model size
2.78B params
Tensor type
FP16
·

Dataset used to train KathirKs/phi-2_alpaca_52k