Hugging Face
Models
Datasets
Spaces
Posts
Docs
Solutions
Pricing
Log In
Sign Up
yunconglong
/
Mixtral_7Bx2_MoE_13B_DPO
like
0
Text Generation
Transformers
Safetensors
mixtral
Mixture of Experts
conversational
Inference Endpoints
text-generation-inference
License:
cc-by-nc-4.0
Model card
Files
Files and versions
Community
4
Train
Deploy
Use this model
Edit model card
Mixtral MOE 2x7B
Mixtral MOE 2x7B
MOE the following models by mergekit and then fine tuned by DPO.
mistralai/Mistral-7B-Instruct-v0.2
NurtureAI/neural-chat-7b-v3-16k
jondurbin/bagel-dpo-7b-v0.1
Downloads last month
2,594
Safetensors
Model size
12.9B params
Tensor type
BF16
·