Edit model card

Support:
My upload speeds have been cooked and unstable lately.
Realistically I'd need to move to get a better provider.
If you want and you are able to, you can support that endeavor and others here (Ko-fi). I apologize for disrupting your experience.

GGUF-IQ-Imatrix quants for ChaoticNeutrals/Poppy_Porpoise-0.72-L3-8B.

Original model information by the author:

"Poppy Porpoise" is a cutting-edge AI roleplay assistant based on the Llama 3 8B model, specializing in crafting unforgettable narrative experiences. With its advanced language capabilities, Poppy expertly immerses users in an interactive and engaging adventure, tailoring each adventure to their individual preferences.

image/png

Recomended ST Presets:(Updated for 0.72) Porpoise Presets

If you want to use vision functionality:

  • You must use the latest versions of Koboldcpp.

To use the multimodal capabilities of this model and use vision you need to load the specified mmproj file, this can be found inside this model repo. Llava MMProj

  • You can load the mmproj by using the corresponding section in the interface:

image/png

Downloads last month
4,274
GGUF
Model size
8.03B params
Architecture
llama
+1
Unable to determine this model's library. Check the docs .