Why no open source llama/orca/platypus submissions?

#22
by lewington - opened

Hey guys, I notice that there are no open source llama/orca or other pure decoder submissions (ok there's one, but it seems quite incomplete and the spelling makes me sus: https://huggingface.co/Shimin/LLaMA-embeeding).

Why is this?

Word on the street is that decoder only models make bad embeddings in general, but has anyone tested this assumption with any rigor? I see that ada-002 is up there in the top 10, which means they can't be awful.

Massive Text Embedding Benchmark org

Sign up or log in to comment