Edit model card

ONNX convert of distiluse-base-multilingual-cased-v2

Conversion of sentence-transformers/distiluse-base-multilingual-cased-v2

This is a sentence-transformers ONNX model: It maps sentences & paragraphs to a 512 dimensional dense vector space and can be used for tasks like clustering or semantic search. This custom model outputs last_hidden_state similar like original sentence-transformer implementation.

Usage (HuggingFace Optimum)

Using this model becomes easy when you have optimum installed:

python -m pip install optimum

You may also need following:

python -m pip install onnxruntime
python -m pip install onnx

Then you can use the model like this:

from optimum.onnxruntime.modeling_ort import ORTModelForCustomTasks

model = ORTModelForCustomTasks.from_pretrained("lorenpe2/distiluse-base-multilingual-cased-v2")
tokenizer = AutoTokenizer.from_pretrained("lorenpe2/distiluse-base-multilingual-cased-v2")
inputs = tokenizer("I love burritos!", return_tensors="pt")
pred = model(**inputs)

You will also be able to leverage the pipeline API in transformers:

from transformers import pipeline

onnx_extractor = pipeline("feature-extraction", model=model, tokenizer=tokenizer)
text = "I love burritos!"
pred = onnx_extractor(text)

Evaluation Results

For an automated evaluation of this model, see the Sentence Embeddings Benchmark: https://seb.sbert.net

Full Model Architecture

SentenceTransformer(
  (0): Transformer({'max_seq_length': 128, 'do_lower_case': False}) with Transformer model: DistilBertModel 
  (1): Pooling({'word_embedding_dimension': 768, 'pooling_mode_cls_token': False, 'pooling_mode_mean_tokens': True, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False})
  (2): Dense({'in_features': 768, 'out_features': 512, 'bias': True, 'activation_function': 'torch.nn.modules.activation.Tanh'})
)

Citing & Authors

This model was trained by sentence-transformers.

If you find this model helpful, feel free to cite our publication Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks:

@inproceedings{reimers-2019-sentence-bert,
    title = "Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks",
    author = "Reimers, Nils and Gurevych, Iryna",
    booktitle = "Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing",
    month = "11",
    year = "2019",
    publisher = "Association for Computational Linguistics",
    url = "http://arxiv.org/abs/1908.10084",
}
Downloads last month
1