If you rely on the Rust Tokenizer implementations (through bindings to Tokenizers) and therefore always requires the tokenizer.json file.

Generated with:

from transformers import AutoTokenizer

tokenizer = AutoTokenizer.from_pretrained("aware-ai/whisper-tiny-german")
assert tokenizer.is_fast
tokenizer.save_pretrained("...")
A\\Ware org

thanks for catching this

flozi00 changed pull request status to merged

Sign up or log in to comment