Where is the checkpoint for tokenizer?

#1
by pootow - opened

When I test the code from model card homepage, I got an error when tokenizer = OFATokenizer.from_pretrained(ckpt_dir), so where is the checkpoint file for tokenizer?

Error message is:

Can't load tokenizer for 'G:/OFA-huge/'. If you were trying to load it from 'https://huggingface.co/models', make sure you don't have a local directory with the same name. Otherwise, make sure 'G:/OFA-huge/' is the correct path to a directory containing all relevant files for a OFATokenizer tokenizer.

By the way, I can load model with no problem.

OK, I now know there are two files ( vocab.json, merges.txt) missing, and I can use these two files from ofa-large repos.

pootow changed discussion status to closed

I just found the mistake!! Now all are updated!

Sign up or log in to comment