Text Generation
Edit model card

😈 Imp

Introduction

To fit the MLC framework for mobile devices, we further perform the 4-bit quantization to Imp-v1.5-3B-196 to obtain Imp-v1.5-3B-196-q4f16_1-MLC.

To use this model on moblie devices, please refer to the mlc-imp project.

License

This project is licensed under the Apache License 2.0 - see the LICENSE file for details.

Citation

If you use our model or refer our work in your studies, please cite:

@article{imp2024,
  title={Imp: Highly Capable Large Multimodal Models for Mobile Devices},
  author={Shao, Zhenwei and Yu, Zhou and Yu, Jun and Ouyang, Xuecheng and Zheng, Lihao and Gai, Zhenbiao and Wang, Mingyang and Ding, Jiajun},
  journal={arXiv preprint arXiv:2405.12107},
  year={2024}
}
Downloads last month
0
Inference Examples
Unable to determine this model's library. Check the docs .

Datasets used to train MILVLG/Imp-v1.5-3B-196-q4f16_1-MLC

Collection including MILVLG/Imp-v1.5-3B-196-q4f16_1-MLC