请问chatglm3-6b支持Flash Attention 2.0吗
#51
by
chentao111
- opened
ValueError: ChatGLMForConditionalGeneration does not support Flash Attention 2.0 yet. Please request to add support where the model is hosted, on its model hub page: https://huggingface.co/model/chatglm3-6b/discussions/new or in the Transformers GitHub repo: https://github.com/huggingface/transformers/issues/new