Instructions to use shareAI/CodeLLaMA-chat-13b-Chinese with libraries, inference providers, notebooks, and local apps. Follow these links to get started.
- Libraries
- Transformers
How to use shareAI/CodeLLaMA-chat-13b-Chinese with Transformers:
# Use a pipeline as a high-level helper from transformers import pipeline pipe = pipeline("question-answering", model="shareAI/CodeLLaMA-chat-13b-Chinese")# Load model directly from transformers import AutoModel model = AutoModel.from_pretrained("shareAI/CodeLLaMA-chat-13b-Chinese", dtype="auto") - Notebooks
- Google Colab
- Kaggle
- Xet hash:
- 03930b05189230497996a1fde025958289665bfb1752e5a7cfe9a03b8c0fbcd0
- Size of remote file:
- 29.9 kB
- SHA256:
- 0137bde65abc67466d1ae70a8922b00f11310a20ba056160e1002a50044d97d1
Β·
Xet efficiently stores Large Files inside Git, intelligently splitting files into unique chunks and accelerating uploads and downloads. More info.