Instructions to use effcot/Limo_llama with libraries, inference providers, notebooks, and local apps. Follow these links to get started.
- Libraries
- PEFT
How to use effcot/Limo_llama with PEFT:
from peft import PeftModel from transformers import AutoModelForCausalLM base_model = AutoModelForCausalLM.from_pretrained("meta-llama/Llama-3.1-8B-Instruct") model = PeftModel.from_pretrained(base_model, "effcot/Limo_llama") - Notebooks
- Google Colab
- Kaggle
- Xet hash:
- c9814d6b0c22b07960d58858d3e569d4f3d7b599e3c990da6fa9d2a4f5326ccd
- Size of remote file:
- 8.02 kB
- SHA256:
- 0fb0e6cdf2ac53fb13f2f98898d6964e3d6d4f8d57b3d7a9f7e6a9f3d8b5e795
·
Xet efficiently stores Large Files inside Git, intelligently splitting files into unique chunks and accelerating uploads and downloads. More info.