Instructions to use nvidia/Alpamayo-R1-10B with libraries, inference providers, notebooks, and local apps. Follow these links to get started.
- Libraries
- Transformers
How to use nvidia/Alpamayo-R1-10B with Transformers:
# Load model directly from transformers import AlpamayoR1 model = AlpamayoR1.from_pretrained("nvidia/Alpamayo-R1-10B", dtype="auto") - Notebooks
- Google Colab
- Kaggle
Is gguf possible?
#4
by 04RR - opened
Love this model, wanted to know if gguf quants are possible for this as this Cosmos-Reason + Action expert architecture of this model isn't supported by llamacpp. Would love to have the quants!