ConicCat/GLM-4.5-Architect-106B-A12B

Big Bottom Text

A finetune of GLM-4.5 air to improve prose and writitquality and attempt to remove the bulk of glm-isms using a Gutenberg-like methodology.

No particular attempt was made to preverse thinking ability; I recommend skipping thinking as in the GLM template i.e. using \n<think></think> as a prefill.

Trained on a variety of backtranslated critically acclaimed short story anthologies for 8 epochs.

Downloads last month
33
Safetensors
Model size
110B params
Tensor type
BF16
·
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for ConicCat/GLM-4.5-Architect-106B-A12B

Finetuned
(35)
this model
Quantizations
1 model