Bleta 🐝
Collection
The Bleta Collection represents the continuous evolution of Albania’s digital services, offering innovative solutions tailored to the needs of citizen • 7 items • Updated • 2
Bleta is a fine-tuned Gemma-4 2B model specialized for the Albanian language (Shqip). Built to understand and generate natural, grammatically correct Albanian text.
Bleta (🐝) means "bee" in Albanian — a symbol of diligence and precision.
| Property | Value |
|---|---|
| Base Model | google/gemma-4-2b-it |
| Architecture | Gemma4ForConditionalGeneration |
| Parameters | ~2 Billion |
| Fine-tuning Method | LoRA → merged into full weights |
| Language | Albanian (sq), English (en) |
| License | Apache 2.0 |
| All-Time Downloads | 2,879 |
Fine-tuned on klei1/bleta-sq-dataset-v1 — a curated Albanian language instruction dataset covering conversation, grammar, reasoning, and general knowledge.
from transformers import AutoModelForCausalLM, AutoTokenizer
import torch
model_id = "klei1/bleta-sq-2b"
tokenizer = AutoTokenizer.from_pretrained(model_id)
model = AutoModelForCausalLM.from_pretrained(
model_id,
torch_dtype=torch.bfloat16,
device_map="auto"
)
messages = [
{"role": "user", "content": "Cila eshte kryeqyteti i Shqiperise?"}
]
inputs = tokenizer.apply_chat_template(
messages,
return_tensors="pt",
add_generation_prompt=True
).to(model.device)
outputs = model.generate(
inputs,
max_new_tokens=512,
temperature=0.7,
top_p=0.95,
repetition_penalty=1.2,
do_sample=True,
)
response = tokenizer.decode(outputs[0][inputs.shape[-1]:], skip_special_tokens=True)
print(response)
| Parameter | Value | Notes |
|---|---|---|
temperature |
0.7 | Balanced creativity |
max_new_tokens |
400–512 | Prevents loops |
repetition_penalty |
1.2 | Reduces repetition |
top_p |
0.95 | Nucleus sampling |
| Model | Params | Focus |
|---|---|---|
| bleta-sq-2b | 2B | Albanian general |
| bleta-meditor-27b | 27B | Medical + specialized |
| bleta-logjike-27b | 27B | Logic + reasoning |
| bleta-1B | 1B | Lightweight |
@model{bleta_sq_2b_2026,
title = {Bleta SQ 2B: Gemma-4 Fine-tuned for Albanian Language},
author = {klei1},
year = {2026},
url = {https://huggingface.co/klei1/bleta-sq-2b}
}
This model is released under the Apache 2.0 License.