arxiv:2508.02271
Jacob Nielsen
JacobBITLABS
AI & ML interests
None yet
Recent Activity
authored
a paper
about 1 month ago
Continual Quantization-Aware Pre-Training: When to transition from
16-bit to 1.58-bit pre-training for BitNet language models?
authored
a paper
about 1 month ago
DeToNATION: Decoupled Torch Network-Aware Training on Interlinked Online Nodes
updated
a dataset
3 months ago
JacobBITLABS/Polyp-Box