Running Featured 1.34k FineWeb: decanting the web for the finest text data at scale π· 1.34k Explore and download the FineWeb webβtext dataset
Running on CPU Upgrade Featured 3.17k The Smol Training Playbook π 3.17k The secrets to building world-class LLMs
π§ LFM2.5 Collection Collection of post-trained and base LFM2.5 models. β’ 30 items β’ Updated Apr 8 β’ 137
Trinity Collection Collection of Arcee AI models in the Trinity family β’ 14 items β’ Updated Mar 25 β’ 30
DINOv3 Collection DINOv3: foundation models producing excellent dense features, outperforming SotA w/o fine-tuning - https://arxiv.org/abs/2508.10104 β’ 15 items β’ Updated Mar 10 β’ 634
Kimi-K2 Collection Moonshot's MoE LLMs with 1 trillion parameters, exceptional on agentic intellegence β’ 5 items β’ Updated Jan 27 β’ 173