Stay Updated
Subscribe to our newsletter for the latest news and updates about Open Source Alternatives
Subscribe to our newsletter for the latest news and updates about Open Source Alternatives
A fully open-source large language model suite offering reproducible training, open weights, and instruction-tuned variants under the Apache 2.0 license.
Moxin-LLM is an open-source LLM family developed by the Moxin team, designed to prioritize full transparency, reproducibility, and broad accessibility. Trained on public datasets and released under the Apache 2.0 license, Moxin includes multiple model types Base, Instruct, Reasoning, and VLM (Vision-Language Model) all with open weights, code, data, and logs.Positioned as a transparent and reproducible alternative to closed models like GPT-4, Claude, and Gemini, Moxin-LLM also rivals open-source leaders like LLaMA 3.1, Mistral, and Qwen. It’s one of the few models meeting all criteria of the Model Openness Framework (MOF), ensuring verifiable and open training pipelines.
Key features include:
Use cases include: