An open-source reproduction of Meta AI’s LLaMA models, offering permissively licensed weights in 3B, 7B, and 13B parameter sizes compatible with both PyTorch and JAX.
A fully open-source LLM framework from the Allen Institute for AI, providing model weights, training code, data, and evaluation tools in a transparent package.
An open-source low-code platform to build custom AI-powered apps and internal tools 10X faster with full code-level control.
A fully-managed vector database service powered by Milvus, designed for enterprise-grade AI applications at any scale.
It is a lightweight, self-hostable chat application and framework built with Rust, offering private in-app messaging and AI bot integration.
An open-source TypeScript backend framework for building robust, type-safe distributed systems with declarative infrastructure-as-code.
A fully open-source, 1B-parameter Chinese-centric language model trained from scratch, with complete access to code, data, checkpoints, and logs under the Apache 2.0 license.
A compact, open-source LLaMA-based language model (~1.1 B parameters) pretrained on trillions of tokens under the Apache 2.0 license.
A fully open-source large language model suite offering reproducible training, open weights, and instruction-tuned variants under the Apache 2.0 license.
Authorization as a service for apps, RAG, and agents. Define authorization logic centrally and enforce access across your applications.
An open-source vector database and search engine for building AI applications, offering scalable similarity search with a convenient API.