An open-source durable execution platform that makes building reliable applications easier by handling failures automatically.
An autonomous agent platform for data and AI cost optimization on Snowflake, Databricks, EMR & Redshift.
Hyperscale object storage for AI, offering high performance, scalability, security, and S3 compatibility for AI workloads.
An open-source family of large language models developed by Alibaba Cloud, featuring scalable model sizes and versions released under permissive licenses like Apache 2.0 .
An open-source reproduction of Meta AI’s LLaMA models, offering permissively licensed weights in 3B, 7B, and 13B parameter sizes compatible with both PyTorch and JAX.
A fully open-source LLM framework from the Allen Institute for AI, providing model weights, training code, data, and evaluation tools in a transparent package.
A fully-managed vector database service powered by Milvus, designed for enterprise-grade AI applications at any scale.
A fully open-source, 1B-parameter Chinese-centric language model trained from scratch, with complete access to code, data, checkpoints, and logs under the Apache 2.0 license.
A compact, open-source LLaMA-based language model (~1.1 B parameters) pretrained on trillions of tokens under the Apache 2.0 license.
A fully open-source large language model suite offering reproducible training, open weights, and instruction-tuned variants under the Apache 2.0 license.
An open-source vector database and search engine for building AI applications, offering scalable similarity search with a convenient API.