Hyperscale object storage for AI, offering high performance, scalability, security, and S3 compatibility for AI workloads.
An open-source family of large language models developed by Alibaba Cloud, featuring scalable model sizes and versions released under permissive licenses like Apache 2.0 .
An open-source reproduction of Meta AI’s LLaMA models, offering permissively licensed weights in 3B, 7B, and 13B parameter sizes compatible with both PyTorch and JAX.
It offers best-in-class embeddings, rerankers, web crawler scraper, deepsearch, and small LMs for multilingual and multimodal data search.
It helps turn technology into products faster, specializing in robotics, IIoT solutions, and open-source VPN infrastructure.
An open-source feature flag and remote configuration service to roll out, segment, and optimize features securely.
A fully open-source LLM framework from the Allen Institute for AI, providing model weights, training code, data, and evaluation tools in a transparent package.
It is a distributed, open-source graph database designed for scalability and handling large graphs across multi-machine clusters.
An open-source Firebase alternative with GraphQL, providing a fully managed, extensible backend platform for building scalable apps.
A data orchestrator platform for building, scheduling, and monitoring reliable AI and data pipelines.
An open-source low-code platform to build custom AI-powered apps and internal tools 10X faster with full code-level control.