Stay Updated
Subscribe to our newsletter for the latest news and updates about Alternatives
Subscribe to our newsletter for the latest news and updates about Alternatives
Jan is an open source local AI chat app that runs LLMs entirely on your hardware: no internet connection, no cloud API, no data sent off-device. AGPL-3.0.
It is a free, open-source video conferencing solution for web and mobile, offering secure, flexible, and feature-rich communication capabilities.
Temporal is an open source workflow orchestration platform that runs durable, fault-tolerant business logic across any number of services. MIT licensed; self-host or use Temporal Cloud.
A fast, typo-tolerant open source search engine for building instant search experiences with an easy-to-use API.
Meilisearch is a fast open source search engine that delivers typo-tolerant, faceted full-text search in milliseconds, self-hosted with a single binary and no JVM dependency.
Manifest is an open source backend that fits into a single YAML file, providing auth, CRUD, storage, and an admin panel for AI-assisted development workflows. MIT licensed; runs on Node.js.
Jan is an open source local AI chat app that runs large language models entirely on your own hardware, providing a ChatGPT-style interface without an internet connection or cloud dependency.
ChatGPT and similar services require a constant connection to external servers; every message you type is processed remotely. For users who work in air-gapped environments, handle sensitive professional data, or simply want AI without a subscription and monthly bill, there is no native offline option from the major providers.
Jan runs models locally using llama.cpp under the hood. Open the app, download a model from the built-in Hub (Llama 3, Mistral, Phi, Gemma, and dozens more), and start chatting, fully offline. The same interface supports remote API connections to OpenAI, Anthropic, or any OpenAI-compatible endpoint if you want to switch between local and cloud inference.
Download the installer for macOS, Windows, or Linux from the GitHub releases page. Open the app and download a model from the built-in Hub to start chatting offline immediately. GPU acceleration (CUDA, Metal, Vulkan) is auto-detected; CPU-only mode works on any machine.
AGPL-3.0. Free to download and use. Distributing a modified version as a network service requires open-sourcing your changes under AGPL.
Jan is best for developers and privacy-conscious individuals who want fully offline AI chat with complete model control, teams in regulated industries running air-gapped environments, and anyone who wants to run local LLMs without the command-line complexity of Ollama alone.
Unlike ChatGPT, Jan runs entirely on your device with no internet requirement: your conversations and documents never leave your machine, and you can switch between dozens of open models without changing providers or paying per message.