An open-source Swift package for on-device LLM inference on Apple platforms using llama.cpp. Offers async APIs and customization, with alternatives like MLX, ggml, and Core ML–based models.
Connect multiple AI models easily with an open-source, user-friendly UI to create workflows with different AI models and integrate them into projects.