N8N2MCP is an open source bridge that converts n8n workflows into Model Context Protocol (MCP) servers, giving AI assistants like Claude and Cursor direct access to your n8n automations as callable tools without writing custom integration code.
The Problem
AI assistants like Claude and Cursor can take actions through tools, but connecting them to your existing n8n automation workflows requires writing custom API wrappers for each assistant. Every new AI client means more integration work, and the connections are not standardized or discoverable.
How N8N2MCP Solves It
N8N2MCP wraps your n8n workflows in the Model Context Protocol standard, which AI clients can discover and call automatically. Point N8N2MCP at your n8n instance, and any workflow becomes an MCP tool the AI can invoke. No custom connector per AI client; one configuration exposes all your automations to any MCP-compatible assistant. MIT license applies.
Key Features
- Converts any n8n workflow into an MCP server endpoint that AI assistants can discover and invoke
- Works with Claude Desktop, Cursor, and any MCP-compatible AI client without per-client integration code
- Uses n8n's existing visual editor: no additional scripting to expose workflows as tools
- Runs locally alongside n8n with no additional cloud dependency
- Opens n8n's 400+ integrations to any MCP-compatible AI assistant
Who It's For
N8N2MCP is best for developers and automation engineers who already run self-hosted n8n and want to give AI assistants programmatic access to their existing workflows without building custom API wrappers for each AI client.
Compared to n8n Webhook Triggers
Unlike n8n's built-in webhook triggers, which expose workflows as HTTP endpoints requiring per-client integration code, N8N2MCP uses the MCP standard so any compatible AI assistant discovers and calls your workflows automatically. One configuration replaces multiple custom connectors.

