Documentation
Welcome to the official nAIn documentation. nAIn is a cloud-native, agentic workflow orchestrator designed to be the "Next AI Node" for the autonomous era.
💡 nAIn is built to run on Cloudflare Workers, providing ultimate speed and global distribution for your AI agents.
Architecture
nAIn is composed of four primary layers that work in harmony to provide a seamless agentic experience:
- Control Plane: High-performance Worker API for orchestration.
- Dynamic Registry: Metadata-rich index of all authorized tools.
- Workflow Engine: Specialized runtime for structured LLM logic.
- Secure MCP Hub: Real-time bridge for connecting LLMs to your infrastructure.
Getting Started
Follow these steps to populate your registry and secure your first agent.
1. Managing Endpoints
Endpoints are the tools your agents can use. You can add them manually or via OpenAPI Import.
// Manual Endpoint Definition
{
"endpoint_id": "send_alert",
"method": "POST",
"path": "/v1/alerts",
"requires_auth": true
}
2. Secure Credentials
nAIn uses Encrypted Vaults (AES-GCM). Once saved, raw keys are never shown in the UI again. They are only injected during secure worker execution.
3. Creating an Agent
An Agent provides a secure token with Scoped Access. You must explicitly select which Endpoints and Credentials the agent can access.
Dynamic Tool Registry
nAIn provides Agents (Claude, GPT-4, etc.) with rich metadata and tool schemas. By leveraging the Agent's native reasoning for tool selection, nAIn eliminates internal embedding costs while increasing accuracy.
Workflow Engine
The nAIn engine supports JSON Mode Logic and structured DAG execution. Access previous
node outputs using the {{$.nodes.NODE_ID.output}} syntax.
Nodes Reference
nAIn provides a rich set of nodes to build complex, resilient, and collaborative workflows.
| Node Type | Description | Key Config |
|---|---|---|
http_request |
Executes a registered API endpoint. | endpoint_id, body |
tool_call |
Calls a custom TS tool from the hub. | tool, input |
branch |
Conditional logic for dynamic paths. | condition, true_next |
approval |
(HITL) Pauses for manual review. | payload, notify: true |
transform |
Maps & renames data (JSONPath support). | mapping: { key: path } |
filter |
Retrieves items matching a criteria. | input_ref, condition |
batch |
Splits lists into manageable chunks. | input_ref, size |
merge |
Synchronizes parallel execution paths. | No config (waits for parents) |
wait_event |
Wait for an external Webhook/SSE event. | event_name |
schedule |
Delays execution until a specific time. | wait_seconds or at |
🔒 Note: The approval node triggers an automated Resend email for Pro
accounts.
MCP Hub
Connect Claude Desktop, ChatGPT, or n8n directly to nAIn via the Model Context Protocol.
Authentication
Bearer Token (Standard): Authorization: Bearer YOUR_TOKEN
URL Parameter (Quick): ?token=YOUR_TOKEN
Custom Header: X-nAIn-Token: YOUR_TOKEN
Pricing & Limits
| Resource | Free | Pro | Business |
|---|---|---|---|
| Endpoints | 20 | 250 | Unlimited |
| Credentials | 10 | 100 | Unlimited |
| Agents | 5 | 50 | Unlimited |
| Price | $0 | $19 | Custom |
Quick Integrations
🤖 Claude Desktop
{
"mcpServers": {
"nAIn": {
"command": "npx",
"args": ["-y", "@modelcontextprotocol/sdk", "launch-sse", "https://mcp-nain.automators.work/mcp/sse?token=YOUR_TOKEN"]
}
}
}
⚡ n8n
Use an HTTP Request node with Authorization: Bearer YOUR_TOKEN.
Workflow Syntax
Pass data between nodes using object notation:
{
"id": "notify",
"type": "http_request",
"config": {
"body": { "msg": "User {{$.nodes.get_user.output.name}} logged in" }
}
}
Security & Trust
🔒 Zero-Knowledge: Credentials are encrypted using AES-GCM. We never see your raw keys.
🛡️ Scoped Isolation: Every token is sandboxed to the resources you authorize.