February 18, 2026 (1mo ago)
Written by Temps Team
Last updated February 18, 2026 (1mo ago)
Deployment dashboards are great until you're switching between six browser tabs to find a deployment log, check analytics, and scale your staging environment.
What if your AI agent could do all of that from a single conversation?
Today we're releasing the Temps MCP Server — a Model Context Protocol integration that gives AI assistants like Claude, Cursor, and Windsurf direct access to your entire self-hosted infrastructure.
The Model Context Protocol is an open standard that lets AI assistants call tools on external systems. Instead of copy-pasting URLs and reading JSON responses, your AI agent makes structured API calls and formats the results for you.
With the Temps MCP Server, your AI agent becomes a full infrastructure operator.
The MCP server covers every part of the Temps platform:
| Category | What you can do |
|---|---|
| Deployments | Trigger, roll back, pause, resume, cancel. Stream build logs with parsed pipeline stages. |
| Environments | Create staging/production environments, manage env vars, scale replicas, update resource limits. |
| Analytics | Query visitors, sessions, page views, bounce rates. Break down by country, browser, device, UTM parameters, referrer, and 15+ other properties. |
| Containers | List, start, stop, restart. Get resource metrics and runtime logs. |
| Domains & SSL | Add custom domains, verify DNS, renew SSL certificates, manage ACME challenges. |
| Services | Provision and manage PostgreSQL, Redis, S3, and MongoDB. |
| Backups | Configure schedules, manage S3 sources, trigger manual backups. |
| Monitoring | Create uptime monitors, check status, view response time history. |
| Error Tracking | List error groups, view stack traces, get error dashboards and stats. |
| Security | Run vulnerability scans, manage IP access rules, configure security policies. |
| And more | Webhooks, DNS providers, funnels, incidents, audit logs, notifications, load balancer routes, proxy logs, API keys, users. |
224 tools is a lot of context for an AI agent. That's why the server supports category filtering — load only the tools relevant to your workflow:
{
"mcpServers": {
"temps": {
"command": "npx",
"args": ["@temps-sdk/mcp", "--tools", "deployments,analytics,projects"],
"env": {
"TEMPS_API_URL": "https://your-instance.example.com",
"TEMPS_API_KEY": "tk_..."
}
}
}
}
This loads ~33 tools instead of 224. Your AI agent responds faster, selects the right tool more reliably, and uses less context.
You can also filter via environment variable:
TEMPS_MCP_TOOLS=deployments,analytics npx @temps-sdk/mcp
Run npx @temps-sdk/mcp --help to see all 30 available categories.
Here's what it looks like in practice.
You: What's the bounce rate on /pricing for the last 7 days?
Agent: [calls get_page_detail with page_path="/pricing"]
Page Detail: /pricing
| Metric | Value |
| --- | --- |
| Unique Visitors | 1,847 |
| Total Page Views | 2,312 |
| Avg Time on Page | 42.3s |
| Bounce Rate | 68.2% |
| Entry Rate | 31.5% |
| Exit Rate | 45.1% |
Top Referrers:
| Referrer | Visits | % |
| google.com | 892 | 38.6% |
| twitter.com | 341 | 14.7% |
| (direct) | 289 | 12.5% |
You: Deploy my-app to production and stream the build logs
Agent: [calls trigger_deployment]
[calls get_deployment_logs]
Deployment #87 — Pipeline Stages
| # | Stage | Status | Duration |
| 1 | Clone | success | 3s |
| 2 | Build | success | 45s |
| 3 | Push Image | success | 12s |
| 4 | Deploy | success | 8s |
| 5 | Health Check | success | 5s |
All stages passed. Deployment is live.
You: Show me a breakdown of visitors by country for project 2
Agent: [calls get_property_breakdown with group_by="country"]
Country Breakdown — Project 2 (Total: 4,231)
| Country | Count | % |
| United States | 1,456 | 34.4% |
| Germany | 523 | 12.4% |
| United Kingdom | 412 | 9.7% |
| France | 287 | 6.8% |
| Canada | 245 | 5.8% |
You: Scale the staging environment to 3 replicas
Agent: [calls scale_environment with replicas=3]
Environment scaled. Staging now running 3 replicas.
Create one from your Temps dashboard under Settings > API Keys, or via CLI:
bunx @temps-sdk/cli api-keys create --name "mcp-agent" --role admin
Claude Desktop — add to claude_desktop_config.json:
{
"mcpServers": {
"temps": {
"command": "npx",
"args": ["@temps-sdk/mcp"],
"env": {
"TEMPS_API_URL": "https://your-temps-instance.com",
"TEMPS_API_KEY": "tk_your_api_key"
}
}
}
}
Claude Code — add to .mcp.json in your project root:
{
"mcpServers": {
"temps": {
"command": "npx",
"args": ["@temps-sdk/mcp"],
"env": {
"TEMPS_API_URL": "https://your-temps-instance.com",
"TEMPS_API_KEY": "tk_your_api_key"
}
}
}
}
Cursor — add to .cursor/mcp.json:
{
"mcpServers": {
"temps": {
"command": "npx",
"args": ["@temps-sdk/mcp", "--tools", "deployments,analytics"],
"env": {
"TEMPS_API_URL": "https://your-temps-instance.com",
"TEMPS_API_KEY": "tk_your_api_key"
}
}
}
}
Also works with Windsurf, Cline, OpenCode, OpenAI Codex, and any other MCP-compatible client.
No special syntax. Just ask your AI agent to manage your infrastructure in natural language.
The 14 analytics tools deserve special attention. You can query your traffic data the same way you'd ask a colleague:
get_unique_countsget_hourly_visitsget_page_pathsget_page_detailget_property_breakdown with group_by=referrer_hostnameget_property_breakdown with group_by=browserget_property_breakdown with group_by=utm_campaignget_page_flowget_general_statsget_active_visitors21 properties available for breakdowns: country, city, region, browser, device_type, operating_system, referrer_hostname, page_path, channel, language, utm_source, utm_medium, utm_campaign, utm_term, utm_content, and more.
This release also includes fixes that improve reliability:
scale_environment and update_environment_resources now use the correct HTTP method (PUT instead of PATCH). Resource fields corrected from string to integer types.get_deployment_logs now properly parses JSONL responses and renders a pipeline stage summary with formatted log output.@temps/cli to @temps-sdk/cli.The MCP server is part of the Temps ecosystem and fully open source. Inspect the code, contribute tools, or fork it for your own platform.
The Temps MCP Server is available now. Install with npx @temps-sdk/mcp and start managing your infrastructure from any AI agent.