Deploy FastAPI to Production: The Easy Way with Temps
Deploy FastAPI to Production: The Easy Way with Temps
January 30, 2026 (3w ago)
Written by Temps Team
Last updated January 30, 2026 (3w ago)
Deploying a FastAPI application to production traditionally involves configuring Docker, setting up Nginx, managing SSL certificates, and integrating monitoring tools. With Temps, you skip all of that.
This guide shows you how to deploy FastAPI to production in minutes, with automatic HTTPS, built-in monitoring, and zero DevOps complexity.
What You'll Get
After this tutorial, your FastAPI app will have:
- Production deployment with automatic SSL
- Automatic Dockerfile generation (no Docker knowledge needed)
- Environment variable management
- Built-in error tracking (no Sentry setup)
- Request logging and analytics
- Custom domain support
Prerequisites
- A FastAPI application
- Git repository (GitHub, GitLab, or Bitbucket)
- Python 3.9+ project
Project Structure
Temps works with any FastAPI project structure. Here's a minimal example:
my-fastapi-app/
├── main.py # or app/main.py
├── requirements.txt # or pyproject.toml
└── .env # optional, for local development
main.py:
from fastapi import FastAPI
app = FastAPI()
@app.get("/")
def read_root():
return {"message": "Hello from FastAPI on Temps"}
@app.get("/health")
def health_check():
return {"status": "healthy"}
requirements.txt:
fastapi>=0.109.0
uvicorn[standard]>=0.27.0
Deploy with Temps CLI
Step 1: Install Temps CLI
# macOS / Linux
curl -fsSL https://temps.sh/deploy.sh | bash
Step 2: Login and Create Project
bunx @temps-sdk/cli login
# Create a project and connect your repository
bunx @temps-sdk/cli projects create -n "My FastAPI App" -d "FastAPI application"
bunx @temps-sdk/cli projects git -p my-fastapi-app \
--owner your-org --repo your-fastapi-app --branch main --preset dockerfile -y
Temps automatically detects Python projects and configures optimal settings.
Step 3: Deploy
bunx @temps-sdk/cli deploy my-fastapi-app -b main -e production -y
Temps will:
- Detect your FastAPI application
- Generate an optimized Dockerfile
- Build your container
- Deploy to production
- Provision SSL certificate
Your app is live at your-app.temps.sh within seconds.
Automatic Dockerfile Generation
You don't need Docker knowledge. Temps generates an optimized Dockerfile based on your project:
Generated Dockerfile:
FROM python:3.11-slim
WORKDIR /app
# Install dependencies
COPY requirements.txt .
RUN pip install --no-cache-dir -r requirements.txt
# Copy application
COPY . .
# Run with uvicorn
CMD ["uvicorn", "main:app", "--host", "0.0.0.0", "--port", "8000"]
If you have a custom Dockerfile, Temps uses that instead.
Configuring Your FastAPI App
Entry Point Detection
Temps detects your FastAPI app through your Dockerfile. If you don't have one, Temps generates an optimized Dockerfile automatically.
Common entry point patterns:
main.py→appapp/main.py→appsrc/main.py→app
Custom Configuration
For advanced settings, provide your own Dockerfile:
FROM python:3.11-slim
WORKDIR /app
COPY requirements.txt .
RUN pip install --no-cache-dir -r requirements.txt
COPY . .
# Custom entry point and port
CMD ["uvicorn", "app.main:app", "--host", "0.0.0.0", "--port", "8000"]
Configure resource limits and replicas via the CLI:
bunx @temps-sdk/cli projects config -p my-fastapi-app --cpu-limit 1 --memory-limit 512 --replicas 2
Environment Variables
Adding Variables
# Single variable
bunx @temps-sdk/cli environments vars set -e production DATABASE_URL "postgresql://..."
bunx @temps-sdk/cli environments vars set -e production OPENAI_API_KEY "sk-..."
# From file
bunx @temps-sdk/cli environments vars import -e production -f .env.production
# List all
bunx @temps-sdk/cli environments vars list -e production
Accessing in FastAPI
import os
from pydantic_settings import BaseSettings
class Settings(BaseSettings):
database_url: str
openai_api_key: str
debug: bool = False
settings = Settings()
Environment variables are encrypted and injected at runtime.
Database Connections
PostgreSQL
Add your database URL as an environment variable:
bunx @temps-sdk/cli environments vars set -e production DATABASE_URL "postgresql://user:pass@host:5432/db"
Use with SQLAlchemy or asyncpg:
from sqlalchemy.ext.asyncio import create_async_engine
import os
DATABASE_URL = os.getenv("DATABASE_URL")
engine = create_async_engine(DATABASE_URL)
Redis
bunx @temps-sdk/cli environments vars set -e production REDIS_URL "redis://host:6379"
import redis
import os
r = redis.from_url(os.getenv("REDIS_URL"))
MongoDB
bunx @temps-sdk/cli environments vars set -e production MONGODB_URI "mongodb://user:pass@host:27017/db"
from motor.motor_asyncio import AsyncIOMotorClient
import os
client = AsyncIOMotorClient(os.getenv("MONGODB_URI"))
Built-in Monitoring
After deployment, your Temps dashboard includes:
Request Analytics
- Request count and latency
- Endpoint breakdown with timing
- Error rates by route
- Geographic distribution
Error Tracking
- Exception capture with stack traces
- Request context (headers, body, user)
- Error grouping and trends
No additional setup or subscriptions required.
API Documentation
FastAPI's automatic documentation works out of the box:
- Swagger UI:
https://your-app.temps.sh/docs - ReDoc:
https://your-app.temps.sh/redoc
Both are accessible in production by default. To restrict access:
from fastapi import FastAPI
import os
app = FastAPI(
docs_url="/docs" if os.getenv("ENABLE_DOCS") else None,
redoc_url="/redoc" if os.getenv("ENABLE_DOCS") else None,
)
Production Best Practices
Health Checks
Always include a health endpoint:
@app.get("/health")
async def health():
# Check database connection
# Check external services
return {"status": "healthy"}
Temps uses this for load balancing and auto-recovery.
CORS Configuration
from fastapi.middleware.cors import CORSMiddleware
app.add_middleware(
CORSMiddleware,
allow_origins=["https://yourfrontend.com"],
allow_credentials=True,
allow_methods=["*"],
allow_headers=["*"],
)
Structured Logging
import logging
import json
class JSONFormatter(logging.Formatter):
def format(self, record):
return json.dumps({
"timestamp": self.formatTime(record),
"level": record.levelname,
"message": record.getMessage(),
"module": record.module,
})
handler = logging.StreamHandler()
handler.setFormatter(JSONFormatter())
logging.getLogger().addHandler(handler)
Temps aggregates these logs in your dashboard.
Scaling Your Application
Horizontal Scaling
Scale to multiple instances:
bunx @temps-sdk/cli environments scale -e production --replicas 3
Temps handles load balancing automatically.
Resource Allocation
Update resources as needed:
bunx @temps-sdk/cli projects config -p my-fastapi-app --cpu-limit 1 --memory-limit 1024
bunx @temps-sdk/cli deploy my-fastapi-app -b main -e production -y
Custom Domains
DNS Configuration
Add an A record pointing to your server:
| Type | Name | Value |
|---|---|---|
| A | api | YOUR_SERVER_IP |
Adding Your Domain
# Uses HTTP challenge by default
bunx @temps-sdk/cli domains add -d api.yourdomain.com
For wildcard domains, DNS challenge is required:
bunx @temps-sdk/cli domains add -d "*.yourdomain.com" --challenge=dns-01
With --challenge=dns-01, the CLI shows a TXT record to add. Once propagated, Temps completes ACME validation. For standard domains, the default http-01 challenge handles everything automatically.
SSL certificates are provisioned via Let's Encrypt.
Comparing Deployment Options
Temps vs Manual Docker + VPS
| Aspect | Temps | Docker + VPS |
|---|---|---|
| Setup time | 5 minutes | 1-2 hours |
| Docker knowledge | Not required | Required |
| SSL setup | Automatic | Manual (Let's Encrypt) |
| Nginx config | Not required | Required |
| Monitoring | Built-in | Setup Prometheus, Grafana |
| Updates | Automatic | Manual |
Temps vs Railway
| Aspect | Temps | Railway |
|---|---|---|
| Python detection | Automatic | Automatic |
| Free tier | Generous | Limited |
| Scaling | Automatic | Manual |
| Error tracking | Built-in | Not included |
| Database hosting | Available | Available |
Temps vs AWS Lambda
| Aspect | Temps | Lambda |
|---|---|---|
| Cold starts | None (containers) | 100-500ms |
| Timeout | Unlimited | 15 min max |
| Complexity | Low | High |
| Cost at scale | Predictable | Variable |
Common FastAPI Patterns
Background Tasks
from fastapi import BackgroundTasks
@app.post("/send-email")
async def send_email(
email: str,
background_tasks: BackgroundTasks
):
background_tasks.add_task(send_email_task, email)
return {"message": "Email queued"}
Dependency Injection
from fastapi import Depends
async def get_db():
db = SessionLocal()
try:
yield db
finally:
db.close()
@app.get("/users")
async def get_users(db: Session = Depends(get_db)):
return db.query(User).all()
Middleware
import time
@app.middleware("http")
async def add_timing(request, call_next):
start = time.time()
response = await call_next(request)
response.headers["X-Process-Time"] = str(time.time() - start)
return response
Troubleshooting
Build Fails
Check build logs in the dashboard. Common issues:
- Missing dependencies in
requirements.txt - Python version mismatch
- Syntax errors in code
App Crashes on Start
- Check runtime logs:
bunx @temps-sdk/cli logs - Verify environment variables are set
- Test locally with same configuration
Database Connection Errors
- Verify DATABASE_URL is correct
- Check database is accessible from internet
- Whitelist Temps IPs if needed
Quick Reference
# Install CLI
curl -fsSL https://temps.sh/deploy.sh | bash
# Login
bunx @temps-sdk/cli login
# Create project and connect repo
bunx @temps-sdk/cli projects create -n "My API" -d "FastAPI application"
bunx @temps-sdk/cli projects git -p my-api --owner myorg --repo my-api --branch main --preset dockerfile -y
# Deploy
bunx @temps-sdk/cli deploy my-api -b main -e production -y
# View logs
bunx @temps-sdk/cli runtime-logs -p my-api -f
# Set environment variable
bunx @temps-sdk/cli environments vars set -e production KEY "value"
# Scale replicas
bunx @temps-sdk/cli environments scale -e production --replicas 3
# Add domain (HTTP challenge by default, use --challenge=dns-01 for wildcards)
bunx @temps-sdk/cli domains add -d api.example.com
Ready to deploy your FastAPI app? Get started at temps.sh or run:
curl -fsSL https://temps.sh/deploy.sh | bash && bunx @temps-sdk/cli deploy