Managed Services
Temps provides fully managed database and storage services that you can provision and link to your projects in seconds. Unlike platforms like Vercel that require external services, Temps includes everything you need.
Competitive Advantage: While platforms like Vercel require third-party database providers (Vercel Postgres, Vercel KV), Temps includes managed PostgreSQL, Redis, and S3 storage out of the box—no additional accounts or billing required.
Available Services
Temps offers three types of managed services that integrate seamlessly with your projects:
- PostgreSQL — Production-ready relational database
- Redis — High-performance caching and pub/sub
- S3 Storage — Object storage compatible with MinIO and AWS S3
All services are provisioned on your infrastructure, giving you complete control over your data.
Quick provision via CLI
# Create a PostgreSQL database
temps service create postgres my-db \
--version 16 \
--storage 10GB
# Create a Redis instance
temps service create redis my-cache \
--version 7 \
--memory 512MB
# Create S3 storage
temps service create s3 my-storage \
--size 50GB
PostgreSQL Database
Creating a PostgreSQL Instance
Provision a production-ready PostgreSQL database with automatic backups and high availability:
Via Dashboard:
- Navigate to Services → Create Service
- Select PostgreSQL
- Choose version (13, 14, 15, or 16)
- Configure storage size and instance resources
- Click Create Database
Via CLI:
temps service create postgres production-db \
--version 16 \
--storage 20GB \
--memory 2GB \
--replicas 2
Configuration Options:
| Option | Description | Default |
|---|---|---|
version | PostgreSQL version (13-16) | 16 |
storage | Disk space allocation | 10GB |
memory | RAM allocation | 1GB |
replicas | Number of read replicas | 1 |
backups | Automatic daily backups | Enabled |
Connecting to PostgreSQL
Once created, Temps generates a connection string you can use in your applications:
Node.js/TypeScript
import { Pool } from 'pg';
const pool = new Pool({
connectionString: process.env.DATABASE_URL,
ssl: {
rejectUnauthorized: true
}
});
// Query example
const result = await pool.query('SELECT * FROM users WHERE id = $1', [userId]);
Python
import psycopg2
conn = psycopg2.connect(
os.environ['DATABASE_URL'],
sslmode='require'
)
cursor = conn.cursor()
cursor.execute('SELECT * FROM users WHERE id = %s', (user_id,))
Connection String Format: postgresql://username:password@host:5432/database?sslmode=require
The connection string is automatically injected as DATABASE_URL when you link the service to a project.
Linking to Projects
Link your PostgreSQL database to one or more projects:
Link database to project
# Link database to project
temps service link production-db --project my-app
# Link to specific environment
temps service link production-db \
--project my-app \
--environment production
Database Migrations
Run migrations during deployment by configuring build hooks:
temps.json
{
"build": {
"command": "npm run build",
"beforeDeploy": [
"npm run db:migrate"
]
}
}
Redis Caching
Creating a Redis Instance
Provision a high-performance Redis instance for caching, sessions, or pub/sub:
Via Dashboard:
- Navigate to Services → Create Service
- Select Redis
- Choose version (6 or 7)
- Configure memory allocation
- Click Create Cache
Via CLI:
temps service create redis session-store \
--version 7 \
--memory 1GB \
--persistence enabled
Configuration Options:
| Option | Description | Default |
|---|---|---|
version | Redis version (6-7) | 7 |
memory | RAM allocation | 512MB |
persistence | Enable RDB/AOF | Disabled |
eviction | Eviction policy | allkeys-lru |
maxclients | Max connections | 10000 |
Connecting to Redis
Node.js with ioredis
import Redis from 'ioredis';
const redis = new Redis(process.env.REDIS_URL);
// Set a value
await redis.set('session:123', JSON.stringify(sessionData), 'EX', 3600);
// Get a value
const data = await redis.get('session:123');
Python with redis-py
import redis
import os
r = redis.from_url(os.environ['REDIS_URL'])
# Set with expiration
r.setex('session:123', 3600, json.dumps(session_data))
# Get value
data = r.get('session:123')
Connection String Format: redis://username:password@host:6379/0
When linked to a project, Redis connection details are available as REDIS_URL, REDIS_HOST, REDIS_PORT, and REDIS_PASSWORD.
Common Use Cases
- Name
Session Storage- Description
Store user sessions with automatic expiration:
await redis.set(`session:${userId}`, sessionData, 'EX', 86400); // 24 hours
- Name
Rate Limiting- Description
Implement rate limiting with Redis counters:
const count = await redis.incr(`rate:${ipAddress}`); if (count === 1) await redis.expire(`rate:${ipAddress}`, 60); // 60 seconds if (count > 100) throw new Error('Rate limit exceeded');
- Name
Caching- Description
Cache expensive queries or API responses:
const cached = await redis.get(`cache:${key}`); if (cached) return JSON.parse(cached); const data = await fetchExpensiveData(); await redis.set(`cache:${key}`, JSON.stringify(data), 'EX', 300); // 5 minutes
- Name
Pub/Sub- Description
Real-time messaging between services:
// Publisher await redis.publish('notifications', JSON.stringify(message)); // Subscriber redis.subscribe('notifications', (message) => { console.log('Received:', JSON.parse(message)); });
S3 Object Storage
Creating S3 Storage
Provision MinIO-compatible object storage for files, backups, and assets:
Via Dashboard:
- Navigate to Services → Create Service
- Select S3 Storage
- Configure storage size
- Set bucket policy (private/public)
- Click Create Storage
Via CLI:
temps service create s3 media-storage \
--size 100GB \
--public-read false
Configuration Options:
| Option | Description | Default |
|---|---|---|
size | Storage allocation | 50GB |
public-read | Public read access | false |
versioning | Object versioning | Disabled |
lifecycle | Auto-delete rules | None |
Connecting to S3
Temps S3 is compatible with AWS S3 SDKs:
Node.js with AWS SDK v3
import { S3Client, PutObjectCommand, GetObjectCommand } from '@aws-sdk/client-s3';
const s3 = new S3Client({
endpoint: process.env.S3_ENDPOINT,
region: 'us-east-1',
credentials: {
accessKeyId: process.env.S3_ACCESS_KEY,
secretAccessKey: process.env.S3_SECRET_KEY,
},
});
// Upload a file
await s3.send(new PutObjectCommand({
Bucket: 'media-storage',
Key: 'uploads/image.jpg',
Body: fileBuffer,
ContentType: 'image/jpeg',
}));
// Download a file
const response = await s3.send(new GetObjectCommand({
Bucket: 'media-storage',
Key: 'uploads/image.jpg',
}));
Python with boto3
import boto3
import os
s3 = boto3.client(
's3',
endpoint_url=os.environ['S3_ENDPOINT'],
aws_access_key_id=os.environ['S3_ACCESS_KEY'],
aws_secret_access_key=os.environ['S3_SECRET_KEY'],
)
# Upload file
s3.upload_file('local-file.jpg', 'media-storage', 'uploads/image.jpg')
# Download file
s3.download_file('media-storage', 'uploads/image.jpg', 'downloaded.jpg')
Public URLs
Generate pre-signed URLs for temporary access:
import { getSignedUrl } from '@aws-sdk/s3-request-presigner';
const command = new GetObjectCommand({
Bucket: 'media-storage',
Key: 'uploads/private-document.pdf',
});
// URL expires in 1 hour
const url = await getSignedUrl(s3, command, { expiresIn: 3600 });
Service Management
Monitoring Service Health
View service metrics and health status in the dashboard:
- CPU & Memory Usage: Real-time resource consumption
- Connection Count: Active database/cache connections
- Storage Usage: Disk space utilization
- Uptime: Service availability percentage
- Query Performance: Slow query detection (PostgreSQL)
CLI monitoring
# View service status
temps service status production-db
# View service logs
temps service logs production-db --tail 100
# View connection count
temps service stats production-db
Backups
All services include automatic daily backups retained for 7 days (configurable up to 90 days):
Backup management
# List backups
temps service backup list production-db
# Create manual backup
temps service backup create production-db --name "pre-migration"
# Restore from backup
temps service restore production-db --backup backup-20240115-120000
# Download backup
temps service backup download production-db \
--backup backup-20240115-120000 \
--output backup.sql.gz
Scaling Services
Scale your services as your needs grow:
Scale services
# Scale PostgreSQL storage
temps service scale production-db --storage 50GB
# Scale Redis memory
temps service scale session-store --memory 2GB
# Add read replicas
temps service scale production-db --replicas 3
Scaling operations are performed with zero downtime. Your applications remain connected during the scaling process.
Security Best Practices
- Name
Connection Security- Description
- Always use SSL/TLS connections (
sslmode=requirefor PostgreSQL) - Store credentials in environment variables, never in code
- Use the Temps-provided connection strings (automatically secured)
- Always use SSL/TLS connections (
- Name
Network Access- Description
- Services are only accessible from projects within your Temps instance
- Configure IP allowlists for external access if needed
- Use private networking for service-to-service communication
- Name
Credential Rotation- Description
# Rotate database password temps service rotate-password production-db # Rotate S3 access keys temps service rotate-keys media-storageConnection strings are automatically updated in linked projects.
- Name
Audit Logging- Description
All service access is logged for security auditing:
- Connection attempts (successful/failed)
- Query execution (optional for PostgreSQL)
- Object access (S3 operations)
- Configuration changes
Pricing & Resource Limits
Self-Hosted Advantage: Since Temps is self-hosted, managed services run on your infrastructure. There are no per-GB storage charges, no connection limits, and no metered bandwidth fees like other platforms.
Recommended Resource Allocation:
| Use Case | PostgreSQL | Redis | S3 Storage |
|---|---|---|---|
| Small Project | 1GB RAM, 10GB storage | 256MB | 10GB |
| Medium Project | 2GB RAM, 50GB storage | 1GB | 100GB |
| Large Project | 4GB+ RAM, 200GB+ storage | 4GB+ | 500GB+ |
Comparison with Other Platforms
| Feature | Temps | Vercel | Netlify | Railway |
|---|---|---|---|---|
| PostgreSQL | ✅ Built-in | ⚠️ Vercel Postgres (extra) | ❌ External only | ✅ Built-in |
| Redis | ✅ Built-in | ⚠️ Vercel KV (extra) | ❌ External only | ✅ Built-in |
| S3 Storage | ✅ MinIO-compatible | ⚠️ Vercel Blob (extra) | ❌ External only | ❌ External only |
| Connection Limits | ❌ No limits | ✅ Metered/capped | ✅ Metered/capped | ⚠️ Plan-based |
| Storage Cost | ❌ Your infrastructure | ✅ $0.15/GB/month | ✅ $0.10/GB/month | ✅ Pay per GB |
| Backups | ✅ Automatic | ⚠️ Enterprise only | ⚠️ Add-on | ✅ Automatic |
Next Steps
- Configure Environment Variables to inject service credentials
- Set up Resource Limits to allocate CPU and memory
- Enable Monitoring to track service health
- Configure Backups to customize backup retention