Serverless Functions & API Routes
Temps is a container platform, not a serverless runtime. There are no Lambda-style isolated functions. Instead, your framework's built-in API routes run inside the same container as your application — and Temps adds cron scheduling for background work.
How Temps differs from serverless platforms
On platforms like Vercel or AWS Lambda, each API route or function runs in an isolated ephemeral environment. On Temps, your entire application — frontend and API routes — runs in a single long-lived container.
What Temps provides
- Framework API routes (Next.js, Remix, SvelteKit, Nuxt, Astro)
- Scheduled cron jobs via
.temps.yaml - Persistent containers with in-memory state
- No cold starts
- No per-invocation billing
- Full access to the filesystem
What Temps does not provide
- Isolated per-function execution
- Auto-scaling to zero
- Edge functions (runs from your server, not CDN edge)
- Per-function memory/timeout configuration
- Separate function deployments
For most applications, the container model is simpler and more predictable. You avoid cold starts, function size limits, and the complexity of managing dozens of isolated function deployments.
Framework API routes
If your framework supports API routes or server endpoints, they work on Temps without any special configuration. The framework runs as a server inside the container and handles both page requests and API calls.
Next.js API routes
Next.js App Router route handlers and Pages Router API routes both work:
import { NextResponse } from 'next/server';
export async function GET() {
const users = await db.query('SELECT * FROM users');
return NextResponse.json(users);
}
export async function POST(request: Request) {
const body = await request.json();
const user = await db.insert('users', body);
return NextResponse.json(user, { status: 201 });
}
Remix loaders and actions
app/routes/users.tsx
import { json } from '@remix-run/node';
import type { LoaderFunction, ActionFunction } from '@remix-run/node';
export const loader: LoaderFunction = async () => {
const users = await db.query('SELECT * FROM users');
return json(users);
};
export const action: ActionFunction = async ({ request }) => {
const form = await request.formData();
const user = await db.insert('users', Object.fromEntries(form));
return json(user, { status: 201 });
};
SvelteKit endpoints
src/routes/api/users/+server.ts
import { json } from '@sveltejs/kit';
export async function GET() {
const users = await db.query('SELECT * FROM users');
return json(users);
}
Nuxt server routes
server/api/users.ts
export default defineEventHandler(async (event) => {
const users = await db.query('SELECT * FROM users');
return users;
});
All of these work identically on Temps as they do in local development. The framework handles routing, and Temps routes HTTP traffic to the container.
Cron jobs
Temps supports scheduled HTTP calls to your application via the cron section in .temps.yaml. This is the closest equivalent to scheduled serverless functions.
.temps.yaml
cron:
- path: /api/cron/cleanup
schedule: "0 0 * * *"
name: "Daily Cleanup"
- path: /api/cron/send-digests
schedule: "0 9 * * 1"
name: "Weekly Digest"
- path: /api/cron/sync-inventory
schedule: "*/15 * * * *"
name: "Inventory Sync"
- Name
path- Type
- string
- Description
The HTTP path to call. Temps sends a
GETrequest to this path on schedule.
- Name
schedule- Type
- string
- Description
A cron expression. Standard five-field format:
minute hour day-of-month month day-of-week.
- Name
name- Type
- string
- Description
Human-readable name displayed in the Temps dashboard.
Implementing a cron endpoint
Your cron endpoint is a regular HTTP route. Keep it fast — the cron system sends a request and expects a response.
app/api/cron/cleanup/route.ts
export async function GET() {
// Delete expired sessions
const deleted = await db.query(
'DELETE FROM sessions WHERE expires_at < NOW()'
);
return Response.json({
deleted: deleted.rowCount,
timestamp: new Date().toISOString(),
});
}
Cron endpoints run inside your existing container — there is no separate function environment. Long-running cron tasks share resources with your web traffic. For heavy background processing, consider a dedicated worker container deployed as a separate Temps project.
Common cron schedules
| Schedule | Expression | Description |
|---|---|---|
| Every minute | * * * * * | High-frequency polling |
| Every 15 minutes | */15 * * * * | Regular sync jobs |
| Hourly | 0 * * * * | Hourly reports |
| Daily at midnight | 0 0 * * * | Nightly cleanup |
| Weekdays at 9am | 0 9 * * 1-5 | Business hours tasks |
| Weekly on Monday | 0 9 * * 1 | Weekly digests |
| Monthly on the 1st | 0 0 1 * * | Monthly reports |
Standalone API servers
If you are building a pure API (no frontend), deploy it as a regular web application. Temps auto-detects backend frameworks:
import express from 'express';
const app = express();
app.use(express.json());
app.get('/api/users', async (req, res) => {
const users = await db.query('SELECT * FROM users');
res.json(users);
});
const port = process.env.PORT || 3000;
app.listen(port, '0.0.0.0');
See the web applications guide for full framework support and auto-detection details.
When to use what
| Pattern | Best for | Temps approach |
|---|---|---|
| API routes within a framework | Full-stack apps with both pages and API endpoints | Deploy the framework — routes are included automatically |
| Standalone API server | Backend services, microservices | Deploy as a web application |
| Scheduled tasks | Cleanup, reports, sync jobs | Use cron jobs in .temps.yaml |
| Background processing | Heavy computation, queue processing | Deploy a separate worker container |
| Webhooks | Third-party integrations | Add an endpoint to your existing app |
Migrating from serverless platforms
If you are moving from Vercel, Netlify, or AWS Lambda:
From Vercel serverless functions
Vercel functions in api/ or app/api/ are typically Next.js API routes. These work on Temps without changes — Next.js handles the routing regardless of where it runs.
If you used Vercel's @vercel/og or @vercel/blob, replace them with:
- Image generation: Run your own image service or use a library like
sharp - Blob storage: Use the Temps Blob SDK or S3-compatible storage
From AWS Lambda
Lambda handlers need to be wrapped in an HTTP server. Replace the Lambda handler signature with an Express/Fastify route:
Before (Lambda)
export async function handler(event) {
const users = await getUsers();
return { statusCode: 200, body: JSON.stringify(users) };
}
After (Express)
import express from 'express';
const app = express();
app.get('/api/users', async (req, res) => {
const users = await getUsers();
res.json(users);
});
app.listen(process.env.PORT || 3000, '0.0.0.0');
The business logic stays the same — you only change the HTTP interface.