Functions
Functions are the foundation of Aerostack. Every MCP server, every Skill, every Bot workflow, every Agent Endpoint — they all run on Functions under the hood. When you write a Function, you are writing the custom logic that powers your entire AI product.
If you can write TypeScript, you can build anything on Aerostack. Functions give you a fullstack edge runtime with Database, Cache, Queue, AI, Vector Search, and Storage — all built in, zero config.
Why Functions?
The problem
Building backends for AI apps today means stitching together 6+ services: a database, a cache, a queue, object storage, an AI provider, and a vector database. Each one has its own SDK, its own credentials, its own billing, and its own latency penalty. A single request can make 3-5 network hops before returning a response.
The Aerostack way
An Aerostack Function runs on Cloudflare’s edge — 300+ data centers worldwide. Every function gets six platform bindings injected automatically. They are not HTTP calls. They are native, in-process bindings that execute in the same worker process. A database query adds microseconds, not milliseconds.
// This is a complete backend. No SDK clients. No credentials. No config.
export default {
async fetch(request: Request, env: Env): Promise<Response> {
// Query your database — in-process, ~0ms
const users = await env.DB.prepare('SELECT * FROM users WHERE active = 1').all()
// Cache the result — same datacenter, ~0ms
await env.CACHE.put('active-users', JSON.stringify(users.results), { expirationTtl: 300 })
// Run AI inference — built-in, multi-provider
const summary = await env.AI.run('@cf/meta/llama-3.1-8b-instruct', {
messages: [{ role: 'user', content: `Summarize: ${JSON.stringify(users.results)}` }]
})
return Response.json({ users: users.results, summary: summary.response })
}
}One file. Six services. Zero configuration. Deployed globally in seconds.
How Functions Compare
| Traditional Stack | Aerostack Functions | |
|---|---|---|
| Database | fetch('https://your-db-api/query') — HTTP round trip | env.DB.prepare('SELECT ...').all() — in-process |
| Cache | redis.get(key) — network hop | env.CACHE.get(key) — same datacenter |
| Queue | sqs.sendMessage(...) — AWS API call | env.QUEUE.send(...) — native binding |
| AI | openai.chat(...) — external API | env.AI.run(model, ...) — built-in |
| Vector Search | pinecone.query(...) — external API | env.VECTORIZE.query(...) — native |
| Storage | s3.putObject(...) — AWS API call | env.STORAGE.put(...) — native |
| Latency per call | 50-200ms per service | ~0ms — all in-worker |
| Config | Manage credentials for each | Zero config — pre-wired |
| Deploy | Docker → Kubernetes → CDN | aerostack deploy → 300+ locations |
Functions Are the Engine Behind Everything
Every product on Aerostack is powered by Functions:
- MCP Servers — Your Function becomes a tool that any AI agent (Claude, GPT, Cursor, etc.) can call
- Skills — Your Function runs on a schedule or reacts to events (cron jobs, webhooks, triggers)
- Bots — Your Function provides the intelligence layer for Discord, Telegram, Slack, and WhatsApp bots
- Agent Endpoints — Your Function becomes a REST API that AI agents call autonomously
The custom logic you write in a Function is what makes your AI product unique. Everything else — routing, auth, billing, scaling — Aerostack handles for you.
What You Can Build
| Use Case | How It Works |
|---|---|
| Custom API backend | REST endpoints with DB, cache, and auth — no separate server |
| AI-powered search | Embed queries → vector search → re-rank with LLM → return results |
| RAG pipeline | Ingest docs → chunk → embed → vector store → semantic query from bots/MCP |
| Bot intelligence layer | Function queries your DB, runs AI analysis, caches results, returns to bot via MCP |
| Real-time data pipeline | Webhook → enqueue → AI processing → vector DB → cached result |
| Scheduled analytics | Cron function aggregates DB → caches dashboard → queues Slack notification |
| Webhook processor | Accept Stripe/GitHub/Slack webhooks → validate → enqueue → process async |
| Smart form handler | Receive submission → validate → store → queue confirmation email |
The Platform Bindings
Every Function gets six bindings through env:
interface Env {
DB: Database // SQL database (SQLite-compatible)
CACHE: Cache // Key-value cache with TTL
QUEUE: Queue // Background job processing
AI: AI // Multi-provider LLM inference
VECTORIZE: VectorSearch // Semantic vector search
STORAGE: Storage // Object storage (zero egress fees)
}All six are in-process bindings — not HTTP calls. They execute within the same worker process on the same machine. This is why Aerostack Functions are fundamentally faster than any approach that calls services over the network.
Deep dive into all platform bindings →
Get Started
Pick a template, deploy it, and have a working function in 2 minutes.
Quick Start (2 min)8 starter templates: REST APIs, WebSocket chat, AI streaming, multiplayer games, and more.
TemplatesStep-by-step tutorial: build a real API with database and cache from scratch.
Build a Bookmarks APIHow Functions connect to MCP, Skills, Bots, and Agents to build complete AI products.
Power Your AI Apps