FunctionsWhy Functions

Functions

Functions are the foundation of Aerostack. Every MCP server, every Skill, every Bot workflow, every Agent Endpoint — they all run on Functions under the hood. When you write a Function, you are writing the custom logic that powers your entire AI product.

If you can write TypeScript, you can build anything on Aerostack. Functions give you a fullstack edge runtime with Database, Cache, Queue, AI, Vector Search, and Storage — all built in, zero config.


Why Functions?

The problem

Building backends for AI apps today means stitching together 6+ services: a database, a cache, a queue, object storage, an AI provider, and a vector database. Each one has its own SDK, its own credentials, its own billing, and its own latency penalty. A single request can make 3-5 network hops before returning a response.

The Aerostack way

An Aerostack Function runs on Cloudflare’s edge — 300+ data centers worldwide. Every function gets six platform bindings injected automatically. They are not HTTP calls. They are native, in-process bindings that execute in the same worker process. A database query adds microseconds, not milliseconds.

// This is a complete backend. No SDK clients. No credentials. No config.
export default {
  async fetch(request: Request, env: Env): Promise<Response> {
    // Query your database — in-process, ~0ms
    const users = await env.DB.prepare('SELECT * FROM users WHERE active = 1').all()
 
    // Cache the result — same datacenter, ~0ms
    await env.CACHE.put('active-users', JSON.stringify(users.results), { expirationTtl: 300 })
 
    // Run AI inference — built-in, multi-provider
    const summary = await env.AI.run('@cf/meta/llama-3.1-8b-instruct', {
      messages: [{ role: 'user', content: `Summarize: ${JSON.stringify(users.results)}` }]
    })
 
    return Response.json({ users: users.results, summary: summary.response })
  }
}

One file. Six services. Zero configuration. Deployed globally in seconds.


How Functions Compare

Traditional StackAerostack Functions
Databasefetch('https://your-db-api/query') — HTTP round tripenv.DB.prepare('SELECT ...').all() — in-process
Cacheredis.get(key) — network hopenv.CACHE.get(key) — same datacenter
Queuesqs.sendMessage(...) — AWS API callenv.QUEUE.send(...) — native binding
AIopenai.chat(...) — external APIenv.AI.run(model, ...) — built-in
Vector Searchpinecone.query(...) — external APIenv.VECTORIZE.query(...) — native
Storages3.putObject(...) — AWS API callenv.STORAGE.put(...) — native
Latency per call50-200ms per service~0ms — all in-worker
ConfigManage credentials for eachZero config — pre-wired
DeployDocker → Kubernetes → CDNaerostack deploy → 300+ locations

Functions Are the Engine Behind Everything

Every product on Aerostack is powered by Functions:

  • MCP Servers — Your Function becomes a tool that any AI agent (Claude, GPT, Cursor, etc.) can call
  • Skills — Your Function runs on a schedule or reacts to events (cron jobs, webhooks, triggers)
  • Bots — Your Function provides the intelligence layer for Discord, Telegram, Slack, and WhatsApp bots
  • Agent Endpoints — Your Function becomes a REST API that AI agents call autonomously

The custom logic you write in a Function is what makes your AI product unique. Everything else — routing, auth, billing, scaling — Aerostack handles for you.


What You Can Build

Use CaseHow It Works
Custom API backendREST endpoints with DB, cache, and auth — no separate server
AI-powered searchEmbed queries → vector search → re-rank with LLM → return results
RAG pipelineIngest docs → chunk → embed → vector store → semantic query from bots/MCP
Bot intelligence layerFunction queries your DB, runs AI analysis, caches results, returns to bot via MCP
Real-time data pipelineWebhook → enqueue → AI processing → vector DB → cached result
Scheduled analyticsCron function aggregates DB → caches dashboard → queues Slack notification
Webhook processorAccept Stripe/GitHub/Slack webhooks → validate → enqueue → process async
Smart form handlerReceive submission → validate → store → queue confirmation email

The Platform Bindings

Every Function gets six bindings through env:

interface Env {
  DB: Database             // SQL database (SQLite-compatible)
  CACHE: Cache             // Key-value cache with TTL
  QUEUE: Queue             // Background job processing
  AI: AI                   // Multi-provider LLM inference
  VECTORIZE: VectorSearch  // Semantic vector search
  STORAGE: Storage         // Object storage (zero egress fees)
}

All six are in-process bindings — not HTTP calls. They execute within the same worker process on the same machine. This is why Aerostack Functions are fundamentally faster than any approach that calls services over the network.

Deep dive into all platform bindings →


Get Started