Vercel is a cloud platform for deploying frontend applications with support for server-side rendering, static assets, serverless functions, and edge middleware. It integrates with Git to trigger deployments and environment-specific builds on each push.
Each commit results in a new deployment using framework-specific build logic and scoped environment variables. While it handles frontend and API logic, it relies on external services for databases and other backend components.
Getting Started with Vercel: Setup and Deployment
Vercel integrates with Git repositories and triggers deployments on each push. It auto-detects frameworks like Next.js, Svelte, and Astro, applying corresponding build presets. Configuration defaults can be overridden via vercel.json. Deployments can also be initiated manually using the Vercel CLI, with builds executed in isolated environments.
Installing the Vercel CLI
The Vercel CLI enables project linking, local testing, and manual deployments. Install it globally via npm or run ephemeral commands using npx.
npm install -g vercelLinking a Project
Run the Vercel link inside a project directory to connect it to a Vercel deployment. This creates a .vercel directory storing project metadata.
vercel linkDeploying with Environment Variables
Vercel injects environment variables at build-time (VERCEL_ENV) or runtime (process.env). Use .env.local for development and the dashboard for production secrets.
echo "API_KEY=12345" > .env.localvercel env add API_KEYExplanation:
- Creates a local environment file named .env.local.
- Sets the variable API_KEY to 12345 for use during local development.
- Files like .env.local are automatically loaded by frameworks like Next.js or Vite.
Configuring vercel.json for Routing
vercel.json defines routing behavior, header rules, and URL rewrites for Vercel deployments. This configuration overrides framework defaults and applies routing logic at the edge. Routing behavior is evaluated before function execution, allowing control over request handling.
{"rewrites": [{ "source": "/old", "destination": "/new" }],"headers": [{ "source": "/(.*)", "headers": [{ "key": "X-Custom-Header", "value": "value" }] }]}Explanation:
- Defines a URL rewrite rule in the Vercel configuration file (vercel.json).
- Applies custom HTTP response headers to all routes matching the source pattern (/(.*)).
- Adds a header X-Custom-Header: value to every response.
Verifying Vercel CLI Installation
To verify that the Vercel CLI is installed correctly, run the following command in terminal:
vercel --version# or shorthand:vc -vExpected Output
If Vercel CLI is installed, it will return the installed version (e.g., 32.5.3).
Deploying from Git
Connect a GitHub/GitLab repository in the Vercel dashboard. Pushes to the main trigger production deployments; pull requests generate preview URLs.
Vercel’s Core Architecture
Vercel uses a serverless and edge-based architecture. It serves frontend code as static assets and handles dynamic logic through serverless or edge functions. Deployments are versioned and read-only, allowing rollback and preview by commit.
Static content is delivered from the nearest edge CDN node. Dynamic requests are routed to serverless or edge functions based on project settings. Vercel integrates with Git, where each push triggers a build tied to a specific commit.
Builds run in isolated environments using framework presets or custom steps defined in vercel.json. The resulting output is deployed to Vercel’s distributed edge network.
Serverless Function Execution
Vercel deploys serverless functions as isolated bundles. Cold starts are reduced through pre-warming and retained memory pools. Functions scale to zero and terminate after inactivity. CPU and memory allocation depend on the deployment region. Compute-heavy workloads run in larger isolates.
// api/task.jsexport default async function handler(req, res) {const data = await fetch('https://db.example.com/tasks');res.setHeader('Cache-Control', 's-maxage=60');res.json(data);}Explanation:
- Defines a Next.js API route handler in api/task.js.
- Exports an async function that receives req (incoming HTTP request) and res (HTTP response object).
- The file-based API routing maps this function to the /api/task endpoint.
Edge Middleware
Edge middleware intercepts incoming requests and applies conditional logic before routing. Logic runs at edge locations without invoking serverless functions. Middleware can perform tasks such as authentication checks, header rewrites, or redirects based on request data.
export function middleware(request) {const url = new URL(request.url);if (url.pathname.startsWith('/admin') && !request.cookies.has('auth')) {return Response.redirect('/login');}}Explanation:
- Parses the full request URL into a URL object.
- Verifies that the auth cookie is absent, meaning the user is unauthenticated.
Video Workflow With Vercel
Vercel deploys and serves web applications using infrastructure for video processing and delivery. Video segments are cached at the edge via CDN. Transcoding runs in regional serverless functions. Video requests route based on file size and viewer location, using byte-range caching for partial delivery. Build-time optimizes static video assets. Efficient streaming depends on correct cache headers and transcoding workflows.
Video Delivery Architecture on Vercel
Vercel's video infrastructure combines edge caching for static video files with on-demand processing in serverless functions. Video requests route through edge locations with TLS termination and fall back to the origin for authenticated content or custom transformations.
Edge Caching for Video Assets
Vercel caches video segments at locations using content-addressable storage with SHA-256 hashed filenames. MP4, WebM, and HLS formats are served with the correct MIME types. Brotli compresses manifest files. Byte-range requests allow seeking without full file transfers.
// vercel.json video caching configuration{"headers": [{"source": "/videos/(.*).(mp4|webm)","headers": [{ "key": "Cache-Control", "value": "public, max-age=604800" },{ "key": "Accept-Ranges", "value": "bytes" }]}]}Explanation:
- Applies the header rules to all video files under the /videos/ path.
- Targets file extensions .mp4 and .webm using a regular expression.
- Sets the Cache-Control header to cache the video file for 7 days (604800 seconds).
- public allows the response to be stored by shared caches (e.g., CDNs).
Serverless Video Processing
Vercel runs video transcoding and analysis in serverless functions with up to 3008MB of memory. Functions handle uploads via multipart/form-data and write output to object storage. Execution duration is limited to 5 minutes for standard functions and 15 seconds for edge functions.
// api/transcode.jsexport default async function handler(req, res) {const { file } = await processMultipartRequest(req);const transcoded = await ffmpeg(file).format('hls').outputOptions(['-hls_time 6','-hls_list_size 0']).run();res.setHeader('Content-Type', 'application/vnd.apple.mpegurl');res.send(transcoded);}Explanation:
- Declares a Next.js API route handler for the /api/transcode endpoint.
- Exports an asynchronous function that receives the request (req) and response (res) objects.
- processMultipartRequest handles raw file data extraction from req.
Video Authentication Workflow
Vercel handles signed URLs and JWT verification for protected video content. Middleware intercepts requests before the origin, validating tokens with sub-millisecond latency. Invalid tokens terminate at the edge without triggering compute.
// middleware.jsexport function middleware(request) {const token = request.cookies.get('token');if (!verifyJWT(token)) {return new Response('Unauthorized', { status: 401 });}}Explanation:
- Declares and exports a middleware function compatible with Next.js Edge Middleware.
- Receives the request object for inspecting headers, cookies, or URL.
