← Back to Blog
12 min read

serverExternalPackages: How One Config Line Cut Our Next.js Build from 30 GB to 1.16 GB

Our Next.js app had never had a production build. It ran next dev in production since v0.10.x. When we finally tried next build, it OOM-killed at 28 GB. 10 attempts later, we found the fix.

26x
RSS Reduction
30 GB
Before (Peak RSS)
1.16 GB
After (Peak RSS)
50s
Warm Build Time

The Setup: next dev in Production

Our production app had been running next dev for months. Nobody noticed because: (a) dev mode actually works — just slowly, (b) we were focused on features not infrastructure, and (c) the VPS had enough RAM to absorb dev mode's overhead. We only discovered it when we tried to add a proper build step.

The moment we ran next build for the first time, the process consumed 28 GB of RSS and got OOM-killed by the kernel. Our 8-core VPS has 16 GB of RAM plus swap. The build never stood a chance.

What followed was a 10-attempt, multi-day odyssey through Turbopack OOM kills, Webpack swap thrashing, Tailwind scanner explosions, and ultimately the discovery that one configuration entry changes everything.

Why the Build Uses 30 GB

When you run next build, the bundler resolves the dependency tree for every imported package — for both client and server bundles. If your server code imports @aws-sdk/client-s3, the bundler traces all 50–80 transitive dependencies (@aws-sdk/middleware-*, @smithy/*, etc.) and builds module graphs for both sides.

Our app imports 12 server-only packages, several of which have massive dependency trees:

None of these packages run in browsers. But the bundler doesn't know that. It resolves every import, builds every module graph, and holds all of it in memory simultaneously. At scale, this means gigabytes of ASTs and module metadata that serve no purpose in the client bundle.

The 10 Attempts

Here's every approach we tried, with measured results on an 8-core, 16 GB VPS:

# Bundler Key Change Peak RSS Time Result
1 Turbopack None (first ever build) 28 GB OOM
2 Tailwind v3 downgrade Broke CSS
3 Turbopack Stop pdl-web service 28 GB OOM
4 Turbopack @source + stop ALL services ~31 GB Fragile
5a Turbopack + outputFileTracingExcludes 31 GB OOM
5b Turbopack + move data/ directory 31 GB OOM
6 Webpack @source + excludes (no sEP) 30 GB 5 min Swap
7 Turbopack + serverExternalPackages 31 GB OOM
8 Turbopack + all optimizations combined 31 GB OOM
9 Webpack + serverExternalPackages ??? Unmeasured
10 Webpack serverExternalPackages (measured) 1.16 GB 50s Baseline

Attempt 4 is the most interesting failure. Turbopack did succeed once — but only after stopping every other process on the VPS and letting it use 31 GB (RAM + swap). The build was fragile and unrepeatable. Turbopack's Rust-based allocator (jemalloc) doesn't respect V8-style GC pressure signals, so memory grows monotonically until the OOM killer intervenes.

The Turbopack Dead End

Attempts 7 and 8 prove something important: serverExternalPackages doesn't help Turbopack on a memory-constrained machine. Even with 12 packages excluded, Turbopack still OOM-killed at 31 GB.

The reason is architectural. Turbopack uses Rust with jemalloc. Once memory is allocated, jemalloc rarely returns it to the OS — it holds freed pages for reuse. Webpack, by contrast, runs in V8's managed heap where the garbage collector can reclaim memory between compilation phases. On a 16 GB VPS, this difference is fatal.

Turbopack is not always faster

If your machine has abundant RAM (32+ GB), Turbopack may build faster. But on VPS instances, CI runners, or any environment where memory is constrained, Webpack with serverExternalPackages is the reliable choice. Don't assume the newer bundler is better for your environment.

The Fix: serverExternalPackages

serverExternalPackages tells Next.js: "these packages are server-only. Don't resolve their dependency trees for client bundles. Just require() them at runtime."

The bundler stops tracing imports for these packages entirely. No ASTs, no module graphs, no transitive dependency resolution for the client side. The packages are loaded via Node.js's native require() at runtime, exactly as they would be in a non-bundled Node.js application.

next.config.ts
const nextConfig = {
  // Force Webpack (Turbopack OOMs on 16 GB VPS)
  // Run: next build --webpack

  serverExternalPackages: [
    "duckdb", "duckdb-async",
    "@aws-sdk/client-s3",
    "pg", "pg-boss", "pg-copy-streams",
    "parquetjs", "@modelcontextprotocol/sdk",
    "commander", "tsx", "bottleneck", "dotenv",
    "gray-matter", "remark", "remark-html",
  ],
};

That's it. This single configuration entry dropped peak RSS from 30 GB to 1.16 GB — a 26x reduction. Build time dropped from 5 minutes (with swap thrashing) to 50 seconds (warm cache). Swap usage went from ~3 GB to zero.

The Misleading Attempt #9

This is worth calling out because it almost derailed us. Attempt #9 applied serverExternalPackages to Webpack and appeared to still use 30 GB. We almost concluded the fix didn't work.

The problem: the .next cache was stale from a previous build without the config. The bundler was reusing cached module graphs that still included the resolved dependency trees. Attempt #10 ran after a clean cache, and the RSS dropped to 1.16 GB.

Always clean .next after config changes

If you add or modify serverExternalPackages, delete the .next directory before building. Stale caches will use the old module graphs and you won't see the improvement. Run rm -rf .next && next build --webpack.

Supporting Cast

serverExternalPackages is the critical fix, but two other configurations contributed to a clean build:

Tailwind @source Directive

Tailwind CSS v4 scans for class names by default, walking the entire file tree. On our VPS, this included multi-gigabyte data directories, Parquet files, and node_modules. The @source directive constrains the scanner to only look at actual source files:

globals.css
@import "tailwindcss";
@source "../src/**/*.{ts,tsx}";
@source "../components/**/*.{ts,tsx}";

outputFileTracingExcludes

This tells Next.js's file tracing (used for standalone builds and deployment) to skip large directories during build analysis:

next.config.ts
outputFileTracingExcludes: {
  "/**": [
    "./data/**",
    "./public/uploads/**",
    "./.git/**",
  ],
},

Neither of these alone solved the OOM problem. Attempt #6 had both of these but not serverExternalPackages — it still peaked at 30 GB. They're good practice, but serverExternalPackages is what actually fixes the memory explosion.


How to Audit Your Own Build

If your Next.js build is consuming more memory than expected, here's how to diagnose it:

1. Measure peak RSS

bash
# Clean build with measurement
rm -rf .next
/usr/bin/time -v npm run build 2>&1 | grep "Maximum resident"

2. Identify server-only packages

Any package that:

...should go in serverExternalPackages.

3. Check if Turbopack or Webpack is better for your machine

bash
# Turbopack (default in Next.js 15+)
rm -rf .next && /usr/bin/time -v npx next build

# Webpack (explicit)
rm -rf .next && /usr/bin/time -v npx next build --webpack

If Turbopack OOMs and Webpack doesn't, you've hit the jemalloc vs V8 GC boundary. Stick with Webpack.

The Complete next.config.ts Pattern

Here's the production configuration pattern we use. Copy and adapt the serverExternalPackages list for your own dependencies:

next.config.ts
import type { NextConfig } from "next";

const nextConfig: NextConfig = {

  // Server-only packages: don't resolve these for client bundles.
  // Each entry prevents the bundler from tracing the package's
  // entire dependency tree for client-side compilation.
  serverExternalPackages: [
    // Native binaries
    "duckdb", "duckdb-async",

    // AWS SDK (50-80 transitive deps)
    "@aws-sdk/client-s3",

    // Database drivers
    "pg", "pg-boss", "pg-copy-streams",

    // File format / protocol libs
    "parquetjs",
    "@modelcontextprotocol/sdk",

    // CLI and runtime utilities
    "commander", "tsx", "bottleneck", "dotenv",

    // Markdown processing
    "gray-matter", "remark", "remark-html",
  ],

  // Exclude large directories from file tracing
  outputFileTracingExcludes: {
    "/**": [
      "./data/**",
      "./public/uploads/**",
      "./.git/**",
    ],
  },
};

export default nextConfig;

When You Need This

When You Don't Need This


Maintenance Rule

Keep serverExternalPackages updated

When you add a new server-only dependency to package.json, also add it to serverExternalPackages in next.config.ts. Forgetting this will cause build RSS to spike as the bundler resolves the new package's full dependency tree for client bundles. Make this part of your PR review checklist.

Key Takeaways

  1. serverExternalPackages is the fix. It dropped our build from 30 GB to 1.16 GB — a 26x reduction.
  2. Turbopack isn't always better. On memory-constrained machines, Webpack's V8 GC wins over Turbopack's jemalloc.
  3. Clean your cache. Always rm -rf .next after changing build config. Stale caches hide improvements.
  4. Measure with /usr/bin/time -v. Don't guess at memory usage — Maximum resident set size is the number that matters.
  5. Infrastructure debt compounds silently. Our app ran next dev in production for months. Nobody noticed until we tried to build it properly.