-
Notifications
You must be signed in to change notification settings - Fork 30.8k
Next.js 16.2.2 standalone + Cache Components: cached internal streamed fetches cause unbounded arrayBuffers growth and OOM #92287
Description
Link to the code that reproduces this issue
https://github.com/mdotk/next-standalone-memory-repro
To Reproduce
- Clone the repro repo and install dependencies.
npm install- Build the app.
npm run build- Start the standalone server.
PORT=3025 npm run start:standalone- In another shell, run the reproducing load.
BASE_URL=http://127.0.0.1:3025 \
DURATION_MS=180000 \
CONCURRENCY=64 \
PAGE_WEIGHT_KB=2048 \
API_WEIGHT_KB=2048 \
MIX=page,api,page,page \
MAX_LOGGED_FAILURES=20 \
npm run load- Sample memory during the run.
curl 'http://127.0.0.1:3025/api/health?sample=1'Current vs. Expected behavior
Current:
On next@16.2.2 with output: "standalone" and cacheComponents: true, the standalone server shows rapid unbounded memory growth under sustained high-cardinality traffic once the app is doing cached internal server-side fetch() calls against a streamed JSON route.
In one local run:
- baseline was about
95 MB rss - after
~28s, memory reached about1.58 GB rss / 589 MB arrayBuffers - after
~38s, memory reached about2.36 GB rss / 1.04 GB arrayBuffers - after
~62s, memory reached about2.42 GB rss / 1.95 GB arrayBuffers - after
~180s, memory reached about3.43 GB rss / 4.31 GB arrayBuffers
The standalone server then died with:
FATAL ERROR: Ineffective mark-compacts near heap limit Allocation failed - JavaScript heap out of memory
While the server is destabilizing, the load generator starts receiving:
ECONNRESETUND_ERR_SOCKETECONNREFUSED
Expected:
Memory should stabilize or at least be reclaimed after responses complete. Large temporary spikes under load are one thing, but the standalone process should not continue retaining arrayBuffers/external memory until it exits with OOM.
Provide environment information
Operating System:
macOS 15.x arm64
Node.js version:
25.1.0
Next.js version:
16.2.2
Output mode:
standalone
Other config:
cacheComponents: true
compress: falseAdditional context
A few things make this feel close to the existing response-retention / runtime-retention family:
- the failing repro requires cached internal server-side
fetch() - the internal fetched route returns a streamed JSON response
- the exploding categories are
rss,external, and especiallyarrayBuffers - the app is generating many unique request paths over time
I also ran the same repro app locally with next start as a control. I know that is not the supported way to run an app configured with output: "standalone", so I am not presenting it as the main repro. But it is useful signal:
- with
next start, the app still showed large temporary growth under the same load - after the shorter control run stopped, memory recovered back down instead of the server dying
- with the standalone server, the same app kept climbing and eventually exited with OOM
That difference made me file this specifically against the standalone runtime path.
This also overlaps with symptoms in:
- Memory leak causing OOM still occurs in Next.js 16.0.10 (also tested with 16.0.1) when using output: standalone with fetch requests. #90433
- Memory leak (ArrayBuffer/WriteWrap retention) in Next.js 16.1.6 stable — fixed in canary but not released #90895
- Memory leak: ArrayBuffer/WriteWrap retention in 16.1.6 standalone (fixed in canary, not released) #90898
- controller[kState].transformAlgorithm is not a function #75994