MeatButton

Your Node.js App Has a Memory Leak

Why it gets slower over time and eventually crashes

Your app works fine when you restart it. Then, over hours or days, it gets slower. Response times increase. Eventually it crashes with:

FATAL ERROR: CALL_AND_RETRY_LAST Allocation failed -
JavaScript heap out of memory

Or the Linux OOM killer terminates it (you'll see Killed in the logs). You restart it, everything's fast again, and the cycle repeats.

That's a memory leak. Your app is using memory and never releasing it. Eventually it runs out.

How to confirm it's a memory leak

First, verify that memory is actually growing over time:

# Watch your app's memory usage (replace with your process name)
watch -n 5 'ps -o pid,rss,command -p $(pgrep -f "node")'

# Or use top/htop and sort by memory
htop

If the RSS (Resident Set Size) number keeps growing and never comes back down, you have a leak.

You can also add monitoring inside your app:

// Log memory usage every 30 seconds
setInterval(() => {
    const usage = process.memoryUsage();
    console.log(`Memory: ${Math.round(usage.heapUsed / 1024 / 1024)}MB / ${Math.round(usage.heapTotal / 1024 / 1024)}MB`);
}, 30000);

If heapUsed keeps climbing, you have a leak.

Common causes in AI-generated code

Event listeners that never get removed. This is the number one cause. Every time a component mounts or a function runs, it adds an event listener. But it never removes it. Listeners accumulate, each holding references to data that can't be garbage collected.

// LEAKY — adds a new listener every time, never removes
app.get('/data', (req, res) => {
    emitter.on('update', (data) => {
        // This listener never gets removed
    });
    res.json({ ok: true });
});

// FIXED — remove listener when done
app.get('/data', (req, res) => {
    const handler = (data) => { /* ... */ };
    emitter.on('update', handler);
    req.on('close', () => emitter.removeListener('update', handler));
    res.json({ ok: true });
});

Growing arrays or objects. A cache, queue, or log buffer that grows without bounds.

// LEAKY — array grows forever
const requestLog = [];
app.use((req, res, next) => {
    requestLog.push({ url: req.url, time: Date.now() });
    next();
});

// FIXED — limit the size
const requestLog = [];
const MAX_LOG_SIZE = 1000;
app.use((req, res, next) => {
    requestLog.push({ url: req.url, time: Date.now() });
    if (requestLog.length > MAX_LOG_SIZE) requestLog.shift();
    next();
});

Closures holding references. Functions that capture variables in their scope and prevent garbage collection.

// LEAKY — closure captures 'bigData' and it can never be freed
function processRequest(bigData) {
    return function handler() {
        // Even if handler doesn't use bigData,
        // bigData is captured in this closure's scope
        return 'done';
    };
}
// If handler is stored somewhere (event listener, cache),
// bigData stays in memory forever

Unclosed database connections or streams. Every request opens a connection but doesn't close it. The connection pool fills up, each connection holding allocated memory.

// LEAKY — connection never closed on error
app.get('/users', async (req, res) => {
    const conn = await pool.getConnection();
    const users = await conn.query('SELECT * FROM users');
    // If query throws, conn.release() never runs
    conn.release();
    res.json(users);
});

// FIXED — use try/finally
app.get('/users', async (req, res) => {
    const conn = await pool.getConnection();
    try {
        const users = await conn.query('SELECT * FROM users');
        res.json(users);
    } finally {
        conn.release();
    }
});

setInterval without clearInterval. Intervals that are created but never stopped, especially in modules that get reloaded during development.

How to find the leak

Option 1: Heap snapshots. Take two heap snapshots at different times and compare them to see what's growing.

// Add this to your app
const v8 = require('v8');
const fs = require('fs');

// Hit this endpoint to take a snapshot
app.get('/debug/heap', (req, res) => {
    const snapshotStream = v8.writeHeapSnapshot();
    res.json({ file: snapshotStream });
});

// Open the .heapsnapshot file in Chrome DevTools
// (Memory tab → Load)

Option 2: The --inspect flag. Start your app with node --inspect app.js, open chrome://inspect in Chrome, and use the Memory tab to take snapshots and track allocations in real time.

Option 3: Process of elimination. Comment out sections of your code and see if the leak stops. Start with middleware, then routes, then libraries. When you comment something out and memory stabilizes, you found the leaky section.

Quick mitigations

While you're finding the leak, these buy you time:

# Increase the heap size limit (default is ~1.5GB)
node --max-old-space-size=4096 app.js

# Restart automatically when memory gets too high (using PM2)
pm2 start app.js --max-memory-restart 500M

These aren't fixes — they're bandaids. The leak still exists. But they keep your app running while you hunt it down.

Why AI-generated code leaks

AI writes code that works in the moment — for a single request, a single user, a single test. It doesn't think about what happens over thousands of requests across hours of uptime. Memory management is about the lifecycle of data, and AI doesn't reason about time. It adds event listeners without cleanup, creates caches without eviction, and opens connections without ensuring they close. Each one works perfectly once. Over time, they compound.

Can't find the leak?

MeatButton connects you with Node.js developers who know how to trace memory leaks in production apps. They can take heap snapshots, identify the offending code, and fix it without rebuilding your app. First one's free.

Get MeatButton