Executing Shell Commands in Node.js: Callbacks vs Promises vs Promisify

Handling Node shell commands can be tricky. Should you use Callbacks, manual Promises, or an abstraction layer like Promisify?

Executing Shell Commands in Node.js: Callbacks vs Promises vs Promisify

When your Node.js or Electron application needs to interact with the OS, it uses the child_process module. While spawn is better for large data streams, exec is the go-to for command-line utilities because it buffers the output and provides a simple interface.

However, how you consume that interface impacts your application's memory usage, error handling, and maintainability.

Let's look at the three primary ways to handle asynchronous execution.

The Low-Level Foundation: The Callback Pattern

The exec function spawns a shell and runs the command within that shell. Internally, Node.js allocates a buffer (defaulting to 1MB) to store the stdout and stderr results.

import { exec } from 'child_process'

const options = {
  cwd: '/path/to/repo', // Current Working Directory
  env: { ...process.env, GIT_TERMINAL_PROMPT: '0' }, // custom env vars
  timeout: 10000, // Kill process if it exceeds 10s
  maxBuffer: 1024 * 1024 * 10, // Extend buffer to 10MB
}

exec('git status --porcelain', options, (error, stdout, stderr) => {
  if (error) {
    // error.code: The exit code of the process (e.g., 128 for Git errors)
    // error.signal: The signal that terminated the process (e.g., SIGTERM)
    console.error(`Process exited with code ${error.code}`)
    return
  }
  // stdout and stderr are strings containing the full output
  console.log(stdout)
})

Memory Management: The callback only fires once the process is closed and the entire output is buffered. If a command outputs 2MB and your maxBuffer is 1MB, the callback fires with an ERR_CHILD_PROCESS_STDIO_MAXBUFFER error, and the process is killed.

Execution Context: This pattern is strictly "Push" based. You are handing control to the OS and waiting for a signal to return.

The Problem (Sequential Dependency): If you need to check git status, then git fetch, then git pull, you create a recursive execution context. Each nested exec keeps the previous one in memory (closure), leading to potential memory leaks in long-running processes if not handled carefully.

Using a Promise Wrapper

By wrapping exec in a new Promise, you transition the function from a Push-based callback to a Pull-based awaitable.

function execAsyncManual(command, options = {}) {
  return new Promise((resolve, reject) => {
    // Logic executed immediately upon Promise instantiation
    const child = exec(command, options, (error, stdout, stderr) => {
      if (error) {
        // Enriched Error Object
        reject({
          message: error.message,
          exitCode: error.code,
          command: command,
          stderr: stderr,
        })
      } else {
        // Resolve only on successful exit (code 0)
        resolve({ stdout, stderr })
      }
    })

    // We can even attach listeners to the child process object if needed
    child.on('spawn', () => console.log(`PID ${child.pid} started.`))
  })
}

Control Flow: This allows for Try/Catch blocks. In the compiled JavaScript, this transforms your logic into a state machine. Instead of nesting closures, the state machine pauses execution, allowing the Event Loop to process other tasks while waiting for the OS kernel to signal the process completion.

Why Manual? Some CLI tools (like curl or ffmpeg) are "noisy." They might write progress logs to stderr. A manual wrapper lets you write logic like: if (stderr && !stderr.includes('Warning')) reject().

The Trap: If you forget to handle the reject, you will trigger an unhandledRejection event, which in modern Node.js versions, will crash your entire process.

The Abstraction: Promisify

Node's util.promisify uses a specific internal Symbol (util.promisify.custom) or a standard argument-guessing logic to wrap the function for you.

import { exec } from 'child_process'
import { promisify } from 'util'

const execAsync = promisify(exec)

async function gitWorkflow() {
  try {
    // Destructuring the returned object
    const { stdout, stderr } = await execAsync('git pull origin main')

    // Technical Note: If stderr is not empty, it doesn't mean the command failed.
    // promisify only rejects if the error argument in the callback is truthy.
  } catch (err) {
    // err is an instance of Error but with extra properties:
    // err.stdout, err.stderr, err.code, err.signal, err.cmd
    console.error(`Command [${err.cmd}] failed with: ${err.stderr}`)
  }
}

Standardization: promisify ensures that the Promise always resolves to an object with { stdout, stderr } keys. This is critical for team-based projects where custom wrappers might return different formats.

Microtask Queue: Using await places the continuation of your function into the Microtask Queue. This is higher priority than the Macrotask Queue (where setTimeout lives), meaning your code resumes almost instantly after the OS process finishes.

Limitations: You lose access to the ChildProcess object (the PID, the .stdin stream, etc.) unless you use the advanced promisify features. You only get the final result.

Important thing to take into account is a common pitfall called Shell Injection.

When you use exec, Node.js calls /bin/sh -c [command] (on Unix) or cmd.exe /d /s /c [command] (on Windows). The shell is an interpreter.

If your code looks like this:

execAsync(`cat ${filename}`)

And a user provides:

"important.txt; rm -rf /"

The shell interprets the ; as a command separator. It finishes the cat and then immediately starts the rm.

Never pass raw input to exec. Use a strict Whitelisting Regex. If you need to handle complex arguments with spaces and special characters, consider using child_process.spawn instead, which passes arguments as an array and does not spawn a shell by default, bypassing the injection risk entirely.


For 90% of use cases in Electron or Backend services, promisify(exec) is the superior choice for maintainability. However, if you are building a tool that requires real-time monitoring of a process (like a progress bar based on PID stats), the Manual Wrapper or raw Callbacks are necessary to maintain a handle on the ChildProcess instance.



Tags:
Share: