background grid

Understanding Streams and Pipelines in Node.js

Roopsagar K

Full-stack Developer | Freelancer | Building AI-powered SaaS Solutions

Sep 14, 2025 | 3 min read

GitHub iconLinkedIn iconTwitter icon
Understanding Streams and Pipelines in Node.js

Table of Contents

Understanding Streams and Pipelines in Node.js

When I first started working with Node.js, I mostly used methods like fs.readFile or fs.writeFile to handle files. They work fine for small data, but as soon as the files get larger, these methods become memory-hungry because the entire file has to be loaded into memory before processing.

That’s when I discovered Streams.

What are Streams?

Streams are a way of handling data piece by piece, rather than all at once.

Think of a stream like a water pipe — data flows through it in chunks.

Node.js has four main types of streams:

  • Readable: Data can be read (e.g., fs.createReadStream)

  • Writable: Data can be written (e.g., fs.createWriteStream)

  • Duplex: Both readable and writable (e.g., network sockets)

  • Transform: Data can be modified as it’s read/written (e.g., compression)

Example 1: File Streaming with Express

Instead of loading a whole file into memory, we can send it chunk by chunk to the client:

import express from "express";

import fs from "fs";
import path from "path";

const app = express();

app.get("/download", (req, res) => {
    const filePath = path.join("./", "largefile.txt");
    const stream = fs.createReadStream(filePath);
    // Pipe file directly to HTTP response
    stream.pipe(res);
    stream.on("error", (err) => {
        console.error(err);
        res.status(500).send("Error reading file");
    });
});

app.listen(3000, () => {
    console.log("Server running at http://localhost:3000");
});

Here, the client starts receiving data immediately instead of waiting for the entire file. This makes large downloads fast and memory-efficient.

Example 2: Compression with Pipelines

Node.js also provides the stream.pipeline utility, which makes chaining multiple streams easier and safer (with built-in error handling).

import { pipeline } from "stream";
import { promisify } from "util";
import fs from "fs";
import zlib from "zlib";

const pipe = promisify(pipeline);

async function compressFile() {
   await pipe(
      fs.createReadStream("largefile.txt"),
      zlib.createGzip(),
      fs.createWriteStream("largefile.txt.gz")
   );
   console.log("File compressed successfully!");
}

compressFile().catch(console.error);

Here, we’re reading a file, compressing it on the fly, and writing the compressed version — all in a streaming fashion.

⚡ Why Streams Matter

  • Efficiency: Handle gigabytes of data without exhausting memory.

  • Speed: Start processing or sending data immediately.

  • Flexibility: Easy to chain transformations (e.g., compression, encryption).

What’s Next?

Today, I learned how powerful streams are for file operations and response handling in Node.js. Next, I want to explore streaming APIs and how streams integrate with WebSockets or real-time apps.

If you’re still using fs.readFile / fs.writeFile for everything, try switching to streams. They’ll make your Node.js apps much more scalable and efficient.

AI Tools to Boost Your Job Search
Struggling with job applications? Resumetweaker helps you craft the perfect resume and cover letter, practice with an AI-powered interview, and even track job applications effortlessly. Let AI simplify your job search and give you an edge.