Skip to content

willfarrell/datastream

<datastream>

Commonly used stream patterns for Web Streams API and NodeJS Stream.

If you're iterating over an array more than once, it's time to use streams.


GitHub Actions unit test status GitHub Actions dast test status GitHub Actions perf test status GitHub Actions SAST test status GitHub Actions lint test status
npm version npm install size npm weekly downloads npm provenance
Open Source Security Foundation (OpenSSF) Scorecard SLSA 3 Checked with Biome Conventional Commits code coverage
Chat on Gitter Ask questions on StackOverflow

  • @datastream/core
    • pipeline
    • pipejoin
    • streamToArray
    • streamToString
    • isReadable
    • isWritable
    • makeOptions
    • createReadableStream
    • createTransformStream
    • createWritableStream

Streams

  • Readable: The start of a pipeline of streams that injects data into a stream.
  • PassThrough: Does not modify the data, but listens to the data and prepares a result that can be retrieved.
  • Transform: Modifies data as it passes through.
  • Writable: The end of a pipeline of streams that stores data from the stream.

Basics

  • @datastream/string
    • stringReadableStream [Readable]
    • stringLengthStream [PassThrough]
    • stringOutputStream [PassThrough]
  • @datastream/object
    • objectReadableStream [Readable]
    • objectCountStream [PassThrough]
    • objectBatchStream [Transform]
    • objectOutputStream [PassThrough]

Common

Advanced

Setup

npm install @datastream/core @datastream/{module}

Flows

stateDiagram-v2

    [*] --> fileRead*: path
    [*] --> fetchResponse: URL
    [*] --> sqlCopyTo*: SQL
    [*] --> stringReadable: string
    [*] --> stringReadable: string[]
    [*] --> objectReadable: object[]
    [*] --> createReadable: blob

    readable --> charsetDetect: binary
    charsetDetect --> [*]

    readable --> decryption
    decryption --> passThroughBuffer: buffer

    readable --> decompression
    decompression --> passThroughBuffer: buffer
    passThroughBuffer --> charsetDecode: buffer
    charsetDecode --> passThroughString: string
    passThroughString --> parse: string
    parse --> validate: object
    validate --> passThroughObject: object
    passThroughObject --> transform: object

    transform --> format: object
    format --> charsetEncode: string
    charsetEncode --> compression: buffer
    compression --> writable: buffer

    charsetEncode --> encryption: buffer
    encryption --> writable: buffer

    state readable {
        fileRead*
        fetchResponse
        sqlCopyTo*
        createReadable
        stringReadable
        objectReadable
        awsS3Get
        awsDynamoDBQuery
        awsDynamoDBScan
        awsDynamoDBGet
    }

    state decompression {
        brotliDeompression
        gzipDeompression
        deflateDeompression
        zstdDecompression*
        protobufDecompression*
    }

    state decryption {
      decryption*
    }

    state parse {
      csvParse
      jsonParse*
      xmlParse*
    }

    state passThroughBuffer {
      digest
    }

    state passThroughString {
      stringLength
      stringOutput
    }

    state passThroughObject {
      objectCount
      objectOutput
    }

    state transform {
      objectBatch
      objectPivotLongToWide
      objectPivotWideToLong
      objectKeyValue
      objectKeyValue
    }

    state format {
      csvFormat
      jsonFormat*
      xmlFormat*
    }

    state compression {
        brotliCompression
        gzipCompression
        deflateCompression
        zstdCompression*
        protobufCompression*
    }

    state encryption {
      encryption*
    }

    state writable {
        fileWrite*
        fetchRequest*
        sqlCopyFrom*
        awsS3Put
        awsDynamoDBPut
        awsDynamoDBDelete
    }
    writable --> [*]
Loading

* possible future package

Write your own

Readable

NodeJS Streams

Web Streams API

Transform

NodeJS Streams

Web Streams API

Writeable

NodeJS Streams

Web Streams API

End-to-End Examples

NodeJS: Import CSV into SQL database

Read a CSV file, validate the structure, pivot data, then save compressed.

  • fs.creatReadStream
  • gzip
  • cryptoDigest
  • charsetDecode
  • csvParse
  • countChunks
  • validate
  • changeCase (pascal to snake)
  • parquet?
  • csvFormat
  • postgesCopyFrom

WebWorker: Validate and collect metadata about file prior to upload

  • cryptoDigest
  • charsetDetect
  • jsonParse?
  • validate

WebWorker: Upload file compressed

Upload file with brotli compression?

WebWorker: Decompress protobuf compressed JSON requests

Fetch protobuf file, decompress, parse JSON

streams

  • filter

  • file (docs only?)

examples

  • fetch
  • node:fs
  • input type=file
  • readable string/array/etc

License

Licensed under MIT License. Copyright (c) 2026 will Farrell and contributors.

About

Commonly used stream patterns for Web Streams API and NodeJS Streams

Topics

Resources

License

MIT and 2 other licenses found

Licenses found

MIT
LICENSE
Unknown
license.json
Unknown
license.template

Code of conduct

Contributing

Stars

Watchers

Forks

Sponsor this project

 

Packages