Commonly used stream patterns for Web Streams API and NodeJS Stream.
If you're iterating over an array more than once, it's time to use streams.
@datastream/core- pipeline
- pipejoin
- streamToArray
- streamToString
- isReadable
- isWritable
- makeOptions
- createReadableStream
- createTransformStream
- createWritableStream
- Readable: The start of a pipeline of streams that injects data into a stream.
- PassThrough: Does not modify the data, but listens to the data and prepares a result that can be retrieved.
- Transform: Modifies data as it passes through.
- Writable: The end of a pipeline of streams that stores data from the stream.
@datastream/string- stringReadableStream [Readable]
- stringLengthStream [PassThrough]
- stringOutputStream [PassThrough]
@datastream/object- objectReadableStream [Readable]
- objectCountStream [PassThrough]
- objectBatchStream [Transform]
- objectOutputStream [PassThrough]
@datastream/fetch- fetchResponseStream [Readable]
@datastream/charset[/{detect,decode,encode}]- charsetDetectStream [PassThrough]
- charsetDecodeStream [Transform]
- charsetEncodeStream [Transform]
@datastream/compression[/{gzip,deflate}]- gzipCompressionStream [Transform]
- gzipDecompressionStream [Transform]
- deflateCompressionStream [Transform]
- deflateDecompressionStream [Transform]
@datastream/digest- digestStream [PassThrough]
@datastream/csv[/{parse,format}]- csvParseStream [Transform]
- csvFormatStream [Transform]
@datastream/validate- validateStream [Transform]
npm install @datastream/core @datastream/{module}stateDiagram-v2
[*] --> fileRead*: path
[*] --> fetchResponse: URL
[*] --> sqlCopyTo*: SQL
[*] --> stringReadable: string
[*] --> stringReadable: string[]
[*] --> objectReadable: object[]
[*] --> createReadable: blob
readable --> charsetDetect: binary
charsetDetect --> [*]
readable --> decryption
decryption --> passThroughBuffer: buffer
readable --> decompression
decompression --> passThroughBuffer: buffer
passThroughBuffer --> charsetDecode: buffer
charsetDecode --> passThroughString: string
passThroughString --> parse: string
parse --> validate: object
validate --> passThroughObject: object
passThroughObject --> transform: object
transform --> format: object
format --> charsetEncode: string
charsetEncode --> compression: buffer
compression --> writable: buffer
charsetEncode --> encryption: buffer
encryption --> writable: buffer
state readable {
fileRead*
fetchResponse
sqlCopyTo*
createReadable
stringReadable
objectReadable
awsS3Get
awsDynamoDBQuery
awsDynamoDBScan
awsDynamoDBGet
}
state decompression {
brotliDeompression
gzipDeompression
deflateDeompression
zstdDecompression*
protobufDecompression*
}
state decryption {
decryption*
}
state parse {
csvParse
jsonParse*
xmlParse*
}
state passThroughBuffer {
digest
}
state passThroughString {
stringLength
stringOutput
}
state passThroughObject {
objectCount
objectOutput
}
state transform {
objectBatch
objectPivotLongToWide
objectPivotWideToLong
objectKeyValue
objectKeyValue
}
state format {
csvFormat
jsonFormat*
xmlFormat*
}
state compression {
brotliCompression
gzipCompression
deflateCompression
zstdCompression*
protobufCompression*
}
state encryption {
encryption*
}
state writable {
fileWrite*
fetchRequest*
sqlCopyFrom*
awsS3Put
awsDynamoDBPut
awsDynamoDBDelete
}
writable --> [*]
* possible future package
Read a CSV file, validate the structure, pivot data, then save compressed.
- fs.creatReadStream
- gzip
- cryptoDigest
- charsetDecode
- csvParse
- countChunks
- validate
- changeCase (pascal to snake)
- parquet?
- csvFormat
- postgesCopyFrom
- cryptoDigest
- charsetDetect
- jsonParse?
- validate
Upload file with brotli compression?
Fetch protobuf file, decompress, parse JSON
-
filter
-
file (docs only?)
- fetch
- node:fs
- input type=file
- readable string/array/etc
Licensed under MIT License. Copyright (c) 2026 will Farrell and contributors.