bulk-insert

A writable stream that batches

Downloads in past

Stats

StarsIssuesVersionUpdatedCreatedSize
bulk-insert
501.2.07 years ago7 years agoMinified + gzip package size for bulk-insert in KB

Readme

Asymmetrical Signing
!NPM versionnpm-imagenpm-url !Build statustravis-imagetravis-url !Test coveragecodecov-imagecodecov-url !Dependency Statusdavid-imagedavid-url !Licenselicense-imagelicense-url !Downloadsdownloads-imagedownloads-url
A writable stream that batches.

Example

const createBulk = require('bulk-insert')

const onError = (err) => {
  if (err) console.error(err.stack || err)
}

const writable = createBulk({
  limit: 500, // maximum # of documents to insert at one time
  interval: '0.5s', // minimum interval between flushes,
  onError,
  flush (data) {
    // `data` will be an array
    kinesis.putRecords({
      Records: data.map((x) => ({
        Data: JSON.stringify(x),
        PartitionKey: 'some_key'
      })),
      StreamName: 'some_stream_name'
    }, onError)
  }
})

writable.write({
  some: 'data'
})

writable.write({
  some: 'more data'
})

API

const writable = bulkInsert(options)

Options:
  • limit<Integer>: 1000 - maximum # of documents to insert at one time
  • interval<Number|String>: '300ms' - minimum interval between flushes,
  • onError<Function> - optional function that handles flush() errors
  • flush<Function> - a function with the signature (data<Array>) => {}

writable.write(data)

Write data to the stream.

writable.flush()

Flush all the data immediately.

writable.close()

Flushes immediately and unrefs all future timers, allowing the process to exit gracefully. Even though you are still able to write to the stream after you've closed it, you should not.