Pipe data straight to an S3 key of your choice

Downloads in past


0.0.110 years ago10 years agoMinified + gzip package size for s3-write-stream in KB


s3-write-stream Flattr this!experimental #
Pipe data straight to an S3 key of your choice.
This is a writeable stream that takes data and uploads it to Amazon S3 using its multipart upload API. This is ideal for handling generated content without needing to know the content's length ahead of time, and without resorting to file system hacks or buffering everything before the upload.
Internally, there's a fibonacci backoff handling errors, stopping the stray failed requests which tend to tear apart long-running S3 uploads.
The end result is that uploading files to S3 is as simple as this:
var fs = require('fs')
var upload = require('s3-write-stream')({
    accessKeyId: process.env.AWS_ACCESS_KEY
  , secretAccessKey: process.env.AWS_SECRET_KEY
  , Bucket: 'photo-album'

fs.createWriteStream(__dirname + '/photo_001.jpg')

Usage ##


createStream = require('s3-write-stream')(opts) ###

Initiates the s3-write-stream module with your AWS configuration. The following properties are required:
  • opts.accessKeyId: your AWS access key id.
  • opts.secretAccessKey: your AWS secret access id.

It's also recommended that you include opts.Bucket to define the default S3 bucket you want to upload to.

createStream(key|opts) ###

Creates and returns a writeable stream, that you can pipe to upload to. You can either:
  • pass the upload's key as a string to determine the location you
want to upload to. By default, files uploaded this way will be public.
  • pass in an opts object, which will pass those parameters on to the
initial upload call via aws-sdk.
Note that if you haven't already specified a default bucket, you'll need to do so here and hence will need to use opts.

License ##

MIT. See LICENSE.md for details.