Upload to AWS S3 By Using Node.js
S3 is a cloud serverless object storage solution provided by AWS. That means you can store any files to S3 without maintaining the backend machine, and how to manager these servers are AWS's responsibility. Also it is very cheap, you can reference the pricing for each region you'll be using for the S3. For example, at the moment 2020 when I check the US East Ohio, the standard s3 first 50TB/month is $0.023 per GB, it also depends on how frequently you'll use that file the price will be changed. If you just want to store some files in S3 for a long time for archive purposes, you can use the S3 Glacier Deep Archive which is $0.00099 per GB.
So, what am I doing on this blog is to show some sample code that how to upload files to AWS by Node.js.
Setup IAM User with S3 permission
The first thing that needs to do before writing the Node.js code is, you have to set up a user with S3 permissions to access your S3 Bucket. I assume you already set up an AWS account. To make it easy you could create a user and provide the S3FullAccess policy for now, but for Production, you should always give the least privileges. Also, if you will running Node.js on EC2 or Lamba you should create an IAM role and attached it to the EC2 or Lambda instead of creating a user.
Require NPM package for the sample code
Following are packages that will use by the code snippet later.
const AWS = require("aws-sdk");
const axios = require("axios");
Update AWS Config and create S3 Instance
Here, you update the AWS config to tell which region is your bucket and add the access key id and secret access key to the s3 instance.
AWS.config.update({
region: "us-west-2"
});
const s3 = new AWS.S3({
accessKeyId: '{the access key id for the IAM user}',
secretAccessKey: '{the secret access key for the IAM user}'
});
Sample code to list all the bucket
s3.listBuckets((err, data) => {
if (err) {
console.log("Error", err);
} else {
console.log("Success", data);
}
});
Sample code to upload a text file to S3
var textObjectParams = {Bucket: 'your bucket name', Key: '{the file name you want to create}.txt', Body: 'Yo! My first text!'};
s3.upload(textObjectParams, (err, data) => {
if (err) {
console.log("Error", err);
} else {
console.log("Success", data);
}
});
Sample code to upload JSON file to S3
var sampleData = {
name: 'sample',
count: '999'
}
var textObjectParams = {Bucket: '{Your bucket folder}', Key: 'jsonFile.json', Body: JSON.stringify(sampleData)};
s3.upload(textObjectParams, (err, data) => {
if (err) {
console.log("Error", err);
} else {
console.log("Success", data);
}
});
Sample code to upload an image URL to S3
The following example uses Axios to get the image file as a stream then upload it to S3.
axios({
method: 'GET',
url: 'https://picsum.photos/536/354',
responseType: 'stream'
}).then(res => {
var objectParams = {Bucket: '{Your bucket name}', Key: 'imageFile.png', Body: res.data};
s3.upload(objectParams, (err, data) => {
if (err) {
console.log("Error", err);
} else {
console.log("Success", data);
}
});
});
Upload Image file from local
const content = fs.readFileSync('yourImage.png');
var objectParams = {Bucket: '{Your bucket name}', Key: 'new image file name.png', Body: content};
s3.upload(objectParams, (err, data) => {
if (err) {
console.log("Error", err);
} else {
console.log("Success", data);
}
});
Above are a few example code that you can list bucket, then upload a different kind of file to AWS S3 by using Node.js