Amazon S3 or Amazon Simple Storage Service is a service offered by Amazon Web Services that provides object storage through a web service interface. Amazon S3 uses the same scalable storage infrastructure that Amazon.com uses to run its global e-commerce network.
The scope of this article is to understand and learn how can we upload files on S3 using nodejs
Github Repo
Prerequisites
- NodeJs
- ExpressJs
- Multer
Here, as you can see in the above diagram. The request will be initiated by the browser by selecting the file from local file system, then send it to the server. Then server will receive the file and it also sends it to the AWS S3 bucket. Once S3 will process the request then it'll send the status to server, and server will send status to the bowser.
NodeJs Server
I am going to use express.Router()
, if you don't know then you can refer here. To receive the file at nodejs
server, i'll use Multer
let multer = require('multer')
let upload = multer({dest: 'kyc/'});
const S3 = require('./S3');
router.post("/upload",upload.single('user__profile__image'),async(req,res)=>{
try{
let docLoc = await S3.uploadDoc(req.file,'client/profile',req.user.cid);
res.send(docLoc);
}catch(e){
console.log(e);
res.status(500).send(err);
}
});
You can see i have defined a /upload
route to upload the file and S3.uploadDoc
is a function which will upload the given file to S3. Once S3.uploadDoc
executes without any error, then it'll resolve to a string docLoc
location of the file inside the S3 bucket.
S3.uploadDoc
Now, let's build the function to upload the file into S3 bucket. I'll use aws-sdk
to work with AWS
services. There is some sequence of steps involved while uploading the fine into S3 bucket.
- update AWS
config
- get
s3
bucket - create
uploadParams
and select location -
createReadStream
to read the file - upload the file
- return the result
update AWS config
AWS.config.update({
region: process.env.REGION,
accessKeyId: process.env.ACCESS_KEY,
secretAccessKey: process.env.SECRET_KEY,
});
get s3 bucket
Select the S3 bucket where you want to upload the file.
const s3 = new AWS.S3({
apiVersion: "2006-03-01",
params: { Bucket: "resources-logistics" },
});
create uploadParams and select location
const uploadParams = {
// the directory where the file will be uploaded
Bucket: `resources-logistics/profile`,
Key: "",
Body: "",
};
createReadStreamto read the file
// doc.path is the file path in you local file system
let fileStream = fs.createReadStream(doc.path);
Full code
const AWS = require("aws-sdk");
const fs = require("fs");
function uploadDoc(doc, preset, cid) {
let originalname = doc.originalname.split(".");
// get the file extenstion ,like .jpg, .img, etc
const docExt = originalname[originalname.length - 1];
// update AWS config
AWS.config.update({
region: process.env.REGION,
accessKeyId: process.env.ACCESS_KEY,
secretAccessKey: process.env.SECRET_KEY,
});
const s3 = new AWS.S3({
apiVersion: "2006-03-01",
params: { Bucket: "resources-logistics" },
});
return new Promise((resolve, reject) => {
const uploadParams = {
Bucket: `resources-logistics/${preset}`,
Key: "",
Body: "",
};
// this fileStream will stream the file from local file system to the S3
let fileStream = fs.createReadStream(doc.path);
// if there is any error while reading the file from local file system or streaming it
// then reject the promise with the err object
fileStream.on("error", function (err) {
reject(err);
});
// set body as stream
uploadParams.Body = fileStream;
// for key check the docs
uploadParams.Key = cid
? `${cid}${originalname
.slice(0, originalname.length - 1)
.join("-")}.${docExt}`
: doc.originalname;
// upload the file
s3.upload(uploadParams, function (err, data) {
if (err) {
reject(err);
}
if (data) {
resolve(data);
}
});
});
}
module.exports = {
uploadDoc,
};
Top comments (0)