Are you looking for an API to upload and download binary files to AWS S3 v3 using Node.js? Look no further! In this step-by-step guide, we will walk you through the process of achieving this seamlessly with a Codehooks.io API and easy deployment to the cloud.
The complete source code is available at GitHub
Set Up AWS S3
To begin, make sure you have an AWS account and an S3 bucket created. Note down the bucket name and region for future reference.
Then get your secure access credential to use with the AWS S3 API, read more about how to in this excellent Medium guide from @shamnad.p.s.
Create a codehooks.io project and install required Packages
If you're new to Codehooks.io you can read the getting started guide here.
Install the official Codehooks command line interface (CLI) first.
npm install codehooks -g
Then, after logging in, create a new project, in this example we'll name our project s3project
. Choose any project title you prefer.
coho create s3project
cd s3project
In your project directory, initialise and install the necessary packages by running the following commands:
npm init es6 -y
npm install codehooks-js
npm install @aws-sdk/client-s3
Writing the Code
Now, let's dive into the code to handle the upload and download files to AWS S3.
Import the required packages:
import { app } from 'codehooks-js'
import { S3Client, GetObjectCommand, PutObjectCommand } from '@aws-sdk/client-s3'
import { PassThrough } from 'stream'
Configure the AWS SDK with your AWS credentials and region:
Use the CLI or the Admin UI to add your secret environment variables. Note the --encrypted
flag to prevent reading the actual data content of the variables.
coho set-env AWS_ACCESS_KEY_ID 'YOUR_KEY' --encrypted
coho set-env AWS_SECRET_ACCESS_KEY 'YOUR_SECRET' --encrypted
coho set-env AWS_BUCKET 'YOUR_BUCKET' --encrypted
coho set-env AWS_REGION 'YOUR_REGION' --encrypted
Use the CLI command coho info
to inspect and verify that the variables are set correctly.
With the necessary secrets in place, we can create the code to access these in the application as regular process.env
values.
const { AWS_ACCESS_KEY_ID, AWS_SECRET_ACCESS_KEY, AWS_REGION, AWS_BUCKET } = process.env;
const s3config = {
region: AWS_REGION,
AWS_ACCESS_KEY_ID,
AWS_SECRET_ACCESS_KEY
}
const s3client = new S3Client(s3config);
The next step is to create an API that process file uploads from any client and store files in an AWS S3 bucket.
API to Upload a binary file to AWS S3:
The API to upload a data stream to AWS S3 uses the PutObjectCommand
with a stream that we can pipe data from the client into.
// API to POST a binary data stream
app.post('/upload/single', async (req, res) => {
try {
// get size, type and filename from destructured header values
const { 'content-length': ContentLength, 'content-type': ContentType, filename } = req.headers;
const input = {
"Bucket": AWS_BUCKET,
"Key": `tmp/${filename}`, // emulate file system /bucketname/tmp/filename
"Body": new PassThrough(), // stream to pipe data through
"ContentLength": ContentLength,
"ContentType": ContentType
};
// pipe binary request data to S3 stream
req.pipe(input.Body);
// create put command
const command = new PutObjectCommand(input);
// execute put object command
const response = await s3client.send(command);
// return data to client
res.json(response);
} catch (error) {
// some error occured, return 400 status to client
res.status(400).end(error.message)
}
})
Let's test the upload API with curl by uploading a test file from a local filesystem. Use coho info --examples
to inspect your project API endpoint. In this example it's: https://s3project-kvqx.api.codehooks.io/dev/upload/single
curl --location 'https://s3project-kvqx.api.codehooks.io/dev/upload/single' \
--header 'x-apikey: XXXXX' \
--header 'filename: einstein.jpg' \
--header 'Content-Type: image/jpeg' \
--data '@./Einstein_tongue.jpg'
API to Download a binary file from AWS S3:
The API to download a data stream from AWS S3 uses the GetObjectCommand
with a stream that can pipe data back to the client.
// API to GET a binary data stream from AWS S3
app.get('/download/:file', async (req, res) => {
try {
// filename from route
const { file } = req.params;
const input = {
"Bucket": AWS_BUCKET,
"Key": `tmp/${decodeURI(file)}` // decode filename and store in /bucket/tmp/file
};
// Create get command
const command = new GetObjectCommand(input);
// Send get command
const response = await s3client.send(command);
// set content-type
res.set('content-type', response.ContentType)
// stream data back to client
response.Body.pipe(res.writable)
} catch (error) {
// Woops
res.status(400).end(error.message)
}
})
Easy deployment to the cloud:
After finishing the code for the AWS S3 API, we simply deploy the application to the cloud with this command:
coho deploy
The command should output the following on success.
Project: s3project-kvqx Space: dev
Deployed Codehook successfully! 🙌
Now that we understand the basic operations required to upload and download binary files to AWS S3, let's go ahead and create a simple web client that uses the API.
Happy days 🙌
Client web app to test the API
Let's create a simple (and ugly) web page to upload and download a binary file to AWS S3 through the Codehooks API.
<html>
<body>
<h2>AWS S3 upload and download API demo</h2>
<label> upload binary file to read here </label>
<input type="file" id="inputFile">
<div id="mydiv"></div>
<script src="test.js"></script>
</body>
</html>
Screen shot of the ugly web page:
Note that the web page refers to a JavaScript test.js
which contains the following code.
/*
* Web client to upload a binary file to a Codehooks.io AWS S3 API
*/
var URL = 'https://<YOUR_PROJECT_ID_HERE>.api.codehooks.io';
var APIKEY = 'YOUR_API_TOKEN_HERE';
async function uploadFile(file, cb) {
var myHeaders = new Headers();
myHeaders.append("x-apikey", APIKEY);
myHeaders.append("filename", file.name);
myHeaders.append("content", file.type);
myHeaders.append("content-type", file.type);
myHeaders.append("content-length", file.size);
var requestOptions = {
method: 'POST',
headers: myHeaders,
body: file, // file stream
redirect: 'follow'
};
try {
var response = await fetch(`${URL}/dev/upload/single`, requestOptions)
console.log(response.status, response.statusText);
var result = await response.text();
console.log(result);
cb(result);
} catch (error) {
console.error(error)
}
}
// HTML5 file event
function fileChange(theEvent) {
var theFile = theEvent.target.files[0]; // target the loaded file
uploadFile(theFile, (result) => {
var mydiv = document.getElementById("myDiv");
var aTag = document.createElement('a');
aTag.setAttribute('href',`${URL}/dev/download/${theFile.name}`);
aTag.setAttribute('target','_blank');
aTag.innerText = theFile.name;
mydiv.appendChild(aTag);
mydiv.appendChild(document.createElement("br"));
if (theFile.type.startsWith('image')) {
var imgTag = document.createElement('img');
imgTag.setAttribute('src',`${URL}/dev/download/${theFile.name}`);
imgTag.setAttribute('width','200px');
mydiv.appendChild(imgTag);
mydiv.appendChild(document.createElement("br"));
}
});
}
// get html element for file: <input type="file" id="inputFile">
document.getElementById("inputFile").addEventListener("change", fileChange, false); // listener for file upload button
You'll find the complete example source code for AWS S3 upload and download of binary files at our Github account here
Remember to replace AWS_ACCESS_KEY_ID
, AWS_SECRET_ACCESS_KEY
, AWS_REGION
and AWS_BUCKET
with your actual AWS credentials, bucket and region.
Links
If you have read this far and even tried out the example code in your own project, you're all set to conquer the world with apps and API's that uses the power of AWS S3 for binary file persistence and much more.
Happy coding!
Top comments (0)