Cover image for Angular With NodeJS Image Upload To AWS S3 - EASY!!

Angular With NodeJS Image Upload To AWS S3 - EASY!!

devpato profile image Pato Updated on ・5 min read

Have you always wondered how to upload an image in your Angular app to your S3 Bucket on Amazon?

I'm going to teach you a very easy way to do it using Angular, NodeJS and AWS S3 without getting in debt in IAM and other security stuff that will make this tutorial even longer. If you want to learn more about that, please do some extra research. Remember, for a real project NEVER use your root credentials.

-Angular CLI Installed
-NodeJS Installed
-Have an AWS Account (Don't worry, you won't be charged for whatever we do in this tutorial)
-Have some basic understanding on AWS, Angular and NodeJS.


1) Go to your AWS Console.
2) On your top right, you will see your account name, click on it then click on My security credentials.

3) A modal will appear, click on "Continue To Security Credentials".

4) Click on Access Keys.
5) Click on Create New Access Key.

6) Copy/Paste your Keys in a SECURE PLACE.


1) Navigate to the AWS S3 Service.
2) Create a new S3 bucket.

3) Give your bucket a name and click NEXT and NEXT again.

4) Uncheck the checkbox as shown in the picture and click NEXT.

5) Click on Create Bucket.

Angular App

1) Clone the following REPO:

Note: Inside the folder FINAL, you will find the complete code of this project.I have setup the architecture of the super tiny app, so you don't waste time doing so.

2) Go to your app.component.html and paste the following code:

Note: Don't forget change the url of the image source to have your bucket name.

   <input (change)="onImagePicked($event)" placeholder="Upload Image" 
   type="file" />
   <button (click)="onImageUpload()">Upload Image</button>

    <div *ngIf="imageUrl">
     Preview Image from AWS
     <br />
     <img width="200px" src="https://YOUR S3 BUCKET 
     imageUrl }}" 

3) Go to your app.component.ts and paste the following line at the top of the file:

   import { ImageUploadService } from './image-upload.service';

4) Go to your app.component.ts and paste the following code:

   imageObj: File;
   imageUrl: string;

   constructor(private imageUploadService: ImageUploadService) {}

   onImagePicked(event: Event): void {
    const FILE = (event.target as HTMLInputElement).files[0];
    this.imageObj = FILE;

   onImageUpload() {
    const imageForm = new FormData();
    imageForm.append('image', this.imageObj);
    this.imageUploadService.imageUpload(imageForm).subscribe(res => {
      this.imageUrl = res['image'];

5) Go to your image-upload.service.ts and paste the following code:

   imageUpload(imageForm: FormData) {
    console.log('image uploading');
    return this.http.post('http://localhost:3000/api/v1/upload', 

NodeJS Server

1) Go to the folder called BACKEND

Note: in your terminal run the following to installed the packages needed for this project.

   npm i --save multer multer-s3 aws-sdk dotenv nodemon

-multer and multer-s3 are the packages to handle the picture uploading.
-aws-sdk will give us access to the aws libraries
-dotenv will give us access to the environment variables.

FYI: nodemon package is only used for you to do not have to restart the server manually everytime you make a change. This package is not need for uploading images to s3.

2) Go to your .env file and enter the KEYS we generated from AWS.

The keys you put in a save location.

3) Inside of the service.js paste the following code;

   /* Where image is the name of the property sent from angular via the Form Data and the 1 is the max number of files to upload*/
   app.post('/api/v1/upload', upload.array('image', 1), (req, res) => {
    /* This will be th 8e response sent from the backend to the frontend */
    res.send({ image: req.file });

4) Inside of the file-upload.js paste the following code:

Note: don't forget to change the regions and s3 bucket name in the following code.

   const aws = require('aws-sdk');
   const multer = require('multer');
   const multerS3 = require('multer-s3');
   const dotenv = require('dotenv');

    secretAccessKey: process.env.SECRET_ACCESS_KEY,
    accessKeyId: process.env.ACCESS_KEY_ID,
    region: 'YOUR AWS REGION' //E.g us-east-1

   const s3 = new aws.S3();

   /* In case you want to validate your file type */
   const fileFilter = (req, file, cb) => {
    if (file.mimetype === 'image/jpeg' || file.mimetype === 'image/png') {
     cb(null, true);
    } else {
     cb(new Error('Wrong file type, only upload JPEG and/or PNG !'), 

   const upload = multer({
   fileFilter: fileFilter,
   storage: multerS3({
    acl: 'public-read',
    bucket: 'YOUR S3 BUCKET NAME',
    key: function(req, file, cb) {
      /*I'm using Date.now() to make sure my file has a unique name*/
      req.file = Date.now() + file.originalname;
      cb(null, Date.now() + file.originalname);

   module.exports = upload;

5) Lastly, in your server.js file, add the following line at the top of the file:

   const upload = require('./middleware/file-upload');

Time to test our app

1) Navigate to your BACKEND folder and run the following command in your terminal to start your backend server:

   npm start

2) Navigate to your angular app folder and run the following command in your terminal to start your backend server:

   ng serve --o

Note: Make sure your backend and frontend servers are running.

3) In your browser where you angular app, upload an image. You should see the following:

Posted on by:

devpato profile



Google Developer Expert on Angular and Web Technologies | Auth0 Ambassador | Media Developer Expert for Cloudinary | Technical Coach at SpringBoard


Editor guide

Did you really allow „Everyone“ to write to your S3 Bucket? You should use your AWS credentials as authorization, not allow everyone full access.


Yeah I did it on purpose.

1) AWS Allows public access to your s3, that's why that option exists, but you have to be careful depending on the app you are working on and what you want people to have access to. If you really care about security then yes, use authorization if you don't then authorization is irrelevant like in this tutorial.

2) I added an extra step just in case people care about having authorization when writing objects into the S3 bucket.


I think public write access should not be endorsed, especially when it’s unnecessary because you are authenticating in the backend. I fear some people will just follow the example without thinking about it.

Maybe I can do another tutorial that goes more in debt with security etc, but I want to keep this one as short as possible. As I mentioned at the beginning of the tutorial, if the requirements is to have have basic knowledge on NodeJS and AWS . I have erased those steps :) Thanks for the feedback

I think the article was otherwise great, looking forward for more :)

I appreciate your feedback. Feel free to let me know if you see anything else that looks sketch or wrong lol


This is one huge security trap. Using root security credentials, completely open buckets, cloning random services, then giving them access to those root credentials.

Stay well clear.


Thanks for the feedback I have erased those steps, but just to clarify you are making it seem with your word "trap" like I want to trick them for me to steal their credentials which is not true. FYI They are not cloning any random service. They are cloning the repo with the structure of the project so they don't have to do it themselves since I assume they already know some nodejs and angular as I specified in the beginning of the tutorial that is was a requirement.


It was maybe worded a bit harsh, sorry. Thanks for fixing it.

Hmmm, looking closer, it's still using root credentials. You really should at least put a big "never ever, ever do this, use roles" warning in there (or better, show the steps).

I'm gonna add that to the note I have at the top "...without getting in debt in IAM and other security stuff that will make this tutorial even longer. If you want to learn more about that, please do some extra research." Thanks for the feedback


Perfect timing according to my task list. Thank you for this article!