DEV Community

Cover image for Mocking AWS with Jest (and TypeScript)
Matt Morgan
Matt Morgan

Posted on • Updated on

Mocking AWS with Jest (and TypeScript)

This seems like a very punny subject, mocking the world's leading cloud provider. I tried to think of a few chuckles, but then I got to thinking about writing blogposts for free to promote the projects of companies that are worth hundreds of billions of dollars, and maybe the joke is just on me?

Anyway, no need to think on that one too much. I like working with these tools a lot and I like sharing the ways I've found to use them together. Some readers may think Jest is a lot like some of the other JavaScript/NodeJS testing frameworks, but it has a few features that I really think set it apart from anything else I've used. One of them is that it ships with a very powerful mocking capability. That is the subject of this article. The other features I like about Jest are snapshots and test tables, but they will need to be covered another time.

Table of Contents

The Road to Mocking

I've written a lot of tests in my time. I believe strongly in automated testing as a practice to keep systems stable and to enable innovation, refactoring and even to automate maintenance updates. The most important thing I've learned about testing is that it is a skill that requires some work as well as a smart approach.

Several years ago, I worked on a team that wrote "unit tests" in Java that mostly ran against a development copy of a relational database (Oracle, to make matters worse). If the tests couldn't negotiate a connection to that database, they would fail. Whenever the database schema changed, some tests would probably need to be updated, or they'd fail. Sometimes tests would fail because a row was deleted or added or because time had passed (how dare it!).

These brittle tests were more trouble than they were worth! In an effort to get some value out of unit testing, I struggled with some of the mocking libraries that were available at the time, such as the ironically-named EasyMock and the diminutive (???) Mockito. I did manage to write a few decent tests, but they took a long time to write and the return on investment just wasn't there to mock everything. As far as I know, that system still exists and still works but I'd be surprised if the unit tests are providing much value.

I'm happy to say I never again wrote a "unit test" against a live database after that. I did spend a bunch of time figuring out how to use Docker to build out a development environment and run tests against a pre-seeded database. Putting a full database (including tables and data) into a Docker image is actually a fantastic way to go if you are using RDBMS. I know that some say "mock the database and test the business layer", but that won't catch my app spitting out invalid or nonsensical SQL. Anyway, that has worked well for me and putting that Dockerized database into a CI pipeline has also produced good results for me.

As I started to get more into AWS, I reached for something else in that familiar pattern - Docker images that would give me a "good enough" implementation so I could develop offline. So I used localstack. If you just need a couple of AWS services, like an S3 bucket, localstack can work out very well. But if you're really going cloud native? It's miserable. There's basically no documentation. Apparently I'm to assume everything works just like the real AWS, except of course that isn't true and the gaps are for me to figure out. The biggest frustration I had with localstack wasn't just with running an AWS service but it was the effort of putting that service into some state where I could reasonably run a unit test against it. For example, if you want a test to run that involves an S3 bucket, you might think you can just create the bucket as part of a docker-compose up, but there isn't really a hook that localstack gives you to do things like that and so you wind up with race conditions and "flaky" tests. I actually did get this to work, but much like my Mockito tests, it was too much effort.

That's the key thing. It's not enough to have the perfect mock. It must be easy to set up and use. If it isn't, the ROI on its use will be too low. Eventually tests break, get skipped or become permanent TODOs. Testing should be a natural part of the development flow, not a big pile of extra work to do at the end of a feature implementation.

A good test setup is an amazing thing. I know that not everyone shares this view, but I like to see 100% unit test coverage on the projects I work on. I want to see unit tests on the simple bits, because I know they won't always be simple. Something is difficult to test? That's likely a code smell or maybe we haven't figured out a good way to manage the dependency in test. If we resolve those issues, getting the test coverage in isn't so hard and shouldn't take so long.

So that brings me to Jest and AWS. The latest project I've worked on has been all Lambda, API Gateway, DynamoDB and other services. As I was ramping up for this project, I spent a lot of time thinking about developer workflows and testing. I spent a lot of seeing how far I could go with SAM and even looked at focusing on the Docker angle of SAM.

In the end, we decided on a more cloud-native approach. We are using CDK without any local execution environment at all. We use unit tests and Jest mocks to write the code, then deploy to a developer-namespaced stack in our development account for integration testing. When we consider a feature complete, we open a pull request, get a review and eventually merge into the main branch, at which point continuous integration builds the main branch in our dev account, fires off some additional tests, then ships to a higher environment.

This is actually working really well for us because of the approach we use to mock AWS. The mocks we're using are extremely light and require very little maintenance or work to set up. This lets us focus on our custom code.

Mocking AWS

If you never mocked anything with Jest because you find their documentation too confusing, you likely aren't alone. I suppose it's challenging to be a maintainer of Jest attempting to be the premier JavaScript testing tool while supporting both pre-and-post ES6 code, ESM, TypeScript, JSX, Babel, NodeJS, etc. All that stuff doesn't make it easy to find your way as a consumer of said documentation. Additionally there are some of what I consider to be traps! Jest mentions a DynamoDB mocking library right in their docs. So that's the thing, right?
Alt Text
Well, now we know why they call it Jest I guess because you HAVE GOT TO BE KIDDING ME! The aws-sdk dependency makes sense, but it runs on Java? No way, that's not good to add Java as a dependency just to run a unit test. And even if this did work well for me, what happens when I want to include SQS or S3 or any other AWS service and all I have is a very specific DynamoDB mock? No, I need a way to do the whole thing.

The good stuff in the Jest documentation is the part on Manual Mocks. My advice is ignore all the stuff on ES6 Class Mocks as that will only draw you away from the right way to do this and that way is to put the modules you want to mock under __mocks__ in the root directory (adjacent to node_modules) of your project. If you read the docs, you are basically going to provide your own mock version of whatever the module is, which sounds like a lot of work until you consider jest.fn().

DynamoDB Mock

Okay, enough talk. Here's how I mock AWS using Jest. Let's start with some code that updates an item in DynamoDB. All code samples are available.

import { DynamoDB } from 'aws-sdk';

const db = new DynamoDB.DocumentClient();

interface Pet {
  legCount: number;
  likesIceCream: boolean;
  name: string;
}

export const savePet = async (tableName: string, pet: Pet): Promise<void> => {
  await db
    .put({
      TableName: tableName,
      Item: {
        PK: pet.name,
        ...pet,
      },
    })
    .promise();
};
Enter fullscreen mode Exit fullscreen mode

The way "Manual Mocks" work in jest is that imports will look for modules in a __mocks__ directory before they go to the regular node_modules source, so effectively I can intercept aws-sdk with a copy of my own. This works by comparing the import path so when I import from aws-sdk if I have __mocks__/aws-sdk.ts, that will intercept my import and replace the module with my mock.

Now you may be thinking that my plan to rewrite all of AWS SDK as a mock doesn't sound so lightweight after all, but that's where Jest really shines. I'm going to be able to provide only the bits I need while ignoring all the internals. Here's a basic mock that can be used with the code above.

export const awsSdkPromiseResponse = jest.fn().mockReturnValue(Promise.resolve(true));

const putFn = jest.fn().mockImplementation(() => ({ promise: awsSdkPromiseResponse }));

class DocumentClient {
  put = putFn;
}

export const DynamoDB = {
  DocumentClient,
};
Enter fullscreen mode Exit fullscreen mode

I'm using DocumentClient in my code, so that's what the mock sdk will need to expose. Even though DynamoDB itself is a class in the sdk, here I'm just pulling a stack class from it, so this will work. I'm only calling one method on DocumentClient, so that's the only mock I need to provide for now.

What about the function implementation? If you look at my code, I'm calling the put method and then promise() on the object returned by it, so that's just what my mock does. It returns an object with a promise method on it (just as the real sdk does) and my code calls that method, which is another mock that just resolves the promise and returns the boolean true.

Putting all that together, I can now write a unit test that looks like this.

import { DynamoDB } from '../__mocks__/aws-sdk';
import { savePet } from './savePet';

const db = new DynamoDB.DocumentClient();

describe('savePet method', () => {
  test('Save Fluffy', async () => {
    const fluffy = { legCount: 4, likesIceCream: true, name: 'Fluffy', PK: 'Fluffy' };
    await savePet('Pets', fluffy);
    expect(db.put).toHaveBeenCalledWith({ TableName: 'Pets', Item: fluffy });
  });
});
Enter fullscreen mode Exit fullscreen mode

Note that it is not necessary to explicitly mock the sdk or import my mock. The only reason I did that is to be able to use toHaveBeenCalledWith in my test.

Import Paths

Some developers have the practice of not importing the entire sdk but just individual clients. This can lead to smaller Lambda sizes if you use any kind of bundling and tree-shaking such as webpack or parcel. I'm aware you can avoid bundling aws-sdk entirely by setting it as an external, but some benchmarks have shown that to be a worse practice performance-wise. In any case, suit yourself, but I like importing just the clients as it makes my code feel cleaner and makes the individual mocks smaller.

So here's the same code refactored to import only individual clients.

The implementation:

import { DocumentClient } from 'aws-sdk/clients/dynamodb';

const db = new DocumentClient();

interface Pet {
  legCount: number;
  likesIceCream: boolean;
  name: string;
}

export const savePet = async (tableName: string, pet: Pet): Promise<void> => {
  await db
    .put({
      TableName: tableName,
      Item: {
        PK: pet.name,
        ...pet,
      },
    })
    .promise();
};
Enter fullscreen mode Exit fullscreen mode

The mock (now in __mocks__/aws-sdk/clients/dynamodb.ts):

export const awsSdkPromiseResponse = jest.fn().mockReturnValue(Promise.resolve(true));

const putFn = jest.fn().mockImplementation(() => ({ promise: awsSdkPromiseResponse }));

export class DocumentClient {
  put = putFn;
}
Enter fullscreen mode Exit fullscreen mode

And finally the test:

import { DocumentClient } from '../__mocks__/aws-sdk/clients/dynamodb';
import { savePet } from './savePet';

const db = new DocumentClient();

describe('savePet method', () => {
  test('Save Fluffy', async () => {
    const fluffy = { legCount: 4, likesIceCream: true, name: 'Fluffy', PK: 'Fluffy' };
    await savePet('Pets', fluffy);
    expect(db.put).toHaveBeenCalledWith({ TableName: 'Pets', Item: fluffy });
  });
});
Enter fullscreen mode Exit fullscreen mode

As you can see, nothing much has changed, so it's easy to choose the approach that works best for your project.

Returning Data

So far we have a pretty good way to just ignore the fact that DynamoDB does stuff that isn't too relevant to our code, but how can we reuse the same mock when we want to test a get request or otherwise inspect the return value from a call to an AWS service? That's where our friend awsSdkPromiseResponse comes into play. Because that is a Jest mock which is exported, we can alter the return value on the fly.

Let's take a get operation:

import { DocumentClient } from 'aws-sdk/clients/dynamodb';

const db = new DocumentClient();

interface Pet {
  legCount: number;
  likesIceCream: boolean;
  name: string;
}

export const getPet = async (tableName: string, petName: string): Promise<Pet> => {
  const response = await db.get({ TableName: tableName, Key: { PK: petName } }).promise();
  if (response.Item) {
    return <Pet>response.Item;
  } else {
    throw new Error(`Couldn't find ${petName}!`);
  }
};
Enter fullscreen mode Exit fullscreen mode

(Note: please don't copy paste your interfaces! This is just to make examples more clear.)

Okay, so the table design is quite simple here with a PK that is the pet's name. If we pass along the right name, we can access that Pet item. If not, we get an error. Let's build out the mock a bit more to accommodate the new functionality.

export const awsSdkPromiseResponse = jest.fn().mockReturnValue(Promise.resolve(true));

const getFn = jest.fn().mockImplementation(() => ({ promise: awsSdkPromiseResponse }));

const putFn = jest.fn().mockImplementation(() => ({ promise: awsSdkPromiseResponse }));

export class DocumentClient {
  get = getFn;
  put = putFn;
}
Enter fullscreen mode Exit fullscreen mode

I could even use exactly the same mock for both getFn and putFn, but doing that would make it a bit harder to test a workflow in which I was trying to count the number of gets vs. puts in a test. Again, this is a pretty basic design decision that you could pivot on without much trouble.

So based on the above, I could write another test like this:

import { DocumentClient } from '../__mocks__/aws-sdk/clients/dynamodb';
import { getPet } from './getPet';

const db = new DocumentClient();

describe('getPet method', () => {
  test('Save Fluffy', async () => {
    await getPet('Pets', 'Fluffy');
    expect(db.get).toHaveBeenCalledWith({ TableName: 'Pets', Key: { PK: 'Fluffy' } });
  });
});
Enter fullscreen mode Exit fullscreen mode

Of course there are two big problems with this test.

  1. I might actually have some code downstream that cares about this response and wants to do something with it - and I'm not getting that here.
  2. I'll hit my error condition every time because the mock isn't returning the expected type. The way to fix that is to alter the value returned by our mock sdk response.
import { DocumentClient, awsSdkPromiseResponse } from '../__mocks__/aws-sdk/clients/dynamodb';
import { getPet } from './getPet';

const db = new DocumentClient();

describe('getPet method', () => {
  test('Save Fluffy', async () => {
    const fluffy = { legCount: 4, likesIceCream: true, name: 'Fluffy', PK: 'Fluffy' };
    awsSdkPromiseResponse.mockReturnValueOnce(Promise.resolve({ Item: fluffy }));

    const pet = await getPet('Pets', 'Fluffy');
    expect(db.get).toHaveBeenCalledWith({ TableName: 'Pets', Key: { PK: 'Fluffy' } });
    expect(pet).toEqual(fluffy);
  });
});
Enter fullscreen mode Exit fullscreen mode

Using mockReturnValueOnce here gives me the response I'm expecting from the sdk at which point I can continue processing. Our tests are passing now! But our coverage has slipped because we aren't hitting the error condition.

Mocking Errors

This is so easy, it's basically cheating since we already had an error above. We only need to put it into a test. We can use try and catch to surround a call that throws an error and then test the error response. It's a best practice to tell Jest how many assertions to expect when putting assertions into blocks. Otherwise the code could NOT throw an error and the test might still pass.

  test(`Can't find Rover`, async () => {
    expect.assertions(1);
    try {
      await getPet('Pets', 'Rover');
    } catch (e) {
      expect(e.message).toBe(`Couldn't find Rover!`);
    }
  });
Enter fullscreen mode Exit fullscreen mode

Let's try something a little harder. What if AWS is broken and we want to see what happens with our function? (spoiler: it doesn't work). Instead of having awsSdkPromiseResponse return a value our code treats as an error, we can just have it throw an error.

  test(`DynamoDB doesn't work`, async () => {
    awsSdkPromiseResponse.mockReturnValueOnce(Promise.reject(new Error('some error')));
    expect.assertions(1);
    try {
      await getPet('Pets', 'Rover');
    } catch (e) {
      expect(e.message).toBe(`some error`);
    }
  });
Enter fullscreen mode Exit fullscreen mode

(It's left as an exercise to the reader to decide what kind of errors to handle here).

Test Data Persistence

In short, we don't do that. Some other mocks and frameworks attempt to create a persistent data store and mimic a real database. To me, this is antithetical to a good unit test. In short order we'll have tests that rely on other tests to put data in a certain state and that is not a good place to end up. A good unit test is completely independent as well as deterministic. We can achieve that by not mocking a DynamoDB database but by mocking the API we use to communicate with it.

TypeScript

If you aren't a fan of TypeScript, all of this could theoretically be done in JavaScript, but I'm not sure it is a solid of an idea. One of the reasons this works well is because the DocumentClient actually has a pretty type-opinionated API. If I pass an invalid payload to my db.put call, it'll fail in linting and my IDE will warn me I'm writing invalid code. With tools like VSCode, you can get some of that benefit even without TypeScript, but I wouldn't want to try this without any type hints at all. It's too likely to put you in a world where all your code seems to work and your tests pass but nothing works when you deploy it.

Next Steps

There's a lot I left off here because I just wanted to focus on the Jest mocks. After trying a few different things, my team is still bundling Lambda with webpack. Webpack has a learning curve, but it works well and is fast. As noted above, we're not really using SAM very much anymore and the team I'm working with is mostly relying on unit tests and deploying their own stacks to a development environment. In fact, we have constructed our application in such a way that the Lambda and CDK tests run together and it works beautifully.

Alt Text

So that's Jest, a great productivity tool if you like seeing a whole bunch of tests go green in a short amount of time. A couple of other great features we've made solid use of are snapshots and test tables, but since I've gone on and on already, I'll save that for another post.

Top comments (7)

Collapse
 
brett_ryan_c987792fff6b8d profile image
Brett Ryan • Edited

This is an excellent article. In the last example for Returning Data, should the mock return value not be a promise?

awsSdkPromiseResponse.mockReturnValueOnce({ Item: fluffy });
Enter fullscreen mode Exit fullscreen mode

might be the following?

awsSdkPromiseResponse.mockReturnValueOnce(Promise.resolve({
  Item: fluffy
});
Enter fullscreen mode Exit fullscreen mode
Collapse
 
elthrasher profile image
Matt Morgan

Thanks for reading, Brett and good catch. It works the way I had it because you can await 'hello' but it's more accurate to have the Promise there.

Collapse
 
zenbeni profile image
Benjamin Houdu

Good article! Are you not afraid you'll generate too many mocks if your codebase use many dynamo queries? I personally prefer using localstack that mocks ddb, with an init phase and a cleanup after. Less reliant on mock code, and well tested, also it should behave more closely to the final behaviour (arguable).

Collapse
 
elthrasher profile image
Matt Morgan

I'm not mocking the query, just the API, so I have to mock each method (get, put, etc) of the API, but don't need to write individual mocks for each query.

As I wrote in the post, I have experience with localstack. Needing to get each service into the correct state (setup, teardown) for tests to run in a continuous integration pipeline (which now needs Docker or localstack installed, of course) was considerably more work for me on the projects I employed that technique than writing a few jest mocks was. For my money, localstack is nearly as much overhead as just using a non-production AWS account and the latter is considerably easier to debug.

But ultimately my approach is a different approach. I am deliberately avoiding testing the AWS service. I'm only testing my own code - how does it form the correct payload for a call to an AWS service and how does it handle the expected response? That seems appropriate for a unit test. Thanks for reading and your thoughts!

Collapse
 
starkiedev profile image
James Starkie

Hi Matt, thanks for this article, absolute lifesaver! I'm used to mocking the AWS SDK in JavaScript, but moving to TypeScript was giving me some real headaches, so using your approach is a big help.

Collapse
 
elthrasher profile image
Matt Morgan

Great to hear, James!

Collapse
 
094459 profile image
Ricardo Sueiras

Great post Matt!