DEV Community

Cover image for Node Project Skeleton Part I: Setup
JTK
JTK

Posted on

Node Project Skeleton Part I: Setup

I've worked with NodeJS for some time now. As someone who works a lot with both Python and JavaScript, a no-frills Node/Express app for me is a tool I'd go to for a similar use case as Flask - something minimal where I don't need a large ecosystem and just want to get up and running quickly.

This is only my opinion of course, but if Flask is a micro-framework, Express feels like a micro-micro-framework. The docs strike me as sparse, and it is sometimes difficult to get through all the Hello World noise in search results and figure out what even is a standard structure for Node/Express APIs.

This project was borne out of a desire to give myself a standard boilerplate setup including tests and dependencies that I could use in the future to avoid repeating my work around figuring out a reasonable application structure.

Audience

While a lot of what I'm doing is pretty human-readable, I'm going to move fast and cover a lot: mongo, express, mocha, chai, and chai-http. This will probably be more meaningful to someone who knows JavaScript and has interest or exposure to one or more of these technologies.

This post specifically deals with basic account creation and seed data for MongoDB, so if you know that already or aren't trying to recreate the app on your own, feel free to skip ahead to part II.

Ten thousand foot view of MongoDB

MongoDB falls into a class of databases called "NoSQL", collectively defined by what they are not: SQL. Mongo data is organized into collections instead of tables, and documents instead of rows.

MongoDB Atlas is a cloud hosting and management platform that provides a GUI to work with data in a cloud hosted mongo instance, and we will use it throughout this post. It's a great tool, and I prefer it heavily over some others I've seen, such as the interface for AWS DynamoDB.

Getting to work

Our objectives here are as follows: We are going to create a new organization > project > cluster > database, and affiliated collections two times: one to represent a production database, and the second time to represent a test database. The test database will be used later on when we demo the integration tests that are a part of my project skeleton.

We will also create a user, give them access, and get connection strings to our two databases that will be used for our running Node app.

The data we are going to use is inspired by a set of adorable pokemon icons. It is a simple api that returns an icon image string and pokemon name.

And lastly, there is a starter repo that will be used throughout this tutorial and the following tutorials which you can clone here.

With that said, let's get started!

First, you are going to need to make a MongoDB Atlas account here.

You will be prompted to create a user and account. Feel free to change the names if you plan on customizing the project boilerplate, but if you want to see the full app running you will either need to keep mine, or update the name references in following steps.

Creating a user and organization in MongoDB Atlas

We will need to create a cluster, where our databases will live, and you'll need to be sure to elect the free/shared option to avoid charges. (One nice thing about MongoDB Atlas is that the free options are flexible enough that you can have multiple free database instances on the shared tier if you are careful about how you structure your work.)

Creating a new database cluster

Use the default cluster options when given the opportunity.

Selecting default options for the cluster

From the cluster overview page, we are going to click the COLLECTIONS button, then select add collection, which also will prompt us to create our first database.

Create collection and database

We will then choose the "Add your own data" option to get started with collections.

Add your own data option from collections screen

Our sample API is going to have two main endpoints: /pokemon, and /equipment. Some of the icons are pokeballs, potions, etc and others are actual pokemon. So we are going to add another collection under our database called equipment.

Adding equipment collection to database

We will then use the INSERT DOCUMENT option on each of our new collections to create some seed data. (You can find seed data in the cloned repo under /sample-data.)

Adding sample pokemon to pokemon collection

Adding sample equipment to equipment collection

We are now going to repeat the last couple steps to create an identical test database, with its own pokemon and equipment collections.

Creating pokepic-test db

Add sample pokemon to test db pokemon collection

Add equipment collection to test database

Add seed data to new equipment collection under our test database

We are now going to navigate up one level, and then hit CONNECT.

Navigate back to overview

Select connect

Here you should be prompted to create a user, and select whitelisted IP addresses for incoming traffic to our database. We are going to use CIDR notation to allow any IP address to access this database by whitelisting 0.0.0.0/0.

We are also going to create an authenticated user.

Edit CIDR notation

Confirms new user and CIDR settings

We are then going to select the Connect Application option, which will give us a URI string that can link our node app with our mongo data.

Connect application

Copy the URI because it will be needed shortly.

Copy URI string

Last but not least, we are going to create a .env file in the cloned repo with our project boilerplate. It will need to be set up as follows:

MONGO_URI=mongodb+srv://<username>:<password>@cluster0.xxxx.mongodb.net/pokepic?retryWrites=true&w=majority
MONGO_URI_TEST=mongodb+srv://<username>:<password>@cluster0.xxxx.mongodb.net/pokepic-test?retryWrites=true&w=majority
Enter fullscreen mode Exit fullscreen mode

With and subbed in with your own credentials.

Wrapping up

With those steps taken care of, you will have created a user, project, cluster, and prod and test database instances for use in the following walkthrough.

In Part II, we will get more into the project structure and functionality, so stay tuned!

Top comments (0)