DEV Community

Hoang Le
Hoang Le

Posted on • Updated on

07 best practices when using AWS SSM Parameter Store

This post appears first on our blog.

Cloud Encryption
Cloud Encryption

Security is one of 5 pillars of the Well-Architected framework, it can archive by applying best practices and principals in IAM, Encryption, Complician, and Governance. Of course, best practices aren't enough, you need to learn more. In this post, I only share our best practices and tip when working with AWS SSM Parameter Store. By sharing our best practices, my hope is to encourage you to build and deploy secure and reliable applications and also giving us your feedback.

As you know, AWS Lambda supports native environment variables, you can easy to define and add any environment variables you want during deployment or change on the AWS Console Management. But using native environment variables contains some disadvantages:

  • It stores plaint-text variables which easy to see its value. You had an option to encrypt variables in the console using KMS, but it still fetching per innovation causes increase your bill.
  • Hard to share across projects and teams that add complexity to your applications and services. More complexity requires more time to operate and increase the cost, therefore, you won't meed the conditional for the Operational Excellence pillar of the Well-Architected framework.
  • As per Yan Cui, it hards for implementing fine-grained access to sensitive data.

What is AWS Systems Manager Parameter Store (aka SSM Parameter Store)?

AWS Systems Manager Parameter Store provides secure, hierarchical storage for configuration data management and secrets management. You can store data such as passwords, database strings, and license codes as parameter values. You can store values as plain text or encrypted data. You can then reference values by using the unique name that you specified when you created the parameter. Highly scalable, available, and durable, Parameter Store is backed by the AWS Cloud. ~AWS

What are the benefits?

There are a lot of benefits when using AWS SSM Parameter Store, I just copied those from the AWS documentation:

  • Use a secure, scalable, hosted secrets management service with no servers to manage.
  • Improve your security posture by separating your data from your code.
  • Store configuration data and secure strings in hierarchies and track versions.
  • Control and audit access at granular levels.
  • Configure change notifications and trigger automated actions for both parameters and parameter policies.
  • Tag parameters individually, and then secure access from different levels, including operational, parameter, Amazon EC2 tag, and path levels.
  • Reference AWS Secrets Manager secrets by using Parameter Store parameters.
  • Use Parameter Store parameters with other Systems Manager capabilities and AWS services to retrieve secrets and configuration data from a central store.
  • Configure integration with the AWS services for encryption, notification, monitoring, and auditing.

So now you understand what is the SSM parameter store and its challenges, let talk about how we use it by reviewing the following our best practices and tips:

#1 - Organizing parameters into hierarchies

AWS provides detailed instructions on how to organize your SSM Parameter Store to define and manage parameters easily. Following its best practices can help you and make your life easier. Below are a couple of formats/conventions that our team normally using:

  • /environment/service-name/type/application-name/parameter_name i.e. /prod/billing/databases/invoicing-portal/db_connection_string
  • You also can add your department name as well i.e. /prod/human-resource/employee/user_list

#2 - Consistent naming convention

Using a well-defined hierarchy helps you to manage and retrieve parameters more efficiently, but you also need to use a consistent naming convention across your AWS account, your departments, and your teams.

By archiving this best practice, it reduces your reviewing efforts by focusing on critical business logic rather than syntax and naming standards and then increase your productivity and quality which can increase your customer satisfaction.

#3 - Restrict IAM permission

AWS SSM Parameter Store normally keeps your sensitive information, so restrict permissions are required to improve your security of the application. Each Parameter Store has a unique Resource ARN per account and region, so you can easier to define role and policy base on the hierarchy of the parameter store.

Below is a sample code from the AWS official document shows how to define a policy to restrict access to the Parameter Store

{
    "Version": "2012-10-17",
    "Statement": [
        {
            "Effect": "Allow",
            "Action": [
                "ssm:*"
            ],
            "Resource": "arn:aws:ssm:us-east-2::parameter/*"
        },
        {
            "Effect": "Deny",
            "Action": [
                "ssm:GetParametersByPath"
            ],
            "Condition": {
                "StringEquals": {
                    "ssm:Recursive": [
                        "true"
                    ]
                }
            },
            "Resource": "arn:aws:ssm:us-east-2:123456789012:parameter/Dev/ERP/Oracle/*"
        },
        {
            "Effect": "Deny",
            "Action": [
                "ssm:PutParameter"
            ],
            "Condition": {
                "StringEquals": {
                    "ssm:Overwrite": [
                        "false"
                    ]
                }
            },
            "Resource": "arn:aws:ssm:us-east-2:123456789012:parameter/*"
        }
    ]
}

#4 - Combine into a single parameter likes database connection string and keep it all together (co-location)

By using consistency hierarchies and naming convention you can archive this idea. Keep all related parameters all together makes easy to find and retrieve. Using fewer parameters can reduce your bills.

Instead of using 4 separated parameters for the database connection string as below:

/{env}/{service}/databases/master/host = db.domain.com
                                 /user = username
                                 /password = password
                                 /port = 3306

We combine into a single parameter using a standard connection string format:

/{env}/{service}/databases/master/db_connection = jdbc://username:password@db_host:port/database_name

Using community library such as connection-string-parser, you can easy to parse the parameter values and use to open connection, see below code snippet:

import { createConnection as createConnectionPromise, Connection } from 'promise-mysql';
import { ConnectionStringParser } from 'connection-string-parser';

const parseConnectionString = (dialect: string, connectionUri: string) => {
  const connectionParser = new ConnectionStringParser({
    scheme: dialect || 'mysql',
    hosts: []
  });
  const connectionStrParams = connectionParser.parse(connectionUri);

  return {
    host: connectionStrParams.hosts[0].host,
    port: connectionStrParams.hosts[0].port || 3306,
    database: connectionStrParams.endpoint,
    user: connectionStrParams.username,
    password: connectionStrParams.password
  };
};

export const createConnection = (connectionUri: string): Promise<Connection> => {
  return createConnectionPromise(parseConnectionString('mysql', connectionUri));
};

#5 - Use tool/library to fetch, cache, and export to environment variables at runtime

You are charged for API integration to SSM parameters, every time you retrieve your parameter from the store, you will increase your bill. What could you do to reduce your billing?

By default, max throughput (transactions per second) to retrieve parameter via API is 1000, how do you manage and avoid exceed throughput error?

@Yan Cui wrote an article describes reasons why you should use AWS SSM Parameter Store over Lambda environment variables, he also mentioned approaches for caching and cache expiration using his custom client library.

Our team is using middy middleware to deal with some cross-cutting concerns outside business logic, like input parsing and validation, output serialization, error handling. Application configuration also an aspect that every developer needs to work out and manage to run business logic. Out of the box, middy provides ssm middleware support fetch and cache parameters from the AWS SSM Parameter Store, it also supports assign parameter values to environment variables.

Here is a sample code on how to use middy to fetch and cache parameter store

const middy = require('middy');
const { ssm } = require('middy/middlewares');

export const handler = middy((event, context, cb) => {
  // You can access the parameter value inside function handler
  console.log(process.env.HARVESTAR_PCMSS_DB_CONNECTION);

  // Your business logic here
}).use(
  ssm({
    cache: true,
    names: {
      // Should have a prefix that include this micro service i.e. pcmss
      HARVESTAR_PCMSS_DB_CONNECTION: '/dev/harvestar/pcmss/db_connection'
    }
  })
);

There are some alternative open-sourced libraries out there:

Do you really trust the community package?

I head from some people, basically, they don't want to assign variables into environment variables (i.e. variables you can access through the process.env global object in Node.js runtime). If you do so, I have some advice as below:

  • Instead of assign variables to environment variables, you also have another option to assign the context object of the AWS Lambda when using middy/ssm middleware.
  • To avoid sending your sensitive information such as data credentials, accessing to the /tmp directory, or running a child process when executing your serverless functions. You can use @puresec/function-shield library. We are also using it in our production environment.

TIPS - Avoid fetching parameters at build/deploy time, fetch it at runtime instead. If you do so, you have to redeploy each time the parameter changed.

#6 - Using hardcoded environment variables for your local development

Do you need to run your function locally that fetches AWS SSM Parameter Store directly? The answer is it is optional, for your local environment, you might not need to use AWS SSM Parameter Store, you can use a .env file to keep your local variables. Below are some approaches you can use to archive that idea, note you still need to test your function with your desired approach on AWS environment:

  • Use the env-cmd library to load, extract and assign to process.env global object. By running env-cmd serverless offline command, you can access all variables defined in your .env file.
  • Using the serverless-secrets-plugin to define the environment variable in a secured manner, you can easier to share across the team and commit the encrypted file.

Using the same code as below with a modification, you can skip fetching parameter store from AWS and reduce your bill:

const middy = require('middy');
const { ssm } = require('middy/middlewares');

const isLocalEnv = process.env.IS_OFFLINE || process.env.IS_LOCAL;

export const handler = middy((event, context, cb) => {
  // You can access the parameter value inside function handler
  console.log(process.env.HARVESTAR_PCMSS_DB_CONNECTION);

  // Your business logic here
}).use(
  ssm({
    cache: true,
    // By setting the paramsLoaded, you tell the middleware to
    // not fetch it from AWS SSM
    paramsLoaded: isLocalEnv,
    names: {
      // Should have a prefix that include this micro service i.e. pcmss
      HARVESTAR_PCMSS_DB_CONNECTION: '/dev/harvestar/pcmss/db_connection'
    }
  })
);

#7 - Pay attention to services limits

Likes other AWS services, AWS SSM Parameter Store also has some limits, such as the maximum number of params per account and region, max param value size, max history. Understanding its limits help us design and build applications with high reliability. For example, avoid storing large items into the SSM parameter because of size limits (4KB for standard and 8KB for the advanced parameter). Refer to AWS service limits documentation for more others.

Resources

Conclusion

By applying best practices, you can implement your applications more reliable, secure, efficient, and cost-effective software on the cloud.

I hope this post brings some ideas to you and save your time. There are more interesting and useful articles, so find and read them to get more information. Feel free to let me know your recommendations or suggestions by adding comments below.

Thank you for reading!

Top comments (4)

Collapse
 
dvddpl profile image
Davide de Paolis • Edited

very nice article.
and thanks for the awesome tip about Middy !

Collapse
 
dineshrathee12 profile image
Dinesh Rathee

Impressive!

Collapse
 
adshin21_ profile image
Aditya • Edited

How to add the key, val pair in react .env file, after fetching from aws?

Collapse
 
hoangleitvn profile image
Hoang Le

If you are using .env file locally, I recommend using env-cmd library, I parse the environments defined in the .env file and assign the environment variables. For the local environment, we skip fetching from the AWS since you can define your variables easily