DEV Community

Sean Ziegler
Sean Ziegler

Posted on • Originally published at seanjziegler.com

A complete guide to using the AWS Systems Manager Parameter Store in your cloud applications

Environment variables are used by nearly every developer. They are great for setting parameters that change based on the environment of the system. Unfortunately, they don't scale well. Sure, I agree, there are ways to do it, but it isn't a fun process. AWS offers a product I think can take the place of environment variables (or at least reduce your reliance on them). The AWS Systems Manager Parameter Store is a perfect solution for building modern cloud applications.

What is AWS Systems Manager Parameter Store?

AWS Systems Manager is a product designed to help you manage large groups of servers deployed into the cloud. For instance, it provides a remote connection to systems, security and patch updates, remote command execution, and other administration tasks at scale.

It also provides a feature called the Parameter Store. The parameter store is a superb place to store centralized data like API keys, database strings, passwords, and other configuration data.

Why use the AWS Systems Manager Parameter Store?

The Parameter Store is a great way to make your application less stateful and improve your ability to deploy across several environments. The parameter store has a few advantages over other methods of managing variables:

  • Easy to update from a central interface
  • Hierarchy structure
  • Supports encryption to store secrets like passwords
  • Supports versioning and roll back of parameters
  • Allows access control, both for IAM users and roles
  • Ability to audit parameter access using CloudTrail
  • Supports throughput of 1,000 transactions per second (must be increased in your settings)

I choose parameters over environment variables because I can update the parameter in one location and the changes are instantly available to any code using the parameter.

Hierarchy Structure

Perhaps the most interesting thing about the Parameter Store is the hierarchy structure. Hierarchies are parameters that start with a slash. They are a great way to organize parameters in a manageable fashion. I often create parameters for dev, test, and prod.

/dev/API_KEY
/dev/DB_STRING
/test/API_KEY
/test/DB_STRING
/prod/API_KEY
/prod/DB_STRING

This is a painless way to separate and manage parameters even when you have thousands of them.

Even with multiple environments, each part of your application can request the data it needs using the hierarchy structure.

Parameter Types

A parameter is a piece of data stored within AWS Systems Manager Parameter Store. AWS provides no validation on any parameters (with one exception covered later).

There are three types of Parameter Store parameters (and a fourth kinda-weird bonus type).

  • String
  • StringList
  • SecureString

String

Strings are exactly what you expect. Strings are any block of text such as Hello World, test, or wow this is a great blog post.

StringList

StringList is, again, rather intuitive. A StringList is a collection of strings separated by a comma. For example, Cat,Dog,Rabbit and Mercury,Mars,Melons are two examples of string lists.

SecureString

SecureString is used for sensitive data like passwords and API Keys. Data stored in a SecureString parameter are encrypted using keys managed by the AWS Key Management Service. You should know that these parameters are free to use, but AWS will charge you for the Key Management Service as usual.

Subtype - AMI Datatype

There is one strange "bonus" type you should know. When using a string attribute, you can use an additional parameter --data-type and then specify an Amazon machine image resource number.

aws ssm put-parameter \
    --name "\amis\linux\golden-ami" \
    --type "String" \
    --data-type "aws:ec2:image" \
    --value "ami-12345abcdeEXAMPLE"

The parameter store will validate that the AMI image is valid, then you'll be able to use the AMI in other services by referencing the parameter.

Parameter Tiers

There are two types of parameters: standard parameters and advanced parameters. Advanced parameters support parameter policies which can set parameter expiration, notify you if parameters expire, and let you know if a parameter hasn't changed in a while.

You can upgrade parameters to advanced parameters, but you can never downgrade to a standard parameter. There's really no reason to use an advanced parameter unless you run up against one of the limits below or you need the advanced policies they offer for notifications.

Standard Parameters

  • Free
  • Max of 10,000 parameters (per region)
  • 4KB max size

Advanced Parameters

  • Paid
  • Max of 100,000 parameters (per region)
  • 8KB max size
  • Supports parameter policies

Intelligent Tiering

This option is a blend of the two standard options. When you select intelligent tiering, the parameter store will inspect each parameter to see if requires advanced features. If it does, the store automatically upgrades the parameter to the advanced tier.

Intelligent tiering helps control cost and prevent failures because you hit the limit on standard parameters or tried to store a key larger than 4KB. If you don't mind spending the money on advanced parameters, it's worth considering.

Managing Parameters using the AWS CLI

You'll need to install and configure the AWS CLI if you haven't already.

Creating parameters is very easy. There's a built in command to create parameters.

aws ssm put-parameter --name "/test/email/liscence" --type "String" --value "XM56HE1I9M2AC9W30UM1"

To create a SecureString, add a --Key-Id and specify a KMS Key ARN.

aws ssm put-parameter --name "/test/email/password" --value "4G00DPA$$W0RD" --type "SecureString" --key-id "arn:aws:kms:us-east-2:123456789012:key/1a2b3c4d-1a2b-1a2b-1a2b-1a2b3c4d5e"

Getting parameters is even more fun. To get a parameter by name, use get-parameters.

aws ssm get-parameters --names "/dev/API_KEY" "/test/API_KEY" "/prod/API_KEY"

SecureString parameters require a --with-decryption flag.

aws ssm get-parameters --names "/prod/API_KEY" --with-decryption

You can get all the parameters in hierarchy

aws ssm get-parameters-by-path --path "/dev"

and use describe-parameters to query parameters by type.

aws ssm describe-parameters --filters "Key=Type,Values=StringList"

Parameter Versioning

Versioning is another great feature of the parameter store. If you overwrite a parameter that already exists, the parameter's version will increment.

aws ssm put-parameter --name "/test/email/liscence" --type "String" --value "XM56HE1I9M2AC9W30UM1"

aws ssm put-parameter --name "/test/email/liscence" --type "String" --value "XM56HE1I000000000000" --overwrite

Lets inspect the parameter's history.

aws ssm get-parameter-history --name "/test/email/liscence"

{
    "Parameters": [
        {
            "Name": "/test/email/liscence",
            "Type": "String",
            "LastModifiedDate": "2020-05-22T16:25:04.303000-04:00",
            "LastModifiedUser": "arn:aws:iam::<redacted>",
            "Value": "XM56HE1I9M2AC9W30UM1",
            "Version": 1,
            "Labels": [],
            "Tier": "Standard",
            "Policies": []
        },
        {
            "Name": "/test/email/liscence",
            "Type": "String",
            "LastModifiedDate": "2020-05-22T16:25:38.281000-04:00",
            "LastModifiedUser": "arn:aws:iam::<redacted>",
            "Value": "XM56HE1I000000000000",
            "Version": 2,
            "Labels": [],
            "Tier": "Standard",
            "Policies": []
        }
    ]
}

Parameter Policies

Parameter polices allow you to set expirations for parameters, get notified when a parameter expires, and also get notified if a parameter hasn't changed in a while. Don't ask me why policies are only good for these three things, but that's how it works. Maybe AWS will add more options.

You can set an expiration time with policies.

{
   "Type":"Expiration",
   "Version":"1.0",
   "Attributes":{
      "Timestamp":"2018-12-02T21:34:33.000Z"
   }
}

You can set up notifications if a parameter is expiring.

{
   "Type":"ExpirationNotification",
   "Version":"1.0",
   "Attributes":{
      "Before":"15",
      "Unit":"Days"
   }
}

And finally, you can set up notifications if a parameter has not changed in a set time period.

{
   "Type":"NoChangeNotification",
   "Version":"1.0",
   "Attributes":{
      "After":"20",
      "Unit":"Days"
   }
}

Assigning a parameter policy with the AWS CLI is relatively straightforward.

aws ssm put-parameter   
    --name "/dev/apikey" \
    --value "XM56HE1I9M2AC9W30UM1" \
    --type "String" \
    --overwrite \    
    --policies "[{policies-enclosed-in-brackets-and-curly-braces}]"

Setting up service roles for EC2

If you want to use the Parameter Store with other services (you probably do), you'll need to grant that service access via a service role.

I've included a role here which will give you access to the Parameter Store for whatever service assumes the role (Just don't forget to attach this to your service role).

{
    "Version": "2012-10-17",
    "Statement": [
        {
            "Sid": "AllowSSMAccess",
            "Effect": "Allow",
            "Action": [
                "ssm:PutParameter",
                "ssm:DeleteParameter",
                "ssm:GetParameterHistory",
                "ssm:GetParametersByPath",
                "ssm:GetParameters",
                "ssm:GetParameter",
                "ssm:DescribeParameters",
                "ssm:DeleteParameters"
            ],
            "Resource": "arn:aws:ssm:*:*:parameter/*"
        },
    ]
}

If you require more strict access control, you can limit access to read-only or only allow access to certain parameters.

Using Boto3 to interact with the Parameter Store

Obviously, no service would be complete with a way to interact with it from code and the Parameter Store is no exception. The AWS SDK, in my case Boto3 since I use python, offers a straightforward way to interface with the Parameter Store.

For example, in the code below I access three different prefixes based on a single environment variable set on the host (you can do this with EC2 user data or a Dockerfile) and then my application knows which set of parameters to retrieve. I use it to a variety of things including API keys, log file locations, ports, debug status, and more.

## PIP
import boto3

ssm = boto3.client('ssm', region_name='us-east-1')
env = os.environ.get('env')

# Select SSM parameters based on the 'env' environment variable
if env == 'DEV':
    prefix = '/<project>-dev/'

elif env == 'TEST':
    prefix = '/<project>-test/'

elif env == 'PROD':
    prefix = '/<project>-prod/'

# If env is not set, raise error to stop server from starting
else:
    raise AttributeError('No value set for environment type (env)')

secrets = {
    'ENV' : env,
    'DEBUG' : ssm.get_parameter(Name= prefix + 'DEBUG')['Parameter']['Value'],
    'PORT' : ssm.get_parameter(Name= prefix + 'PORT')['Parameter']['Value'],
    'FLASK_SECRET_KEY' : ssm.get_parameter(Name= prefix + 'FLASK_SECRET_KEY', WithDecryption=True)['Parameter']['Value'],

    'COGNITO_ID' : ssm.get_parameter(Name= prefix + 'COGNITO_ID', WithDecryption=True)['Parameter']['Value'],
    'CLIENT_ID' : ssm.get_parameter(Name= prefix + 'CLIENT_ID', WithDecryption=True)['Parameter']['Value'],
    'INVOKE_API' : ssm.get_parameter(Name= prefix + 'INVOKE_API', WithDecryption=True)['Parameter']['Value'],
    'API_GATEWAY_KEY' : ssm.get_parameter(Name= prefix + 'API_GATEWAY_KEY', WithDecryption=True)['Parameter']['Value'],
    'SNS_CONTACT_ARN' : ssm.get_parameter(Name= prefix + 'SNS_CONTACT_ARN')['Parameter']['Value'],

    'RAINFOREST_API_KEY' : ssm.get_parameter(Name= prefix + 'RAINFOREST_API_KEY', WithDecryption=True)['Parameter']['Value'],

    'LOG_FILE' : ssm.get_parameter(Name= 'LOG_FILE')['Parameter']['Value']
}

Don't forget to add the service role for any machine that will run code that accesses the Parameter Store!

Conclusion

That's everything that you need to know about how to integrate the AWS Systems Manager Parameter Store. Hopefully, you learned how to use the AWS Parameter Store a little better and can incorporate it into your future work.

If you enjoyed this content, follow me on Twitter to see more like this!

Top comments (0)