DEV Community

Anirban Das for AWS Community Builders

Posted on • Edited on

Get a Custom Weekly Inventory of EC2 Instances Using Lambda Function

Introduction:

While working in any projects based on AWS, it’s mandatory to have an inventory of EC2 instances weekly/monthly basis to get compact details on instances and their attributes if the environment is very big. Also, from compliance perspective, it helps to auditor to understand the details very precisely like when an instance got build, what was the image version or when that image was created, whether SSM agent is working fine or not, something like this can be playing a key role during compliance assessment.

Pattern:

Image description

Functionalities:

This is a very straightforward solution to get weekly EC2 instances inventory report that uses EventBridge as a scheduler, lambda function used as a trigger based service which holds the actually python script. Once the inventory file is generated, lambda pushes it to S3 bucket for storing for long term and parallelly sends an email notification using SES(Simple Email Service).

Lets Get Started:

1. Create a Lambda Function:
Create a lambda function as below with python 3.11 runtime.

Image description
Before uploading the source code, just wanted to let you all know that lambda doesn’t support any external python community modules like xmltodict, openpyxl etc. So here, while developing the codes locally, we can install python modules in that project folder.
Use “pip3 install openpyxl -t . — no-user” to install python modules in same project directory. Post that, make the entire project directory compressed. Below one is the python code to generate inventory file

import boto3
import openpyxl
import time
import os
from botocore.exceptions import ClientError
from email.mime.multipart import MIMEMultipart
from email.mime.text import MIMEText
from email.mime.application import MIMEApplication

client = boto3.client('ec2', region_name = 'us-east-1')
ssm_client = boto3.client('ssm', region_name = 'us-east-1')
s3 = boto3.client('s3', region_name = 'us-east-1')
ses = boto3.client('ses',region_name='us-east-1')

def instance_attributes():
    data = [['InstanceName', 'InstanceID', 'InstanceType', 'PrivateIP', 'Operaring System', 'State', 'SGID','SubnetID','VPCID','AvailabilityZone','SSM', 'ImageID', 'ImageCreated', 'OSDisk', 'OSDiskID', 'OSDiskSize'],]
    resp = client.describe_instances()
    for res in resp['Reservations']:
        for ins in res['Instances']:
            os = ins['PlatformDetails']
            priv_ip = ins['PrivateIpAddress']
            state = ins['State']['Name']
            subnetid = ins['SubnetId']
            vpcid = ins['VpcId']
            sgid = [sg['GroupId'] for sg in ins['SecurityGroups']][0]
            ins_id = ins['InstanceId']
            image_id = ins['ImageId']
            image_date = [img['CreationDate'] for img in client.describe_images(ImageIds=[image_id])['Images']]
            image_final_date = image_date[0] if len(image_date) >= 1 else 'Image may not be available'
            ins_ty = ins['InstanceType']
            monitoring_stat = ins['Monitoring']['State']
            launch_time = ins['LaunchTime']
            az = ins['Placement']['AvailabilityZone']
            ssm_agent_stat = ssm_client.get_connection_status(Target=ins_id)['Status']
            volname = [vol['DeviceName'] for vol in ins['BlockDeviceMappings']]
            volid = [vol['Ebs']['VolumeId'] for vol in ins['BlockDeviceMappings']]
            root_disk_name = volname[0]
            root_disk_id = volid[0]
            root_vol_size = [sz['Size'] for sz in client.describe_volumes(VolumeIds=[root_disk_id])['Volumes']][0]
            #last_reboot = ssm_client.send_command(DocumentName='AWS-RunShellScript', Parameters={'commands': [cmd]}, InstanceIds=ins_id)
            for tag_name in ins['Tags']:
                if tag_name['Key'] == 'Name':
                    name = tag_name['Value']
                    data.append([name, ins_id, ins_ty, priv_ip, os, state, sgid, subnetid, vpcid, az, ssm_agent_stat, image_id, image_final_date, root_disk_name, root_disk_id, root_vol_size])

    if name and ins_id:
        print('All the attributes are extracted from Instances')
        time.sleep(1)
        print('Creating an Inventory')
        workbook = openpyxl.Workbook()
        worksheet = workbook.active
        worksheet.title = 'Inventory'
        for row_data in data:
            worksheet.append(row_data)
        path = '/tmp/ec2-inventory.xlsx'
        workbook.save(path)
        print(f'Success!! Workbook created at {path} location', '\n')
        s3.upload_file(path, 'cf-templates-czmyfizwsx0a-us-east-1', 'ec2-inventory.xlsx')
    else:
        print('Fail!! Please check the syntax once again', '\n')

def sendemail():
    SENDER = '<emailid registered in SES>'
    RECIPIENT = '<emailid registered in SES>'
    SUBJECT = "Weekly Instances Report"
    #CONFIGURATION_SET = "ConfigSet"
    ATTACHMENT = "/tmp/ec2-inventory.xlsx"
    BODY_TEXT = "Hello Team,\n\nPlease find the attached instance inventory report.\n\nRegards,\nAnirban Das\nCloud Operations Team"
    BODY_HTML = """\
    <html>
    <head></head>
    <body>
    <p>Hello Team</p>
    <p>Please find the attached instance inventory report.</p>
    <p>Regards,</p>
    <p>Anirban Das</p>
    <p>Cloud Operations Team</p>
    </body>
    </html>
    """
    CHARSET = "utf-8"
    msg = MIMEMultipart('mixed')
    # Add subject, from and to lines.
    msg['Subject'] = SUBJECT 
    msg['From'] = SENDER 
    msg['To'] = RECIPIENT
    # Create a multipart/alternative child container.
    msg_body = MIMEMultipart('alternative')

    # Encode the text and HTML content and set the character encoding. This step is
    # necessary if you're sending a message with characters outside the ASCII range.
    textpart = MIMEText(BODY_TEXT.encode(CHARSET), 'plain', CHARSET)
    htmlpart = MIMEText(BODY_HTML.encode(CHARSET), 'html', CHARSET)

    # Add the text and HTML parts to the child container.
    msg_body.attach(textpart)
    msg_body.attach(htmlpart)

    # Define the attachment part and encode it using MIMEApplication.
    att = MIMEApplication(open(ATTACHMENT, 'rb').read())

    # Add a header to tell the email client to treat this part as an attachment,
    # and to give the attachment a name.
    att.add_header('Content-Disposition','attachment',filename=os.path.basename(ATTACHMENT))

    # Attach the multipart/alternative child container to the multipart/mixed
    # parent container.
    msg.attach(msg_body)

    # Add the attachment to the parent container.
    msg.attach(att)
    #print(msg)
    try:
        #Provide the contents of the email.
        response = ses.send_raw_email(
            Source=SENDER,
            Destinations=[
                RECIPIENT
            ],
            RawMessage={
                'Data':msg.as_string(),
            },
            #ConfigurationSetName=CONFIGURATION_SET
        )
    # Display an error if something goes wrong. 
    except ClientError as e:
        print(e.response['Error']['Message'])
    else:
        print(f"Email sent! Message ID: {response['MessageId']}")

def lambda_handler(event, conte):
    instance_attributes()
    sendemail()
Enter fullscreen mode Exit fullscreen mode

Open the lambda function you created and upload the zip file.

2. Create an EventBridge Rule:

  • Open the AWS services option and select EventBridge service

  • In the left navigation pane, you’ll see option “Rules”, click on it.

  • Once you click on “Create Rule” option, fill the “Name” and “Description” options.

  • Choose rule type as “Schedule”.

  • Click on “Continue in EventBridge Scheduler” option to proceed further.

Image description

  • Choose occurrence as “Recurring Schedule” and select “Cron Based Schedule” as a type.

  • Here, it’s setup that lambda function is supposed to trigger in every SUNDAY at 12:00 AM. Accordingly cron expression is give like this cron(00 12 ? * SUN *).

  • Flexible window is not chosen here, so taken as “Off” which means rule should triggered instantly.

Image description

  • Choose the target as Lambda function which we created in step 1.

  • Finally submit the request.

Image description

3. Create Verified Identities in SES:

  • Open the Simple Email Service console and click on “Verified Identities” option.

  • Choose “Email address” as an email type and provide the email address.

  • Once created, you will get an email from AWS an click on the confirmation link to approve this identity request.

  • Post that, verify the status of identity. It must be “Verified”.

Testing:

As we have created EventBridge rule with a particular cron schedule, hence it will be triggered as per schedule we have mentioned. For Adhoc testing, we can use “Test” option in Lambda.

In Test option normally some JSON content is required to put as Lambda internal structure takes an event and context as an input, so deleting that would result an error. That’s a reason we need to use blank JSON format i.e. {} like below.

Image description

Below is the output you’ll get after testing through Lambda.

Image description

Below is the kind of email you’ll receive with an attachment of EC2 inventory.

Mail Snapshot:

Image description

Inventory Snapshot:

Image description

~~ Happy Ending ~~

Hope this blog will help to understand how a good inventory can be generated for small as well as big environments. Here, we have chosen a few attributes of EC2 instances and related volumes, but yes there can be so many things we can include in the same inventory, that will definitely be a part of the requirement. 🙂🙂

Cheers!

Top comments (0)