DEV Community

Cover image for Part II: Notes on Studying for AWS Practitioners Exam
Andrew Casarsa
Andrew Casarsa

Posted on

Part II: Notes on Studying for AWS Practitioners Exam

I've been working on the Cloud Guru AWS Cloud Practitioners course which I started through 7-day free trial here. At this point, I've finished all the lectures but have decided to keep a monthly membership for review and other specialty courses. I am planning to schedule my exam in a couple of weeks!

In Part I we covered an overview of high-level topics and the support plans that would be covered on the exam.

In Part II we'll focus on S3 which is one of the oldest AWS services and features heavily on the exam.

Official definition

Amazon S3 has a simple web services interface that you can use to store and retrieve any amount of data, at any time, from anywhere on the web. It gives any developer access to the same highly scalable, reliable, fast, inexpensive data storage infrastructure that Amazon uses to run its own global network of web sites. The service aims to maximize the benefits of scale and to pass those benefits on to developers.


Basics to know

Simple Definition

Amazon Simple Storage Service is storage for the Internet. It is designed to make web-scale computing easier for developers.

Overview summary

  • S3 is Object-based (i.e. allows you to upload flat files)
  • Files can be sized from 0 Bytes to 5 TB
  • Unlimited Storage
  • Files are stored in Buckets (folder in the cloud)
  • S3 is a universal namespace which means that bucket names must be unique globally. An S3 URL looks like this:
  • Not suitable to install an operating system or database on.
  • Successful uploads will generate an HTTP 200 status code

S3 Objects

S3 is Object-based it is a key-value store

  • Key (this is the name of the object -- like the filename)
  • Value (This is the data and is made up of a sequence of bytes)

Data Consistency Model for S3

  • Read after Write consistency for PUTS of new Objects
  • Eventual Consistency for overwrite PUTS and DELETES (can take some time to propagate)
What's this mean?
  • if you write a new file and read it immediately after, you will be able to view that data.

  • if you update AN EXISTING file or DELETE a file and read it immediately, you may get an older version, or you may not.


  • Tiered Storage Available
  • Lifecycle Management
  • Versioning (can restore files)
  • Encryption
  • Secure your data using Access Control Lists and Bucket Policies.
    • ACL is on a per-file or object basis payroll.xls
      • payroll spreadsheet - only want payroll admin to have access to that.
      • per-file level
    • Bucket Policies apply across entire bucket
      • ex: deny public access
      • Bucket level security

S3 Storage Classes

General Purpose

  • Standard
    • Low latency and high throughput performance
    • High durability, availability, and performance object storage for frequently accessed data.

Unknown or changing access

  • Intelligent-Tiering
    • machine-learning will move data into the appropriate access tier (no loss to performance or operations overhead)

Infrequent Access

  • Standard-IA
    • accessed infrequently but rapidly when accessed
  • One Zone-IA
    • when you want a lower-cost solution but don't need multiple availability zones


  • Glacier
    • low-cost storage for data archiving -- retrieval minutes to hours
  • Glacier Deep Archive
    • retrieval of 12 hours is acceptable

S3 on Outposts

  • Outposts storage class
    • delivers object storage to on-premises AWS Outpost environments
    • for workloads with local data residency requirements, and to satisfy demanding performance needs by keeping data close to on-premises applications

The exam questions will tend to focus on when a company might want one storage option over another. For a deep dive into Storage Classes check out Amazon's docs.



Need to know Bucket info for the Exam

  • Bucket names are a universal namespace which means that they must be globally unique.
  • Buckets can be viewed globally but must be assigned an individual region.
  • Can replicate buckets from one region to another using Cross Region Replication (a storage backup basically)
  • Can change storage class and encryption on the fly
  • Remember that there are different storage classes (see above)
  • S3 Transfer Acceleration allows user to upload to local edge location then uses amazon's super-fast network to transfer to your bucket

3 Ways to Restrict Bucket Access

  • Bucket Policies - Applies across the whole bucket
  • Object Policies - Applies to individual files
  • IAM Policies to Users & Groups - Applies to Users & Groups (I'll cover IAM quickly below)

To restrict access to an entire bucket, you use bucket policies; and

to restrict access to an individual object, you use access control lists.

Identity Access Management (IAM)

IAM allows you to create users, groups, and roles and give different levels of access to them. Use this to restrict access to the information stored in your buckets.

IAM is Global, however, you may specify a region when dealing with IAM. When you create a user or a group, this is created globally. Roles, when applied to a service instance, take effect immediately.

IAM Roles

IAM roles are a secure way to grant permissions to entities that you trust. Examples of entities include the following:

  • IAM user in another account
  • Application code running on an EC2 instance that needs to perform actions on AWS resources
  • An AWS service that needs to act on resources in your account to provide its features
  • Users from a corporate directory who use identity federation with SAML

IAM roles issue keys that are valid for short durations, making them a more secure way to grant access.

Always use roles in place of access keys and secret access keys. They are safer and easier to manage. This may be an exam question!

That covers things for now. Next up we'll dive into Elastic Compute Cloud (EC2).

Discussion (0)