This is a submission for the The AWS Amplify Fullstack TypeScript Challenge
What I Built
I developed in Angular a small application serving as a backoffice for a site selling pets. The idea here is to provide registered information on the platform about a pet and its seller, commonly called the owner.
To be able to use the application, authentication by password/username (email) is necessary, the user can create one if they do not have one. Once the user is connected, he records the information on the owner of the animal, then records the pets one by one by selecting an owner from the previously created list.
Demo and Code
Source code --> Github repo
Live demo --> app
Demo (1/3)
Password/email login
Demo (3/3)
Registering a pet
Integrations
The following AWS services have been integrated:
- S3
- AppSync
- Lambda
- Dyanmodb
- Cognito/IAM
1. Serverless DynamoDB & AppSync
All data entered by users is saved in a Dynamodb table. Here is the table schema and the amplify configuration:
export const petsSchema = {
PetOwner: a
.model({
OwnerID: a.id(),
Name: a.string().required(),
Email: a.email().required(),
Phone: a.string(),
Bio: a.string(),
Picture: a.string(),
})
.authorization((allow) => allow.publicApiKey()),
PetKind: a.enum(['FEMALE', 'MALE']),
PetCategorySchema: a.enum(
Object.keys(PetCategory).filter((v) => isNaN(v as any))
),
Pet: a
.model({
ID: a.id(),
NickelName: a.string().required(),
Price: a.float().required(),
Category: a.ref('PetCategorySchema').required(),
Breed: a.string(),
Rate: a.float().default(0.0),
Kind: a.ref('PetKind'),
BornDate: a.date().required(),
Weight: a.float().required(),
PetBio: a.string(),
OwnerID: a.id(),
//Owner: a.belongsTo('PetOwner', 'OwnerID'),
Images: a.string().array(),
DefaultImage: a.string(),
})
.authorization((allow) => allow.publicApiKey()),
};
2. S3
For storing the pets' images and the owners photo, I used S3 storage service. How did I store the images in S3? when registering the pet our owner, I make a first record in the DynamoDB table to get the _ID, then I upload the file(s) to the S3 bucket then I update the information with the file ObjectKey.
amplify/s3 configuration
export const storage = defineStorage({
name: 'amplifyPetShop',
triggers: {...}),
},
access: (allow) => ({
'owner-pictures/*': [
allow.guest.to(['read']),
allow.authenticated.to(['read', 'delete', 'write']),
],
'pets-thumbs/*': [
allow.guest.to(['read']),
allow.resource(createImageThumbs).to(['read', 'write', 'delete']),
allow.entity('identity').to(['read', 'write', 'delete']),
],
'pets/*': [
allow.guest.to(['read']),
allow.resource(createImageThumbs).to(['read']),// Lambda resource
allow.entity('identity').to(['read', 'write', 'delete']),
],
}),
});
const files =...; // <-- list of File
const prefix = "pets";
const arr = [...files].map((file) =>
from(
uploadData({
data: file,
path: `${prefix}${uuid}_${file.name}`,
}).result.then((result) => ({ result, original: file.name }))
)
);
3. Serverless Lambda
I. n order to automate the creation of images in miniaturized versions (thumbnails) at a lower cost, I opted for a lambda which is triggered each time a file is loaded in the Pets folder, this lambda executes a function which generates a thumbnail image and it in turn loads it into S3 bucket. Here is the amplify configuration and code for the function in question.
Amplify onfiguration
export const createImageThumbs = defineFunction({
name: 'gen-image-thumbs',
entry: './handler.ts',
runtime: 20,
memoryMB: 256,
});
and the lambda function code
export const handler: S3Handler = async (event: S3Event) => {
const objectKeysUploaded = event.Records.map(
(record: any) => record.s3.object.key
);
const srcBucket = process.env.AMPLIFY_PET_SHOP_BUCKET_NAME;
const srcKey = event.Records[0].s3.object.key;
if (srcKey.startsWith('pets/')) {
const dstKey = `pets-thumbnails/${srcKey.split('pets/')[1]}`;
const originalImage = await s3Client.send(
new GetObjectCommand({
Bucket: srcBucket,
Key: dstKey,
})
);
// @ts-ignore
const resizedImage = await sharp(originalImage).resize(128).toBuffer();
const command = new PutObjectCommand({
Bucket: srcBucket,
Key: `thumbnails/${objectKeysUploaded[0]}`,
Body: resizedImage,
});
await s3Client
.send(command)
.then((value) => {
console.log(`Thumbnail uploaded for objects ${srcKey}]`, value);
})
.catch((reason) => {
console.error(reason);
});
}
};
For the lambda to be executed by the S3 resource, we need to assign it the necessary rights, similarly the lambda function will also need the rights to read and write to the S3 bucket. Here is the amplify config:
'pets-thumbnails/*': [
...
allow.resource(createImageThumbs).to(['read', 'write', 'delete']),
],
'pets/*': [
...
allow.resource(createImageThumbs).to(['read']),
],
The first line gives the rights w/r to the lambda in the bucket to load the files, and the second line to download the original file.
And finally here is the trigger configuration:
export const storage = defineStorage({
name: 'amplifyPetShop',
triggers: {
onUpload: defineFunction({
entry: '../functions/create-image-thumbs/handler.ts',
environment: {
TARGET_BUCKET_NAME: 'pets-thumbs',
},
}),
},
...
Connected Components and/or Feature Full
This project was developed with Angular, and for that occasion I used the connection plugin @amplify/angular-ui to manage the connection and usage registration pages, the rest of the project was developed with the components o primeng.
Top comments (2)
Nicely built app.
thanks!