AWS Batch enables developers, scientists, and engineers to easily and efficiently run hundreds of thousands of batch computing jobs on AWS. AWS Batch dynamically provisions the optimal quantity and type of compute resources (e.g., CPU or memory optimized instances) based on the volume and specific resource requirements of the batch jobs submitted.
A batch job is a job with a start and end to it as opposed to a continuous job that runs 24X7
Batch processing service at any scale.
It provision the right amount of EC2 instances or Spot instances for processing the batch job based on the volume and the requirements
You submit or schedule batch jobs and AWS manages the rest
There run as Docker images.
Pulling data form somewhere at night
Backups at midnight
Financial data processing each day