Optimizing AWS Batch Workloads: Concurrent Batch Jobs Execution using Lambda, S3, and ECR
<p><strong>Situation</strong>: While working on building Machine Learning models for 70 different countries (with one model dedicated to each country), we encountered the requirement to execute a specific set of post-processing tasks for each of these countries. To streamline this process and enable the concurrent triggering of 70 separate Batch jobs, each aligned with a distinct entry in a configuration file containing 70 rows, I harnessed the capabilities of AWS Batch, Lambda, S3, ECS, and ECR, effectively achieving a highly efficient workflow.</p>
<p><strong>Task</strong>: The task at hand is to set up a serverless system that can concurrently and seamlessly trigger the 70 AWS Batch jobs, each of which is based on one of the rows in the configuration file. To accomplish this, it’s essential to create a well-coordinated system where a Lambda function acts as an <strong><em>orchestrator </em></strong>and AWS Batch code serves as the <strong><em>worker </em></strong>for these jobs.</p>
<p><a href="https://aws.plainenglish.io/optimizing-aws-batch-workloads-concurrent-batch-jobs-execution-using-lambda-s3-and-ecr-71a2bccae435"><strong>Read More</strong></a></p>