AWS Batch Processing Goes Online
Amazon Web Services Inc. (AWS) yesterday announced that its AWS Batch service is now generally available.
"AWS Batch enables developers, scientists and engineers to easily and efficiently run hundreds of thousands of batch computing jobs on AWS," the company said. "AWS Batch dynamically provisions the optimal quantity and type of compute resources (e.g., CPU or memory optimized instances) based on the volume and specific resource requirements of the batch jobs submitted."
The service supports all jobs that can execute as a command or shell script provided by customers within a zip package or Docker image. Submitted jobs must indicate their memory requirements and number of virtual CPUs required.
Use cases cited by the company that are suitable for the Batch service span industries including financial services (high performance computing, post-trade analytics and fraud surveillance); life sciences (drug screening and DNA sequencing); and digital media (rendering, transcoding and simplifying complex media supply chain workflows).
"With AWS Batch, there is no need to install and manage batch computing software or server clusters that you use to run your jobs, allowing you to focus on analyzing results and solving problems," the company said. "AWS Batch plans, schedules and executes your batch computing workloads across the full range of AWS compute services and features, such as Amazon EC2 and Spot Instances."
The service is available at no extra charge, as users need only pay for AWS resources such as computing instances or Lambda functions used to store and run applications.
"You simply package the code for your batch jobs, specify their dependencies, and submit your batch job using the AWS Management Console, CLIs or SDKs," the company said. "AWS Batch allows you to specify execution parameters and job dependencies, and facilitates integration with a broad range of popular batch computing workflow engines and languages (e.g., Pegasus WMS, Cromwell and Luigi). AWS Batch efficiently and dynamically provisions and scales Amazon EC2 and Spot Instances based on the requirements of your jobs. AWS Batch provides default job queues and compute environment definitions that enable you to get started quickly."
AWS Batch is available only in the US East (N. Virginia) region.
David Ramel is an editor and writer for 1105 Media.