AWS Adds Compute Options for Wider Variety of Workloads
One of the AWS cloud's "superpowers," according to CEO Andy Jassy, is its speed of innovation.
Speaking during Wednesday's AWS re:Invent keynote in Las Vegas, which was attended by 32,000 audience members and streamed by an additional 50,000 viewers worldwide, Jassy noted that the cloud platform is on track to add over 1,000 new services and capabilities by year's end.
Many of these were unveiled during Jassy's keynote, including a slew of new Elastic Compute Cloud (EC2) instances to accommodate a wider range of workloads.
AWS is expanding its lineup of T2 general-purpose, burstable instances with the new t2.xlarge and t2.2xlarge, both generally available. The t2.xlarge has 16 GiB of memory and supports four virtual CPUs (vCPUs), while the t2.2xlarge has 32 GiB of memory and support for 8 vCPUs.
Also generally available is the new memory-optimized R4 instance family. These instances range from the r4.large (which has 15.25 GiB of memory and two vCPUs) to the r4.16xlarge (488 GiB and 64 vCPUs). Running on Intel's Broadwell processors, the R4 instances have twice the memory and speed of the R3 instances, and are designed for BI, database and in-memory caching applications.
Currently in preview, meanwhile, are EC2 Elastic GPUs, which enable users to bolt on GPU memory -- anywhere from 1 GiB to 8 GiB -- to their existing instances, ideal for workloads such as gaming, 3-D modeling or industrial design.
"It's super useful if you need some amount of GPU but you don't need the full GPU instances," Jassy said in the keynote.
Jassy also announced the developer preview of the new F1 instances with field programmable arrays (FPGAs). Designed for workloads that require custom hardware acceleration such as those in the fields of genomics and financial analysis, the F1 instances let users "build and write their own accelerations," Jassy said.
Currently, there are two F1 instances -- the f1.2xlarge (which has one FPGA, eight vCPUs and 122 GiB of memory) and the f1.16xlarge (eight FPGAs, 64 vCPUs, and 976 GiB of memory).
In addition to the F1 instance developer preview, AWS is also releasing an FPGA developer Amazon Machine Image (AMI) and an F1 hardware development kit (HDK).
The F1 instances are set to become generally available "in the coming weeks," Jassy said.
Finally, AWS is readying two new instance types for the first quarter of 2017 -- the I3, for I/O-intensive workloads, and the C5, based on Intel's Skylake processors for compute-optimized workloads.
According to IDC Program Director Al Hilwa, the new EC2 instances show a sharpening focus on machine learning and "streaming computations."
"Developers will love having slices of GPU instead of paying for it all the time the app is not using it," Hilwa said in a research note. "The FPGA instances will typically be used for highly customized compute workloads typically using floating point numbers. Gaming and other types of testing applications are the biggest examples today, but the change versus a couple of years ago is the increase in image, video and audio stream processing, often done in the context of preparing data for machine learning."
More from re:Invent 2016
Gladys Rama is the senior site producer for Redmondmag.com, RCPmag.com and MCPmag.com.