Flexible Engine Elastic Cloud Server introduces new PI2 instances for GPU-accelerated use cases

Elastic Cloud Server (ECS) is Flexible Engine’s virtual machine service based on Openstack which offers a wide choice of flavors designed to run your workloads on highly performant instances at the best price.

ECS has receive the following updates :

Elastic Cloud Server is updated with a new New PI2 instances for GPU-accelerated use cases. They are equipped with NVIDIA Tesla T4 GPUs dedicated for real-time AI inference. These ECSs use the T4 INT8 calculator for up to 130 TOPS of INT8 computing. The PI2 ECSs can also be used for light-load training.

The P2s ECS features are:

  • CPU: 2nd Generation Intel® Xeon® Scalable processors (Cascade Lake Xeo 6278 processors at 2.6 GHz of base frequency and 3.5 GHz of turbo frequency)
  • Up to 4 NVIDIA Tesla T4 GPUs on an ECS
  • GPU hardware passthrough
  • Up to 8.1 TFLOPS of single-precision computing on a single GPU
  • Up to 130 TOPS of INT8 computing on a single GPU
  • 16 GiB of GDDR6 GPU memory with a bandwidth of 320 GiB/s on a single GPU
  • One built-in NVENC and two NVDEC GPUs

This new ECS type is available in 3 different flavors, with a vCPU number/RAM GB ratio of 1:4.

pi2.2xlarge.4

8

32

10/4

50

4

1 x T4

1×16

pi2.4xlarge.4

16

64

15/8

100

8

2 x T4

2×16

pi2.8xlarge.4

32

128

25/15

200

16

4 x T4

4×16

Use Cases     

PI2 flavors are suited for in GPU-based inference computing scenarios, such as image recognition, voice recognition, and natural language processing. The PI2 ECSs can also be used for light-load training.

Limitations

PI2 has a maximum number of 4 Nvidia Tesla T4 GPU with 64 GiB of GPU memory, 32 vCPU and 128 GiB of RAM.

PI2 ECSs support the following commonly used software:

  • Deep learning frameworks, such as TensorFlow, Caffe, PyTorch, and MXNet

Console / API and Resources

As all other instance flavors, the new PI2 series is available through the console and API. 

More information on ECS GPU-accelerated is available here  

lectus vulputate, mattis elit. id, eleifend suscipit