NVIDIA Tesla V100s Volta Based Graphics Card Features Higher GPU Clocks For Over 16 TFLOPs Compute, Over 1 TB/s Memory Bandwdith

Nov 26
Submit

NVIDIA has released a new variant of its Volta-based Tesla graphics card known as the Tesla V100S. The new server aimed solution carries the same specifications of the full Volta GPU, but offers much faster clock frequencies for both GPU and memory, driving its performance beyond 16 TFLOPs in single-precision compute workloads.

NVIDIA Tesla V100S Volta GPU Brings 16+ TFLOPs and Over 1 TB/s Memory Bandwidth To Servers

In terms of configuration, the Tesla V100S has the same GV100 GPU which is based on the 12nm FinFET process node. The specifications include 5120 CUDA cores, 640 Tensor cores and 32 GB of HBM2 memory. As you can tell, these are very similar specifications to the existing Tesla V100, but there are some significant changes made to both the GPU and memory clock speeds.

AMD Radeon RX 5500 XT Variants Pictured – ASRock & GIGABYTE

NVIDIA Tesla V100S offers higher compute and memory performance.

The Tesla V100S only comes in the PCIe form factor, but delivers higher clocks than the 300W Tesla V100 SMX2 (NVLINK) solution. It comes with a GPU clock speed of 1601 MHz compared to 1533 MHz on the SMX2 variant and also offers higher 1.1 Gbps frequencies for the HBM2 DRAM. The combined boost to memory and graphics clocks make this Tesla variant the fastest HPC & server aimed graphics solution.

At its above-mentioned clock speeds, the Tesla V100S is able to deliver a theoretical FP32 compute performance 16.4 TFLOPs, FP64 compute performance of 8.2 TFLOPs and DNN/DL compute of 130 TFLOPs. The card also pumps out over 1 Terabyte of memory bandwidth (1134 GB/s) versus 900GB per second bandwidth of the Tesla V100. The Tesla V100S comes in a 250W design and has a higher compute performance than AMD's Radeon Instinct MI60 which is based on the 7nm Vega 20 GPU architecture, but delivers a maximum FP32 compute performance of 14.75 TFLOPs with a TDP of 300W.

NVIDIA Volta Tesla V100S Specs:

NVIDIA Tesla Graphics CardTesla K40
(PCI-Express)
Tesla M40
(PCI-Express)
Tesla P100
(PCI-Express)
Tesla P100 (SXM2)Tesla V100 (PCI-Express)Tesla V100 (SXM2)Tesla V100S (PCIe)
GPUGK110 (Kepler)GM200 (Maxwell)GP100 (Pascal)GP100 (Pascal)GV100 (Volta)GV100 (Volta)GV100 (Volta)
Process Node28nm28nm16nm16nm12nm12nm12nm
Transistors7.1 Billion8 Billion15.3 Billion15.3 Billion21.1 Billion21.1 Billion21.1 Billion
GPU Die Size551 mm2601 mm2610 mm2610 mm2815mm2815mm2815mm2
SMs15245656808080
TPCs15242828404040
CUDA Cores Per SM1921286464646464
CUDA Cores (Total)2880307235843584512051205120
Texture Units240192224224320320320
FP64 CUDA Cores / SM6443232323232
FP64 CUDA Cores / GPU9609617921792256025602560
Base Clock745 MHz948 MHz1190 MHz1328 MHz1230 MHz1297 MHzTBD
Boost Clock875 MHz1114 MHz1329MHz1480 MHz1380 MHz1530 MHz1601 MHz
FP16 ComputeN/AN/A18.7 TFLOPs21.2 TFLOPs28.0 TFLOPs30.4 TFLOPs32.8 TFLOPs
FP32 Compute5.04 TFLOPs6.8 TFLOPs10.0 TFLOPs10.6 TFLOPs14.0 TFLOPs15.7 TFLOPs16.4 TFLOPs
FP64 Compute1.68 TFLOPs0.2 TFLOPs4.7 TFLOPs5.30 TFLOPs7.0 TFLOPs7.80 TFLOPs8.2 TFLOPs
Memory Interface384-bit GDDR5384-bit GDDR54096-bit HBM24096-bit HBM24096-bit HBM24096-bit HBM24096-bit HBM
Memory Size12 GB GDDR5 @ 288 GB/s24 GB GDDR5 @ 288 GB/s16 GB HBM2 @ 732 GB/s
12 GB HBM2 @ 549 GB/s
16 GB HBM2 @ 732 GB/s16 GB HBM2 @ 900 GB/s16 GB HBM2 @ 900 GB/s16 GB HBM2 @ 1134 GB/s
L2 Cache Size1536 KB3072 KB4096 KB4096 KB6144 KB6144 KB6144 KB
TDP235W250W250W300W250W300W250W

There's around a 17% increase in compute performance to be had from the Tesla V100S when you compare it with the Tesla V100 PCIe. That's a nice increase and the server audience would see that as a reason to upgrade. The only thing to consider here is the AMD Instinct parts feature PCIe Gen 4.0 compatibility and with many major server players moving over to the PCIe 4.0 platforms in 2020, NVIDIA needs to work on their own PCIe Gen 4.0 implementation, which I believe is where their Ampere GPUs come in. There's currently no word on the pricing or availability of the Tesla V100S, but expect it to cost over $6000 US.

Submit