⋮    ⋮  

NVIDIA GeForce GTX 1650 To Feature 4 GB GDDR5, 8 Gbps Memory – Priced at $179 US, Launches in April

Mar 1

NVIDIA is launching more GeForce 16 series graphics cards in the coming months and while we did talk about them in a previous post, there is some new information on the most entry-level graphics card based on the Turing GPU architecture.

NVIDIA GeForce GTX 1650 To Feature 4 GB GDDR5 VRAM Spec’d at 8 Gbps – Priced at $179 US, Launches in April

So there are some interesting new details surrounding the GeForce GTX 1650, one of the GeForce 16 series cards to be introduced soon. The NVIDIA GeForce GTX 1650 will be targeting the budget and low-tier segment with a price of $179 US. The sub-$200 US market is a crucial market for the green team which could help them cope up the GeForce Gaming financial losses incurred during the previous quarter.

Related NVIDIA: Turing GPUs Have Sold 45% More Than Pascal In First Eight Weeks Of Revenue

According to the latest leak from TUM APISAK, the GeForce GTX 1650 indeed features the GDDR5 memory which was indicated before. The graphics card would feature 4 GB of GDDR5 VRAM that would operate across a 128-bit wide bus interface at speeds of 8 Gbps (2000 MHz effective clock speed).  This would result in a total bandwidth of 128 GB/s which would prove to be a good boost over the 112 GB/s bandwidth on the GeForce GTX 1050 series graphics cards.

A 3DMark entry showing the NVIDIA GeForce GTX 1650 along with its specifications that include 4 GB GDDR5 8 Gbps memory. (Image Credits: TUM APISAK)

Also when it comes to clock speeds, the card reportedly operates at 1395 MHz base and 1560 MHz boost. Earlier, it was reported that the base clock of the GeForce GTX 1650 graphics card would be 1485 MHz but that could very well be a factory overclocked variant. The card also seems to use the TU107 GPU core rather the TU117 but that’s not confirmed yet.

It will be interesting to see how many CUDA cores that card features. It is highly likely that we are looking at around 896-1024 since the GeForce GTX 1050 Ti already featured 768 CUDA cores and we have seen with the Turing gen that each card has received a core count bump over its predecessor. The GeForce GTX 1650 will end up being faster than the GeForce GTX 1050 Ti but we have to wait for the final specifications to provide a proper estimate that’s close to the real product.

NVIDIA GeForce 16 Series Preliminary Specifications:

  GeForce RTX 2060 FE GeForce GTX 1660 Ti 6GB
GeForce GTX 1660 6GB
GeForce GTX 1660 3GB * (TBC)
GeForce GTX 1650 4 GB
Architecture (GPU)
TU106 TU116-400 TU116-300 TU-116-300 TU-107/117?
CUDA Cores
1920 1536 1280 * 1280 * 896-1024?
Tensor Cores
240 N / A N / A N / A N / A
RT cores
30 N / A N / A N / A N / A
Texture Units
120 96 80 * 80 * ?
Base Clock
1365 MHz 1500 MHz 1530 MHz 1530 MHz 1395MHz?
GPU Boost
1680 MHz 1770 MHz 1785 MHz 1785 MHz 1560 MHz?
192-bit 192-bit 192-bit 192-bit 128-bit
Memory Clocks
14 Gbps 12 Gbps ? ? 8 Gbps
48 48 ? ? ?
L2 cache
3 MB 1.5 MB 1.5 MB 1.5 MB ?
PCB Number PG160 PG161 PG165 PG165 ?
160 W 120W ? ? 75W?
10.8 billion 6.6 billion 6.6 billion 6.6 billion ?
Die Size
445 mm² 284m2 284m2 284m2 ?
No No No No No
* = To be confirmed

Talking specifically about shading performance which will be the main technology and architecture feature of the GeForce GTX cards, it looks like we are looking at an average 50% improved shading performance per core compared to Pascal. This is not the overall performance increase but rather the rate at which Turing improves upon its predecessor in shader performance.

Related NVIDIA Quake II RTX Demo Screenshots Look Absolutely Gorgeous

The new mainstream lineup would definitely let NVIDIA gain some ground in the budget market. Their GeForce RTX cards, although good products, weren’t able to grab much attention by the high-end market, due to higher prices and the little support for RTX features in games at launch. The GeForce 16 series cards can, however, play a very positive role in terms of sales in the gaming side of things for NVIDIA.

Do you think NVIDIA's GeForce GTX Turing cards could become more popular than GeForce RTX cards in mainstream market?