Jump to content

Rumour: NVIDIA’s Flagship RTX 3090 GPU Will Have A TGP Of 350W, Complete Breakdown Leaked


Recommended Posts

NVIDIA’s Flagship RTX 3090 GPU Will Have A TGP Of 350W, Complete Breakdown Leaked

Quote

NVIDIA's next-generation RTX 3090 flagship GPU will have a significantly increased TGP of up to 350Ws. The information comes from Igor, who has been an incredibly reliable source of leaks in the past and we have also independently confirmed the same. In other words, the TGP and its breakdown is not a rumor, although we are still not a 100% sure on the nomenclature (NVIDIA may decide to call it something other than an RTX 3090 or play around etc).

NVIDIA's Flagship GPU RTX 3090 Gets TGP Breakdown, GPU Core TDP Itself Is 230W

NVIDIA has usually played nice with power consumption on its graphics cards for the sake of power efficiency but gamers don't usually care for the same. I don't know of any gamer that would rather have a power-efficient card instead of a powerful one in the same price. This is why this particular leak is very interesting.

Before we go any further, however, let's contextualize this leak: the RTX 2080 Ti had a maximum TGP of 260W on some custom variants. The RTX 3090 (or whatever NVIDIA decides to call it) will have a TGP of 350W and a TDP (GPU only) of 230W. This is a massive increase in power consumption - one that will likely be accompanied by some truly phenomenal performance gains.

PG132_01.jpg

This also explains the heatsink we saw in leaks earlier because you would need an absolutely mammoth cooling solution to tame this beast. It would appear that NVIDIA is planning to go out guns blazing with a chip that is designed to run extremely fast and output a ton of heat. Jensen is clearly missing his record quarters and it is showing. The complete breakdown of wattages (approximated to a few watts) is given below: 

 

Estimated Power Consumption / Losses  
Total Graphics Power TGP 350 Watts
24 GB GDDR6X Memory (GA_0180_P075_120X140, 2.5 Watts per Module) -60 Watts
MOSFET, Inductor, Caps NVDD (GPU Voltage) -26 Watts
MOSFET, Inductor, Caps FBVDDQ (Framebuffer Voltage) -6 Watts
MOSFET, Inductor, Caps PEXVDD (PCIExpress Voltage) -2 Watts
Other Voltages, Input Section (AUX) -4 Watts
Fans, Other Power -7 Watts
PCB Losses -15 Watts
GPU Power approx. 230 Watts

 

The RTX 3090's GA102 GPU core itself will draw roughly 230 watts of power with the memory drawing 60 watts. The MOSFETs will consume a further 30 watts of power with the fan coming in at 7 watts. PCB losses are expected to be in the range of 15 watts and input section consumption will be around 4 watts. All of this totals to a total board power of 350 watts for the RTX 3090 and keep in mind this is the FE. The custom boards are likely going to go even higher.

So what exactly is happening here? Well, what you are seeing is NVIDIA unleashing all of its fury in one single product. Shifting to the 7nm node will actually lower power consumption - all things held constant - but clearly NVIDIA wants to trade away the power efficiency gains from 7nm and instead roll out an absolute beast of a GPU. We have already heard rumors that they have up-specced the RTX 3080 (and pretty much all their lineup) so users will be looking at double-digit performance gains (think 40-50%) going from the RTX 2000 series to the 3000.

I also have a very strong suspicion [opinion] that you are going to see NVIDIA slash performance per dollar pricing across the board as well. The RTX 2000 series was relatively deadpanned compared to the reception of the GTX 1000 series and I think NVIDIA will likely want to capitalize on the opportunity of the new process by increasing their Total Available Market (TAM).

Since Computex is canceled, NVIDIA will likely be going with a virtual event again although we expect the timeline to be unaffected (still around September). With the next-gen consoles rocking 10+ TFLOPs of power, NVIDIA will also be feeling a lot of pressure to not only offer higher performance but in a comparable price bracket. All this is very good for the consumer. That said, I am still fairly certain that the NVIDIA premium will remain higher than Radeon counterparts - just not as high as the RTX Turing series [/opinion].

 

 

Link to comment
Share on other sites

23 minutes ago, Xbob42 said:

What kind of PSUs are you guys working with where this would affect you at all? You guys got like 90 fans hooked up something?

 

Well, actually, I do have 14 fans in my system, and 11 RGB LED components.  When I built my system I determined I was not going to do any overclocking that required voltage increases, and I was operating under the assumption that video cards would stay below 300W.  My other priority was to chose a power supply that delivered the best and most consistent voltage output, and being a thrifty person I look for bang-for-the-buck too.  So I went a 550W Seasonic Gold unit, which will probably actually output 650W before its auto-shutdown engages or becomes out-of-spec.  At least it has a 10 year warranty.

Link to comment
Share on other sites

RGB is some awful relic from early 2000s when Fast and The Furious was a big deal. People would treat their shitty computer like it was a shitty Honda Civic and put lights all over the place. But rather than the fad dying out like it should, LEDs just became cheaper and cheaper and now PC builders find the same schmucks that buy Mountain Dew Game Fuel to sell their shitty RGB lighting gear and apps and shit to.

Link to comment
Share on other sites

19 hours ago, Xbob42 said:

What kind of PSUs are you guys working with where this would affect you at all? You guys got like 90 fans hooked up something?


When you have 2 components, a CPU and GPU, that take over 800 watts of power at full load, you need large PSU. I’m just glad I got a great deal on a 1200watt EVGA G2. I always raise my brow when people buy PSUs that just barely make the cut for their current builds: build with headroom and upgrades in mind.

 

10 hours ago, ThreePi said:

RGB is some awful relic from early 2000s when Fast and The Furious was a big deal. People would treat their shitty computer like it was a shitty Honda Civic and put lights all over the place. But rather than the fad dying out like it should, LEDs just became cheaper and cheaper and now PC builders find the same schmucks that buy Mountain Dew Game Fuel to sell their shitty RGB lighting gear and apps and shit to.

 

2000s RGB was absolute trash, which is why it disappeared for so long. Modern RGB is vastly different and programmable: you can even make it look like things are moving such as simulating that the fans are playing pong with each other or making it look like a level of pacman is happening.

Link to comment
Share on other sites

13 minutes ago, Spork3245 said:


When you have 2 components, a CPU and GPU, that take over 800 watts of power at full load, you need large PSU. I’m just glad I got a great deal on a 1200watt EVGA G2. I always raise my brow when people buy PSUs that just barely make the cut for their current builds: build with headroom and upgrades in mind.

 

I always figured the bottom end of people building modern mid-to-high-end gaming PCs was like 750w, hearing that some folks have 550w is surprising to me. Not like it wouldn't run most machines just fine, but it's also not like your PSU uses all that extra power potential if it's not under load anyway, so no reason to skimp unless you're on a real tight budget.

Link to comment
Share on other sites

4 hours ago, Xbob42 said:

 

I always figured the bottom end of people building modern mid-to-high-end gaming PCs was like 750w, hearing that some folks have 550w is surprising to me. Not like it wouldn't run most machines just fine, but it's also not like your PSU uses all that extra power potential if it's not under load anyway, so no reason to skimp unless you're on a real tight budget.


Even with a mid-range build, at 550w I’d be concerned even just plugging in my phone to charge if I had the CPU OCed :p

I guess if you never plan to OC it’s fine; OCing wrecks havoc on power draw. My 5960x jumps from like 190 tdp to 300+ with a 1ghz OC.
I think 650watts is the “average” most people go for these days, but I personally wouldn’t get less than 800-850. I agree that 750 should be a minimum for all but mini ITX builds.

Link to comment
Share on other sites

If you don't overclock 550W is more than enough.  Only the halo, top tier video cards at stock would start to be worry-some as they can reach 300W.   The second-rung and slower cards don't even come close.  There's no way I'm going to afford a 3090 anyway so I'm not really worried about my system.

 

But if you plan to do any overclocking at all the bigger PSU the better.  

Link to comment
Share on other sites

On 6/15/2020 at 5:27 AM, Spork3245 said:


When you have 2 components, a CPU and GPU, that take over 800 watts of power at full load, you need large PSU. I’m just glad I got a great deal on a 1200watt EVGA G2. I always raise my brow when people buy PSUs that just barely make the cut for their current builds: build with headroom and upgrades in mind.

 

 

2000s RGB was absolute trash, which is why it disappeared for so long. Modern RGB is vastly different and programmable: you can even make it look like things are moving such as simulating that the fans are playing pong with each other or making it look like a level of pacman is happening.

 

The problem isn't custom RGB kits where you can do stuff like what you're describing, the problem is how you have to pay a premium for gaming equipment that doesn't have gross gaudy RGB shit built in. 

Link to comment
Share on other sites

8 hours ago, Jason said:

 

The problem isn't custom RGB kits where you can do stuff like what you're describing, the problem is how you have to pay a premium for gaming equipment that doesn't have gross gaudy RGB shit built in. 


Actually, the discussion was about RGB looking bad. Also, I’ve never experienced something non-RGB being more expensive than something with RGB. Crappy RGB is certainly cheaper than premium RGB, however. I could see a sale price for something with RGB being cheaper than the msrp on a non-RGB equivalent, I guess? In which case just buy the RGB version and turn the lighting off in the software if you don’t like it.

Link to comment
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

  • Recently Browsing   0 members

    • No registered users viewing this page.
×
×
  • Create New...