• Log In
  • Register

CPU Voltage compared to Temperature

Forum Search


  • Be respectful to others
  • No spam
  • No NSFW content
  • No piracy or key resellers
  • No link shorteners
  • Offensive content will be removed


SubNauticaWaterWorld47 13 months ago

I have heard from a couple youtubers I watch that when a CPU runs cooler it requires less voltage to run at a given frequency. Is this true and if so why is this?

Comments Sorted by:

mysticial 6 Builds 5 points 13 months ago

All the answers here so far are getting the cause-and-effect backwards. Yes, higher voltages will cause higher temperatures. But the question is whether a lower temperature will require less voltage to be stable.

I'm not an expert in this area, but I believe the physics behind this is the Temperature Coefficient.

The resistance for most materials increases with temperature. The notable exception here are semiconductors like silicon. Though it isn't the only component in the chip. (again, I'm unsure of the details)

The higher the resistance, the higher the leakage and the harder it is to transmit power to all the components on the chip. Therefore a higher voltage is needed to "force it through". Lower temperatures result in lower resistance thereby making it more stable. When you're near absolute zero you reach superconductivity.

However, the differences here are unlikely to be observable with a mere 10 - 20C difference in temperature. But it does matter if you're comparing air cooling (60 - 80C) vs. liquid nitrogen (< -100C) and even helium (< -200C).

If lower temperatures didn't increase stability, there would be no point to using helium at all since nitrogen is already overkill as far as keeping the chip from overheating.

This is separate from the so called, "cold bug" that the LN2 people run into a lot.

SubNauticaWaterWorld47 submitter 1 point 13 months ago

I think this is what I was looking for but wasn't sure how to phrase it. Thanks!

Allan_M_Systems 1 point 13 months ago


Radox-0 5 Builds 2 points 13 months ago

Just seeing this thread now thanks to Mystical's post and very much agree, there is a truth to it. When you heat up the CPU you in turn increase the chance for instability. Now with a stock CPU, the out the box speeds is low enough and supplied voltage by default is high enough that these two points rarely overlap, even when the CPU is maxed out and running hot. But this does happen when you start overclocking manually. We can see form Puget sounds own comments https://www.pugetsystems.com/labs/articles/Impact-of-Temperature-on-Intel-CPU-Performance-606/

With PC hardware, higher temperatures make both minor and major hardware faults much more likely. These hardware faults can result in anything from reduced performance due to minor errors needing to be corrected to data corruption or bluescreens due to more dramatic errors.

So in turn you will see people ramp up the voltages to try and overcome faults.

For myself as an example, when overclocking, if I open the window and drop ambient temp nearly 20 degrees in winter and in turn have my water 15 degrees colder, I can achieve the same overclock with slightly less voltage. Not significant compared to if I was LN2, but this is with a CPU that can pull 500 watts easily overclocked in certain loads, so minute chnages can make a difference compared with doing the same on my 8700k for example.

A nice example here with a guy reported his finding's for his CPU and with a 80 degree difference in ambient (-40 degrees vs 40 degrees) and saw he needed 1.06 to be stable at -40 degrees while at 40 degrees 1.29v in the same test: https://forums.anandtech.com/threads/relationship-between-cpu-core-temps-vcore-and-stability.263363/

Take it how you want, but others reporting similar things here: https://www.techpowerup.com/forums/threads/do-the-stability-of-clock-speed-of-cores-in-a-cpu-depend-on-cpu-temperature.221208/page-2


Overall even if you do not always lower the voltage, you can also see a overall reduction in power leakage. Take Gamer's Nexus's own review with Vega. Same card but once with blower cooler and other with the AIO attached we see near 31w (obviously not voltage, but showing my point) from there testing when performance at stock equalizes: https://www.gamersnexus.net/guides/2986-amd-vega-hybrid-mod-results-overclocking-liquid-vs-air/page-2

Now for your average use on here, doubt you will see a difference tbh, may see a drop if you delid a 8700k and I expect that is what guy in video saw and I saw similar when my chip was delidded.

jonuk76 1 point 13 months ago

I'm not sure I'd put it like that. It sounds a bit backwards.. Lower voltage should result in lower temperature. So if you have two otherwise identical CPU's but due to the silicon lottery one operates at 4 Ghz @ 1.25v and the other runs at the same 4 Ghz but @ 1.275v, the former would be expected to run at a lower temperature. Simply, at the higher voltage it is consuming more power therefore more waste heat will be produced.

vagabond139 5 Builds 0 points 13 months ago

Is this true

No, are you sure you heard it right? If you heard it right then I wonder what they smoked before recording.

The temperature has absolutely nothing to do with how much voltage you need. If that was the case then why world record overclocks on FX CPU's use around 2V instead of like .05V. Temperatures do heavily depend on what the voltage is since voltage effects the heat out put exponentially.

By that logic it would mean the CPU would able to hit absolute zero since less voltage means the CPU runs run cooler and since the CPU runs cooler it needs less voltage.

SubNauticaWaterWorld47 submitter 1 point 13 months ago

Here is the video I heard it from.

vagabond139 5 Builds 1 point 13 months ago

Yeah, he has no idea what he is talking about there.

Chillsabre 2 Builds 1 point 13 months ago

The temps just reduces the power leakage doesnt it?

VivaLaPalmer 3 Builds 1 point 13 months ago

I think my 7700k has too high voltage, I never did anything I think my asus z270a did it automatically, it's actually turbo at 4.6 and not 4.5.. I think like 1.285 volts or something, would lower the voltage make it THAT much lower temps?

vagabond139 5 Builds 1 point 13 months ago

Voltage effects heat exponentially so it could make a big difference depending on how low you go.

Kakarotto 0 points 13 months ago

I've heard many people say that over the years but I never found it to be true at all.

I swap coolers a lot. For example, with my 2600k I used a Thermalright IFX-14, Corsair H70, and a custom loop. No matter what I did, I could not go any lower than 1.345v @ 4.8ghz. Even with the Core i7-8700k using a custom loop or a D14, at 4.8ghz I need the voltage at 1.2v. What ramps up the temps is the wattage the CPU is consuming.