Terrible optimization causing runaway CPU temps

It's not even overutilization but causing CPU temps that, if sustained cause crashing or hardware damage.

Surprised such a game that's graphically not too crazy can do this - developers are using hardware features as a crutch to avoid actually optimizing their games.

You should run the game with something to monitor CPU temps. It's not everyone, but some people, especially those who experience crashing or the game suddenly closing are likely experiencing temp spikes forcing the game to close.

Lock V Sync. Lock your FPS at something like 60 (background and foreground). Change the API, for me just switching to Vulkan from DX12 dropped temp spikes like 20º. Go into power management and change processor maximum usage from 100 to 99%. This game has serious CPU issues for some users and you could be damaging or wearing out your hardware by playing this game.

Last bumped on Feb 23, 2025, 5:21:27 PM
yeeeeeeeee
im thinking on getting new laptop this year and i will most definitely avoid like plague intel 13th and 14th generations i9's processors. They had 100% failure rate due to degradation from overheating. There were bios (uefi now) updates which "helped", but still it was not 100% resolved.

Currently i have pretty old laptop with Intel i7-9750H and GTX 1660 Ti, it still just barely handles poe2 with screaming fans, but nothing fatal.
"
Currently i have pretty old laptop with Intel i7-9750H and GTX 1660 Ti, it still just barely handles poe2 with screaming fans, but nothing fatal.


Weird. I have that exact same combo and my PoE 2 ran mostly fine when I was playing. 3+ year old laptop that I never opened to clean up and change the thermal paste.

I don't remember now whether or not I was running 720p or 1080p, but I capped FPS at 45, almost everything to low (maybe some to medium).

The game changer for me was creating a new power plan, capping processor usage to 95% and the Intel graphics settings (within the power plan) to maximum battery duration. I've been using this power plan for everything now (Elden Ring, etc), because it keeps all temperatures well within the safe range for laptop components and it doesn't really have any real impact on my FPS.

Using CPUID HWMonitor to check temperatures and maximum processor temps are around 77ºC (never above 80) and GPU temps maximums are around 82ºC.
Last edited by _rt_#4636 on Feb 23, 2025, 10:39:26 AM
huh. Maybe i will have to check this stuff. I was quite frustrated with cpu temperatures. CPU loaded 15% and temps were going 85 degrees celcius (0% gpu load), then changed termopaste and cleaned up dust and temperature dropped from 85 to 75 which is still insanely bad for a 15% laod. Maybe its power plan afterall..

Poe runs almost ok, maybe lags few places like there in sekhmas trial when mobs are poping up from ground ( i think its portal room or maybe survive room). It just that temperatures going nuts, i was started to think that my cpu is botched or fried, changing power settings may still help.
Last edited by Andrius319#4787 on Feb 23, 2025, 1:43:33 PM
You might want to tweak your CPU Lite Load mode (LL). You could also go with manually setting clocks and voltages if you want to optimize it fully. I've mentioned similar to many comments on heat issues here. You'll have to do some research and testing for what generally works on your setup.

Frame Rate lock helps of course. I prefer Riva Tuner since it's global for all applications. It also has the fastest response time without using V-Sync to prevent screen tearing with G/Free-Sync.

My 13600k at 54/40/45 1.3v (a slight OC) hits 76c running Cinnebench on air.

Both motherboard and CPU manufacturers only care that your hardware lasts for their warranty period so they cram voltage into the components to look good on benchmarks.

Far as optimization even if a modern game feels optimized it's probably not. Developers creating tricks and new methods of squeezing performance ended like 2005 or so. When DLSS first came out it had two modes. Upscale and Downscale. The downscale was intended to help take load off CUDA cores for AA purposes. It never happened. They'd rather just sell you another card.
"Never trust floating women." -Officer Kirac
"
Xzorn#7046 wrote:
You might want to tweak your CPU Lite Load mode (LL). You could also go with manually setting clocks and voltages if you want to optimize it fully. I've mentioned similar to many comments on heat issues here. You'll have to do some research and testing for what generally works on your setup.

Frame Rate lock helps of course. I prefer Riva Tuner since it's global for all applications. It also has the fastest response time without using V-Sync to prevent screen tearing with G/Free-Sync.

My 13600k at 54/40/45 1.3v (a slight OC) hits 76c running Cinnebench on air.

Both motherboard and CPU manufacturers only care that your hardware lasts for their warranty period so they cram voltage into the components to look good on benchmarks.

Far as optimization even if a modern game feels optimized it's probably not. Developers creating tricks and new methods of squeezing performance ended like 2005 or so. When DLSS first came out it had two modes. Upscale and Downscale. The downscale was intended to help take load off CUDA cores for AA purposes. It never happened. They'd rather just sell you another card.


The simple truth is. It's not about them having to develop new tips and tricks. They just simply aren't optimizing games. Optimization can take tons of development time, and money. Delay the release of a game. Very few game companies are doing it anymore.

Usually if someone wants to play a game like Cyberpunk, and it's laggy and choppy and otherwise an awful experience. Usually the gamers themselves will just update their system. It offloads the cost to the consumer.

It's one of the worst things about modern games these days. Plenty of them could easily get up to 50% more performance, had they spent time and money optimizing their games. But they just simply don't. And now Nvidia is incentivized to make AI frame generation to compensate, so that people can still have great experiences in these awfully optimized games.

There is a ton of people pointing this out over the last few years too. So it's becoming well known.
Last edited by Akedomo#3573 on Feb 23, 2025, 3:36:34 PM
"
Akedomo#3573 wrote:
The simple truth is. It's not about them having to develop new tips and tricks. They just simply aren't optimizing games. Optimization can take tons of development time, and money. Delay the release of a game. Very few game companies are doing it anymore.

Usually if someone wants to play a game like Cyberpunk, and it's laggy and choppy and otherwise an awful experience. Usually the gamers themselves will just update their system. It offloads the cost to the consumer.

It's one of the worst things about modern games these days. Plenty of them could easily get up to 50% more performance, had they spent time and money optimizing their games. But they just simply don't. And now Nvidia is incentivized to make AI frame generation to compensate, so that people can still have great experiences in these awfully optimized games.

There is a ton of people pointing this out over the last few years too. So it's becoming well known.


Well put. I honestly thought it went unnoticed by most these days. I recall the generational improvements from the 90's and they were just huge compared to now.

I remember early use of textured polygon games doing tricks like meshing them with sprites to save load and it was kinda hard to tell. Before Voodoo most PC gamers hardly knew of 3D Accelerators.

Devs simply had to squeeze what they could because it wouldn't run well otherwise. Now they're just so terribly inefficient. We had games like Duke Nukem 3D then Unreal 1 running on the same specs.
"Never trust floating women." -Officer Kirac

Report Forum Post

Report Account:

Report Type

Additional Info