Hardware

My 360Hz monitor exposed the real bottleneck in my PC

At a glance: - RTX 4090 paired with 360Hz monitor revealed CPU bottleneck - Ryzen 9 5900X held GPU back at high refresh rates - Switched to Ryzen 7 9800X3D for better balance

What happened

When I first got the RTX 4090 in 2022, I paired it with a 4K/160Hz IPS monitor because I knew it was powerful enough to handle almost any game I threw at it. At that resolution, the GPU is usually the limiting factor, so I didn't have to worry about any other bottlenecks in my build. A couple of years later, though, I wanted a faster monitor purely for competitive gaming, so I got the Alienware AW2725DF 360Hz OLED. I went into this upgrade thinking I'd get near-perfect motion clarity and responsiveness, but I wasn't even close to maxing out the monitor's refresh rate, even in lighter titles like Valorant and Fortnite. You'd think the RTX 4090 would brute-force its way to 360FPS at 1440p, yet my GPU usage wouldn't even stay above 80%. And that's when I realized how much my Ryzen 9 5900X was holding the GPU back at higher refresh rates.

Why it matters

At very high refresh rates, your CPU matters just as much as the GPU. When you're gaming at 4K, your GPU is doing most of the heavy lifting anyway since the frame rates are generally lower, and your CPU doesn't have to work nearly as quickly to prepare frames. That's a big reason why my Ryzen 9 5900X never really stood out as a problem when I first paired it with the RTX 4090. I wasn't hitting triple-digit frame rates in most AAA titles unless I enabled DLSS upscaling or frame generation. In these scenarios, even a slightly older CPU like my 5900X is enough to keep the 4090 properly utilized. However, at 1440p, your GPU has plenty of headroom to push significantly higher frame rates, especially when you have something as powerful as the 4090. The higher the frame rate, the harder your CPU has to work to prepare and deliver frames quickly enough to keep the GPU fed. And that's when my 5900X became a problem. It couldn't prepare frames as quickly as my RTX 4090 could render them, so the GPU would end up waiting for the CPU to catch up. That's why my GPU usage wouldn't even stay above 80% despite having more performance left in the tank.

CPU bottlenecks stand out more on OLED panels

One of the biggest reasons why OLED monitors are perfect for competitive gaming is their near-instant pixel response times. When some of the fastest LCD gaming monitors on the market struggle to deliver the 0.5ms response times they claim, OLEDs push numbers as low as 0.03ms without any overdrive tuning. It's why I didn't even hesitate to splurge on the AW2725DF when it came out in 2024. Motion clarity is unlike any other IPS or TN monitor I've used because pixels transition so quickly, so tracking enemies is a lot easier in fast-paced titles. But OLED's unmatched motion clarity is actually a double-edged sword because it also makes any inconsistencies in performance easier to notice. There's no getting around frame pacing issues when every frame is being presented with that much clarity, especially at 360Hz. On LCD monitors, a bit of motion blur and slower pixel transitions can sometimes smooth over those inconsistencies. OLEDs don't really do that, so uneven frame delivery immediately stands out. Not maxing out my GPU is one thing, but when the panel itself exposes every little hitch, it's impossible to ignore CPU bottlenecks.

You could argue I'm chasing diminishing returns

Sure, the jump from 160Hz to 360Hz isn't nearly as dramatic as going from 60Hz to 144Hz. Regardless, I wouldn't have gotten 200+FPS natively in most AAA titles even if my CPU wasn't a bottleneck. At some point, you're simply limited by how demanding modern games have become, and even the RTX 5090 won't be enough to max out my monitor's refresh rate at 1440p across the board. So I get why many of you see 360Hz as pointless when a 240Hz or 180Hz monitor would've delivered a similar experience. Then again, I bought this monitor for competitive gaming. I'm usually playing first-person shooters like Valorant, Battlefield 6, and Counter-Strike 2, where responsiveness and motion clarity matter far more to me than visual fidelity. A 360Hz monitor naturally raises the bar for what I consider smooth and responsive gameplay. And once you get used to OLED's motion clarity, microstutters and frame-time spikes start to annoy you enough to consider a new CPU.

360Hz made me chase balance instead of faster GPUs

For the longest time, I treated my CPU as an afterthought whenever I upgraded my GPU, but that changed once I started gaming at ultra-high refresh rates. If there's anything I've learned, it's that you can have the fastest GPU on the market and still not get the performance you expect because something else is holding it back. Sure, you can get away with a slightly older CPU at 4K, but at higher frame rates, even your RAM could influence how consistent your performance is. Now that I've upgraded to the 9800X3D, my build is much more balanced than it used to be, but if it wasn't for this monitor, I'd probably still be eyeing the 5090.

Editorial SiliconFeed is an automated feed: facts are checked against sources; copy is normalized and lightly edited for readers.

FAQ

What caused the GPU usage to not stay above 80% at 360Hz?
The Ryzen 9 5900X was holding the GPU back at higher refresh rates because it couldn't prepare frames as quickly as the RTX 4090 could render them, leading to the GPU waiting for the CPU to catch up.
Why did the author switch to a Ryzen 7 9800X3D?
The Ryzen 7 9800X3D, with its 3D V-Cache technology, offers plenty of cache for storing data on the chip rather than slower RAM, creating a more balanced build that can handle the demands of a 360Hz monitor and high-refresh-rate gaming.
Is 360Hz worth it for competitive gaming?
The author argues that 360Hz is worth it for competitive gaming because it raises the bar for smooth and responsive gameplay, especially in first-person shooters where responsiveness and motion clarity matter more than visual fidelity.

More in the feed

Prepared by the editorial stack from public data and external sources.

Original article