ASUS GeForce ROG STRIX RTX 4090 OC Edition
24 GB
The GeForce RTX 4090 is a monster in every respect. I’m surprised the graphics card is less power-hungry than its spec sheet suggests. In contrast, it’ll absolutely devour your wallet.
Is that still a graphics card or already furniture? The ROG Strix GeForce RTX 4090 OC is the biggest GPU I’ve ever come across. Not only its physique is impressive, but its price and performance as well. As I note in the test, it’s a quiet and not at all wasteful giant when in use. Buying one, on the other hand, is another matter: at over 2000 francs, it’s damn expensive.
At 357.6 × 149.3 × 70.1 millimetres, the card takes up just under a quarter of the volume of my Meshlicious mini-ITX case. For that, it has three 108 millimetre axial fans and a huge heatsink. Its case and backplate have slots everywhere to ventilate the card even better. The card features two RGB LED lighting zones: the ROG lettering in the front and a bordering bar on the right. Overall, the card looks well made and the materials used are solid. Personally, however, I’m not a fan of the ROG design. It seems too adolescent and «gamery» for me.
The RTX 4090 uses the PCIe 4.0 interface, offers two HDMI 2.1a and three DisplayPort 1.4a ports. Up to 7680 × 4320 pixels at 60 Hz are possible using both connection standards. Even more would be possible with DisplayPort 2.1. But whether that would make sense is questionable, since more than 60 FPS at 8K would be a challenge for even the RTX 4090. A 16-pin adapter cable and a stand are included in the scope of delivery. This way the card, which weighs almost 2.5 kilos, stays secure in your PC.
The Nvidia GeForce RTX 4090 is manufactured at TSMC in the N4 process. 16,384 Cuda cores are enabled on the AD 102 chip of the RTX 4090. In addition, the card has 512 fourth-generation tensor cores and 128 third-generation ray tracing cores. Thus, the RTX 4090 offers 56 per cent more cores than the RTX 3090. Its memory remains unchanged with 24 gigabytes of GDDR6X.
Nvidia still relies on PCIe Gen 4 for the interface. For this, a 12VHPWR is used for power. Here you should pay attention to correctly handle the cable; otherwise there’s a risk of burning. The thermal design power (TDP) is 450 watts. Here’s all the specs at a glance:
ASUS RTX 4090 STRIX OC | NVIDIA GeForce RTX 3090 | |
---|---|---|
Architecture and process | Ada Lovelace AD102-300 TSMC N4 | Ampere GA102-300-A1 Samsung 8 nm |
CUDA cores | 16384 | 10496 |
TMUs / ROPs | 512 / 176 | 328 / 112 |
Tensor- / RT-cores | 512 / 128 | 328 / 82 |
Base clock speed | 2230 MHz | 1395 MHz |
Boosted clock speed | 2520 MHz | 1695 MHz |
Memory | 24 GB GDDR6X | 24 GB GDDR6X |
Memory bus | 384 bit | 384 bit |
Memory bandwidth | 1008 GB/s | 936.2 GB/s |
TDP | 450 watts | 350 watts |
I used the following components for this review. They were provided to me by the manufacturers for testing:
The system runs on Windows 11 version 21H2 (22000.1098). I used BIOS version 0502 and enabled XMP. Otherwise I left everything on default, Resizable BAR was disabled. For the graphics card, I used driver version 526.47.
Here’s an overview of the different benchmarks:
I ran all benchmarks three times and took the best result. For the games, I used the highest possible presets. Otherwise, I left everything at default except for the resolution. I left ray tracing, DLSS or FSR deactivated. In this review, I’m looking at the rasterization performance of various games without additional tools.
Since we can’t display image galleries, I won’t list individual game results. You can download all the benchmarks here. The following charts show the arithmetic mean of the frames per second (FPS) for all nine benchmark games.
The RTX 4090 is a true beast. Difference between it and the predecessor RTX 3090 start with higher resolutions. In 1080p, the card delivers 15, in 1440p 28 and in 2160p even 48 per cent more FPS. This is the case in most games. «Far Cry 6» dances out of line. Here, the difference in 1080p between the RTX 4090 and RTX 3090 is just under 6 per cent. However, this is put into perspective at higher resolutions and corresponds to the average. The RTX 4090 is predestined for gaming in 2160p. For lower resolutions, I’d recommend a cheaper card in a lower price range. At 1080p resolution, the CPU is the bottleneck. Even higher frame rates simply aren’t possible with the RTX 4090. Since the performance jump in 1440p is moderate compared to that in 2160p, the CPU seems to act as a bottleneck in this resolution as well.
The card’s performance also has an effect on frame times in percentiles, although not as much as in the average FPS. The percentile values are usually frametimes measured in milliseconds. These are the time intervals from image to image, or frame to frame. The percentile values ignore statistical outliers. 99th percentile means that 99% of all data is faster than the value provided. At these percentiles, the RTX 4090 is 23 per cent ahead of the RTX 3090 across all resolutions. At the 99.9th percentile, the figure is still just under 18 per cent.
Game-like scenarios are rendered in 3DMark’s synthetic game benchmarks. From this, they calculate a score that indicates theoretical in-game performance. I’ll only be giving the values for the graphics card. Differences in the overall ranking from tests are just too big. The chart shows the arithmetic mean of all four benchmarks. You can download the individual graphics here.
A similar picture as in 2160p resolution is seen in the synthetic game benchmarks. The RTX 4090 places 48 per cent higher than the RTX 3090 across all four tests. It’s one of the biggest jumps I’ve ever seen.
Blender benchmark renders three scenes in its 3D graphics suite (version 3.3) and calculates three scores using them. I’ve added these up to give end scores for each.
The performance jump in Blender is enormous. My RTX 4090 achieves a 109 per cent higher score compared to the RTX 3090. The new Nvidia flagship is really flexing its muscles.
Using the Photoshop and Premiere Benchmark from Puget Systems, I ran different workloads. At the end, the benchmark calculates a score based on the reference workstation.
The RTX 4090 is surprisingly lacking here. But I’m not surprised. Adobe is very slow when it comes to optimising software to new hardware. If I ran the benchmark again in a month or two, I’d get a vastly different result. That’s exactly what I’ll do, so expect an update here!
The PCMark 10 benchmark tests diverse scenarios such as app loading times, the efficiency of spreadsheet calculations, web browsing and photo and video editing. All in all, it accurately detects how the graphics card affects office work. This results in an overall score.
As expected, the graphics card has no effect on typical office work. The eight-point difference between the RTX 4090 and RTX 3090 really doesn’t matter.
The maximum thermal design power (TDP) of 450 watts is rarely reached by the RTX 4090. In games, I’ve only seen the value flare up briefly. On average, it stays at 340 watts in 2160p across the entire benchmark suite. Thus, the RTX 4090 is just as hungry as the RTX 3090 with 335 watts. At lower resolutions, the performance should be below this value. Thus, the 1000-watt power supply recommended by Asus is only necessary in exceptional cases. If you don’t overclock your system, a new 850-watt Platinum or Gold certified power supply should suffice.
Temperatures in-game reach a maximum of 63 degrees. On average, I measured 53 degrees Celsius across all games on the open testbench.
The card demands more performance in the Time Spy Extreme benchmark. Here, the RTX 4090 draws an average of 425 watts during the GPU tests. Nevertheless, it remains relatively cool with a maximum of 61 degrees Celsius. Even at full load with FurMark for 20 minutes, the card doesn’t get warmer than 65 degrees Celsius. The fans also remain relatively quiet with a maximum of 40 dB measured from a distance of 30 centimetres.
Speaking of noise: you won’t hear a peep from the card in idle mode, since the fans stand still. Thus, the card only draws 8 watts and stays cool at 30 degrees Celsius. When I’m browsing or watching Netflix, the card requires up to 35 watts of power, but doesn’t get hotter than 40 degrees Celsius. Most of the time, the fans don’t run.
Lastly, I measured performance during the Blender benchmark. Here, the RTX 4090 draws an average of 270 watts and reaches a maximum of 52 degrees Celsius.
It’s really fun watching my FPS counter while gaming with the RTX 4090. Even in 2160p, it doesn’t get embarrassing. I mean, the card only makes sense in this resolution anyway. It’s overpowered for lower resolutions, and the performance gain isn’t as great as in 2160p. At this resolution, there’s a 48 per cent increase compared to the predecessor in the benchmark games. In 1080p, it’s just 15 per cent.
The card cannot (yet) unleash its potential everywhere in the tested applications. Premiere and Photoshop probably still have to be optimised, currently my RTX 4090 system actually lags behind the RTX 3090 one. In return, Nvidia’s flagship pulverises everything that has been seen so far in the Blender benchmark.
Initial concerns about a TDP of 450 watts are put into perspective by the test, but it still achieves amazing performances. The RTX 4090 can draw up to 450 watts, but that’s rarely the case. Compared to the RTX 3090, power consumption is about the same in games and averages 340 watts in 2160p. That doesn’t change the fact that the card is quite demanding in absolute terms. Personally, I find 340 watts too much.
At launch, the ROG Strix GeForce RTX 4090 OC cost over 2300 francs. That’s about 1000 francs or 77 per cent more than a GeForce ROG STRIX RTX 3090 24G. A steep increase in price, when you consider the performance increase of 48 per cent in 2160p gaming.
Of course, I have to put this result into perspective again. With DLSS 3.0, the card offers a feature that I haven’t even discussed in this review. I haven’t tested ray tracing either. With these features enabled, the performance jump compared to the predecessor would look different. I deliberately didn’t do this because I’m interested in the raw processing power in Rasterization games.
The RTX 4090 isn’t a card for the faint of heart. It’s for people who need the best of the best. If you bring the necessary change and have enough space in your case, the ROG Strix GeForce RTX 4090 is currently one of the best gaming graphics cards out there. If you want to squeeze the very best out of hardware, I can recommend it.
From big data to big brother, Cyborgs to Sci-Fi. All aspects of technology and society fascinate me.