I'm fan of Intel+Nvidia, why do people opt for amd+ati? is it because of the price or something else?
edit:
You also want RAM with much lower CL, 30 is too high, especially if you're gamer.
Older thread I know but some things just bother me. Came across this when researching 7900X.
This statement, untrue. Clearly the person knows nothing about CL. DDR4 @ 3000 Mhz with CL 15 is the same latency as DDR5 6000 Mhz CL 30. 10 nanoseconds. Here, go try it out yourself:
https://notkyon.moe/ram-latency.htm
Secondly, I have not seen any DDR5 at 6000 Mhz that has under CL30. The only way you are getting lower than that is by manually overclocking and even then you might just hit CL 28. Barely any difference.Tuned RAM difference in games is generally around the 3-5% mark. Now, going from 30 to 28 is less than 7%. How much is 7% of let's say 5%? It is in the "Don't worry about it" territory. In other words, don't bother.
And some other comment mentioned Nvidia. Here is my experience.
In order for me to get any output from the card with proprietary drivers, I had to modify Grub commandline. Otherwise, black screen. Not off to a great start. Then, I had to modify 2-3 other files just to get it working in games. And I might add, 2 of the very few games I played, crashed instantly after loading into the game. Wonderful experience...Not to mention Nvidia always skimps on VRAM. This also lead to one of my games running out of VRAM and shutting down. So I had to use FSR downscaling to play that game. And lowered settings. The game looked jaggedy as...it was a bad experience.
What did I have to do when I switched to AMD inplace? (Did not reinstall Linux distro). I removed all the Nvidia-specific "fixes". Installed AMDs opensource drivers. Turned off PC. Installed the card. Everything worked. It still does, a year later. Did not have to modify a single line anywhere. I did try overclocking, for that I needed to add something to Grub cmdline, the "featuremask" thing. Tried overclocking, got the card the pull 50 extra watts, made almost zero difference in games, reverted to default clocks. And that is where it has been sitting since.
OOTB experience can't be beat.
These articles about Nvidia opensourcing their driver, just wrong. The only thing they opensourced is their GPU firmware, which is like 5% of the code. They literally couldn't have opensourced LESS. And it was opensource a year ago too. I tried it on my Nvidia card. It was bad. So I never ran it. It might be better now. Nvidia was in the starting blocks then. The only thing that has changed is that Nvidia now does it on all cards. Opensource GPU firmware. Minimal change.
Still never buying Nvidia.