I ordered a new desktop PC - 2024

My current desktop PC was kind of old for my standards and it was already several generations past the release year when I bought it. I decided I would buy a new desktop PC so that it would be more future proof and that I am able to upgrade it longer, hopefully that is.

4000D AIRFLOW Tempered Glass Mid-Tower ATX Case
ROG STRIX 1000W Gold Aura Edition
ROG STRIX B650E-F GAMING WIFI
AMD Ryzen™ 9 7900X
ARCTIC Liquid Freezer II 420
VENGEANCE® 32GB (2x16GB) DDR5 DRAM 6000MT/S CL30 AMD EXPO Memory
980 PRO w/ Heatsink PCIe® 4.0 NVMe™ SSD 2TB
NITRO+ AMD Radeon™ RX 7900 XTX Vapor-X 24GB

What do you think of my new setup, will it be able to run Linux?

Don't know. Is the CPU Intel compatible ?

If yes, then it's OK.
 


I do have an i7-13700k, It runs at over 87c most of the time. It's easy to post high temps. You could put a computer under heavy loads, and say it's the average temp. But trust me, I have had plenty of Intels run hot over the years.
Heh. Reminds of my old Pentium 4, from 15 years ago! This was a non-HT 2.8 GHz with a 400MHz FSB; tJMax of only around 67°C, and it seemed to run low-to-mid 60s most of the time. But those things were darned near indestructible.....they seemed to thrive on heat; the hotter they got, the better they seemed to perform...

A-yup; happy days!


Mike. :p
 
Heh. Reminds of my old Pentium 4, from 15 years ago! This was a non-HT 2.8 GHz with a 400MHz FSB; tJMax of only around 67°C, and it seemed to run low-to-mid 60s most of the time. But those things were darned near indestructible.....they seemed to thrive on heat; the hotter they got, the better they seemed to perform...

A-yup; happy days!


Mike. :p
I remember the Intel Pentium D 820 Smithfield.
Pentium D Smithfield processors ran hot they were labeled Flamethrowers.
They were two Pentium 4 processors fabricated on one die in a LGA 775 socket.


A-yup; happy days! Agreed.
 
I remember the Intel Pentium D 820 Smithfield.
Intel were playing catch up back then, AMD with the Athlonx2 and Semperon 64 bit were miles ahead
 
Intel were playing catch up back then, AMD with the Athlonx2 and Semperon 64 bit were miles ahead
AMD Athlon x2 processors in the old days were the bees knees to coin a @Brickwizard expression.

My first game tower used an unlocked AMD Athlon x2 Black Edition processor overclocked the hell out of it without any overheating concerns.

My apologies to @f33dm3bits for being a bit off topic.
 
Heh. Reminds of my old Pentium 4, from 15 years ago! This was a non-HT 2.8 GHz with a 400MHz FSB; tJMax of only around 67°C, and it seemed to run low-to-mid 60s most of the time. But those things were darned near indestructible.....they seemed to thrive on heat; the hotter they got, the better they seemed to perform...

Interestingly enough if you look at the specs for the i9 and ryzen7...


Max. Operating Temperature (Tjmax) 89°C


tJMax:100°C

I guess that could be taken two ways. It runs hotter, but that's OK, because it "can" run hotter.
 
I guess that could be taken two ways. It runs hotter, but that's OK, because it "can" run hotter.
tjmax is temperature at which CPU will start to degrade it's performance in order to avoid damage.
It is not a measure at what temps the CPU will run.

In your example, ryzen7 can handle heat up to 89C when it starts to get performance hits, while i9 can handle up to 100C
 
I'm fan of Intel+Nvidia, why do people opt for amd+ati? is it because of the price or something else?

edit:
You also want RAM with much lower CL, 30 is too high, especially if you're gamer.

Older thread I know but some things just bother me. Came across this when researching 7900X.
This statement, untrue. Clearly the person knows nothing about CL. DDR4 @ 3000 Mhz with CL 15 is the same latency as DDR5 6000 Mhz CL 30. 10 nanoseconds. Here, go try it out yourself: https://notkyon.moe/ram-latency.htm

Secondly, I have not seen any DDR5 at 6000 Mhz that has under CL30. The only way you are getting lower than that is by manually overclocking and even then you might just hit CL 28. Barely any difference.Tuned RAM difference in games is generally around the 3-5% mark. Now, going from 30 to 28 is less than 7%. How much is 7% of let's say 5%? It is in the "Don't worry about it" territory. In other words, don't bother.


And some other comment mentioned Nvidia. Here is my experience.
In order for me to get any output from the card with proprietary drivers, I had to modify Grub commandline. Otherwise, black screen. Not off to a great start. Then, I had to modify 2-3 other files just to get it working in games. And I might add, 2 of the very few games I played, crashed instantly after loading into the game. Wonderful experience...Not to mention Nvidia always skimps on VRAM. This also lead to one of my games running out of VRAM and shutting down. So I had to use FSR downscaling to play that game. And lowered settings. The game looked jaggedy as...it was a bad experience.

What did I have to do when I switched to AMD inplace? (Did not reinstall Linux distro). I removed all the Nvidia-specific "fixes". Installed AMDs opensource drivers. Turned off PC. Installed the card. Everything worked. It still does, a year later. Did not have to modify a single line anywhere. I did try overclocking, for that I needed to add something to Grub cmdline, the "featuremask" thing. Tried overclocking, got the card the pull 50 extra watts, made almost zero difference in games, reverted to default clocks. And that is where it has been sitting since.

OOTB experience can't be beat.

These articles about Nvidia opensourcing their driver, just wrong. The only thing they opensourced is their GPU firmware, which is like 5% of the code. They literally couldn't have opensourced LESS. And it was opensource a year ago too. I tried it on my Nvidia card. It was bad. So I never ran it. It might be better now. Nvidia was in the starting blocks then. The only thing that has changed is that Nvidia now does it on all cards. Opensource GPU firmware. Minimal change.

Still never buying Nvidia.
 
Older thread I know but some things just bother me. Came across this when researching 7900X.
This statement, untrue. Clearly the person knows nothing about CL.
Secondly, I have not seen any DDR5 at 6000 Mhz that has under CL30.
Yeah I knew nothing about CL for DDR5 and I acknowledged that in my post #16 later, thanks for not reading the thread further.
DDR5 is the latest standard and low CL RAM's do not exist yet, that's why you can't find it.
My RAM is still DDR4 where CL matters and I had no clue this is not longer the case for DDR5.

And some other comment mentioned Nvidia. Here is my experience.

I've read the rest of your post but I don't have these problems, I have over 500GB games library and ALL games run fine with my Nvidia GTX 1650 just fine and without any glitches.

Things have changed with Nvidia btw, I'm not going to debate with you except to re-post a video which @f33dm3bits shared with me today, I highly suggest you watch it (it's not long) and it shows how people among Linux community can be stubborn and not monitoring how the tech progresses:



Still never buying Nvidia.
I was considering to go for AMD GPU one day but after watching the video there is no way as Nvidia obviously is superior but I still haven't tested these suggestions because my PC is busy for the next 5 hours.
 
Things have changed with Nvidia btw, I'm not going to debate with you except to re-post a video which @f33dm3bits shared with me today, I highly suggest you watch it (it's not long) and it shows how people among Linux community can be stubborn and not monitoring how the tech progresses:
 
Yeah I knew nothing about CL for DDR5 and I acknowledged that in my post #16 later, thanks for not reading the thread further.
DDR5 is the latest standard and low CL RAM's do not exist yet, that's why you can't find it.
My RAM is still DDR4 where CL matters and I had no clue this is not longer the case for DDR5.
I was researching something else. Of course I was skim-reading. Usually that means couple of the first comments and couple of the last. So I am sorry I missed the post.
But CL still matters. I don't want CL 36 or 40 @ 6000 Mhz. That is crap-tier RAM. The tables have turned. Samsung does NOT make the best RAM anymore. They make among the worst DDR5. I think it's all Hynix A-die these days, possibly M-die. If you care even a little about performance and an optimized system. On AMD.
On that topic, my brother got CL 36 RAM...OMG!!! In my head I was thinking: "Why didn't you ask me?". He went for color of the RAM and RGB instead. It doesn't matter as much luckily since he has an X3D-chip. Not that reliant on RAM. Huge L3-cache works wonders in gaming.

I've read the rest of your post but I don't have these problems, I have over 500GB games library and ALL games run fine with my Nvidia GTX 1650 just fine and without any glitches.

The 2 games that would crash for me on a 2080 was Forza Horizon 5 and Cyberpunk 2077. I tried all the switches I could find on ProtonDB. Same result regardless. 500 gigs these days isn't much. Just looked at 3 games I have installed, ESO, Sniper Elite 5 and Starfield. Total size of 3 games = 340 gigs. So I could fit one more game and I would be right there at 500 gigs. The thing is, I don't paly many games but the games I do play I play for 500-1500 hours. So if even one doesn't work, it is a huge blow to me. I buy like 1 game a year, sometimes 2.

Things have changed with Nvidia btw, I'm not going to debate with you except to re-post a video which @f33dm3bits shared with me today, I highly suggest you watch it (it's not long) and it shows how people among Linux community can be stubborn and not monitoring how the tech progresses:




I was considering to go for AMD GPU one day but after watching the video there is no way as Nvidia obviously is superior but I still haven't tested these suggestions because my PC is busy for the next 5 hours.

I have to be honest, I skimmed the video, did not see a single graph or number. "Superfast" means nothing to me without comparisons. Yeah, CUDA is everywhere. Everyone knows that. I also don't use Davinci so it is not of any relevance to me.
I hope Nvidia gets better, I do. But I don't see that happening before they opensource the drivers. We'll see what NVK brings.
But I also don't like to tinker with my system every time I launch a new game. I do enough of that almost daily. Gaming is my time away from tinkering. I need a break. AMD provides that. I don't have to use a single launch command. Sometimes I do, just for MangoHUD. But that is not AMD specific.
 
Last edited:
@CaffeineAddict

Nvidia works great in Linux if you own a newer Nvidia graphics card.

An older Nvidia graphics cards that requires the Nvidia 340.108 propitiatory graphics driver than Nvidia doesn't work.

Last Linux kernel that supports Nvidia 340.108 graphics driver is kernel 5.4 and any Linux kernel newer than 5.4 will not support the Nvidia 340.108 driver.

Therefore Nvidia doesn't always work as some of us don't use the latest and newest hardware as we prefer to use what we already have working.

The Nouveau, Accelerated Open Source driver for nVidia cards has come along way but still ain't the Nvidia proprietary graphics driver.

Windows OSs still supports the older Nvidia graphics driver which is why Linux will never be on top and that's fine by me.

I like where Linux is low on the totem pole that way Linux remains unpolluted.

I'm not trying to start any arguments with anyone.

My apologies to @f33dm3bits for my ranting.
 
Last edited:
500 gigs these days isn't much. Just looked at 3 games I have installed, ESO, Sniper Elite 5 and Starfield. Total size of 3 games = 340 gigs.
The 2 games that would crash for me on a 2080 was Forza Horizon 5 and Cyberpunk 2077.
We play very different games, there are 72 games in 427GB here, installers only.

I no longer play AAA shooters like in my 20's, these are all big nowdays and very demanding, but strategy games, which are max. 15GB each or as low as 1GB.

Nvidia works great in Linux if you own a newer Nvidia graphics card.

An older Nvidia graphics cards that requires the Nvidia 340.108 propitiatory graphics driver than Nvidia doesn't work.

Last Linux kernel that supports Nvidia 340.108 graphics driver is kernel 5.4 and any Linux kernel newer than 5.4 will not support the Nvidia 340.108 driver.

Therefore Nvidia doesn't always work as some of us don't use the latest and newest hardware as we prefer to use what we already have working.
Understood, making old hardware useful, but 340 driver, that's really old.
 
Understood, making old hardware useful, but 340 driver, that's really old.
So are my Nvidia cards that were purchased for Windows XP and Vista.

The cards actually work great on Linux when the Linux kernel supported the Nvidia 340.108 so I'm blaming Linux developers.

Nvidia still offers the 340.108 driver although the Linux kernels above 5.4 no longer support the driver.

People can say what they want it's Linux developers who's at fault.
 
People can say what they want it's Linux developers who's at fault.
Not fully, after a certain amount of time the Nvidia gpu's aren't supported anymore by Nvidia so then the newer drivers won't work with older cards anymore and because they aren't updated anymore for those older cards the older drivers can't be used with new kernels anymore. So then the the older drivers can only be used with older kernels and old Nvidia drivers or with the opensource Nouveau drivers.

The Youtuber who's videos I linked is a gamer and he hangs around Linux communities where other people game on Linux as well and generally speaking most gamers aren't into keeping around gpu's older than 5-6 years which is around how long Nvidia supports gpu's depending on the gpu. So his audience is mostly people who don't care about hardware preservation but about having a good gaming experience on Linux and when it comes to gaming around the Linux communities there is still a trend going around that Nvidia is a bad choice for gaming on Linux and for Wayland.

While the opposite is true with recent Nvidia gpu's and with the most recent driver updates(555.x) that have come out recently, but a lot of people still get crapped on for mentioning anything with an Nvidia gpu. Even with my previous Nvidia gpu gaming was already good with Nvidia except for some Wayland problems that were still a problem before the 555 driver. The trend is still just to blame Nvidia because just because Nvidia has been getting the blame ever since the famous quote of Linus Torvalds which I won't repeat here. However things are changing an Nvidia is making steps towards being more opensource which we should be grateful for.

Yes not everything is opensource yet but we should be grateful they are making steps towards becoming more opensource and it will take time. Can we just be positive for once instead of cracking down on everything that isn't opensource enough. Yes I am even guilty of that at times and I need to do the same.
So I think we should make an effort to be more open minded and postive towards the steps Nvidia is taking to become more opensource and improving their driver experience on Linux and Wayland instead of always defaulting to Nvidia is crap on Linux. Times are changing and so should our attitude because a negative attitude helps no one, not even new Linux users or potential new Linux users and it doesn't help the bad/toxic reputation that the Linux community already has.
 
I don't have no problem with Linux or Nvidia.
I'm a cheap old bastard and don't want to buy new hardware.

As long as Windows still supports my old Nvidia graphics cards my Flight Simulators work fine.

I can do anything I need to do using Linux on any old junk / spare parts / thrift store / garage sale / desktop with Intel inside or AMD APUs.

As long as the Windows XP desktops work I'll continue using them for Flight Simulators.

I just like flying airplanes and since I'm to old to fly real airplanes Flight Simulator X is as it gets. :D
 
For me it's all about reference. In the old days, on the PC market, we only had Intel so to speak, and these days Intel still exists and still is sort of king .. so, why even bother trying something else ? OK, Intel is more expensive ... it usually is for better quality. If I count the time I used my whatever 1000 euro CPU, it's probably 0.00005 cent per minute. That would be 0.00004 if using AMD.

Similar to Nvidia, it was so long the most used, an still is ... why bother ? In the old days, when games came on boxes, we would look at the graphics specs, and also then Nvidia was the best reference. So, same thing, why go for something less, to save almost nothing ? Doesn't seem to be worth the effort.

Surely, both Intel and Nvidia make bad things, and do bad things. AMD doesn't ? I now hear about that issue with 13th and 14th gen Intel ... so I know what not to buy.
 
Yes, I only use Linux in corporate context, we don't do anything else but upgrading Kernels as distributed by the vendor, nothing else.
 

Members online


Top