I ordered a new desktop PC - 2024

. I decided I would buy a new desktop PC so that it would be more future proof and that I am able to upgrade it longer,

Very close to mine.
 


What motherboard and cpu does yours have?

MB: MSI X670E Gaming Plus WiFi

I'm not really much of a gamer, but it seems all the motherboards say "gaming" in them.

Edit: You said "and cpu"

CPU: AMD Ryzen7 7800X3D
 
Last edited:
I'm not really much of a gamer, but it seems all the motherboards say "gaming" in them.
My server system has a gaming motherboard as well.

CPU: AMD Ryzen7 7800X3D
I thought you don't game, from what I read the 3D version of Ryzen cpu's are officially gaming cpu's.
 
Don't forget to slather it in RGB!

That looks like a nice system. I should probably upgrade but I just can't justify it. Everything I have works and is reasonably modern - meaning also reasonably efficient. Efficiency is important to me as I quite like this rock we're stranded on. Even my cheap refurbished experimental computer does more than I ask of it without any notable lag. Though, I should cram an NVME M.2 SSD into it.
 
I thought you don't game, from what I read the 3D version of Ryzen cpu's are officially gaming cpu's.

They are the fastest CPU period.


.. and about $250 cheaper than a i9-14900,

I was hoping to get more cores. I got this as a newegg bundle, so I could have been more patient, but I feel I got
a good price on everything.

On top of that, intel may not be as competitive as some benchmarks say, they have recently been caught "cheating".

 
Last edited:
I have similar reason why Intel, it's because AMD CPU's were getting hotter than Intel, I don't know if this is still true but it was true when I've been using amd, and I like cold PC so much that I never turned back to amd, I had it only once in my life.
@CaffeineAddict :-

I gotta confess, this is one statement I've never understood. For me, the opposite was true.

For years, I ran a very elderly Compaq Presario desktop PC, which had been assembled from a final batch of components by Compaq's own staff just prior to the HP buyout in 2003/4. It originally came with a single-core Athlon 64 3200+, which I later upgraded to an Athlon 64 X2 3800+ dual-core (these would run in the same Socket 939, along with a newer BIOS to accommodate them).

Even in the house, during the winter with the central heating running, at boot I would often briefly see 19°C on the temp indicator before it steadily climbed into the mid-to-high 20s. After a couple of hours normal activity - mostly browsing - it would stabilise at around the mid-to-high 30s.

I've always done a fair bit of video-editing. On one particularly memorable occasion, she hit 52°C after a very long render at 1080p under OpenShot.....which if memory serves, was the hottest I ever saw her go. Even during a really hot UK summer, she usually averaged mid-40s.....

It didn't really surprise me, given the enormous size of the stock K8 cooler. Still, horses for courses, I guess; we ALL have different experiences. Mine must have just been a good one.

@Brickwizard :-

Just outta curiosity, how much did you spend on your 6000+? I was trying to find one, but at the time I bought my X2 3800+ off eBay (for all of £8, some nine years ago now), what few X2 6000+'s I could find were going for 3 figures plus.....silly money, given they were already pretty elderly chips at that time.

@atanere :-

Oh that's summat I never could get my head around. This current HP Pavilion desktop of mine has a combination BlueTooth/wireless chip. Wireless.....on a desktop? Like, what's THAT all about? o_O


Mike. ;)
 
Last edited:
@MikeWalsh

From what I hear from discussion with people in RL, they agree with me but this ADM hotter than Intel issue applies only to specific CPU's from some 10 years or so back, I think it applies to cpu's somewhere before Ryzen come out, the issues was fixed with Ryzen CPU's. and it appears it's no longer present but I can't confirm this because didn't use AMD for more than 10 years.
 
I gotta confess, this is one statement I've never understood. For me, the opposite was true.

Same here. Although, I confess I am water cooled on my two AMD systems.

1709654842106.png
 
Easy to get into temp wars here. But I will concede that some runs cooler than others.

1709655284104.png
 
I must confess, I'd heard so much about Intel chips running hot enough to fry an egg, but this Pentium 'Gold' G5400 - running @ 3.7 GHz - has quite surprised me. Here's the current temps in gKrellM.....in a centrally-heated room at a comfortable 23°C:-

Screenshot-380.png


Which I'm quite happy with.....and that 44°C for the GPU is with the in-kernel nouveau driver. And this is a "passive-cooler"....even this has rarely - if ever - topped 50°C. Even when rendering with OpenShot, which can offload to the GPU if one's available and set-up with the official drivers.

(shrug...)


Mike. ;)
 
But I will concede that some runs cooler than others.
Not bad your i5 when converted runs @30 C which is even better than mine, but it all depends on how long it's running, if you just turned on your PC then 30 C is normal but it should get another 5 or 10 in idle.

I recently pasted my CPU also that's why it's cooler than what it was, it used to run @ 40 C

No need for any temp wars, honestly IDK at what temps do AMD CPU's run today.
 
@CaffeineAddict :-

@MikeWalsh

From what I hear from discussion with people in RL, they agree with me but this ADM hotter than Intel issue applies only to specific CPU's from some 10 years or so back, I think it applies to cpu's somewhere before Ryzen come out, the issues was fixed with Ryzen CPU's. and it appears it's no longer present but I can't confirm this because didn't use AMD for more than 10 years.
Yah. Figures; I think you're probably talking about the Bulldozer/PileDriver/Excavator architectures.....from what are often referred-to as the AMD "wilderness" years. I knew somebody who used to run an octacore FX 8300+; even with a massive Noctua air-cooler, he was always moaning about how hot the thing ran.....well over 80°C+, most of the time.


Mike. ;)
 
I knew somebody who used to run an octacore FX 8300+; even with a massive Noctua air-cooler, he was always moaning about how hot the thing ran.....well over 80°C+, most of the time.

I do have an i7-13700k, It runs at over 87c most of the time. It's easy to post high temps. You could put a computer under heavy loads, and say it's the average temp. But trust me, I have had plenty of Intels run hot over the years.
 
why do people opt for amd+ati? is it because of the price or something else?
Why do people buy Chevrolet.
Why do people buy Chrysler.
Why do people buy Dodge.
Why do people buy Ford.

Because it's their choice to buy what they like and want.
 
I'm fan of Intel+Nvidia, why do people opt for amd+ati? is it because of the price or something else?

I have always preferred AMD to Intel it goes back a long way when intel started making their own chips and they were unreliable [before this AMD and others made them under contract or licence for Intel]
and I was using ATI graphics long before the AMD takeover, in those days they were not the cheapest but IMO were again the most reliable,

but it's horses for courses, you say Toma'toe and I say tomar'toe
In the olden days of the single core processors and overclocking the choice was always AMD.
AMD was the preferred because they overclocked better without the additional heat due to Intel's netburst technology.

I never had any issues with ATI Radeon / AMD graphics cards or Nvidia graphics cards back in my Windows OS days of gaming.

Nowadays using Linux I run from Nvidia and stay as far away as I can.
In my experience Nvidia and Linux are a bad combination together imo.

Too many Nvidia proprietary driver problems with Linux new kernel releases supporting a Nvidia propriatary driver.

These days I use oem desktops with all Intel inside and have no problems.

@f33dm3bits looks to be a great box for any purpose.
 
Too many Nvidia proprietary driver problems with Linux new kernel releases supporting a Nvidia propriatary driver.
I don't find any of these problems to be obstacle to using Nvidia.

Honestly I'm new to permanent use of Linux, it's only a month since I dumped Windows, but I've manged to solve a ton of problems already including Nvidia issues.

The only 2 real issues with nvidia I had was that I had to install newer kernel which wasn't that hard thanks to backports repo in debian.
I'm now using the latest driver and it all works fine, I'm pretty much happy with Nvidia and see no valid reason to switch to ATI other than open source driver, but IMO, GPU performance and quality is more important than open source driver, one buys GPU due it's stats rather than driver.

Another problem I had with Nvidia on linux is playing 32 bit games, it took me some time to figure out 32 bit libc needs to be installed in order to install 32 bit driver libs.

Other than this no issues and no reason to dump Nvidia.
If you're dumping Nvidia solely because the driver isn't open source then it's a bad decision because as I said you're buying GPU for it's name and stats not beause of the driver.

And btw. I bought yet another brand new Nvidia card 2 weeks ago even though I'm now Linux user and despite any issues because there are none.
 

Members online


Top