Has my (what used to be top of the line) PC really gotten THAT old?

BigBadBeef

Active Member
Joined
Sep 23, 2021
Messages
457
Reaction score
238
Credits
4,125
The most recent news on the latest hardware market on the enthusiast level is quite depressing - horrible pricing, abysmal power efficiency, coolers so big that it looks downright disgusting, and you need to spend as much for aftermarket cooling as you did for the hardware itself.

This is a shameful display! I am quite convinced that I am going to skip one generation.


Back in 2015 I built myself a pc that was a work of art in terms of ballancing the performance - a Xeon 1241v3, paired with a GTX 980. In remember that back then I made several vendors blush since what I asked for was in top 10 percentile of hardware that was available. They couldn't get it. I mean they could, but it would be a substantial price markup since they had to order it from germany. So I ordered it myself.

Now its 2022 and the problem lies that its an ideal pairing with each other. Whatever I upgrade to will bottleneck the CPU... very badly! I checked the newest possible hardware that could fit PCI-e 3.0 and and even its midrange options bottleneck the CPU at over 50% in some cases.

So has my PC really gotten THAT OLD?

Probably wouldn't have asked myself that question if I kept playing the newest titles all the time throughout the years... yet I find my interests reaching for the indie market for which hardware requirements were all on the more gentle side.
 


2015 is to my mind, middle-aged your E3-1241V3 is still a pretty quick 4 core 8 thread processor, but a little Heavey on the wattage compared to the latest ones, but if you hunt round you could get an increase in speed by picking up a E3-1281 V3, unless your gaming then for a Linux box I personally would stick with it for a few more years

 
2015 is to my mind, middle-aged your E3-1241V3 is still a pretty quick 4 core 8 thread processor, but a little Heavey on the wattage compared to the latest ones, but if you hunt round you could get an increase in speed by picking up a E3-1281 V3, unless your gaming then for a Linux box I personally would stick with it for a few more years

Agreed, but in order to remain relevant for said years, or at the very least wait for one more gen of hardware, I would need to pair it with a RTX 2060 Super... at least the regular RTX 2060, anything less would not have been a meaningful upgrade. Yet in this example, the bottleneck is calculated to being at least 50% across a wide field of applications, including games.

Would 200MHz on the boost clock make that much of a difference? It is the devil's price for a new one, and I would not wager on buying used, even if it is by design made to be more robust than core i7.
 
those specs can still run a lot of software, including games
 
those specs can still run a lot of software, including games
Even back in the day, it was considered to be a 1080p 60Hz futureproof card which can do 4K when it was a new tech. While this still applies, I now have a 1440p 144Hz monitor.

I bought it as a prelude to splurging the big bucks for new hardware. Imagine my disappointment when the new gen, that was put forth was, presented as that embarrassment.

Yet, however, to my own detriment, I am too good at my job. My computer is perfectly balanced, however I upgrade it, the CPU will become a bottleneck, to which the only upgrade option is listed above, barely a poke in the right direction in terms of performance.

I'll think of something... eventually.
 
Since when has computer parts been cheap...there's always an excuse to put the price up.
t1949.gif
 
Since when has computer parts been cheap...there's always an excuse to put the price up.
t1949.gif
About the year 2000 I put one together for about 300 USD -- with a whopping 512MB Corsair RAM! Just to play GTA3
 
My first writable CD drive cost about a grand in the nineties. A few weeks later, it was on sale for a few hundred bucks (and mechanically quite different) which made me a bit annoyed. Prices have dropped considerably since then.
 
My first writable CD drive cost about a grand in the nineties. A few weeks later, it was on sale for a few hundred bucks (and mechanically quite different) which made me a bit annoyed. Prices have dropped considerably since then.
I'd bet the write speeds were pretty atrociously slow? Prices are better in terms of how ubiquitous computing power is, even though i'd argue they're about the same in terms of disappointment factor as they've always been.
 
I'd bet the write speeds were pretty atrociously slow? Prices are better in terms of how ubiquitous computing power is, even though i'd argue they're about the same in terms of disappointment factor as they've always been.

Oh, I think it was 1x, or maybe 2x? I really don't recall the details. It used heat to erase and write to CD, I remember that much.

(I've been an early adopter for all sorts of tech. This is usually a bad idea.)
 
(I've been an early adopter for all sorts of tech. This is usually a bad idea.)
That all depends on the size of your bank account and level of enthusiasm...new technology in general tends to be sold for rip-off prices, 1TB+ SSDs used to be sold for thousands of dollars. It was stupid of me, imo, to buy a C64 sometime between 2015-2019, at the time i was kind of an ignorant computer enthusiast, but at least i learned a lot about buying antiques off the internet...
 
Last edited by a moderator:
That all depends on the size of your bank account and level of enthusiasm...

Oh, the real reason it's a bad idea is you end up with obsolete tech because you're buying it before you know if others will buy it - thus providing a market for it. Think betamax. It was a superior technology to the rivals, but ended up losing the format war with VHS. If you bought it, just a few years later they'd stop releasing new/good movies on the format, leaving you with a fairly useless device if you wanted to remain current. It hasn't really got anything to do with how much money you have, 'cause you can't really force them to make more betamax content.
 
Oh, the real reason it's a bad idea is you end up with obsolete tech because you're buying it before you know if others will buy it - thus providing a market for it. Think betamax. It was a superior technology to the rivals, but ended up losing the format war with VHS. If you bought it, just a few years later they'd stop releasing new/good movies on the format, leaving you with a fairly useless device if you wanted to remain current. It hasn't really got anything to do with how much money you have, 'cause you can't really force them to make more betamax content.
But nonetheless, betamax was supposedly better than VHS, and atleast it now has value as an antique.
 
and atleast it now has value as an antique.

LOL I just checked eBay and they're pretty much worthless as compared to what they cost. So, they lost significant value. If you factor in inflation, they cost (I got mine on sale) about $2700 in today's money. You can take your pick of players for $100 at eBay. Never mind the hassle of keeping it safe for 45 years.
 
LOL I just checked eBay and they're pretty much worthless as compared to what they cost. So, they lost significant value. If you factor in inflation, they cost (I got mine on sale) about $2700 in today's money. You can take your pick of players for $100 at eBay. Never mind the hassle of keeping it safe for 45 years.
I guess im just trying to say that value is always subjective, i can tell myself im dumb for buying a c64 in the hyper-computer age, but atleast i learned something from a terrible decision...
 
i can tell myself im dumb for buying a c64 in the hyper-computer age, but atleast i learned something from a terrible decision...

I'm pretty sure that's the exact opposite of 'early adopter'. It's fun from a tech/geek view, but often a bad financial decision. The field of tech is littered with the bodies of failed formats and bad decisions.

There's more abandonware than there is software, at least as evidenced by GitHub's stats. Something like 70% of all the code on GitHub hasn't been updated in three years - and that stat is a few years old.

These days, not for financial reasons - but for headache reasons, I wait for tech to be adopted by a bunch of people, enough so that I think I can use the tech for some years to come. I still sometimes do the early adopter, but I'm much more selective. I used SSDs early on, for example.
 
I'm pretty sure that's the exact opposite of 'early adopter'. It's fun from a tech/geek view, but often a bad financial decision. The field of tech is littered with the bodies of failed formats and bad decisions.

There's more abandonware than there is software, at least as evidenced by GitHub's stats. Something like 70% of all the code on GitHub hasn't been updated in three years - and that stat is a few years old.

These days, not for financial reasons - but for headache reasons, I wait for tech to be adopted by a bunch of people, enough so that I think I can use the tech for some years to come. I still sometimes do the early adopter, but I'm much more selective. I used SSDs early on, for example.
I do agree that a strictly early adopter approach will leave you unable to pay for stuff you need, which is why i aggressively avoid it. The dissapoitment resulting from buying a new ipad pretty much ended my enthusiasm for new technology.
 
The dissapoitment resulting from buying a new ipad pretty much ended my enthusiasm for new technology.

I had the first gen. It was mostly only used for watching videos. Then, it no longer got updates and the YouTube app stopped working. Then, YouTube via browser stopped working. Eventually, I could still watch videos - if I found them in like 3gp (filmed on a potato) videos.

So, I avoid that sort of stuff now. That took place over just a few years. Planned obsolescence sucks. I hate that my phone stopped receiving updates. It only got updates for a few years, and that's probably because I bought it new. I should probably replace it, but it still works and I don't do anything all that secure on it. I figure anything I do on that phone is public knowledge as I'm sure someone's scraping my data.
 
I still like new tech, I just got disgusted by the latest generation of nVidia graphics cards.

However, it just occured to me the paradox about asking the linux community about new tech!
 

Staff online

Members online


Top