If you think of buying nvidia 4000 series - don't, better wait

rado84

Well-Known Member
Joined
Feb 25, 2019
Messages
861
Reaction score
708
Credits
5,815
If you think of buying nvidia 4000 series - don't, better wait for 5000. Or, if you don't wanna wait, better go back to the high ends of the GTX series or early RTX (2060/2070/2080).

Picture_created_01-09-2024_16-41-34.png
 


I follow a Linux Youtube who's huge on gaming, video-editing etc, he prefers Nvidia as well and he has never complained about this. The only difference is he uses an RTX 4070. The most recent Nvidia gpu I have used is an RTX 3090 and this wasn't an issue then either, so maybe try the 555 driver because I heard that driver was still good.

It seems your post was deleted, since it's not viewable anymore?
 
Last edited:
 
1725293212776.png


Idk, my RTX 4090 24GB does me just fine? People are so quick to point the finger, seldom considering that well... they might be the problem. But, I'm sure they're a very smart individual, and it's completely out of the realm of reality that they could have miss configured, but because they miss configured repeatedly, it couldn't be their fault.

I get that this sounds totally "I use Arch....BTW" of me, but I literally had to lol at this one.

People buy NVIDIA then when they have to configure Wayland, they cry about it. Instead of crying about it, install X11 and just use your freakin machine 0_o. X11 is, was, and still will do the job as it's intended to be done.

The Arch Wiki literally has everything dude would need....

People want to OC all their hardware and then get reaaaaaaaaally upset when it doesn't go as well as they'd like it to. I have to giggle.

OC? But why? For an extra flies fart of render speed?
Again, the 4090 works perfectly fine, and performs very well; unless you break it.

My Steam Game library is MASSIVE, and my 4090 chews up up like hot cakes.
I can edit and render using DaVinci Resolve in a timely manner without issue.
I can use any desktop environment that I wish without issue.
 
Personally, I've used X11 because I'm ignorant and don't like change. But out of curiosity I looked up an article on configuring NVIDIA+Wayland and it's really not that complicated.

 
@AlphaObeisance
I checked your CPU/GPU combo and it seems your CPU bottleneck is a bit high:

My combo gives me 15% CPU bottleneck so I should upgrade my CPU.

I guess @rado84 's combo is even worse, but until he shares his system info we can only speculate.

Yeah I found this out after the fact lol, I was a bit more excited to buy new hardware than homework at the time lol! It gets me by.

I run x2 3440x1440 displays @100hz a wacom cintique 22'' drawing tablet and a 4k TV off it all lol. I can see the bottleneck here and there, but it's negligible at best in my experience. Still pulling 200+ FPS in most games on ultra settings. REALLY eager for S.T.A.L.K.E.R. 2 to release!
:p

 
Last edited:
Just for funzies I played around with Wayland and got it working without much issue. I guess it's supposed to be "more secure" and "perform better than X11" but at this time, I really don't see what all the fuss is about. Depends on how much I wanna dive under the hood I suppose.

Splash screen animates smoother, but that's about all I've noticed lol
 
Just for funzies I played around with Wayland and got it working without much issue. I guess it's supposed to be "more secure" and "perform better than X11" but at this time, I really don't see what all the fuss is about. Depends on how much I wanna dive under the hood I suppose.
You should experience no issues since Arch delivers latest software, so major issues are resolved.
The story is different for me since I'm stuck with Debian's stale software.

It also depends on which programs you use, not all play well with Wayland.
In my case password safe doesn't work and VSCode also has issues with Wayland, these are 2 essential to me so Wayland is not an option right now.
 
You should experience no issues since Arch delivers latest software, so major issues are resolved.
The story is different for me since I'm stuck with Debian's stale software.

It also depends on which programs you use, not all play well with Wayland.
In my case password safe doesn't work and VSCode also has issues with Wayland, these are 2 essential to me so Wayland is not an option right now.

I opened up VS-code and it seemed to do fine. Honestly I've used X11 for the past 3 years I've used Arch and it's "just worked" out the box, literally no configuration needed. I'll use Wayland a bit but if I don't see any substantial difference I'll likely swap back over to X11 cuz well, change lol.

The only issue I seemed to have was getting the kernel mods to load properly, after that was modified (literally just modifying a single configuration file), it seemingly works fine now. Though I did notice that even though my resolution is back to where it's supposed to be, nvidia-settings yields minimal options, where as on X11 it would give me 100% control over all my displays, performance mode ect. But all of those options are seemingly gone spite having DRM enabled. Probably just a willie nillie config I've missed. Doesn't matter too much, just mean's my wacom tablet has a bit more contrast than is necessary.
 
I opened up VS-code and it seemed to do fine.
Try scrolling in diff editor and you should see screen tearing, it happens randomly so you'll need a large diff to repro.
Other than this VSCode is doing fine.

The only issue I seemed to have was getting the kernel mods to load properly, after that was modified (literally just modifying a single configuration file), it seemingly works fine now.
Yes, you need to enable DRM modeset, otherwise session might not start.
It took me a while to figure out why I can't login into wayland.

I'd also suggest using proprietary nvidia driver because open driver is not fully complete yet but it's work in progress.
 
Try scrolling in diff editor and you should see screen tearing, it happens randomly so you'll need a large diff to repro.
Other than this VSCode is doing fine.


Yes, you need to enable DRM modeset, otherwise session might not start.
It took me a while to figure out why I can't login into wayland.

I'd also suggest using proprietary nvidia driver because open driver is not fully complete yet but it's work in progress.
I'm a lazy man, so I use proprietary drivers by default as with X11 they "just work", i'm sure admitting this will have me burned at the stake, but 'Murica.

I did discover that the jellyfin media player doesn't work right on wayland, haven't bothered looking for a wayland package (if there is one).

EDIT: Seems to be a standing issue (2021): https://github.com/jellyfin/jellyfin-media-player/issues/9
 
It wasn't bottlenecking cuz this teleportation problem was only on movies and TV shows.
I bought a GTX 1080 Ti 11GB which according to all sites SHOULD bottleneck the cpu even more, but it doesn't. Both run in perfect sync on games and TV shows. The conclusion is simple: 4060 sucks. Hell, I even just rewrote the recommended requirements for a new game that has been released 8 days ago and requires 2060.
 

@rado84
I'm just trying to think with you here but the gpu's over the years have gotten bigger and gotten more power. So this generation of gpu is generating more heat then your GTX 1660 and previous generations. The first thing I would try is getting a new pc case with better airflow.

I haven't heard @AlphaObeisance complaining about the idle temps and has a 4090 which has the same max operating temps. I wonder which pc case he is using for his setup, would be interesting to see what idle temps he is getting?

My RX 7900XTX idles at around average 45C and when under load during gaming(ie: Cyberpunk 2077, Dead Space, Borderlands) it goes to around average 65-75C depending on the game. From what I can find operating temperature for my gpu is from 70-75C and is able to deal with higher temps as well. I have a full tower case for good airflow, might be worth looking into getting a new pc case that has better airflow, to see if that will make the idle temps go down as well as the operating temps during gaming?

This is the full tower pc case I have.
 
Last edited:
I haven't heard @AlphaObeisance complaining about the idle temps and has a 4090 which has the same max operating temps. I wonder which pc case he is using for his setup, would be interesting to see what idle temps he is getting?

Did someone say temps? Lemme ask my shell

I'm running a bunch of crap right now and she's chilling contently at 51 degrees celsius right now, and if I were to jump into game and run Cyberpunk 2077 I'd be cranking 100+ FPS (100hz monitor) on ultra settings, I MIGHT break 68 degrees celsius

Add on the fact that I pretty much never leave this machine powered off. I may restart it, but it runs pretty much 24/7. It's my daily driver, my work horse, my gaming godess. She treats me well.

I have 10 fans. 3 up front, 3 on the GPU, 3 up top on the water cooler and the back exhaust fan. I also keep it CLEAN (looks at all the dust collected on the front/top) cough.... i mean.... usually.

EDIT: I've been running stable diffusion generating 2.5k rez images for like 12 hours straight now..... forgot to mention that lol. One would think my GPU would be screaaaaaaaaaming, and while it's hot in my studio, the GPU temps are vibin lol. #AllBoutDatAirflow

1725703368988.png
 
Did someone say temps? Lemme ask my shell
Yes!

I'm running a bunch of crap right now and she's chilling contently at 51 degrees celsius right now, and if I were to jump into game and run Cyberpunk 2077 I'd be cranking 100+ FPS (100hz monitor) on ultra settings, I MIGHT break 68 degrees celsius
At what resolution, I'm guessing 4k? But seems like if you can have an operating temperature of around 51C with games, then @rado84 should be able to as well. So that would be my first thought rado84 should try getting a case with better airflow for the Nvidia RTX 3060 to see if that improves operating temperature during gaming.

I have 10 fans. 3 up front, 3 on the GPU, 3 up top on the water cooler and the back exhaust fan. I also keep it CLEAN (looks at all the dust collected on the front/top) cough.... i mean.... usually.
I took out all the fans on my case and am only using the AIO water cooler and the fans of my gpu since my case has good airflow, so my case is nice and quiet.
 
Yes!


At what resolution, I'm guessing 4k? But seems like if you can have an operating temperature of around 51C with games, then @rado84 should be able to as well. So that would be my first thought rado84 should try getting a case with better airflow for the Nvidia RTX 3060 to see if that improves operating temperature during gaming.


I took out all the fans on my case and am only using the AIO water cooler and the fans of my gpu since my case has good airflow, so my case is nice and quiet.

My primary display is 3440x1440 (2.5k "?"). But I have two 3440x1440 monitors, a 4k tv, and a wacom cintique 22'' tablet all runing simutaneously; not sure if that'd make a difference but I'm going to assume it does as even if you're only gaming on one monitor, the GPU is still rendering stuff on the other screens (because I multitask even when gaming ;) ).

100% on the case with better airflow. My wife has my old 3080, I can check temps on that under load later today.
 
I was thinking on buying a 4xxx card but may old computer (with GeForce 2080) keeps on working great. OK, I can't get 160 or 240 FPS, but still more than enough for me, so that's OK. I can spend some money on it, so I would go for the fastest 4xxx but I ... just need a machine along with it.
It's some time ago, I guess from the time I was still watching Linus Tech Tips, where he was - rightfully so - complaning about the high prices of the 4xxx cards.
But no worry, 4090 or a good 5xxx card, either will cost a lot and either will do - There's no dicussion about Nvidia for me.
 
I was thinking on buying a 4xxx card but may old computer (with GeForce 2080) keeps on working great. OK, I can't get 160 or 240 FPS, but still more than enough for me, so that's OK. I can spend some money on it, so I would go for the fastest 4xxx but I ... just need a machine along with it.
It's some time ago, I guess from the time I was still watching Linus Tech Tips, where he was - rightfully so - complaning about the high prices of the 4xxx cards.
But no worry, 4090 or a good 5xxx card, either will cost a lot and either will do - There's no dicussion about Nvidia for me.

Yeah I think at the time I bought my 4090 I paid $2,200 USD. Insanity. But, it works lol. Passed the 3080 to the wifes rig, and her old 2060 super to my boy.
 


Latest posts

Top