GPUs & CPUs: Questions, Discussion and fanboy slap-fights - Nvidia & AMD & Intel - Separe but Equal. Intel rides in the back of the bus.

My new rig was making a little annoying coil whine, so I thought for a minute about if I did anything wrong with the build, and then it hit me. I put a y-splitter 6+2/6+2 on the damn 3090 FE, putting all of its load on a single cord. Ah, shit. It’s too high of wattage for that. So, I swapped it out with two separate cords coming from the PSU to the card’s adapter. Much better.

3090 FE’s have an adapter that turns two 8-pin inputs into a single 12-pin coming into the side of the card, like so.

View attachment 1936204

When I saw the waterfall effect scrolling across the Corsair RGB RAM, I felt stupid that I hadn’t picked up a windowed case. So, I got a window for it. And a lighting kit for the 5v addressable header coming off the mobo, because at that point, why not go all in?

The performance is everything I wanted and then some. The RTX 3090 can run RDR2 with everything on Ultra at 5120x1440 on the 32:9 ultrawide at an average of 90 frames per second. It makes SLI utterly pointless.

On the CPU, I noticed my idle temps were 38 to 45C, 60 to 70C under load, and 86C max. I had to run the fan on that poor Noctua UH-12S on a turbo profile to get it to cool. At first, I thought I’d selected too small of a cooler, but as it turns out, Ryzen 9 5900Xes run hot and there’s basically nothing you can do to stop them from running hot. It doesn’t matter if you have a low-profile cooler or a 360mm AIO. It will run hot. Everyone posts temps that are in the same general range no matter what heat sink they have. Therefore, the NH-U12S with the iPPC fan is actually plenty enough to run at stock clock. Still have to fiddle with the fan profiles some more to try and tame the noise, though.


Next up, to try Crysis and Star Shitizen.

My 3090 FE has quite a bit of coil whine as well, and it's gotten more noticeable once I put a waterblock on it. It isn't that loud, but it is kind of annoying. Also, what do you think of the samsung G9? I've thought about getting one, but not sure if it would be worth the price.
 

Drain Todger

Unhinged Doomsayer
True & Honest Fan
My 3090 FE has quite a bit of coil whine as well, and it's gotten more noticeable once I put a waterblock on it. It isn't that loud, but it is kind of annoying. Also, what do you think of the samsung G9? I've thought about getting one, but not sure if it would be worth the price.
+Amazing picture in games that support ultrawide resolutions.
+Good multi-tasking potential in picture-by-picture mode.
+Looks amazing; built-in lighting in the back and a built-in wireway for a clean look.
+Built-in USB hub for your mouse and keyboard.
-Large and heavy; needs substantial desk space.
-VA panel with mildly grainy text and so-so color representation; not the best thing for word processing or extensive reading unless you blow up your text to 150% at least.

I prefer IPS panels for their really precise picture, to be honest.
 

RightToBearBlarms

The Red Lobster Cheddar Biscuit of People
My new rig was making a little annoying coil whine, so I thought for a minute about if I did anything wrong with the build, and then it hit me. I put a y-splitter 6+2/6+2 on the damn 3090 FE, putting all of its load on a single cord. Ah, shit. It’s too high of wattage for that. So, I swapped it out with two separate cords coming from the PSU to the card’s adapter. Much better.

On the CPU, I noticed my idle temps were 38 to 45C, 60 to 70C under load, and 86C max. I had to run the fan on that poor Noctua UH-12S on a turbo profile to get it to cool. At first, I thought I’d selected too small of a cooler, but as it turns out, Ryzen 9 5900Xes run hot and there’s basically nothing you can do to stop them from running hot. It doesn’t matter if you have a low-profile cooler or a 360mm AIO. It will run hot. Everyone posts temps that are in the same general range no matter what heat sink they have. Therefore, the NH-U12S with the iPPC fan is actually plenty enough to run at stock clock. Still have to fiddle with the fan profiles some more to try and tame the noise, though.

I have my Asus TUF 3090 and 5950X in an open loop with two 360mm radiators and that little motherfucker will hit 80C in no time flat while the goddamn GPU never passes 55. The first few days I couldn't get over the feeling that I was doing something wrong but from all my searching it looks like that's just how those CPUs are.

As for the 3090: The coil whine on those things is truly out of this world. This card just uses two 8-pin connectors. I tried removing my Cablemod extensions and running with the cables directly from the PSU but it didn't make a lick of difference. I ran it on air for a couple of weeks before the new CPU came in and didn't really notice it until now that it has a waterblock on it.
 

The Mass Shooter Ron Soye

🖩🍸🔫 :) :) :) 😷
AMD crowd-sources evidence for the fast-emerging 500-series chipset USB issue

Will Alder & Meteor lake finally bring Intel back as the performance king? I don't know if TSMC will have enough fab space and yield for Zen 4's 5nm chips.

Alder Lake will compete well against Zen 3, probably with a decent single-threaded lead and with the 8 big, 8 small CPU beating the 12-core 5900X in multi-threaded.

Zen 4 will demolish Alder Lake with up to 40% single-threaded improvement, and probably an increase to 24 cores at the top.

I don't know about Meteor Lake, but if it comes out first, Zen 5 will probably wreck it within 6 months.

TSMC has already gotten 5nm out the door for Apple. By the time Zen 4 manufacturing ramps up, there should be no problem. Also remember that most of AMD's 7nm capacity at TSMC is used to make Xbox S/X and PS5 SoCs, not CPUs and GPUs. 5nm will not have that problem.
 

Secret Asshole

Expert in things that never, ever happened
Local Moderator
True & Honest Fan
Beware on the 3080/3090s, VRam heat is fucking through the roof and can reach temperatures of 100-110. So make sure you use HWInfo to make sure that junction temp is down. My card was at like 50 C but the Junction was at 104. There's inadequate cooling on the GDDR6X memory. No idea how it will affect the life of the card, but 100 on anything for long periods makes me leery.

EDIT:
And since I need money and I can earn $10 bucks a day, I've resorted to being a mining fag, both GPU and CPU, while keeping everything as cool as possible. Which is why I noticed. I do want the card to last several years and I'm not really serious about it, but I figure 80 degree VRAM memory with 60 degree CPU and GPU temps won't do any harm. Lucky my case has magnetic doors I can open, so I'm buying a portable laptop cooler and just aiming it in the case.

Even during normal gaming, that backplate is fucking HOT.
 
Last edited:

An Avenging Bird

Lean in snorted
I don't mind at all sitting here with my 1080 Ti just waiting for the used prices to hit a point where I can make back the amount I spent for the thing years and years ago.
Demand is only going up.
If the price doesn't get to that level, oh well I still get to keep my card.
 
Will Alder & Meteor lake finally bring Intel back as the performance king? I don't know if TSMC will have enough fab space and yield for Zen 4's 5nm chips.
It really doesnt matter if alder is better. Rocket lake is already shaping up to be a decent arch performance wise. So long as intel is stuck using 14nm they will always be hitting a performance wall, we can see that with the 11900k.

I remember saying on many a tech site that the 10900k had pushed 14nm to its absolute limit. It's power consumption was out of control relative to its performance gains and required extensive cooling to maintain full performance. The 11900k backing down to 8 cores yet pulling even MORE power then a 10900k honestly backs this up, 14nm is faltering. If intel cant get 10nm performing the way it should they'll be in trouble in the server markets as AMD continues to pull ahead.

There's also the rumors that zen 4 will bring about even higher clock speeds and a 20-25% IPC bump as well

If Intel cant get 10nm running and TSMC cant increase production we'll be in a pickle.
 
When do people think we'll see a new generation of professional cards? From my superficial knowledge it looks like everything right now is about the gaming cards.
 

Smaug's Smokey Hole

Sweeney did nothing wrong.
When do people think we'll see a new generation of professional cards? From my superficial knowledge it looks like everything right now is about the gaming cards.
They have been talking about the A6000 for a while. A 48GB card and priced as such. You would think that they would skim some chips off the top to build up and put out some of their premium priced Quadros. Looking at Dell they have some Turing Quadro's left while HP only seem to only have the lower end ones. Lenovo doesn't sell stand alone cards but they sell systems with the 8GB Quadro RTX 4000(RTX 5000 is the high end one before going over into whatever crazy machine learning cards they built at the time).
 

Ginger Piglet

Burglar of Jess Phillips MP
True & Honest Fan
Bot privilege is being about to buy all the RTX 3060s in 5 nanoseconds.

And this is why AMD really fucked the dog. They had an open goal in the form of Nvidia having terminally short stock of cards. They had a product which was competitive (RTX being still a meme) and which was competitively priced. And what do they do? TRIP OVER THEIR OWN FEET by having even less stock than Nvidia had at launch.

The Radeon 6700 XT is coming soon. They need to delay launch by about 4-6 weeks just to build up stock levels.
 

The Mass Shooter Ron Soye

🖩🍸🔫 :) :) :) 😷
The Radeon 6700 XT is coming soon. They need to delay launch by about 4-6 weeks just to build up stock levels.

Maybe the delay already happened? 6800 XT came out in November. Navi 22 is a different die, but they could have been ready to go earlier, and waiting to see what Nvidia does just like before.

There won't be any availability miracles from the 6700 XT launch, but I predict that AMD will make money.
 

Smaug's Smokey Hole

Sweeney did nothing wrong.
This was somewhat interesting, someone tested an RTX 2080 TI with 18 months of mining under its belt and compared it to a new 2080 TI. There is a difference in framerate but that's not what I found interesting(marked in red).
gpumining.JPG gpumining2.JPG

Some are speculating that it's the thermal compound and it would be fine if reapplied. That could be true but all in all I don't think there are that many people that would go full Steve and dismantle all that shit to clean up and reapply it or even know that it could dry out. I don't know enough about electrical engineering to say anything about other components or VRMs seeing some wear and tear, so who knows. It also ties into an old suspicion that people have had for a looong time and that is that newer drivers will make cards slower over time. Phil's Computer Lab did a video on that("Nvidia drivers slowing down - Retrospective Win98 GPU and driver benchmarks").
 
Modern graphics memory can degrade with use. Similar, but much slower, than flash memory. I have no idea how that actually manifests, but having less memory available, slowing down, and needing more power to refresh all sound plausible.
 
Top