GPUs & CPUs & Enthusiast hardware: Questions, Discussion and fanboy slap-fights - Nvidia & AMD & Intel - Separe but Equal. Intel rides in the back of the bus.

  • The site is having difficulties because our bandwidth is totally overextended. Our 1Gbps line is at 100% even when there aren't 8000 people on the site. We were supposed to get a second Gbps line months ago but I'm struggling to get technicians scheduled to set it up.

cummytummies

kiwifarms.net
Joined
Jul 16, 2021
The combination of the 2. Pulling air in... pushing air out.... The purpose is to get the hot air out as fast as possible.

I don't like Linus Tech Tips but sometimes he does things right. He did a 1 year air flow experiment on 3 computers.

That was a nice video, but I also have a Fractal R5, and those filters do some serious work. They're not just a gimmick "feature" but actually keep out about ~95% of dust (just a ballpark estimate). I vaccuum the filters every few weeks while vaccuuming the floors, and whenever I open my PC it's nearly free of all dust just by keeping the filters clean.
 

Smaug's Smokey Hole

Closed for summer
kiwifarms.net
Joined
Sep 7, 2016
GT 730? You can also buy adapter's as well
Unfortunately the low end consumer cards have a dumb configuration of display outputs, the 730 and 1030 usually have 1xVGA, 1xHDMI, 1xDVI. You might be able to buy a splitter from Matrox to drive more than one monitor through that HDMI port, but that feels expensive.

I recommended the K1200 to a poster in another thread that was in a similar situation(no games, just screens) and he was very pleased with it. It's got 4xMiniDP and is functionally a 1030 iirc, I also think it can drive 4x5K monitors but I don't remember the specifics. (MiniDP can be converted to HDMI of course)
 

Grand Fucktard

kiwifarms.net
Joined
Mar 22, 2020
Outputs are decided by the specific brand and model of the card. Even the same card (RTX 3070 or whatever) can have wildly varying jacks based on the manufacturer.

You're just going to have to buckle down and read the item description for each card, unless you're looking for someone to do your job for you, which I'm not going to do. This first searh result seems to have some 3x HDMI cards but I don't know how conclusive that list is, so maybe just try searching for "gpu with 3 hdmi" like I did. Newegg does seem to also have a filter option "3 x HDMI 2.0b" if you scroll down the left side of the page.

As for performance any mid-end card should handle three monitors just fine if you're not looking to use it for anything intensive. You can also use a DP port for HDMI with an adapter if it's just regular monitor usage.
eh, I just mentioned the HDMI because that's my use case.
mainly just looking for some advice (and that's what I got -- thanks guys) because it's the modern problem..the problem isn't working to find info, it's data overload with a lot of arbitrary product and technology designations.
there's also the use case that's not exaclty left-of-field, but the focus of the industry seems to be on gaming.

anyway -- got some good leads - thanks guys
 

fuckidunno

kiwifarms.net
Joined
May 24, 2019
eh, I just mentioned the HDMI because that's my use case.
mainly just looking for some advice (and that's what I got -- thanks guys) because it's the modern problem..the problem isn't working to find info, it's data overload with a lot of arbitrary product and technology designations.
there's also the use case that's not exaclty left-of-field, but the focus of the industry seems to be on gaming.

anyway -- got some good leads - thanks guys

I can dig it, often we don't know what we don't know so asking about hidden gotchas or things you might not have thought about is important.
and there appears to be a good amt of expert knowledge here to help that could save on some tears of frustration.
It's a nightmare to hear "I wrote my own patent application" or "I got a contract template off the internet".
My doc shakes her head when she talks about people consulting "Doctor Google"
 
Last edited:

Fursei

kiwifarms.net
Joined
Jul 14, 2019
Anyway, My situation is a little different - I need a graphics card to support 3 HDMI monitors.
I'm using it for productivity (a couple of the monitors are input devices which is why they are separate) but no high load realtime 3D or anything, so I don't really need huge huge horsepower (compared to gamers)
it's a little more about connectivity and good sub-4k resolution.

any thoughts?

There's an ASUS GT 710 with four HDMI outputs: https://www.newegg.com/asus-geforce-gt-710-gt710-4h-sl-2gd5/p/N82E16814126433

There are some other places that apparently sell it.
 

The Mass Shooter Ron Soye

Do it for Null
kiwifarms.net
Joined
Aug 28, 2019

Vyse Inglebard

Someday, I'll go beyond that sunset.....
kiwifarms.net
Joined
Nov 29, 2020
Alchemist-HQ-full-length.jpg

It looks like an incredibly faggy Star Wars character. And trust me, considering that franchise, that's saying a LOT. You don't need characters to sell GPUs. Just show Fortnite or COD: Warzone or whatever TF the game of the month is running silky smooth and you're golden.
 

TheRedChair

Ultimate Chaos, Ultimate Confort.
kiwifarms.net
Joined
Jan 20, 2019
My brothers. I went full on autistic. I could not help myself. I found the holy grail for the current DIY desktop building. We will see if it does indeed works.

I purchased an AMD 5900 OEM. This is a non 5900X. This CPU Chip is only sold (ATM) to OEM companies such as HP/Dell/etc companies. You are unable to get these easily. All I've seen are systempulls of dubious pricing.

A AMD 5900X is a 12 core 24 thread CPU with a 105 TDP watt Base/boost = 3.7ghz/ 4.8ghz

A 5900 OEM is a 12 core 24 thread CPU with a with a 65 watt TDP Base/boost = 3.0ghz/ 4.7ghz
This chip came from a Alienware Pull, So its only a few months old.

This is a perfect CPU for my rig. It will be using the same wattage, 65 watts, (or very close) as my AMD 3600, which is a good CPU. This will double the core/thread count on my AMD 3600.

The all around performance difference is around 5% less than a AMD 5900X

Why I'm doing this? The CPU will keep my rig relevant for 3+ years. I've need a video editing/graphic arts/gaming rig hybrid. Since my rig is already been tuned for peak airflow management this is going to be mostly plug and play. Like the AMD 3600, the low watt usage from the AMD 5900 OEM will overall keep my motherboard cool, which means the rest of my components will be cool as well. Less heat being created from its use.

Another good chip and perhaps best for general Desk top/gaming is, if you can get it is the AMD 5800 OEM.

I was actually going to go to with the AMD 3700X because of it being a 8 core 16 thread CPU at 65 watts.

And yes the wattage does matter great when dealing with a desktop and wear and tear + cooling.
 

Barbigny

kiwifarms.net
Joined
Feb 26, 2019
I'm installing my AIO for my new build (Corsair H100x), and I saw this in the manual:
20210907_175958_copy_3024x1701_1.jpg

Every other source that I've read says to install the pump tach to CPU_OPT and the radiator fans to CPU_FAN. My motherboard's manual seems to agree with my initial assumption. What should I do?
 
Last edited:

Vyse Inglebard

Someday, I'll go beyond that sunset.....
kiwifarms.net
Joined
Nov 29, 2020
I'm installing my AIO for my new build (Corsair H100x), and I saw this in the manual:
View attachment 2519555
Every other source that I've read says to install the punp tach to CPU_OPT and the radiator fans to CPU_FAN. My motherboard's manual seems to agree with my initial assumption. What should I do?
Edited with more detail:
If you want to connect fans from your CPU cooler or radiator, use the CPU_FAN header (With a splitter/hub if you have a lot of fans). If you only have a two fan cooler, for example, you can just use CPU_FAN for one fan and CPU_OPT for the other. If you have a radiator with a large number of fans or just a large number of fans in general, you can look into getting a fan splitter/fan hub to accommodate them. For example, you can use something like that to plug all three fans (or six if you have fans on both sides) of a 360mm radiator to the CPU_FAN header. Or you can just get a Y splitter (two female connectors, one male) and plug one of the fans into CPU_FAN and the other two into the splitter, and then that into CPU_OPT. You can do the same thing with your case fans as well. Case fan into hub/splitter, hub/splitter into case fan header.
 
Last edited:

Barbigny

kiwifarms.net
Joined
Feb 26, 2019
If you want to connect fans from your CPU cooler or radiator, use the CPU_FAN header (With a splitter/hub if you have a lot of fans). If you only have a two fan cooler, for example, you can just use CPU_FAN for one fan and CPU_OPT for the other. If you have a radiator with a large number of fans or just a large number of fans in general, you can look into getting a fan splitter/fan hub to accommodate them. For example, you can use something like that to plug all three fans (or six if you have fans on both sides) of a 360mm radiator to the CPU_FAN header. Or you can just get a Y splitter (two female connectors, one male) and plug one of the fans into CPU_FAN and the other two into the splitter, and then that into CPU_OPT. You can do the same thing with your case fans as well. Case fan into hub/splitter, hub/splitter into case fan header.
The radiatior fans actually came with a Y splitter. I have it plugged into CPU_FAN right now. It's the three-pin pump tach that I'm worried about. Every source that I've read besides the AIO manual says to plug it into CPU_OPT. It just feels wrong to directly contradict the manual that it came with.
 

Smaug's Smokey Hole

Closed for summer
kiwifarms.net
Joined
Sep 7, 2016

Maybe I was right and the 12900K will be $600.
If leaked/early benchmarks are anything to go by that price would make it very compelling when compared to the 5950X. This is why competition is so good, if AMD had nothing to offer and Intel was the only game in town the cpu space might at this point be nothing but 6 core i7's and 4/4 i5's for us regular plebs.

The Intel graphics card is said to fall short of the 3070, but if it performs better than a 3060 and is priced competitively then that plus Alder Lake would put Intel in a very good position.
 

The Mass Shooter Ron Soye

Do it for Null
kiwifarms.net
Joined
Aug 28, 2019
If leaked/early benchmarks are anything to go by that price would make it very compelling when compared to the 5950X. This is why competition is so good, if AMD had nothing to offer and Intel was the only game in town the cpu space might at this point be nothing but 6 core i7's and 4/4 i5's for us regular plebs.

The Intel graphics card is said to fall short of the 3070, but if it performs better than a 3060 and is priced competitively then that plus Alder Lake would put Intel in a very good position.
I heard that but it could be based on early results. I'm sure they want it at 3070 level. The top DG2 should have up to 16 GB of VRAM so it could beat the 3070 and 3070 Ti in some games with that alone.

Alder Lake can use both DDR5 and DDR4, so that is going to affect the system prices and benchmarks. How much of the ass kicking is from DDR5? It's AMD's fault for not using it first, but it also means you could add a big gain from DDR5 to Zen 4 performance. AMD's first response will be Zen 3 with triple the L3 cache, for a claimed 15% average gain in games.

Edit: https://videocardz.com/newz/leaked-...te-with-geforce-rtx-3070-and-radeon-rx-6700xt
 
Last edited:

Smaug's Smokey Hole

Closed for summer
kiwifarms.net
Joined
Sep 7, 2016
@The Mass Shooter Ron Soye 16GB could become a marketing flex on their side if its not fast enough to do much with that amount of memory(other than allocating it). Nvidia put 2GB of VRAM on the 740m almost ten years ago and that was largely pointless for the previously mentioned reason, faster cards with way less memory ran circles around it even in higher resolutions. 2GB looked good though, like putting a spoiler on a cheap car and calling it the sport version.

Early results from Intel might also be severly hampered by their current drivers so who knows how much it will improve in the next ~6 months until the rumored release.
Their driver support is something I'm looking forward to see, they would have to hire and assign a lot of people for that, they can't just dump it into the lap of the existing IGP driver department.

It's a shame that they can't manufacture it themselves, if they didn't have to pay TSMC I imagine that they could really pressure AMD and Nvidia on price alone just to get a foothold.
 

Smaug's Smokey Hole

Closed for summer
kiwifarms.net
Joined
Sep 7, 2016
@Smaug's Smokey Hole Some games do run out of VRAM and FPS dips at 8 GB, in 1440p and 4K. The RTX 3060 is probably where the extra VRAM is more pointless. But in related news: RTX 2060 12 GB?
Huh. That could have something to do with chip availability, production of 1GB modules might be ramping down or maybe it's become less of a priority to pump those out right now. They replaced the 4x512MB 1050 with the 3x1024MB 1050 3G or whatever it was called for that reason, 512MB modules were becoming scarcer.

I don't know, it's weird but I think that it was over a year ago that I predicted that things would get dumb when it comes to memory configurations. The 3060 have 50% more VRAM than the 3070 Ti, it's even got 25% more memory than the 3080, but the latter cards will smoke it in something like RDR2 max@4k.