GPUs & CPUs & Enthusiast hardware: Questions, Discussion and fanboy slap-fights - Nvidia & AMD & Intel - Separe but Equal. Intel rides in the back of the bus.

Just Some Other Guy

kiwifarms.net
Joined
Mar 25, 2018
I'm a Radeon guy. They've consistently offered better performance per dosh and it upsets me when people say 'Radeon software is lacking'.

Meanwhile Nvidia is stuck on XP dreams with their XP like UI and dropped Kepler support hard, where even a 7970 supports modern titles better than a 780. Hell even a R9 290 still outperforms a Titan to this day.

Open GL is not bad on AMD with Navi, though it could be better.
Oh no doubt that gets annoying. It's pretty well known that Radeon cards usually age better, but at the same time day 1 performance does matter. Also some people are going to have niche uses. If someone primarily uses opengl, I'd still say go Nvidia.

Me personally? I just want my freesync back and I usually hang onto cards for a few years or so. Here's to hoping Navi keeps up the fine wine analogy.
 

Glitched_Humanity

Far worse than you.
kiwifarms.net
Joined
Oct 25, 2019
I prefer Nvidia over AMD.

My timeline of GPUs
>Voodoo rage II (Used on my Windows ME laptop. Mostly used for DOS Games and Windows Media Player Visualizers.)
>8600gt(Overpriced POS. Driver crashed a lot.)
>9800gt(Use as a backup. was a great card for the time)
>5970HD(Overheated a lot.) Sold for 200.00 in 2010
>Gtx 550(A piece of shit. DVI port snapped off.)
>6870x2 HD (Use for my dying XP desktop. Power Hog, Always at 80c but chugs along.)
>gtx 780 (best video card i owned.) Died after being in the oven 3 times. lasted 6+ years. (Got at a goodwill NIB for 40.00)

Current:
>gtx 980ti (probably last me until 2020) Paid 150.00 for in 2017. It's a great card and runs most AAA Games at 60fps on 4k.

Looking for a new gtx 780ti for my Windows xp computer.

With my gtx980ti, it seems to be bottlenecked by my intel 3570k.
 
Last edited:

Gustav Schuchardt

Forum Staff
True & Honest Fan
kiwifarms.net
Joined
Aug 26, 2018
The GPU market is fascinating to me because all the big players make major mistakes

1) NVidia was the performance leader but they pissed away their advantage by devoting half the die to ray tracing and DLSS hardware with, as you point out 'the support of five games'

2) AMD launched a GPU with HBM2 memory which made it hard to undercut NVidia's GDDR based cards and stay profitable

3) Intel have terrible integrated graphics and have tried and failed multiple times to produce a discrete GPU

On the other hand, there's some good news. You can buy a 16xx card from NVidia if you don't care about ray tracing in this generation. AMD has some pretty decent laptop chips with decent integrated graphics now. Intel poached that AMD guy and might actually do a fairly normal discrete GPU rather than some weirdo HPC disguised as a GPU like Larabee/Knights Landing. Larabee/Knights Landing is actually a very interesting bit of hardware even if it's not a GPU. 'Lots of Atom cores rather than few Core ones' is a bit radical but it might pay off in HPC or servers. And maybe in the long run games will use thread pools and be scalable to this sort of hardware.

What makes this stuff interesting, even if you're too busy to actually play games is that you've got a bunch of massive companies hiring very smart people and all competing hard. And still, they quite often miss what looks like an open goal. It's not like CPUs where both AMD and Intel have their strengths and both produce 'good enough' chips for a couple of hundred bucks.

>gtx 980ti (probably last me until 2020) Paid 150.00 for in 2017. It's a great card and runs most AAA Games at 60fps on 4k.

I've got a GTX 980 and it's actually pretty fine for the minimal gaming I do. I don't do enough to justify replacing it.
 

Gustav Schuchardt

Forum Staff
True & Honest Fan
kiwifarms.net
Joined
Aug 26, 2018

CreamyHerman’s

(Worrying intensified)
kiwifarms.net
Joined
Dec 15, 2017
mean it will still do it, but apparently there's been long-standing optimization issues with GCN. I did some light searching and it appears Navi is still somewhat hampered. It could (and should) be better.

Here's the thing: Opengl on Linux is faster than opengl on Windows. And opengl on Windows in terms of following opengl spec, AMD follows it to an ANAL degree, whereas Nvidia breaks opengl spec. Which I am not surprised, the company known for threating videogames makers not to use vulkan or dx12 back when Maxwell was a thing.
 

Just Some Other Guy

kiwifarms.net
Joined
Mar 25, 2018
Alrighty, honestly I personally don't care about opengl performance. Guy asked if it still performs worse on windows with Radeon. It does. At the end of the day most people don't care why something is faster or slower, just the numbers when all is said and done.

TBH, the 5700 probably has enough horsepower to handle any unoptimizations. But who knows, ymmv depending on use. Also they're not shoving them in laptops like the person uses.
 

Smaug's Smokey Hole

Excuse me, I currently have some brain damage.
kiwifarms.net
Joined
Sep 7, 2016
Larabee/Knights Landing is actually a very interesting bit of hardware even if it's not a GPU. 'Lots of Atom cores rather than few Core ones' is a bit radical but it might pay off in HPC or servers. And maybe in the long run games will use thread pools and be scalable to this sort of hardware.

What makes this stuff interesting, even if you're too busy to actually play games is that you've got a bunch of massive companies hiring very smart people and all competing hard. And still, they quite often miss what looks like an open goal. It's not like CPUs where both AMD and Intel have their strengths and both produce 'good enough' chips for a couple of hundred bucks.

I would have liked to see what could have come out Larrabee, a bunch of x86-cores with some ROPs/TMUs at either end sounds like what Kutaragi imagined cell would have been like.

Talisman is another old GPU I'm still curious about, it seemed to be some kind of tile based (deferred) renderer that Microsoft was developing at one point, cancelled in 1999 I think. That was when they had big plans on how they would push 3D for Windows. WinG was memoryholed, the wet dogshit that was Direct3D was drying up nicely, they had DirectEngine in the works at MonoLith so why not a GPU? Their advantage would be the API and drivers, native D3D from day one, instead of Glide, PVGL, S3 MeTaL and all the other ones that existed during those 2-3 years. They were aiming to provide the platform(windows), API(DirectX), software(VS/DXSDK/DirectEngine) and hardware(Talisman).

With all that in mind the Xbox wasn't much of a surprise, it was in line with their current and previous efforts.

Later they were kicking around "DirectPhysics" to have a default/optional physics integration into DX, that mostly solved itself.

The 10 series chips with Iris Plus graphics are better than the previous chips, but they won't hit 60 FPS at 1080p with low settings in Rise of the Tomb Raider or Far Cry 5. Which would irritate the hell out of me if I wanted to play those games.

https://www.pcmag.com/article/369884/intel-ice-lake-benchmarked-how-10nm-cpus-could-bring-majo

Iris is decent enough for what it is but it's part of Intel's premium priced chips meaning that a user willing to pay that price, even in the desktop space, will end up with something they won't use because it's not competitive with a $100 card.
As far as I know it largely exists because Intel had tons of fabrication capacity at a previous node and they knew that they could crank out tons of eDRAM and use it as cache on a CPU/IGPU made at a different process node.
 

Fluke

kiwifarms.net
Joined
Sep 23, 2019
My logic is buy intel and nvidia for personal use and buy stocks in amd.
Nvidia to me always worked, looked and performed better. Same with Intel.
Its gonna take a few years for me to buy any amd hardware cause it always seemed like they had lower qa & qc standards.
 

Smaug's Smokey Hole

Excuse me, I currently have some brain damage.
kiwifarms.net
Joined
Sep 7, 2016
What if Nvidia brought back the 3DFX brand along with Voodoo?

NES/SNES mini and replication arcade cabinets are popular. They could make a "Win95/98" mini-pc and really milk the nostalgia by sticking a bunch of logos from defunct companies on it. Orchid 3dfx graphics card, Diamond Multimedia A3D soundcard, Cyrix CPU, ABit motherboard, network card with a fake coaxial connection to keep it real, the computer is of course beige.

It is astonishing how fast things happened during that time. 3dfx released the first Voodoo in October 1996, GLQuake was released a couple of months later in January and that became the killer app. They were king of the world. Almost exactly four years later, October of 2000, they released their last card the Voodoo4 4500. It was a Direct3D 6.0 card, released just four months before the first D3D8.0 compatible card, the GeForce 3, introduced pixel and vertex shaders by... well there was the water in Morrowind, a game that was released a year later.
RTX is in a similar situation as shaders were back then. I think the Xbox really helped and I think next-gen consoles will help raytracing on the PC in the same way.

For your mild amusement, the first pictures of the 3dfx/Gigapixel card.
3dfxgigapixel1.jpg

3dfxgigapixel2.jpg
 

SmileyTimeDayCare

This is pleasure!
kiwifarms.net
Joined
Dec 3, 2018
I've looked pretty hard at moving to AMD but just can't move away from Intel/nVidia.

My laptop is an i7/980m that has been hanging in pretty well. My desktop is an older I5/760 that by some miracle is still running. It runs pretty 24/7 and has since I want to say 2012. The card is definitely showing signs of falling apart but you'll have that.

I don't have anything against AMD, I had a K6-2 back in the day and it was great, but they aren't doing anything that is going to get me to move.

I'm hoping my desktop limps to Tiger Lake's release at which point I will drop a ridiculous sum on a new build.
 

Just Some Other Guy

kiwifarms.net
Joined
Mar 25, 2018
I've looked pretty hard at moving to AMD but just can't move away from Intel/nVidia.

My laptop is an i7/980m that has been hanging in pretty well. My desktop is an older I5/760 that by some miracle is still running. It runs pretty 24/7 and has since I want to say 2012. The card is definitely showing signs of falling apart but you'll have that.

I don't have anything against AMD, I had a K6-2 back in the day and it was great, but they aren't doing anything that is going to get me to move.

I'm hoping my desktop limps to Tiger Lake's release at which point I will drop a ridiculous sum on a new build.
Zen2 is a brilliant cpu platform. What happens if tiger lake continues the trend of 10nm, aka failing?

I'm just curious, what do you look for in a cpu?
 

SmileyTimeDayCare

This is pleasure!
kiwifarms.net
Joined
Dec 3, 2018
Zen2 is a brilliant cpu platform. What happens if tiger lake continues the trend of 10nm, aka failing?

I'm just curious, what do you look for in a cpu?

Performance mixed with factors like not pulling 200w+ at load.

Oh, and I realize how the wording looks but I probably wouldn't go for a TL. That would just be when I started deciding what I was going to do in terms of building a new system.
 

Just Some Other Guy

kiwifarms.net
Joined
Mar 25, 2018
Performance mixed with factors like not pulling 200w+ at load.

Oh, and I realize how the wording looks but I probably wouldn't go for a TL. That would just be when I started deciding what I was going to do in terms of building a new system.
Ahhhh, gotcha. It just made me curious was all. Though 200w of power draw is hard to hit on most new CPUs. For instance, my 3600x with pbo on only does 70w or so.
 

Smaug's Smokey Hole

Excuse me, I currently have some brain damage.
kiwifarms.net
Joined
Sep 7, 2016
Ahhhh, gotcha. It just made me curious was all. Though 200w of power draw is hard to hit on most new CPUs. For instance, my 3600x with pbo on only does 70w or so.

The 3600X is a 65W or 70W part I think. At stock most high-end Intel and AMD CPU's are rated with a 95W TDP, some a bit above that but they're usually in the "gotta go fast!" benchmarking territory and priced as such. It's when overclocking the power curve reaches for the sky, a 30% boost in base frequency might lead to a 150% increase in power consumption.