GPUs & CPUs & Enthusiast hardware: Questions, Discussion and fanboy slap-fights - Nvidia & AMD & Intel - Separe but Equal. Intel rides in the back of the bus.

  • Copy+pasting images with your clipboard now works again, ending the two-month reign of terror we endured of the prior notice.

Smaug's Smokey Hole

Excuse me, I currently have some brain damage.
kiwifarms.net
Joined
Sep 7, 2016
What's going on right now:
AMD finally have something competitive price wise in one segment, a lucrative segment at that.
Nvidia sells $1000+ card with raytracing supported by five games.
Intel takes another stab at a discrete graphics card(don't forget the 740) and stole the Navi designer from AMD in the process, making the AMD nerds very salty, they lost their Luke Skywalker. Will it be a hugely expensive card with disappointing performance? Will they say fuck it and repeat what they did with Larabee and turn it into a Knights Ferry?

APUs might become better to the point that they are perfectly fine, at least on AMDs side, proving Tim Sweeney right even if took 20 years. Prediction: In 20 years people will agree with Tim Sweeney that the Epic Game Store was the right move.
The closest thing Nvidia have is Tegra because they don't make desktop/laptop/x86 CPUs.
Intel will keep putting their terrible IGPs into their CPUs so they don't cannibalize their potential discrete GPU market, it seems like something they would do.

None of that will matter after 2025 when PowerVR returns with Matrox, now in a powerful Lich-form, as their board partner.


I don't know, what do you people think about the former, current and future state of GPUs?
 
Last edited:

Vecr

"nanoposts with 90° spatial rotational symmetries"
kiwifarms.net
Joined
Jun 24, 2019
AMD has somewhat better drivers for Linux, so I guess that's something.
 

dreamworks face

Model bugman
True & Honest Fan
kiwifarms.net
Joined
Sep 24, 2018
AMD hasn't been competitive with Nvidia in either price/performance, raw performance, or driver quality since about 2015 or so unless you go with junk sub-100 dollar GPUs. The only time in recent memory I went with AMD was during the bitcoin mining craze when I needed a GPU and even shit cards were ridiculously expensive, and I regretted it. I really wish AMD were more competitive - I remember having an athlon 64 x2 and a radeon 9800xt back in my college days, and it kicked total ass, but as a boring adult I always go Intel and Nvidia.
 

Smaug's Smokey Hole

Excuse me, I currently have some brain damage.
kiwifarms.net
Joined
Sep 7, 2016
AMD hasn't been competitive with Nvidia in either price/performance, raw performance, or driver quality since about 2015 or so unless you go with junk sub-100 dollar GPUs. The only time in recent memory I went with AMD was during the bitcoin mining craze when I needed a GPU and even shit cards were ridiculously expensive, and I regretted it. I really wish AMD were more competitive - I remember having an athlon 64 x2 and a radeon 9800xt back in my college days, and it kicked total ass, but as a boring adult I always go Intel and Nvidia.

Nah, there's maybe a $30 difference between the cheapest 6GB RTX 2060 and the cheapest 8GB RX5700, the RX5700 is by far the better purchase. On the ~$100 front they have actually done poorly, they didn't have anything performance wise to match the 1030 or 1050 at the same price point. Their $100 card was Nvidia's $70 card, their $150 was the equivalent of a $100 Nvidia. At retail pricing that is, the manic miner craze led to some strange pricing.
 

Dizzydent

kiwifarms.net
Joined
Oct 22, 2016
Got an rx580 used a month ago for 150 Canadian cash. While being an older now midrange card, It still runs borderlands 3 with almost all settings on ultra and maybe a couple turned down to high for a solid 55fps. You dont have to spend a lot for good gaming if you look around in used markets. It actually handles every latest triple A title in high to ultra settings just fine so dont let people convince you you need to spend a grand to get good gaming.

Oh and it can run crysis...
 

saddingodog

kiwifarms.net
Joined
Nov 13, 2019
Vega 56 is the best value card right now. Especially if you'll look at the used markets, it's amazing for the money, and the blower is pretty silent. Linux support is a plus too.
 

byuu

Non-binary they/them
kiwifarms.net
Joined
Aug 17, 2018
And it only took well over 20 years. The old ATi drivers were horrid.
AMD at least cooperates with the FOSS community and shares specs with them.
Nvidia doesn't and I've been boycotting them for years because of it.

I'm not even a freetard, I was just very pissed off when nvidia decided to drop support of my old GPU and I had to use really shitty FOSS drivers, run an outdated insecure system, or throw the card away.
 

He Who Points And Laughs

Flavortown Refugee
kiwifarms.net
Joined
Sep 18, 2017
AMD at least cooperates with the FOSS community and shares specs with them.
Nvidia doesn't and I've been boycotting them for years because of it.

I'm not even a freetard, I was just very pissed off when nvidia decided to drop support of my old GPU and I had to use really shitty FOSS drivers, run an outdated insecure system, or throw the card away.

I used nvidia only, for years, because their Linux drivers were vastly superior. It's only been in the past few years that the AMD drivers have caught up.

The nvidia blob is hated by purists, including the kernel devs.
 

thx1138

Are you now, or have you ever been?
kiwifarms.net
Joined
Jun 7, 2018
The driver situation with AMD/ATI cards kept me away from them for years. Nvidia just "did it right". I could download one driver package and get everything from the NV1 chipset up through whatever card I was using (last time I think it was when the 6800 series was the rage). No questions, no chasing down "legacy" drivers, and what have you. It was very nice.

Also I could afford to be trailing-yet-still-high-performing. One could spend around $150-$170 and still get a decent performance out of a card. The Mining craze shot prices to shit and they still haven't recovered. I love my GTX 780 and it holds its own but 2gb VRAM isn't cutting it, but the bottom hasn't dropped out of 1000 series cards price-wise. I want a decent 4gb nvidia card, but goddamn those prices.
 

Coffee Anon

kiwifarms.net
Joined
Nov 8, 2019
I think I finally got wattman to stop crashing with my vega 64 by setting it to 'Turbo' mode from 'balanced', and also making sure the power cables from the PSU to the card were clean and snug (I also switched sockets they used on the PSU).
No crashes so far today. If it remains like this for the next 2 weeks, I will consider it problem solved and install the 3rd party heat sink I bought for it.
 

a_lurker

kiwifarms.net
Joined
Jun 16, 2019
AMD has somewhat better drivers for Linux, so I guess that's something.
AMD has somewhat better drivers for Linux, so I guess that's something.
Who'd have thought if you release specs to the open source community they make your proprietary drivers look like a halfassed joke in about 2 years flat.

i had a 5850 a few years ago, about the time their drivers started getting good. I honestly hope AMD was embarassed. The proprietary fglrx driver only supported opengl 2 and couldn't run bioshock infinite, fast forward a couple months, the open source driver actually runs bioshock infinite, and has a big performance lead over the stuff that did run on the proprietary driver.

Even the windows drivers lately seem to be a lot better than they were in the past


AMD hasn't been competitive with Nvidia in either price/performance, raw performance, or driver quality since about 2015 or so unless you go with junk sub-100 dollar GPUs. The only time in recent memory I went with AMD was during the bitcoin mining craze when I needed a GPU and even shit cards were ridiculously expensive, and I regretted it. I really wish AMD were more competitive - I remember having an athlon 64 x2 and a radeon 9800xt back in my college days, and it kicked total ass, but as a boring adult I always go Intel and Nvidia.

eh, i think the nvidia/amd issue is a little more complex than that.

In the past it seems ATI/AMD's solution seems to be throw hardware at the problem, of course nvidia does that too, but they understand software is also important as a gpu (or any computer hardware) is fucking useless without software.

The old 5850 i used to have, despite being somewhat older was still fairly valuable because it was really fucking good at crypto mining. It had assloads of raw compute power for the time compared to nvidia.

Nvidia spends a lot of money with developers, pushing their "features" on developers and giving them cash to ensure they support hairworks/physx/rtx/etc. They were smart back around 2012-2015 partnering up with unreal and getting a lot of their tech baked into UE3, as a lot of the big titles around that time (bioshock, borderlands, etc) used unreal engine.


I think AMD is getting competitive (definitely in cpu, possibly in gpus) but more importantly right now, nvidia is fucking up.

They currently sell three goddamned variants of the 1660 regular 1660, 1660 super, 1660 ti. They're binning the chips and breaking them out into slices of the same price bracket between their own models, and are going to wind up competing with themselves.
They missed out on being the gpu in the current gen playstation and xbox (both amd), they're in the switch though.
They're getting comfortable like intel on the whole "fastest" thing, and bet a little too hard on the RTX thing.

yeah, realtime raytracing has been a dream for years, but its not fully baked. Another generation or two it may be amazing.

* I'm not 100% positive on all of this, i know a bit about graphics stuff, but am by no means an authority, this could be entirely wrong as it is based on my very basic understanding of graphics tech
*
In modern games its almost like FXAA , its a fast approximation of raytracing for the most part (small number of rays, noise filter to get rid of the gaps and average to get something that looks fairly good without tanking performance), with traditional gaming rendering tech doing the vast majority of the work and then a raytracing pass on top.
The reflections (particularly the off screen/non screen space reflections) hit hard, because you have to essentially factor in and at least partially draw shit that would normally be culled due to not being visible.

from nvidias marketing

"Rays are cast towards reflective surfaces visible from the player’s camera, rather than from the player’s camera across the entire scene. This greatly reduces the number of rays that have to be cast, which increases performance."

This isn't ray tracing, but raycasting and PBR.

raycasting is a performance hack that sort of works backwards to raytracing. Raytracing is essentially simulating light ,starting at a light source ,hitting shit in the scene and illuminating/bouncing around, possibly bouncing off away from the screen and ultimately doing a lot of work to never be seen.
Raycasting starts at the camera/eye/each pixel on the screen and travels outward toward the scene to ensure every bit of work used is actually going to be put to use. '

Raycasting scales i think linear to screen resolution in terms of compute, raytracing scales based on (in addition to resolution) scene complexity in terms of geometry and the properties of the stuff in the scene (reflective, semitransparent (sub surface scattering), etc)

from the above it sounds like they're going through the surfaces in the scene, anything that is reflective if fires rays (more/less depending on reflectiveness of the surface) from the camera back towards the surface and out to the scene itself copying what it hits as a reflection


I remember reading bits about 2016 doom, one of the engine devs mentioned using a lot of clever caching tricks to speed things up. I'm speculating, but it could be something like keeping lower LOD information about the scene cached for quick references for stuff like reflections without having to fully redraw.

one thing i learned from playing ut3, doom 2016 and to a lesser degree rage 1 (In a bad way compared to the other two), if your game is entertaining, and fairly fast paced, You can hide texture pop-in/LOD streaming well enough that the user won't notice or care much, if you're smart about it

I think Quake 2 RT is essentially fully raytraced, but is still using the noise filter to fill in the gaps. low poly enough that its doable with a fairly high number of rays while still running and looking good.

I could be wrong, but the concurrent int/float calculation in the rtx series cards i think may wind up actually being more useful for devs, but thats not nearly as shiny as raytracing.

I know most games are going to be doing mostly floating point calculations, but being able to do floating point and integer work in the same single cycle, if used cleverly can be leveraged to do a LOT.

its cool that amd is contributing to open standards (freesync, opencl,etc) instead of nvidia branded shit tied to licensing fees, but they need to probably start trying to get together some sort of initiative with game devs. Being that a console games are ultimately the bread and butter of most big game companies and eventually ported to pc, amd should have a good working relationship with the devs (especially since the consoles are mostly running amd hardware), but it still seems like in the PC market nvidia is targeting the developers better.

I know nvidia has done shit with the demoscene, and regularly gives presentations/talks on new graphics and rendering techniques, I'm sure amd has done some of this as well, but in my eyes they've not been as visible about it.

Amd seems to be making graphics hardware, nvidia makes graphics hardware but seems to be putting just as much into pushing graphics software forward.
 
Last edited:

Smaug's Smokey Hole

Excuse me, I currently have some brain damage.
kiwifarms.net
Joined
Sep 7, 2016
In the past it seems ATI/AMD's solution seems to be throw hardware at the problem, of course nvidia does that too, but they understand software is also important as a gpu (or any computer hardware) is fucking useless without software.

In the past ATi was a much larger company than Nvidia so they had that monolithic "eh, fuck it" attitude, a bit like Intel, where buying new hardware would solve old software problems or bugs. Nvidia was forced to have better support and be more nimble than the lumbering giant that put the fucking Mach64 in millions of shitboxes(and servers).

They missed out on being the gpu in the current gen playstation and xbox (both amd), they're in the switch though.

Now that Nvidia is the larger company they have grown into the lumbering giant that expect people to follow wherever they lead, they are set in their ways. They don't sell chip designs, they sell chips, that's what made Microsoft sour on them with the original Xbox. It was the exact same thing that made them sour on Intel. AMD/ATi started to license their work in a way that is similar to ARM, MIPS, IMG - all the stuff you find in tablets, smartphones and also consoles. Nvidia themselves licensed ARM to put their GPU tech into what is now Tegra, but they don't want to sell a Turing/Pascal/whatever license in the same way they bought the ARM license.

At the same time AMD have started pouring resources into the driver situation. "Our drivers aren't crap(anymore)" is part of their marketing in the form of the tools they show off to overclock, handle voltages, fan speeds, general user freedom.


ATi made one potential mistake in the past, when it comes to chip design and market share. They bought BitBoys and brought their GPU into the Radeon Mobile Group. Then they sold the Radeon group. The buyers rebranded it "Adreno", an anagram of Radeon, and it is pretty popular these days.
 

Just Some Other Guy

kiwifarms.net
Joined
Mar 25, 2018
Managed to score a 5700xt for $325 off Reddit to replace an ageing 980ti. Waiting on a bykski block from Ali to take it as far as I can. Navi looks like a significant step forward, finally catching up to Nvidia in perf/transistors. The 5500 is going to do nasty things to the 1650 if it gets in the same price band. Has the exact number of cores and almost transistors as the 1660s.
 

Duncan Hills Coffee

Sewn back together wrong
kiwifarms.net
Joined
Dec 13, 2016
I've never really used an actual GPU before (I am building my first PC with a dedicated one soon, all of my systems have used onboard graphics), but I'm ultimately going with Nvidia for a few key reasons.

Firstly, the AMD laptop I have just doesn't offer a great selection of graphics options. Now admittedly, a lot of that has to do with the fact that it is technically a lower tier graphics processor, but even then I've noticed that a lot of user guides on how to get older games running properly tend to lean towards Nvidia since most of the options don't seem available to even higher tiers of AMD cards. Now I'm pretty uneducated in this department, so I might be getting it totally wrong, but to me Nvidia seems like it would be easier to work with when it comes to messing around with graphics settings. Half the time the guides recommend you download some third-party software to get AMD to do what you want.

Secondly, AMD doesn't support OpenGL that well, at least not according to the research I've done and personal experience. A huge majority of my PC game catalogue consists of older titles, most of which run on OpenGL, and most of the time they run badly on my laptop, which is annoying because I had an older Intel laptop that ran those games just fine. To my extremely limited knowledge, Nvidia has better support for OpenGL drivers, and I like that.

Like I said, I'm not really experienced in this field, just thought I'd give my two cents on it.
 

Just Some Other Guy

kiwifarms.net
Joined
Mar 25, 2018
I've never really used an actual GPU before (I am building my first PC with a dedicated one soon, all of my systems have used onboard graphics), but I'm ultimately going with Nvidia for a few key reasons.

Firstly, the AMD laptop I have just doesn't offer a great selection of graphics options. Now admittedly, a lot of that has to do with the fact that it is technically a lower tier graphics processor, but even then I've noticed that a lot of user guides on how to get older games running properly tend to lean towards Nvidia since most of the options don't seem available to even higher tiers of AMD cards. Now I'm pretty uneducated in this department, so I might be getting it totally wrong, but to me Nvidia seems like it would be easier to work with when it comes to messing around with graphics settings. Half the time the guides recommend you download some third-party software to get AMD to do what you want.

Secondly, AMD doesn't support OpenGL that well, at least not according to the research I've done and personal experience. A huge majority of my PC game catalogue consists of older titles, most of which run on OpenGL, and most of the time they run badly on my laptop, which is annoying because I had an older Intel laptop that ran those games just fine. To my extremely limited knowledge, Nvidia has better support for OpenGL drivers, and I like that.

Like I said, I'm not really experienced in this field, just thought I'd give my two cents on it.
After doing a little reading, looks like Navi is still plagued with poor opengl support on windows. What a shame.
 

Duncan Hills Coffee

Sewn back together wrong
kiwifarms.net
Joined
Dec 13, 2016
After doing a little reading, looks like Navi is still plagued with poor opengl support on windows. What a shame.
That's a huge dealbreaker for me honestly. Like I said, most of my catalogue runs on OpenGL. There are a few games that run okay, but even those have some bad stuttering. The most I can do is turn on Triple Buffering, which tends to make it a little better but even so it's not ideal.
 

Just Some Other Guy

kiwifarms.net
Joined
Mar 25, 2018
That's a huge dealbreaker for me honestly. Like I said, most of my catalogue runs on OpenGL. There are a few games that run okay, but even those have some bad stuttering. The most I can do is turn on Triple Buffering, which tends to make it a little better but even so it's not ideal.
Yeah I hear ya. Navi is a great looking chip, especially on water, but the truth is for certain applications an equivalent Nvidia card is going to function better.
 

CreamyHerman’s

(Worrying intensified)
kiwifarms.net
Joined
Dec 15, 2017
I'm a Radeon guy. They've consistently offered better performance per dosh and it upsets me when people say 'Radeon software is lacking'.

Meanwhile Nvidia is stuck on XP dreams with their XP like UI and dropped Kepler support hard, where even a 7970 supports modern titles better than a 780. Hell even a R9 290 still outperforms a Titan to this day.

Open GL is not bad on AMD with Navi, though it could be better.