GPU Prices - Damn those Miners and their imaginary shekels!

  • Order for the new server will be going in ASAP. Performance will be rocky until then (rip).
Status
Not open for further replies.

ColtWalker1847

kiwifarms.net
Actually, manufacturers are very wary about this bubble, because they know that miners will ultimately dump their used cards on eBay/Craigslist, which will fuck up the market.
It's the 22lr bubble all over again. Idiots who barely understand the economic forces at play jumping all over something that is in perceived shortage and the manufacturers are like "nah, man. we ain't expanding our factories for a fad". It'll eventually work itself out but not after a whole lot of rubes were swindled.
 

neger psykolog

moon goons for communism detection
True & Honest Fan
kiwifarms.net
It's the 22lr bubble all over again. Idiots who barely understand the economic forces at play jumping all over something that is in perceived shortage and the manufacturers are like "nah, man. we ain't expanding our factories for a fad". It'll eventually work itself out but not after a whole lot of rubes were swindled.
You'll never make the "gaming crowd" happy and at the end of the day it largely doesn't matter because they'll buy anything you put in front of them even if they say they won't.
  • Not enough of the latest Nintendo/Sony/Microsoft product = fuck Nintendo/Sony/Microsoft I'm never buying another product from them again (proceeds to buy all their products for eternity)
  • Shelves stocked with some other product = ha ha these guys are going broke and their product is a 100% failure!
  • Day 1 console release = wow this is so cool I bought my favorite console on day 1! (then console inevitably breaks from early manufacturing defects which are largely expected) fuck this shit i'm never buying another product from these faggots again (proceeds to buy every product from them for the next 6 decades)
  • Can't buy a GPU I want because there's more demand than supply = fuck the people buying these, I want one! This is bullshit!
  • Microsoft/Sony/Nintendo are restrictive about what games can/cannot be released on their platforms = This is fucking bullshit! (continues to pay for all their shit) This quality control stuff is bullshit because the independent game developers can't release their games without making quality games!
  • Microsoft/Sony/Nintendo force paid online access = This is fucking bullshit! No way I'm going to pay for this shit! (continues to pay for that shit)
  • Steam opens the doors for basically any developer to release games regardless of content = This is fucking bullshit! There is no quality control!
  • These CDs and DVDs are fucking bullshit! I want digital downloads of all my games! (buys digital downloads and then proceeds to cry because they can't resell their license keys)
 

carltondanks

kiwifarms.net
if i'm not mistaken, they're farming etherium which is a cryptocurrency that's impossible to make a dedicated ASIC miner for
(for those who don't know, an ASIC miner is a piece of hardware that's dedicated to mining a specific crypto or scrypto currency.)

The people who make ASIC miners have this system where you can pre order months in advance to get your specific ASIC miner that you want. More often then not, they'll only have the pre orders and will never, and i mean NEVER, have the option to just buy one on launch day. They do this to avoid this exact scenario, and they also do it to get exact numbers of what they want to create.

i heard that the etherium developers are making it so that their currency is a little more ASIC friendly. they're probably doing this because of all the flak they're getting
 

Done

True & Honest Fan
kiwifarms.net
I understand what they're getting at but I disagree.

Inevitably without mining and industrial/commercial applications (such as machine learning and AI projects) they're going to run into the same problems the PC industry has as a whole. That problem is the same problem smartphone manufacturers are having and its the same problem car manufacturers have had for years.
I agree on diversification being a must for both NVIDIA & AMD, but around 60% of NVIDIA's revenue comes from consumer-grade graphics cards. So gaming is still a hugely significant market for them.

If you look at Steam Survey results, the 1060 became the most popular on Steam last year. This means that a not-insignifcant number of people are upgrading their rigs as they go.

Besides massive population increases and more humans getting access to disposable income and internet access once you release a good product there is less of a reason to upgrade. Once 2 and 4 core processors became standard, your average PC user (not gamers) had very little reason to seek an upgrade because their computer could do practically everything they wanted.

Same with smartphones, once you have the iPhone 634 and the iPhone 635 comes out, what reason do you have to upgrade (that's why Apple hates external storage, because it'd give people a great reason to never upgrade).
But here there is a reason to upgrade, PC games keep getting more and more graphically intensive. This is basically planned obsolescence that is inherent in the value proposition.

Remember in the past when you had to buy a soundcard because motherboard soundcards were universally shit? Turns out that the rise of USB sound as well as decent motherboard sound means that companies like Creative (who tried very hard to control the entire soundcard industry and succeeded for many years: https://en.wikipedia.org/wiki/Creative_Technology ended up being hit hard once soundcards weren't of that much use anymore)
I agree on this part, Creative used to be really scummy in their ways way back then, but I don't know how it figures into the GPU discussion we have now, if you have a sound card or external DAC/amp, then you're set for the next 5 years, minimum. There's no real need to upgrade it because Dolby Digital 5.1 has been the standard of choice for years now.

Top end graphics cards cater to a market that is small. Most games aren't even able to utilize more than 4 cores. SLI? barely any games even support it. 4K gaming? Barely any games can even effectively use the most powerful graphics cards let alone two of them (source: I have 2 x 1080 TIs and there are almost no games that can even fully use one, let alone both, as in the game will cap out at a certain point because of its own limitations rather than scale any further. The amount of money/investment required to make games work for massive GPUs is probably worthless because almost no one has them and the majority of "hardcore gamers" generally don't even have much disposable income to buy stupid shit with in the first place)

Game developers know this and nowadays spend increasingly less time on catering to the "top end" of the PC market, because its fucking tiny and alongside sales of PCs has been on a downward trend for years. Developers and publishers care so little about "hardcore gamers" that even spending a few development dollars to put an FOV slider in the game isn't worth the effort when you can just shovel out shit and most people won't complain. I'll concede that in the near future it may be that games take a drastic turn when it comes to scaling with hardware but so far it doesn't look like that's going to happen anytime soon.
GTX 1070, 1070 Ti, 1080, 1080Ti and the AMD Vega range are all top-end cards with a price tag equivalent to an entire lower to midrange PC, I don't disagree on that.

However, the AMD 570, 580 range and the GTX 1060, 1050Ti are all midrange offerings, and a +50% hike in street prices is a big deal for people who want to build their 1st PC, which is a surprisingly high amount, I've had 5 different people of differing age groups and requirements who asked me to help them build a gaming PC in the past 2-3 months.

As you mentioned, SLI is increasingly a non factor now, NVIDIA has already removed SLI fingers from 1060 cards and below, and has effectively removed 3 and 4-way SLI from gamers by limiting it to only a few applications.

So the crying people are doing about cryptocurrency mining and its affect on graphics card sales? its actually crying about something that has already happened and has already made nvidia huge sums of money:

View attachment 378543

View attachment 378541
https://www.anandtech.com/show/10864/discrete-desktop-gpu-market-trends-q3-2016/4

With the affect mining and other applications have on the GPU market, nVidia can probably stop caring about the gaming market being a primary interest for quite some time. Gamers will get their cards eventually.
Partly answered in the above sections.

Also, this is isn't the 1st time this happened, and it won't be the last, it's just the 1st time the price surges reached cards like the 1080/1080Ti, which held their prices the last time Crypto surged, I fully expect this to happen with Volta as well, depending on hash-rate and pricing of course.

And yeah, salty gamers are this thread's fuel lol.

It's also important to realize the distinction between fabricators like nVidia and manufacturers (like MSI etc). I'm not exactly sure of how the money changes hands in that relationship, but I assume nVidia makes money whether cards are sold or not (if they didn't manufacture them). So I'm guessing nVidia cares even less.

It's a weird situation where GPUs only really got developed as much as they did because of gaming (that's debatable) and now the actual use and demand for them comes from other sources.

nVidia doesn't just make gaming GPUs though which is probably why they're going to take full advantage of cryptocurrency morons taking out 20 credit cards to buy graphics cards. I'd guess that their long terms bets are probably more in providing general purpose computing power for phones/consoles and AI/machine learning applications. Sales of the latest and greatest GPUs for actual gaming purposes are tiny.
NVIDIA has already highlighted crypto miners as an opportunity in their earnings reports, but right now, they are limiting GPU sales to 1 GPU/customer in their stores and are instructing retailers to do the same. Which are clearly moves to prioritize gamers, not sure how much difference it makes though.

EDIT:
Also now that Samsung has entered into the cryptoshekel industry (unclear if they'll also expand to GPUs) you can expect the current fire to only increase in size:
https://www.gamersnexus.net/industry/3225-samsung-confirms-asic-miner-production
What's also interesting is that Intel has just hired Raja Koduri, who used to run AMD's Radeon Technologies Group, so they may be planning to enter the GPU arena as well.

I assume Samsung has experience making GPUs considering that they make them for their own phones, but I don't know how much overlap there is with desktop graphics, or if they will be even interested in that market.

There's another, more logical reason too IMO: GPU makers have to have some for all the computer manufacturers out there. Those buy up GPUs in bulk and slap them in gaming computers you can find at Micro Center. At this point it's cheaper to buy one of those than it is to build your own because of the GPU markup from places like Newegg (which have a "1 per person limit" on top of high prices to try to slow the influx of miners).
That's a reason that totally slipped my mind. Thanks for your comment.

There's something else that happened too, consoles became a thing. Game developers found out that more people will buy a game on a platform with no piracy and that's far more accessible than a PC (You'd be shocked how many PC gamers lose it if you have to edit an INI file or download a mod).

The Xbox got PC developers onto consoles, and they learned that more people buy games on consoles.

Also used GPUs worthless for mining (for example the GeForce 7xx series) can be found for low prices on eBay still, I got a friend one of those and all the games he plays work fine on it.
I think PC gaming is getting more and more prominent as time passes, you notice that Steam is now becoming a base platform in nearly all game announcements, even Microsoft has allowed all of their premium 1st party titles to appear on PC, most of them being exclusive to the Windows App Store.

Ultimately, I don't completely believe in the PCMR propaganda that PC will eventually make consoles obsolete, but their work has had its fruit in pushing PC as a worthwhile platform.

Also only relatively recent consoles make use of GPUs (or at least what we call a GPU today).

I think the first major console with a brand-name GPU (i.e. a gpu from a company that specifically makes GPUs and doesn't just shovel out the odd one as an experiment) was the original Xbox in 2001. The first Playstation to make use of one was in 2006 (PS3).

All of the consoles starting to use nVidia/ATI hardware or at least hardware that is very similar has seemed to make a bit more of an even playing field.

It also seems like nVidia is starting to try and sell products to the mobile phone market too which will be interesting if it ever takes off at larger scale.
All home consoles use AMD hardware (CPU+GPU), the Nintendo Switch uses a Nvidia GM20B. I am unsure how much Nvidia controls of the mobile market, given that most phones now have Qualcomm stuff in them.
 

Arse Biscuit

Horrible Bastard
kiwifarms.net
Bitcoin is probably the best laugh I've had in 5 years. It's like watching goldbugs, if the goldbugs were suffering from fetal alcohol syndrome and maybe trying to land a crippled jumbo jet.

You know they're doomed, but you can't help cheering them on.
 
  • Feels
Reactions: Done

neger psykolog

moon goons for communism detection
True & Honest Fan
kiwifarms.net
I agree on this part, Creative used to be really scummy in their ways way back then, but I don't know how it figures into the GPU discussion we have now, if you have a sound card or external DAC/amp, then you're set for the next 5 years, minimum.
Well there's a shift happening slowly in trying to make computers less component-based (by soldering CPUs to motherboards etc) and also because GPUs are increasingly being aimed at being GPGPUs. There are obviously a lot of technical barriers and we're not that close, so its unclear exactly when we'll ever get a "desktop computer" that doesn't have a separate CPU, but we do already have computers which don't have separate GPUs.

When a computer becomes powerful enough without having an upgrade path is probably the point where a discrete GPU will cease. Graphics cards nowadays are exceptionally powerful and it won't be too long until it likely reaches the point of diminishing returns (where upgrading your GPU more than every 5 years would be pointless).

A lot of the stuff we're seeing now like smart TVs, chromecast dongles, and Steam Link wasn't even possible just a few years ago because of how expensive the hardware was. It'll be interesting to see how processing power changes in the future when you consider stuff like edge computing (meaning someday it may just be a giant box/server in your house/apartment that processes every users needs including running games at 8K resolution).

(as you mentioned about Intel poaching an AMD guy) Intel is already aiming to try and get into the GPU market and will eventually succeed. Intel needs to do relatively little to absolutely decimate the current GPU market and put both AMD and nVidia out of business very quickly. Once Intel gets its way it can likely influence the nature of motherboards etc as we know them which may mean the point at which discrete GPUs go away.

So it'll basically end up being a race, and what I was implying by that is although nVidia is known today for GPUs the concept of a GPU may not exist as we currently know it. Knowing how technology changes, that might happen extremely quickly too.

A lot of the discussion surrounding GPUs and gaming is better put into perspective when you look at the userbase. Steam's peak userbase is 18 million (and that's a record set only a few days ago). Even if you assume the highest-selling games like Minecraft (100 million) are an indication of how large the market is, companies like Intel ship 3-400 million processors a year (according to some google search I just did).

So the gaming market is still tiny, any way you look at it, whoever provides the faster/cheaper option for general purpose computing that does away with the concept of GPUs will be the market winner ( I'm talking about a decade or more time scale when I say that).
 

CIA Nigger

"IT'S NOT A FETISH MOM"
Supervisor
True & Honest Fan
kiwifarms.net
I think PC gaming is getting more and more prominent as time passes, you notice that Steam is now becoming a base platform in nearly all game announcements, even Microsoft has allowed all of their premium 1st party titles to appear on PC, most of them being exclusive to the Windows App Store.

Ultimately, I don't completely believe in the PCMR propaganda that PC will eventually make consoles obsolete, but their work has had its fruit in pushing PC as a worthwhile platform.


All home consoles use AMD hardware (CPU+GPU), the Nintendo Switch uses a Nvidia GM20B. I am unsure how much Nvidia controls of the mobile market, given that most phones now have Qualcomm stuff in them.
Oh for sure, even if many of the games are console ports or are on par with console versions instead of being vastly better/different, digital has helped PC make a comeback somewhat, along with allowing anyone to self-publish games. Even Asians are porting their games to Steam, and for the longest time the PC was treated as a platform for hentai games only in Japan.

I don't think consoles will be replaced at least for this gen, maybe even next gen and for 2 good reasons: They're stupidly easy to plug into a TV and use, and they're cheap.

Well there's a shift happening slowly in trying to make computers less component-based (by soldering CPUs to motherboards etc) and also because GPUs are increasingly being aimed at being GPGPUs. There are obviously a lot of technical barriers and we're not that close, so its unclear exactly when we'll ever get a "desktop computer" that doesn't have a separate CPU, but we do already have computers which don't have separate GPUs.

When a computer becomes powerful enough without having an upgrade path is probably the point where a discrete GPU will cease. Graphics cards nowadays are exceptionally powerful and it won't be too long until it likely reaches the point of diminishing returns (where upgrading your GPU more than every 5 years would be pointless).

A lot of the stuff we're seeing now like smart TVs, chromecast dongles, and Steam Link wasn't even possible just a few years ago because of how expensive the hardware was. It'll be interesting to see how processing power changes in the future when you consider stuff like edge computing (meaning someday it may just be a giant box/server in your house/apartment that processes every users needs including running games at 8K resolution).

(as you mentioned about Intel poaching an AMD guy) Intel is already aiming to try and get into the GPU market and will eventually succeed. Intel needs to do relatively little to absolutely decimate the current GPU market and put both AMD and nVidia out of business very quickly. Once Intel gets its way it can likely influence the nature of motherboards etc as we know them which may mean the point at which discrete GPUs go away.

So it'll basically end up being a race, and what I was implying by that is although nVidia is known today for GPUs the concept of a GPU may not exist as we currently know it. Knowing how technology changes, that might happen extremely quickly too.

A lot of the discussion surrounding GPUs and gaming is better put into perspective when you look at the userbase. Steam's peak userbase is 18 million (and that's a record set only a few days ago). Even if you assume the highest-selling games like Minecraft (100 million) are an indication of how large the market is, companies like Intel ship 3-400 million processors a year (according to some google search I just did).

So the gaming market is still tiny, any way you look at it, whoever provides the faster/cheaper option for general purpose computing that does away with the concept of GPUs will be the market winner ( I'm talking about a decade or more time scale when I say that).
There's a major flaw with that though: AMD is similar to Intel in the sense that they're also a CPU maker doing the same exact thing Intel is doing: A CPU with a GPU on the same chip. AMD's APU line is a good example of this and with Raven Ridge their performance is absolutely destroying Intel's current solutions. Nvidia on the other hand is going to be the one getting fucked hard, especially since their Tegra line has done pretty poorly outside the Nintendo Switch.

AMD will likely hold up while Nvidia is going to get blown out, especially since both AMD and Intel did away with third party chipsets. Discrete GPUs will likely still be in production for either GPGPU tasks or "pro" software that is accelerated by the GPU. For example the Radeon Pro SSG has 16GB of VRAM + 2TB on an SSD for virtual memory.
 

Quijibo69

Mr. Robot
kiwifarms.net
I hope these people feel like morons in a few years when Bitcoin isn't worth shit. It could just close shop and run off with your money like Bit Connect.
 

CIA Nigger

"IT'S NOT A FETISH MOM"
Supervisor
True & Honest Fan
kiwifarms.net

Done

True & Honest Fan
kiwifarms.net
Every time Bitcoin drops they're like "FINALLY I CAN GET A GPU NOW" only for the price to rise again and salt to continue. Bitcoin rebounded again to $8887 as of now.
It will likely take a couple of months.

I think it's a good idea to save your money anyways, considering that Ampere is rumored to be in the way, from the rumors, it seems like an incremental step rather than a huge one, which is really expected given the trash fire AMD's Vega series turned out to be.

Tomorrow, I'll try to see about harvesting more salt, because this thread is leaning far too much on the informative side lol.
 

DNA_JACKED

kiwifarms.net
Well there's a shift happening slowly in trying to make computers less component-based (by soldering CPUs to motherboards etc) and also because GPUs are increasingly being aimed at being GPGPUs. There are obviously a lot of technical barriers and we're not that close, so its unclear exactly when we'll ever get a "desktop computer" that doesn't have a separate CPU, but we do already have computers which don't have separate GPUs.

When a computer becomes powerful enough without having an upgrade path is probably the point where a discrete GPU will cease. Graphics cards nowadays are exceptionally powerful and it won't be too long until it likely reaches the point of diminishing returns (where upgrading your GPU more than every 5 years would be pointless).

A lot of the stuff we're seeing now like smart TVs, chromecast dongles, and Steam Link wasn't even possible just a few years ago because of how expensive the hardware was. It'll be interesting to see how processing power changes in the future when you consider stuff like edge computing (meaning someday it may just be a giant box/server in your house/apartment that processes every users needs including running games at 8K resolution).

(as you mentioned about Intel poaching an AMD guy) Intel is already aiming to try and get into the GPU market and will eventually succeed. Intel needs to do relatively little to absolutely decimate the current GPU market and put both AMD and nVidia out of business very quickly. Once Intel gets its way it can likely influence the nature of motherboards etc as we know them which may mean the point at which discrete GPUs go away.

So it'll basically end up being a race, and what I was implying by that is although nVidia is known today for GPUs the concept of a GPU may not exist as we currently know it. Knowing how technology changes, that might happen extremely quickly too.

A lot of the discussion surrounding GPUs and gaming is better put into perspective when you look at the userbase. Steam's peak userbase is 18 million (and that's a record set only a few days ago). Even if you assume the highest-selling games like Minecraft (100 million) are an indication of how large the market is, companies like Intel ship 3-400 million processors a year (according to some google search I just did).

So the gaming market is still tiny, any way you look at it, whoever provides the faster/cheaper option for general purpose computing that does away with the concept of GPUs will be the market winner ( I'm talking about a decade or more time scale when I say that).
People were saying that we would hit the point of diminishing returns on GPUs 5 years ago with the nvidia 600 series, that GPUs couldnt possibly keep getting more powerful forever. Many in the industry were CONVINCED that going below 65nm would be impossible to do on any profitable scale. Many were convinced that CPUs with more then 4 cores would never exist, or if they did, they would be too expensive to produce and buy.

I distinctly remember many in the industry clamoring that 8 GB drives were enough, that the 10GB hard drives were just toounstable, and would never reach mass production. Then how going over 80 GB was ridiculous, then how 1TB was the largest any mass production drive would hit. Now you can buy 8TB drives, 16TB are coming soon.

If there is one thing to take away from this, it is that, no, we will not hit those diminishing returns. Someone will find a way to use that extra capability and create more demand. "When a computer becomes powerful enough without having an upgrade path" has been predicted since the 90s, for both servers and home PCs, and it has never happened. It may slow down a bit, but progress wont stop. Unless video games die out as a hobby, you will always have some who want more then just baseline graphical processing power.

Never bet against technological progress, you will always lose.
 
Status
Not open for further replies.
Tags
None