![]()
At least craigslist is here to warn people
I kinda wish I could go on Craigslist and vote this post
![]()
At least craigslist is here to warn people
It's the 22lr bubble all over again. Idiots who barely understand the economic forces at play jumping all over something that is in perceived shortage and the manufacturers are like "nah, man. we ain't expanding our factories for a fad". It'll eventually work itself out but not after a whole lot of rubes were swindled.Actually, manufacturers are very wary about this bubble, because they know that miners will ultimately dump their used cards on eBay/Craigslist, which will fuck up the market.
It's the 22lr bubble all over again. Idiots who barely understand the economic forces at play jumping all over something that is in perceived shortage and the manufacturers are like "nah, man. we ain't expanding our factories for a fad". It'll eventually work itself out but not after a whole lot of rubes were swindled.
>implying miners pay market rate for electricityIt's infuriating because it's virtually worthless to mine with GPUs at the current difficulty.
These people are fucking exceptional.
I agree on diversification being a must for both NVIDIA & AMD, but around 60% of NVIDIA's revenue comes from consumer-grade graphics cards. So gaming is still a hugely significant market for them.I understand what they're getting at but I disagree.
Inevitably without mining and industrial/commercial applications (such as machine learning and AI projects) they're going to run into the same problems the PC industry has as a whole. That problem is the same problem smartphone manufacturers are having and its the same problem car manufacturers have had for years.
But here there is a reason to upgrade, PC games keep getting more and more graphically intensive. This is basically planned obsolescence that is inherent in the value proposition.Besides massive population increases and more humans getting access to disposable income and internet access once you release a good product there is less of a reason to upgrade. Once 2 and 4 core processors became standard, your average PC user (not gamers) had very little reason to seek an upgrade because their computer could do practically everything they wanted.
Same with smartphones, once you have the iPhone 634 and the iPhone 635 comes out, what reason do you have to upgrade (that's why Apple hates external storage, because it'd give people a great reason to never upgrade).
I agree on this part, Creative used to be really scummy in their ways way back then, but I don't know how it figures into the GPU discussion we have now, if you have a sound card or external DAC/amp, then you're set for the next 5 years, minimum. There's no real need to upgrade it because Dolby Digital 5.1 has been the standard of choice for years now.Remember in the past when you had to buy a soundcard because motherboard soundcards were universally shit? Turns out that the rise of USB sound as well as decent motherboard sound means that companies like Creative (who tried very hard to control the entire soundcard industry and succeeded for many years: https://en.wikipedia.org/wiki/Creative_Technology ended up being hit hard once soundcards weren't of that much use anymore)
GTX 1070, 1070 Ti, 1080, 1080Ti and the AMD Vega range are all top-end cards with a price tag equivalent to an entire lower to midrange PC, I don't disagree on that.Top end graphics cards cater to a market that is small. Most games aren't even able to utilize more than 4 cores. SLI? barely any games even support it. 4K gaming? Barely any games can even effectively use the most powerful graphics cards let alone two of them (source: I have 2 x 1080 TIs and there are almost no games that can even fully use one, let alone both, as in the game will cap out at a certain point because of its own limitations rather than scale any further. The amount of money/investment required to make games work for massive GPUs is probably worthless because almost no one has them and the majority of "hardcore gamers" generally don't even have much disposable income to buy stupid shit with in the first place)
Game developers know this and nowadays spend increasingly less time on catering to the "top end" of the PC market, because its fucking tiny and alongside sales of PCs has been on a downward trend for years. Developers and publishers care so little about "hardcore gamers" that even spending a few development dollars to put an FOV slider in the game isn't worth the effort when you can just shovel out shit and most people won't complain. I'll concede that in the near future it may be that games take a drastic turn when it comes to scaling with hardware but so far it doesn't look like that's going to happen anytime soon.
Partly answered in the above sections.So the crying people are doing about cryptocurrency mining and its affect on graphics card sales? its actually crying about something that has already happened and has already made nvidia huge sums of money:
View attachment 378543
View attachment 378541
https://www.anandtech.com/show/10864/discrete-desktop-gpu-market-trends-q3-2016/4
With the affect mining and other applications have on the GPU market, nVidia can probably stop caring about the gaming market being a primary interest for quite some time. Gamers will get their cards eventually.
NVIDIA has already highlighted crypto miners as an opportunity in their earnings reports, but right now, they are limiting GPU sales to 1 GPU/customer in their stores and are instructing retailers to do the same. Which are clearly moves to prioritize gamers, not sure how much difference it makes though.It's also important to realize the distinction between fabricators like nVidia and manufacturers (like MSI etc). I'm not exactly sure of how the money changes hands in that relationship, but I assume nVidia makes money whether cards are sold or not (if they didn't manufacture them). So I'm guessing nVidia cares even less.
It's a weird situation where GPUs only really got developed as much as they did because of gaming (that's debatable) and now the actual use and demand for them comes from other sources.
nVidia doesn't just make gaming GPUs though which is probably why they're going to take full advantage of cryptocurrency morons taking out 20 credit cards to buy graphics cards. I'd guess that their long terms bets are probably more in providing general purpose computing power for phones/consoles and AI/machine learning applications. Sales of the latest and greatest GPUs for actual gaming purposes are tiny.
What's also interesting is that Intel has just hired Raja Koduri, who used to run AMD's Radeon Technologies Group, so they may be planning to enter the GPU arena as well.EDIT:
Also now that Samsung has entered into the cryptoshekel industry (unclear if they'll also expand to GPUs) you can expect the current fire to only increase in size:
https://www.gamersnexus.net/industry/3225-samsung-confirms-asic-miner-production
That's a reason that totally slipped my mind. Thanks for your comment.There's another, more logical reason too IMO: GPU makers have to have some for all the computer manufacturers out there. Those buy up GPUs in bulk and slap them in gaming computers you can find at Micro Center. At this point it's cheaper to buy one of those than it is to build your own because of the GPU markup from places like Newegg (which have a "1 per person limit" on top of high prices to try to slow the influx of miners).
There's something else that happened too, consoles became a thing. Game developers found out that more people will buy a game on a platform with no piracy and that's far more accessible than a PC (You'd be shocked how many PC gamers lose it if you have to edit an INI file or download a mod).
The Xbox got PC developers onto consoles, and they learned that more people buy games on consoles.
Also used GPUs worthless for mining (for example the GeForce 7xx series) can be found for low prices on eBay still, I got a friend one of those and all the games he plays work fine on it.
All home consoles use AMD hardware (CPU+GPU), the Nintendo Switch uses a Nvidia GM20B. I am unsure how much Nvidia controls of the mobile market, given that most phones now have Qualcomm stuff in them.Also only relatively recent consoles make use of GPUs (or at least what we call a GPU today).
I think the first major console with a brand-name GPU (i.e. a gpu from a company that specifically makes GPUs and doesn't just shovel out the odd one as an experiment) was the original Xbox in 2001. The first Playstation to make use of one was in 2006 (PS3).
All of the consoles starting to use nVidia/ATI hardware or at least hardware that is very similar has seemed to make a bit more of an even playing field.
It also seems like nVidia is starting to try and sell products to the mobile phone market too which will be interesting if it ever takes off at larger scale.
I agree on this part, Creative used to be really scummy in their ways way back then, but I don't know how it figures into the GPU discussion we have now, if you have a sound card or external DAC/amp, then you're set for the next 5 years, minimum.
Oh for sure, even if many of the games are console ports or are on par with console versions instead of being vastly better/different, digital has helped PC make a comeback somewhat, along with allowing anyone to self-publish games. Even Asians are porting their games to Steam, and for the longest time the PC was treated as a platform for hentai games only in Japan.I think PC gaming is getting more and more prominent as time passes, you notice that Steam is now becoming a base platform in nearly all game announcements, even Microsoft has allowed all of their premium 1st party titles to appear on PC, most of them being exclusive to the Windows App Store.
Ultimately, I don't completely believe in the PCMR propaganda that PC will eventually make consoles obsolete, but their work has had its fruit in pushing PC as a worthwhile platform.
All home consoles use AMD hardware (CPU+GPU), the Nintendo Switch uses a Nvidia GM20B. I am unsure how much Nvidia controls of the mobile market, given that most phones now have Qualcomm stuff in them.
There's a major flaw with that though: AMD is similar to Intel in the sense that they're also a CPU maker doing the same exact thing Intel is doing: A CPU with a GPU on the same chip. AMD's APU line is a good example of this and with Raven Ridge their performance is absolutely destroying Intel's current solutions. Nvidia on the other hand is going to be the one getting fucked hard, especially since their Tegra line has done pretty poorly outside the Nintendo Switch.Well there's a shift happening slowly in trying to make computers less component-based (by soldering CPUs to motherboards etc) and also because GPUs are increasingly being aimed at being GPGPUs. There are obviously a lot of technical barriers and we're not that close, so its unclear exactly when we'll ever get a "desktop computer" that doesn't have a separate CPU, but we do already have computers which don't have separate GPUs.
When a computer becomes powerful enough without having an upgrade path is probably the point where a discrete GPU will cease. Graphics cards nowadays are exceptionally powerful and it won't be too long until it likely reaches the point of diminishing returns (where upgrading your GPU more than every 5 years would be pointless).
A lot of the stuff we're seeing now like smart TVs, chromecast dongles, and Steam Link wasn't even possible just a few years ago because of how expensive the hardware was. It'll be interesting to see how processing power changes in the future when you consider stuff like edge computing (meaning someday it may just be a giant box/server in your house/apartment that processes every users needs including running games at 8K resolution).
(as you mentioned about Intel poaching an AMD guy) Intel is already aiming to try and get into the GPU market and will eventually succeed. Intel needs to do relatively little to absolutely decimate the current GPU market and put both AMD and nVidia out of business very quickly. Once Intel gets its way it can likely influence the nature of motherboards etc as we know them which may mean the point at which discrete GPUs go away.
So it'll basically end up being a race, and what I was implying by that is although nVidia is known today for GPUs the concept of a GPU may not exist as we currently know it. Knowing how technology changes, that might happen extremely quickly too.
A lot of the discussion surrounding GPUs and gaming is better put into perspective when you look at the userbase. Steam's peak userbase is 18 million (and that's a record set only a few days ago). Even if you assume the highest-selling games like Minecraft (100 million) are an indication of how large the market is, companies like Intel ship 3-400 million processors a year (according to some google search I just did).
So the gaming market is still tiny, any way you look at it, whoever provides the faster/cheaper option for general purpose computing that does away with the concept of GPUs will be the market winner ( I'm talking about a decade or more time scale when I say that).
AFAIK, Bitcoin is decentralized, no?I hope these people feel like morons in a few years when Bitcoin isn't worth shit. It could just close shop and run off with your money like Bit Connect.
Every time Bitcoin drops they're like "FINALLY I CAN GET A GPU NOW" only for the price to rise again and salt to continue. Bitcoin rebounded again to $8887 as of now.Looks like this is gonna be an ongoing salt mine:
View attachment 379280
View attachment 379281
http://www.guru3d.com/news-story/ru...eforce-gtx-2070-and-2080-on-april-12th,2.html
Every time Bitcoin drops they're like "FINALLY I CAN GET A GPU NOW" only for the price to rise again and salt to continue. Bitcoin rebounded again to $8887 as of now.
It will likely take a couple of months.Every time Bitcoin drops they're like "FINALLY I CAN GET A GPU NOW" only for the price to rise again and salt to continue. Bitcoin rebounded again to $8887 as of now.
Fuck these drooling, slobbering idiots. They need to kill themselves.
They're driving up the prices of GPUs for no fucking reason at all, because they're goddamn morons.
This all just illustrates the sheer beauty of the free market.
In communism, the GTX 1080 TI would cost only 5 dollars.
People were saying that we would hit the point of diminishing returns on GPUs 5 years ago with the nvidia 600 series, that GPUs couldnt possibly keep getting more powerful forever. Many in the industry were CONVINCED that going below 65nm would be impossible to do on any profitable scale. Many were convinced that CPUs with more then 4 cores would never exist, or if they did, they would be too expensive to produce and buy.Well there's a shift happening slowly in trying to make computers less component-based (by soldering CPUs to motherboards etc) and also because GPUs are increasingly being aimed at being GPGPUs. There are obviously a lot of technical barriers and we're not that close, so its unclear exactly when we'll ever get a "desktop computer" that doesn't have a separate CPU, but we do already have computers which don't have separate GPUs.
When a computer becomes powerful enough without having an upgrade path is probably the point where a discrete GPU will cease. Graphics cards nowadays are exceptionally powerful and it won't be too long until it likely reaches the point of diminishing returns (where upgrading your GPU more than every 5 years would be pointless).
A lot of the stuff we're seeing now like smart TVs, chromecast dongles, and Steam Link wasn't even possible just a few years ago because of how expensive the hardware was. It'll be interesting to see how processing power changes in the future when you consider stuff like edge computing (meaning someday it may just be a giant box/server in your house/apartment that processes every users needs including running games at 8K resolution).
(as you mentioned about Intel poaching an AMD guy) Intel is already aiming to try and get into the GPU market and will eventually succeed. Intel needs to do relatively little to absolutely decimate the current GPU market and put both AMD and nVidia out of business very quickly. Once Intel gets its way it can likely influence the nature of motherboards etc as we know them which may mean the point at which discrete GPUs go away.
So it'll basically end up being a race, and what I was implying by that is although nVidia is known today for GPUs the concept of a GPU may not exist as we currently know it. Knowing how technology changes, that might happen extremely quickly too.
A lot of the discussion surrounding GPUs and gaming is better put into perspective when you look at the userbase. Steam's peak userbase is 18 million (and that's a record set only a few days ago). Even if you assume the highest-selling games like Minecraft (100 million) are an indication of how large the market is, companies like Intel ship 3-400 million processors a year (according to some google search I just did).
So the gaming market is still tiny, any way you look at it, whoever provides the faster/cheaper option for general purpose computing that does away with the concept of GPUs will be the market winner ( I'm talking about a decade or more time scale when I say that).