GPUs & CPUs & Enthusiast hardware: Questions, Discussion and fanboy slap-fights - Nvidia & AMD & Intel - Separe but Equal. Intel rides in the back of the bus.

  • The site is having difficulties because our bandwidth is totally overextended. Our 1Gbps line is at 100% even when there aren't 8000 people on the site. We were supposed to get a second Gbps line months ago but I'm struggling to get technicians scheduled to set it up.

The Mass Shooter Ron Soye

Do it for Null
kiwifarms.net
Joined
Aug 28, 2019

Maybe I was right and the 12900K will be $600.
i9-12900K = $589

Intel’s Innovation 2021 conference is scheduled to kick off at 9 a.m. PT with an opening keynote from CEO Pat Gelsinger. Interested viewers are advised to register for a free account to view all sessions of the virtual conference on Intel’s Innovation portal, where they’ll be able to view and stream the keynote. Be sure to visit Intel’s Innovation portal to view the keynote at 9 a.m. on October 27.

While Intel will showcase its latest innovation and work in fields such as A.I. and cloud computing, PC fans and gamers will want to keep an eye out for the client computing track. The track is expected to commence at 10:40 a.m. PT on October 27, and Intel will announce the next era of PC innovation.
 
Last edited:

Stasi

kiwifarms.net
Joined
Jan 6, 2019
I've given up on getting a decently priced GPU in today's market. @Smaug's Smokey Hole was talking up Quadros a while back so thought I'd check out the workstation cards and picked up an AMD WX 3100 for 70 Bongland bux. From what I can tell its roughly equivalent to an RX 550 which I've seen go for twice as much on ebay. Pretty weak by today's standards but should do alright with older titles and emulation which is all I'm really after.
2860403-a.jpg
I haven't tested it much but seems to run Kingdom Come Deliverance at a respectable 30fps at 1080p so maybe I'll finally give that game a try. Its an improvement on the 2GB GTX 745 that was in the pre-built anyway. Didn't seem to be much info on the card out there other than this random Slav on youtube. Looks like you can squeeze a bit more juice out of the card by fucking around with the bios but I'm too scared to brick it.
 

Smaug's Smokey Hole

Closed for summer
kiwifarms.net
Joined
Sep 7, 2016
I've given up on getting a decently priced GPU in today's market. @Smaug's Smokey Hole was talking up Quadros a while back so thought I'd check out the workstation cards and picked up an AMD WX 3100 for 70 Bongland bux. From what I can tell its roughly equivalent to an RX 550 which I've seen go for twice as much on ebay. Pretty weak by today's standards but should do alright with older titles and emulation which is all I'm really after.
View attachment 2674592
I haven't tested it much but seems to run Kingdom Come Deliverance at a respectable 30fps at 1080p so maybe I'll finally give that game a try. Its an improvement on the 2GB GTX 745 that was in the pre-built anyway. Didn't seem to be much info on the card out there other than this random Slav on youtube. Looks like you can squeeze a bit more juice out of the card by fucking around with the bios but I'm too scared to brick it.
Hope it works out well, I haven't had any experience with the AMD side of professional cards since they were called ATi FireGl so I leaned into Quadro. I think it's a viable option for those that need something right now that isn't total ass and completely overpriced. Plus, more display ports for free.

And that slav video is top notch. He even shot part of it without a shirt, for no reason at all.
slavgpu.JPG

The BIOS thing was interesting and I was not aware of it in the slightest, it allowed him to tweak the upper limits of what the card would tolerate before shutting down and that involved getting the memory from 12,000 to 15,800. It should mean that the card would work fine booting into windows after a crash and numbers can be tweaked or restore it all by flashing it with the original bios. He had to use a weird device at some point hooked up to the bios chip so it's not exactly foolproof, though.
 

Smaug's Smokey Hole

Closed for summer
kiwifarms.net
Joined
Sep 7, 2016
The weakest 128 execution unit Intel Alchemist GPUs could really help the market if they can flood them out there. inb4 OEM only
HP Omen exclusive or some shit like that. If they manage to get anything out that is above a HTPC level they will be able to sell everything they have. I just looked at our largest retailer, they have the 1030 and the 6900XT in stock, that's it. The 1030 now costs the same as the 1050Ti did around the time it launched almost exactly five years ago.
I am popping in to ask a stupid question: is the GPU market at all improved over last year?
Nvidia stopped production of their current 3000-series chip, supposedly to limit supply and make the current prices the new normal prices for their upcoming super refresh in 2022. Shit is fucked.
 

HarryHowler

The Amazing Goatman
kiwifarms.net
Joined
Jan 10, 2014
So, basically it's what everyone thought - the i9 is sorta competitive with the 5950X, at the cost of abysmal power efficiency and needing you to spend a fuckton on DDR5 modules to let the chip really strut its stuff. Except for games, where it's got a more solid lead, and it's pretty much a wash as to whether DDR4 or DDR5 works best.

Either way, it's pretty hilarious to see the Alder Lake i5 completely wrecking the entire Rocket Lake line-up.
 

The Mass Shooter Ron Soye

Do it for Null
kiwifarms.net
Joined
Aug 28, 2019
So, basically it's what everyone thought - the i9 is sorta competitive with the 5950X, at the cost of abysmal power efficiency and needing you to spend a fuckton on DDR5 modules to let the chip really strut its stuff. Except for games, where it's got a more solid lead, and it's pretty much a wash as to whether DDR4 or DDR5 works best.

Either way, it's pretty hilarious to see the Alder Lake i5 completely wrecking the entire Rocket Lake line-up.
Intel did what it had to do. Beating the 5950X in multithreading often does exceed earlier expectations.

i5-12600K + DDR4 might be a good choice for some people, and in a couple months there will be i5-12400 and lower.

AMD will come back with Zen 3 + VCache as an upgrade for existing Zen 2/3 owners, possibly eliminating Intel's Alder Lake lead. AMD promised +15% average, but what if game developers optimized for it?
 

Smaug's Smokey Hole

Closed for summer
kiwifarms.net
Joined
Sep 7, 2016
and it's pretty much a wash as to whether DDR4 or DDR5 works best.
This has always been the case in my experience. Right when the new standard arrives there's not much need to upgrade to it. I think it was with DDR2 or DDR3 where the first-gen timings were so bad that the more affordable, and slower, previous gen kits could go toe to toe with them.
It's always better to wait for better kits at better prices, plus a more affordable platform(the Intel DDR5 motherboards are currently expensive from what I've seen).

edit:
Just some advice for people that might have the same problem I had.

My CPU didn't clock itself down(or boost) as it should and it annoyed me, it was pegged at its base speed, plus it probably consumed more power than it should when idling. It's a Ryzen and after googling it I found lots of people with the same problem. At some point Windows 10 was fucked up and AMD created a special performance/power profile to solve the problem. That didn't work for me and a lot of other people. Some people said that Win10 was now fixed and using the standard power plan would solve the issue. It didn't.

Forums suggested setting the minimum CPU state to 1% in the power settings. That did not work either.

This was the culprit, a setting in the menu for turning off the screen:
powerplans1.JPG
Knocking it down from Best Performance was the solution, it now boosts up and down clocks itself as it should. I can't remember ever touching it, don't think I knew it was even there. Why is there even a slider that overrides the power plan settings outside of the power plan menu?
 
Last edited:

HarryHowler

The Amazing Goatman
kiwifarms.net
Joined
Jan 10, 2014
AMD will come back with Zen 3 + VCache as an upgrade for existing Zen 2/3 owners, possibly eliminating Intel's Alder Lake lead. AMD promised +15% average, but what if game developers optimized for it?
It should give AMD the lead back at the high-end. The problem they're gonna have is more in the mid-range section, where the Alder Lake i5's lead over the Ryzen 5 is just too big for the V-Cache to realistically make up the gap.

As for game developers actually optimizing for the V-Cache, well, that would assume they give a shit about optimizing for anything beyond the Zen 2s that the PS5 and Series X/S use. A few of them might, but I wouldn't hold my breath.
 

BootlegPopeye

kiwifarms.net
Joined
Jan 18, 2021
Nvidia stopped production of their current 3000-series chip, supposedly to limit supply and make the current prices the new normal prices for their upcoming super refresh in 2022. Shit is fucked.
I can't see it happening yet, but I hope this is a bubble that blows up in the manufacturers faces. The market is totally distorted now, it's nearly impossible to get a mid range card new for less than 650 bucks. Some people may just conclude PC gaming is for a rich, elite crowd and not bother getting into it.

I would kill for a midrange Radeon, something that used to be in the 250 USD price point before all this bullshit started, that tier seems to be dead (for now).
 

AmpleApricots

kiwifarms.net
Joined
Jan 28, 2018
Everything that kills the mess that's AAAAAAAAAAAAAA+++ gaming is a net positive in a book. That market needs a good collapse.

Tons of stuff runs just fine on the more modern iGPUs, especially indie stuff which sucked immensely for a while but is pretty good right now. I'll just welcome developers optimizing into my niche. The last powerful graphics card I had was an R9 390 which still sells for a crazy 200-300 euro in the used market in my corner of the world right now, which is almost as much as I paid when I got it. I literally only used it to play 2-3 of the same titles once in a blue moon. There's IMHO just nothing interesting in the game market trier that requires such cards or would justify the investment. It's almost all games I played already 20 years ago, just better looking.
 

BootlegPopeye

kiwifarms.net
Joined
Jan 18, 2021
Yeah, I agree. I probably could put in my last Nvidia card in this thing and would barely notice it on the few newish 'AAA' games I own. But Radeon plays nicer with Linux and since I mostly do old games anyway I decided that at some point I want to go to Radeons for good.

Probably what I am going to do is just buy a few years old card, even if its a step down from what I have now (any recommends? Think sub-250 price point).

That's maybe one good tech trend that happened in the last 10-12 years - desktops last ALOT longer than they used to, I remember the days of a total PC replacement every couple of years. Now that Moore's Law has slowed, I could probably keep this desktop for 6-7 years and barely notice. Hopefully video cards end up like that.
 

Smaug's Smokey Hole

Closed for summer
kiwifarms.net
Joined
Sep 7, 2016
Yeah, I agree. I probably could put in my last Nvidia card in this thing and would barely notice it on the few newish 'AAA' games I own. But Radeon plays nicer with Linux and since I mostly do old games anyway I decided that at some point I want to go to Radeons for good.

Probably what I am going to do is just buy a few years old card, even if its a step down from what I have now (any recommends? Think sub-250 price point).

That's maybe one good tech trend that happened in the last 10-12 years - desktops last ALOT longer than they used to, I remember the days of a total PC replacement every couple of years. Now that Moore's Law has slowed, I could probably keep this desktop for 6-7 years and barely notice. Hopefully video cards end up like that.
Every few years? Jeez, I had the brand new Pentium MMX running at 166mhz, it was released in January of 1997. Three years later, March of 2000, the 1ghz Athlon was released. Things moved fast. Same with graphics accelerators, in 1997 the Voodoo 1 was the best thing ever and that was basically just a texture filtering device with hardware clipping. Three years later that thing was beyond useless and the first GeForce was released. The next year the GeForce 3 showed up, it was the first thing that could be called a GPU.

During that same span of time the RAM standards went from EDO to SDRAM to DDR1.
 

HarryHowler

The Amazing Goatman
kiwifarms.net
Joined
Jan 10, 2014
How fucking sad is it when Intel, a company who's been producing graphics processors in one form or another for nearly a quarter-century, and the best they've ever gotten in that period was some versions of Haswell having an iGPU that was sorta decent for the time, are the last great hope for resolving the current GPU market clusterfuck?
 

Judge Dredd

Senior Layout Artist
kiwifarms.net
Joined
Aug 23, 2018
I am popping in to ask a stupid question: is the GPU market at all improved over last year?
I saw a compelling argument that GPU prices would start to drop at the end of November, but now it seems the most common prediction is we might see prices start to come back down at the end of 2022.

A crazy person on Discord argued that the current GPU prices are a good thing because PC games had become too bloated and graphics focused, and that gaming peaked by 2008. The implication being that by making GPUs this expensive going forward means that gaming will be capped at that level forever.

No but for a short time I thought it was going down then prices stay stagnated.
Like the housing market, there's no reason to lower the price unless supply increases. Since that's not going to happen any time soon, it's better for them to just sit on the cards and wait for a four digit payout.