GPUs & CPUs & Enthusiast hardware: Questions, Discussion and fanboy slap-fights - Nvidia & AMD & Intel - Separe but Equal. Intel rides in the back of the bus.

SeniorFuckFace

Fucking Fuck Fuck
kiwifarms.net
So I have actually followed along a little with UE5 and Nanite. It's exciting technology. It's not my area - you system architecture, infrastructure, DBs, I'm great at that stuff. Graphics isn't my area though I follow it.

This video talks a bit about development for UE5 and Nanite. https://www.youtube.com/watch?v=roMYi7BU1YY&t=656s

So this video says that Nanite isn't very CPU heavy and relies mostly on the GPU (makes sense). For this reason, though I'd love to go Threadripper, I'm leaning Ryzen. You're simply going to need every penny you can for the GPU. :/

I managed to find UE's developer recommendations for hardware for version 4. They're suprisingly low and I think they're out of date but this is them: https://docs.unrealengine.com/4.26/en-US/Basics/RecommendedSpecifications/ As you can see, not very high. I'm guessing because you don't need high detail textures to do basic development. For their Early Access requirements for v.5 they have this:
https://docs.unrealengine.com/5.0/en-US/Welcome/ where they say requirements are the same but have upgraded their GPU recommendation to GTX 1080 or Vega 64 upwards. That still seems low to me but sharing anyway.

So I've started with a rough build because it's easier to start from somewhere than from nowhere. I would love to go for a Threadripper build as I said but I've gone Ryzen because I think that will be fast enough on the CPU for what you need and I'd rather leave as much money as possible for the GPU. I decided it was really worth trying for the PCI-E v4 over v3 because although it doesn't make that much difference right now I think content creation work is one of those scenarios where it is going to help. Again, if you went down to the previous gen that wouldn't be bad. But v4 gets you faster storage for the future and current gen GPUs can make some use out of it.

The killer is the GPU. Miners have devastated the market and that doesn't look like it's stopped. It's become genuinely hard to buy a modern GPU and when you can the prices are double or more what they ought to be. So this is a Ryzen based build sans GPU and monitor which I'll treat separately.

PCPartPicker Part List

CPU: AMD Ryzen 7 5800X 3.8 GHz 8-Core Processor ($423.88 @ Amazon)
CPU Cooler: Cooler Master Hyper 212 EVO 82.9 CFM Sleeve Bearing CPU Cooler ($39.99 @ Amazon)
Motherboard: MSI MAG X570 TOMAHAWK WIFI ATX AM4 Motherboard ($257.99 @ Newegg)
Memory: Corsair Vengeance LPX 32 GB (2 x 16 GB) DDR4-3600 CL18 Memory ($183.99 @ Amazon)
Memory: Corsair Vengeance LPX 32 GB (2 x 16 GB) DDR4-3600 CL18 Memory ($183.99 @ Amazon)
Storage: Western Digital Black SN850 1 TB M.2-2280 NVME Solid State Drive ($199.99 @ Western Digital)
Storage: Western Digital Black 4 TB 3.5" 7200RPM Internal Hard Drive ($149.99 @ Newegg)
Case: Phanteks Eclipse P300A Mesh ATX Mid Tower Case ($49.99 @ Newegg)
Power Supply: Corsair RM (2019) 750 W 80+ Gold Certified Fully Modular ATX Power Supply ($116.82 @ Amazon)
Total: $1606.63
Prices include shipping, taxes, and discounts when available
Generated by PCPartPicker 2021-06-06 06:03 EDT-0400

As you can see, it's a starter build as far as storage goes but I think that's okay - it has a strong performer for its main OS drive and with 1TB it should be able to put most of the software you'd want on there as well. I've added a 4TB traditional drive for storing the assets, project files, etc. Maybe you could swap it for an SSD but I think based on description you want the space. For RAM, you've got 64GB which is a lot. It's 3200 as well so pretty fast. DDR5 is coming but unless you're willing to put this off for a year I think you have to take the plunge on knowing it's going to be slightly outdated in a couple of years. But it's a poor workman blames his tools - goal here is to get your nephew what he needs to learn and work productively. Slight downside is that if he does want to go higher with RAM there are no spare slots so it's replace the memory sticks wholesale. But to avoid that scenario you'd really need to go Threadripper and we'll be adding around $500 dollars to make that jump.

So now the two big omissions. First up the monitor. I haven't used this personally but good brand and good reviews and good specs:
View attachment 2236204

It's got lots of real estate for having IDE, blender, whatever on one side and reference webpages on the other. Good colour reproduction, 4K resolution for video editing.

You could also look for an ultrawide and maybe drop the cost a little with something like this and have a view to replace it in the future. Or just stick with what he has if need be as I don't know what he has.

A key thing to look for in monitors is "IPS". That's the panel technology. You don't want to do any photo or video work on a "TN" screen as the colour will be awful.

And now the thing that dominates everything else, cost-wise. The graphics card. Because he's wedded to iRay technology it needs to be Nvidia. As @Smaug's Smokey Hole says, you need minimum 8GB VRAM on it. Ideally you'd go for something like the 3080 with 10GB but that can be over a thousand right now due to miners. It honestly grieves me that I can't point you at something more reasonably priced. Crypto mining has utterly wrecked this to the point only the well-off can comfortably afford this. I myself have been waiting a year to upgrade my GPU because of the situation and will probably sit this generation out entirely because of it. Maybe get a 1080 or a 2080. They wont be bad, they're impressive cards. They're just not as good as you should be able to get for the money.

I might do up a Threadripper based system for comparison but my feeling is you'd be adding on $500 and I don't feel you'd lose too much based on my reading the requirements from UE. You can play around on PC Parts picker and it will warn you about incompatibilities. We might want to check the CPU / motherboard combo in the sample spec I put together as well as it might require a BIOS update to work. So don't click buy on that just yet! It's a starting point. I'd really like to get some input from others on what they'd change.
Thank you again for all this of this, I have read through this once and I am going to go through it several more times making sure I understand it in full.

A great starting point for my end, for sure....thanks again! :)
 

Ginger Piglet

Burglar of Jess Phillips MP
True & Honest Fan
kiwifarms.net
Example. The old intel stock CPU coolers had a solid copper core, then they went semi copper-aluminum, and finally all aluminum core... at an increase of cost.

This was one of the reasons why AMD was so compelling with Ryzen 1000. It might have only matched (better in multi thread, worse in single thread) the Intel offerings of the time, but the standard coolers they came with were genuinely good. The Wraith Max / Wraith Prism could even do some fairly significant overclocking on a suitable board.

Ideally you'd go for something like the 3080 with 10GB but that can be over a thousand right now due to miners.

More like a grand and a half. And the new RTX 3080 Ti is knocking on two grand.
 

Car Won't Crank

My cars won't crank
kiwifarms.net
it's not like my active scythe is that loud to begin with, my keyboard with brown switches is louder.
Which Scythe cooler do you have? I'm currently running the Ninja 5 and it more than adequately keeps my 3800x cool even with high ambient temps. My HDDs are louder than the cooler it's hilarious.
 

SeniorFuckFace

Fucking Fuck Fuck
kiwifarms.net
More like a grand and a half. And the new RTX 3080 Ti is knocking on two grand.
What about the RTX 3080?


And if we can think of it in terms of upgrading from a Nvidia GeForce GTX 770, I am thinking that is a significant boost in performance especially when working in software a program such as After Effects?

Thank you for contributing to the conversation.
 

Overly Serious

kiwifarms.net
What about the RTX 3080?


And if we can think of it in terms of upgrading from a Nvidia GeForce GTX 770, I am thinking that is a significant boost in performance especially when working in software a program such as After Effects?

Thank you for contributing to the conversation.

Coming from a GTX770 that is beyond "significant boost" and into space rocket vs. paper plane territory. GTX 770 is what, eight years old? Nine? It has fucking DVI ports! (that's a really old thing).

But note that your link says "Out of Stock". If you can find a 3080 for the price on that page, grab it. But would be a small miracle if you do. And watch out for scams if you resort to ebay for one. I really hate having to tell you all this but you're about to see why everyone who wanted a graphics card in the past however many months has come to hate crypto miners. The AMD 6xxx series launched, 4 months back? And I've never seen one in real life nor know someone who got one.
 

Smaug's Smokey Hole

Sweeney did nothing wrong.
kiwifarms.net
What about the RTX 3080?


And if we can think of it in terms of upgrading from a Nvidia GeForce GTX 770, I am thinking that is a significant boost in performance especially when working in software a program such as After Effects?

Thank you for contributing to the conversation.
The jump will be absolutely huge for After Effects and it would still be a massive leap if he got a last gen 5700XT or a last-last-gen GTX 1080. Heavy CUDA and OpenCL processes really benefits from more VRAM and he's got almost none.
 

ArnoldPalmer

kiwifarms.net
AMD has somewhat better drivers for Linux, so I guess that's something.

Not somewhat, hand-over-fist superior. Nvidia's drivers barely work, and can't advance, because it's closed source. There's nouveau, which is open source, but with no technical data on any of the GPUs, it can't go very far, either.
 

Deadwaste

my password is ballsdeepnpussy69 honest
kiwifarms.net
Coming from a GTX770 that is beyond "significant boost" and into space rocket vs. paper plane territory. GTX 770 is what, eight years old? Nine? It has fucking DVI ports! (that's a really old thing).
the 770 is still a decent card. it might not play all the new games at the highest settings and at the highest resolutions, but if people are fine with having their settings down to 720p low, or for older games, more modest settings, then it'd serve perfectly well.
but yeah the 3080 would blow that thing out of the water by a couple hundred miles and then some
 

Overly Serious

kiwifarms.net
the 770 is still a decent card. it might not play all the new games at the highest settings and at the highest resolutions, but if people are fine with having their settings down to 720p low, or for older games, more modest settings, then it'd serve perfectly well.
but yeah the 3080 would blow that thing out of the water by a couple hundred miles and then some

Agreed. Wasn't meaning to write off 720p gaming - honestly, I'm only really interested in cheap strategy games when I can even find the time to game so it's all good with me. Just highlighting what nearly a decade of intense research and development has achieved compared to back then.

Anyway, somewhat more helpfully to the OP, I'd say a 1080 would be sufficient. I mean, not nearly as nice as a 3080 if you could get one at a reasonable price, but you can't. Not that getting hold of a 1080 is easy or cheap either but if it's all that can be got it's still a good card.
 

TheRedChair

Ultimate Chaos, Ultimate Confort.
kiwifarms.net
Coming from a GTX770 that is beyond "significant boost" and into space rocket vs. paper plane territory. GTX 770 is what, eight years old? Nine? It has fucking DVI ports! (that's a really old thing).
You know that if you use a CRT, you are going to have an awesome experience.

The main reason we left the crt field is because of.

1. Weight.
2. Fucking Greed.

As stated before, I had access to a refurbishment company so I exactly know why the video monitors in technology went the way it did.


Let me give you a hint. It was not out of the goodness of their heart.

So if you want to screw around a bit and go retro Keep your card and create an Win2000/Xp machine and have fun

Or just say fuck it and do what you want.
 

Smaug's Smokey Hole

Sweeney did nothing wrong.
kiwifarms.net
The main reason we left the crt field is because of.

1. Weight.
2. Fucking Greed.
I would add power consumption and regulations to that. Look at plasma, it went the way of incandescent lightbulbs.

SED was the true successor to CRT and it looked really spectacular(never seen it, just read about specs) but it was dropped when the other pipe-dream, OLED, looked like it could finally happen.
 

TheRedChair

Ultimate Chaos, Ultimate Confort.
kiwifarms.net
I would add power consumption and regulations to that. Look at plasma, it went the way of incandescent lightbulbs.

SED was the true successor to CRT and it looked really spectacular(never seen it, just read about specs) but it was dropped when the other pipe-dream, OLED, looked like it could finally happen.
Heh, The CRT Hybrid was great however around 2007 the peanut counters got together and stated the following.
That 1920x1080 was going to be the facto size. The excuse was quote: "for best entertainment value". Everything else at a higher rate was considered commercial/business/artistic.

By doing this they increased the prices of everything by 50 to 100%. that was higher than 1080p

This meant they were going to sell shit, literal poor quality shit at whatever prices they can get, because marketing said so. From late 2008 to 2014 most monitors life expectancy 2 to 3 years stuck at 1080 @60hz. Anything out of this spec you got your ass handed to you.

Man I was there when that happened at the repair factory talking to the owner. I bought a soyo tech 1920x1200 for $250. A few months later it was stupid high. That soyo I purchased lasted me for 6 years.

https://bjorn3d.com/2008/07/soyo-24-pearl-series-lcd/

Then hit the Korean wave of cheap monitors at high refresh rates... and they were good monitors overall.

I purchased the 27" Qnix 2710 in late 2014 which lasted me to 2019. Purchased that for $320

From there my current monitor is the Pixio monitor 32" PX329. This one is a nice monitor but It'll probably last for 1 or 2 more years. That should be in the 2023 range however your transformer will probably give you issues sooner.

Because things these days are made to be broken... so that consuoooomers will buy the greatest and bestest things that they really don't need.

And finally I won't spend more than around $350 for a monitor. I just find that nothing is really made well anymore. It just easier to place a stamp of quality on something than actually do R&D.
 

Overly Serious

kiwifarms.net
And finally I won't spend more than around $350 for a monitor. I just find that nothing is really made well anymore. It just easier to place a stamp of quality on something than actually do R&D.

I find the historical background interesting and I agree with most of what you said but this last I do not. It's true that most electronic goods are designed to fail, these days. Fridges and washing machines fail within literal months of the warranty expiring with remarkable frequency. But high-end monitors are well worth it if you can afford it. There are 5k2k ones now which cost a LOT but to work on them is a joy compared to something in the $350 region.
 

TheRedChair

Ultimate Chaos, Ultimate Confort.
kiwifarms.net
I find the historical background interesting and I agree with most of what you said but this last I do not. It's true that most electronic goods are designed to fail, these days. Fridges and washing machines fail within literal months of the warranty expiring with remarkable frequency. But high-end monitors are well worth it if you can afford it. There are 5k2k ones now which cost a LOT but to work on them is a joy compared to something in the $350 region.
That depends on your parameters. IMHO as well as I tell my clients is to look at the 1440P range @ 144hz at around the $350 range and wait 2 or more years until the the price of 4K comes down. I'm not sold on it the same way I'm was not sold on the past gimmicks Such as G-sync from Ngreedia.

I saw no real degradation in playing (borrowed a monitor) the usual hosts of video games that denoted a 100+ price increase during the time. The same thing goes with Ray Tracing. Neat concept except 80% of the world does not have bleeding edge tech to make full use of it. Nor all of the video game companies make full use of it as well. But it sure as hell brought the price up on the video cards.

The 1440P and high refresh rate was incredibly expensive prior to 2014, just like 4K is now. I'll wait a few more years until the price goes down enough where it is affordable to purchase.

My 32in" Pixio329 @165mz got good reviews, works well with my 5700XT. But what I have said before is what is going to go out is probably going to be your transformer block. I've already have an extra and will be automatically replacing it at the end of this year.

Oh that is another thing. You really have to pair up what your video card will do. Of course if you willing to sell a kidney( or you really do have the money) and got a 3070 or higher... then okay I see where you are coming from. It would be better to take advantage your video card and get a high performance monitor.

In my case by the time the Pixio monitor hits 4 years I'll be looking for a replacement and retire my old ViewSonic (that I got free) that is on my back up computer.
 

CreamyHerman’s

I FUCKING HATE TRUMP, I LOVE BIDEN’S COCK
kiwifarms.net
1. Weight.
2. Fucking Greed.
There's more to it, I know @Smaug's Smokey Hole mentions power consumption, but the biggest issue with CRT's are that they are environmental landmines due to how much mercury is in a single CRT monitor. Recycling warehouses have to take proper precautions to prevent Mercury from leaking out.
 

Smaug's Smokey Hole

Sweeney did nothing wrong.
kiwifarms.net

Thieves Hit Internet Cafe and Make Off With GPU Stash​

Chinese news outlet 浙样红TV has reported that thieves have stolen over $7,000 worth of high-end graphics cards from an Internet cafe in the city of Hangzhou.


I'm surprised that we haven't head about this happening more often. Unlike other relatively small items with high value like laptops a graphics card won't be geofenced and brick itself or phone home or any of that stuff. Phones can be IMEI-blocked or locked as well. A functioning graphics card will continue to be a functioning graphics card and it can be used to bolster someones personal mining efforts or sold at the current ridiculous scalper prices. From previous pictures it looks like some miners rents an empty house and just places everything on the floor in the empty rooms and I'm surprised that someone isn't raiding them. Maybe they booby trap the place and thieves don't want to risk it? The Chinese that collects and sells birds nests supposedly booby trap the caves they take them from to keep the competition out, a hazard for tourists that likes to explore off the beaten path, and I've heard that houses rented to function as grow houses for weed can be booby trapped.
 

Ginger Piglet

Burglar of Jess Phillips MP
True & Honest Fan
kiwifarms.net

Thieves Hit Internet Cafe and Make Off With GPU Stash​

Chinese news outlet 浙样红TV has reported that thieves have stolen over $7,000 worth of high-end graphics cards from an Internet cafe in the city of Hangzhou.


I'm surprised that we haven't head about this happening more often. Unlike other relatively small items with high value like laptops a graphics card won't be geofenced and brick itself or phone home or any of that stuff. Phones can be IMEI-blocked or locked as well. A functioning graphics card will continue to be a functioning graphics card and it can be used to bolster someones personal mining efforts or sold at the current ridiculous scalper prices. From previous pictures it looks like some miners rents an empty house and just places everything on the floor in the empty rooms and I'm surprised that someone isn't raiding them. Maybe they booby trap the place and thieves don't want to risk it? The Chinese that collects and sells birds nests supposedly booby trap the caves they take them from to keep the competition out, a hazard for tourists that likes to explore off the beaten path, and I've heard that houses rented to function as grow houses for weed can be booby trapped.

I heard about scalpers being robbed in Toronto for them. And in Scotland, a PS5 scalper was conned into driving 120 miles at his own expense before the purported buyer revealed how he's done it to waste their time (cash on delivery, you see). Scalper had a shit fit and said "I hope your kids are disappointed." Chappie replied, "lol they already have one fuckface."

But yeah, these are reasons not to scalp, frankly. If I didn't have such a finely tuned moral compass ("lol, what moral compass, Piglet's a lawyer") I would be sorely tempted to sell my 3080 on greedbay and show pictures of it with its original box and suchlike, and when the money came in, send the box with a bag of sand in it carefully filled to the same weight as the card. And then torch my account and run for the hills. I'm also surprised that scalpers haven't been jumped when they gave their address for cash on collection for cards and the money retrieved.
 

Sexual Chocolate

kiwifarms.net
I heard about scalpers being robbed in Toronto for them. And in Scotland, a PS5 scalper was conned into driving 120 miles at his own expense before the purported buyer revealed how he's done it to waste their time (cash on delivery, you see). Scalper had a shit fit and said "I hope your kids are disappointed." Chappie replied, "lol they already have one fuckface."

But yeah, these are reasons not to scalp, frankly. If I didn't have such a finely tuned moral compass ("lol, what moral compass, Piglet's a lawyer") I would be sorely tempted to sell my 3080 on greedbay and show pictures of it with its original box and suchlike, and when the money came in, send the box with a bag of sand in it carefully filled to the same weight as the card. And then torch my account and run for the hills. I'm also surprised that scalpers haven't been jumped when they gave their address for cash on collection for cards and the money retrieved.

Scalpers are annoying but I read an article about them and it said they were mostly college kids. Or at least the ones who aren't part of organized groups are.

Idk how mad I can get at some 19 year old who's already financially fucked by student loans and the generally shitty state of the economy tryna make a few bucks off PS5s.
 

Smaug's Smokey Hole

Sweeney did nothing wrong.
kiwifarms.net
Idk how mad I can get at some 19 year old who's already financially fucked by student loans and the generally shitty state of the economy tryna make a few bucks off PS5s.
Great moments in scalper history: the PS3 launch. In a sea of listings with insane prices the scalpers started desperately pimping out their wives, girlfriends and sisters to get people to click on their listing because people weren't buying. Some had bought many systems on credit.
27.jpg
New PS3s were soon selling below retail on eBay.
 

Overly Serious

kiwifarms.net
Well nice stuff from AMD if you can get it. They've just announced their new line-up of RDNA-2 based Pro graphics cards. Their W6800 has a very comfy 32GB of RAM. Supposed to be a lot faster than the last gen as well.


They've got a W6600 which has 8GB. Supposed to be about $600 so I'd be up for one if you could actually get one for that. Maybe with the big crypto slump the past few weeks might actually be able to get a card soon.

They've got their new mobile version out too, which I don't care so much about but if you need a workstation laptop I guess it's pretty swell.

EDIT: Anyone discussed FSR, yet? What I'm hearing is not quite competitive with Nvidia's DLSS but a lot easier to implement and has big studios already on board. It's also backwards compatible with previous generations and it's also an open standard so looks like they are trying to (and very likely will) pull a FreeSync vs. GSync move all over again.

 
Last edited:
Top