GPUs & CPUs: Questions, Discussion and fanboy slap-fights - Nvidia & AMD & Intel - Separe but Equal. Intel rides in the back of the bus.

Ginger Piglet

Burglar of Jess Phillips MP
True & Honest Fan
kiwifarms.net
Launched on November 20th 2000, today is Intel Pentium 4's 20th birthday.


I... I had one.

There I was at the Old Pigsty, trucking along nicely since 1998 with my Pentium 200MMX which, despite the memes, was a good CPU right up until the year 2000. Could play Unreal Tournament, Tiberian Sun, all the Quakes, System Shock 2, and Baldur's Gate no problem. Then, we got an upgrade! We trucked out to PC World ('member PC World?) in mid 2001 and got ourselves a lovely new prebuilt from Packard Bell in a rather swish blue case. In fact, it looked just like this. GeForce 2 GPU, 256 MB RAM, 40 GB hard drive... and a Pentium 4 at 1.5 GHZ.

So, bragged about it the next day to school friends. One of them told me he had an Athlon 900. This was a CPU that was a year older and went up against the Pentium III at the time, but he reckoned it was faster. But me, being the brainless proto-consoomer I was, say, "is not! Yours is only 900 MHz while mine's 1,500 MHz! S'there!" So we agreed to go away and benchmark it.

It was ogre. The P4 was humiliated.

Turns out that the Pentium 4 was really, really fast at number crunching and performing loads of mathematical and logical operations so long as it didn't have to rely upon other things or carry out too many if/then/else type operations. This is because it had a yuge pipeline and this was how they got it to for the time dizzying high speeds. But if it predicted the wrong branch, it had to stop, flush the pipeline, and start again.

Despite this, when I went off to university I acquired a Pentium 4 based laptop (which oddly enough had a desktop class P4 at 2.8 GHz) and so never learned. Its battery life was shit and there was a torrent of hot air permanently emerging from the side vents.

But yeah, the Pentium 4 was an architecture that started out poorly, became adequate two years too late with the Northwood core, and then took a hard turn into awfulness again with Prescott and Cedar Mill which had the added fun bonus of cooking themselves. Yet despite this, there was worse. The Pentium 4 based Celerons, for instance. In 2010 I started my training contract and the PCs at the firm were all old XP based boxes running Celeron D processors. Despite the name, these were single core and gimped versions of Prescott Pentium 4s. They are possibly the worst CPU I have ever, ever, used. Literally having Word, Excel, our case management software, and Firefox open all at once would bog them to rage-inducing levels, and having more than three Firefox tabs open at once would make them lock up totally for about two or three minutes.

EDIT: If you want to experience the agony of the Celeron D, you can buy TWO for literally less than the cost of a pint.
 

Allakazam223

We wuz Orkz n shit
kiwifarms.net
I know, but the 3950X is still a capable CPU
Seeing CDPR recommend a 3600 (which I have) and a 3080 for ultra 4k RT, I am perfectly fine being a waitfag and getting a used 5950x+6800xt when zen4/hopper/yuge Navi comes out and consoomers can't help themselves.
 

Ginger Piglet

Burglar of Jess Phillips MP
True & Honest Fan
kiwifarms.net
Seeing CDPR recommend a 3600 (which I have) and a 3080 for ultra 4k RT, I am perfectly fine being a waitfag and getting a used 5950x+6800xt when zen4/hopper/yuge Navi comes out and consoomers can't help themselves.

Probably more like THICC Navi, given that the only true and honest dual slot cards this generation are the 6800 non-XT and the Founders Edition RTX 3070 and RTX 3080. All the others range from 2.2 to 2.7 slots. I suspect that the 6900XT may also be a 2.7 slot card when it appears. At which point you might as well just admit it's a triple slot and have done with it.
 

DNA_JACKED

kiwifarms.net
Launched on November 20th 2000, today is Intel Pentium 4's 20th birthday.


What a dog of a CPU. Terrible microarchitecture and a space heater. Are these traded as "retro" yet?

I... I had one.

There I was at the Old Pigsty, trucking along nicely since 1998 with my Pentium 200MMX which, despite the memes, was a good CPU right up until the year 2000. Could play Unreal Tournament, Tiberian Sun, all the Quakes, System Shock 2, and Baldur's Gate no problem. Then, we got an upgrade! We trucked out to PC World ('member PC World?) in mid 2001 and got ourselves a lovely new prebuilt from Packard Bell in a rather swish blue case. In fact, it looked just like this. GeForce 2 GPU, 256 MB RAM, 40 GB hard drive... and a Pentium 4 at 1.5 GHZ.

So, bragged about it the next day to school friends. One of them told me he had an Athlon 900. This was a CPU that was a year older and went up against the Pentium III at the time, but he reckoned it was faster. But me, being the brainless proto-consoomer I was, say, "is not! Yours is only 900 MHz while mine's 1,500 MHz! S'there!" So we agreed to go away and benchmark it.

It was ogre. The P4 was humiliated.

Turns out that the Pentium 4 was really, really fast at number crunching and performing loads of mathematical and logical operations so long as it didn't have to rely upon other things or carry out too many if/then/else type operations. This is because it had a yuge pipeline and this was how they got it to for the time dizzying high speeds. But if it predicted the wrong branch, it had to stop, flush the pipeline, and start again.

Despite this, when I went off to university I acquired a Pentium 4 based laptop (which oddly enough had a desktop class P4 at 2.8 GHz) and so never learned. Its battery life was shit and there was a torrent of hot air permanently emerging from the side vents.

But yeah, the Pentium 4 was an architecture that started out poorly, became adequate two years too late with the Northwood core, and then took a hard turn into awfulness again with Prescott and Cedar Mill which had the added fun bonus of cooking themselves. Yet despite this, there was worse. The Pentium 4 based Celerons, for instance. In 2010 I started my training contract and the PCs at the firm were all old XP based boxes running Celeron D processors. Despite the name, these were single core and gimped versions of Prescott Pentium 4s. They are possibly the worst CPU I have ever, ever, used. Literally having Word, Excel, our case management software, and Firefox open all at once would bog them to rage-inducing levels, and having more than three Firefox tabs open at once would make them lock up totally for about two or three minutes.

EDIT: If you want to experience the agony of the Celeron D, you can buy TWO for literally less than the cost of a pint.
Ahh Netburst, a massive pancake of shit on Intel's face and something that makes their current 14nm conundrum look tame. The pursuit of clock speed above all else tanked the design and made it totally worthless. Also worth noting, the only reason it was any good at number crunching was because each core had two ALUs instead of the typical one in a desperate attempt to keep data flowing through it (this is also why power consumption as so horrid).

The first PC I ever had was a dell dimension 2400. This thing was total pig shit. It had the dreaded celeron D inside. For those who dont know why these things were so bad:

The OG pentium 4, known as "williamette", had a 20 stage pipeline, twice the 10 stage of the pentium III. This long pipeline meant that more complex calculations could easily clog the pipeline, leading to the processor stalling out. Any calculations that got a wrong answer would have to start back at the beginning, and the longer the pipeline the longer this takes. In addition, even with out of order executions, longer pipelines take longer to successfully process a calculation, meaning those higher clockspeeds did not translate to performance.

The double ALU design was part of what intel called the "Rapid Execution Engine" that was supposed to reduce branch prediction error time by up to 33%. These ALUs were also double pumped, meaning if your CPU was clocked at 3.8GHz like the infamously hot running Irwindale chips, your ALUs would be hitting a staggering 7.6 GHz. Reading between the lines, the netburst arch was so fucking slow Intel jammed in additional circuitry trying to keep the latency down to make the chips viable.

What intel didnt seem to figure out the first time was that this is also very cache intensive, and the 256kb of l2 cache on williamette was simply insufficient for this task, forcing the CPU to send requests to system memory for even small tasks, imposing yet another major latency penalty and also bandwidth penalties as cache supplies memory bandwidth orders of magnitude higher then system memory. This resulted in constant stalling, wherein the latency of the netburst core caused instructions to pileup whenever there was a branch miscalculation (which is often) and seemingly froze the computer while the CPU chewed through backlogged requests.

Now, the second generation of Pentium IV chips was known as "northwood". Intel made some tweaks to branch prediction to lower the mis-calculated rate, and doubled the L2 cache to 512kb to reduce reliance on system memory for small tasks. They also implemented a DDR-400 memory controller instead of RAMBUS like Willamette, as RAMBUS had a major latency problem (imagine my shock) and despite the much lower clockspeed and bandwidth DDR-400 allowed for a far more responsive system. Intel then decided they needed a cheaper chip to sell to system integrators, and came up with the marvelous idea of cutting the cache to 128kb, 1/4 of northwood and 1/2 of williamette, and giving them only 266MHz bus speeds instead of 400 MHz.

This disastrous idea became the infamous first generation of Celeron D.

In my case, Dell went even cheaper, using a SINGLE CHANNEL memory controller on top of a celeron D. The result was a system that couldnt even handle microsoft word without stalling, and windows XP would produce laggy menus and lock up if you sneezed near it. Northwood was a godsend by comparison, to say nothing of the athlon 64's that were eating intel alive. A 2.2GHz athlon 64 could easily beat a 3.4GHz northwood in games and a 2.0 GHz could beat it in productivity. For specifics: I was a huge star wars fan as a kid, and enjoyed 3 games, Empire At War, Battlefront, and Knights of the Old Republic. EaW ran at about 5-8 FPS at all times, dipping to 1-2 in larger battles. Battlefront did about 10-15 FPS on smaller maps indoors, but outside the FPS dropped back to single digits and some maps with vehicles like dune sea were totally unplayable. KoToR ran in the single digits at all times and was nearly unplayable, with movement often locking into a certain direction for upwards of 30 seconds while the CPU trying to figure out what was going on, and in particular the movement controls would fail altogether in Taris' underworld, meaning I never saw past Taris until years later. This computer had a geforce 8400gs in it, the infamous 8 core version, by the time I got it.

For comparison, a few years later a family friend gifted me his old windows 2000 pc. This PC had a 1.4GHz pentium III Tualatin S chip, regarded as the best Pentium III ever made, 266 MHz DDR via a VIA 266 motherboard, and a Geforce 4200ti. This computer could run all those games at 30-40 FPS without much issue.

Intel would make several other worthless Celeron D CPUs a few years later, based off of prescott (or PressHot as they were known) which had a 31!!! stage pipeline and was only slightly better then the first generation. Netburst allowed AMD to basically print money for years, giving them enough revenue to buy ATi with debt and start Global Foundries. Stunningly, Intel had a 50!!!!!! stage pipeline CPU int he works before finally scrapping it and going with the Core design that their Israeli team was cooking up.

For those interested, this guy does a pretty good rundown of the last generation celeron D. Keep in mind this is notably faster then what I had:
He also does a good rundown on the hideous 8400gs
 

Ginger Piglet

Burglar of Jess Phillips MP
True & Honest Fan
kiwifarms.net
In my case, Dell went even cheaper, using a SINGLE CHANNEL memory controller on top of a celeron D.

Lol, the work PCs I referred to weren't even the 3.06 GHz Celeron D that Budget Builds tests, but were 2.53 GHz. And they weren't Dell but were HP. And given I remember the IT man trying to squeeze them further by adding a second RAM stick, they probably were also single channel memory.

But, yeah. Celeron. Not even once. And to think that back in 1998 the Celeron 300A was an absolutely legendary piece of kit. 50 percent overclocking on the stock cooler with a BX chipset. Do they even still make them? And if so, who buys them? Because even the mass produced office boxes have i5's (usually Haswell or 1st gen Skylake) these days.
 

AmpleApricots

kiwifarms.net
It's more of a marketing naming thing, these Celerons are SoCs have nothing in common with the earlier CPUs. Some of them are quite fast, the big advertising point is high integration and therefore cost saving production in components for mainboard manufacturers and eased cooling, related to saving power and again saving of parts in Designs. Engineering-wise it's also a big deal if you need a fan or can solve any heat problem by sticking the chip to the case/any piece of metal.
 
  • Agree
Reactions: DNA_JACKED

Ponchik

Death.
kiwifarms.net
What a dog of a CPU. Terrible microarchitecture and a space heater. Are these traded as "retro" yet?
BRESCOTT.png

p4s are in this weird position like a lot of early 2000s hardware where they aren't ancient enough to be considered "retro", but they're too old and useless to actually be used for much anymore... i have seen them used in plenty of winxp period piece rigs, but i'm guessing they're still fairly cheap and plentiful
 

DNA_JACKED

kiwifarms.net
View attachment 1742020
p4s are in this weird position like a lot of early 2000s hardware where they aren't ancient enough to be considered "retro", but they're too old and useless to actually be used for much anymore... i have seen them used in plenty of winxp period piece rigs, but i'm guessing they're still fairly cheap and plentiful
They are remarkably cheap, usually under $10. And once again I am jealous over how cheap retro hardware in the UK appears to be VS the US.

Their one big advantage is motherboard support. AMD motherboards from this era are Fucking Fucking Fucking Bad Bad Bad Bad. AMD didnt even have their own chipset at the time, instead relying on the likes of via and nvidia for that role. AMD systems never had the stability of intel systems, and if one is building a XP retro PC today and wants proper 32 bit compatibility for 16 bit hybrid games, the Pentium 4 is a LOT easier to build around with limited information today.

That and AMD never bothered to release Athlon XPs faster then 2.23 GHz when the arch could easily hit 3 GHz and at that speed crushed every Northwood Pentium 4 ever made.
 
  • Informative
Reactions: Rozzy

Ponchik

Death.
kiwifarms.net
They are remarkably cheap, usually under $10. And once again I am jealous over how cheap retro hardware in the UK appears to be VS the US.

Their one big advantage is motherboard support. AMD motherboards from this era are Fucking Fucking Fucking Bad Bad Bad Bad. AMD didnt even have their own chipset at the time, instead relying on the likes of via and nvidia for that role. AMD systems never had the stability of intel systems, and if one is building a XP retro PC today and wants proper 32 bit compatibility for 16 bit hybrid games, the Pentium 4 is a LOT easier to build around with limited information today.

That and AMD never bothered to release Athlon XPs faster then 2.23 GHz when the arch could easily hit 3 GHz and at that speed crushed every Northwood Pentium 4 ever made.
i've never heard of amd's xp-era motherboards being egregiously awful - it is fun to remember tho that there was once a world in which via, nvidia and ati actually fucking made mobo chipsets
 
  • Like
Reactions: Smaug's Smokey Hole

Smaug's Smokey Hole

Sweeney did nothing wrong.
kiwifarms.net
i've never heard of amd's xp-era motherboards being egregiously awful - it is fun to remember tho that there was once a world in which via, nvidia and ati actually fucking made mobo chipsets
I knew I had an old dusty box sitting in storage. Look at that Firewire 800.
gigaforce1.jpg

nvFirewall could be a nightmare that would BSOD the system, I think torrent traffic absolutely wrecked it, but there was two ethernet ports so it was easy to switch(after some angry googling).

15 years later I'm still buying Gigabyte but now they're too cheap to put shitty donut spaceships flying into a donut space station on the box. Palit probably abandoned the happy frog as well.
gigaaorus.jpg

What a dismal future this turned out to be.

One thing I think I remember about nForce4 was that they had a less sketchy SATA controller and that's why I went with the nForce instead of other chipsets. nForce used a sketchy PATA controller instead.
SATA, USB and WiFi could be a nightmare and as a consumer you never knew which makes and models of chips you would get on a motherboard so a larger brand umbrella like nForce(or Intel putting their foot down with Centrino) inspired confidence. Having one driver package for everything was a relief.
 
  • Feels
Reactions: Just Some Other Guy

Just Some Other Guy

kiwifarms.net
I knew I had an old dusty box sitting in storage. Look at that Firewire 800.
View attachment 1742567
nvFirewall could be a nightmare that would BSOD the system, I think torrent traffic absolutely wrecked it, but there was two ethernet ports so it was easy to switch(after some angry googling).

15 years later I'm still buying Gigabyte but now they're too cheap to put shitty donut spaceships flying into a donut space station on the box. Palit probably abandoned the happy frog as well.
View attachment 1742528
What a dismal future this turned out to be.

One thing I think I remember about nForce4 was that they had a less sketchy SATA controller and that's why I went with the nForce instead of other chipsets. nForce used a sketchy PATA controller instead.
SATA, USB and WiFi could be a nightmare and as a consumer you never knew which makes and models of chips you would get on a motherboard so a larger brand umbrella like nForce(or Intel putting their foot down with Centrino) inspired confidence. Having one driver package for everything was a relief.
Fellow gigabyte user as well. Though went to Asrock this time around because I couldn't pass up a great deal on an x570 Taichi.

Still, pc never had an issue with Gigabyte.
 

Ginger Piglet

Burglar of Jess Phillips MP
True & Honest Fan
kiwifarms.net
i've never heard of amd's xp-era motherboards being egregiously awful - it is fun to remember tho that there was once a world in which via, nvidia and ati actually fucking made mobo chipsets

If you go back to the 1990s there were loads of chipset and motherboard manufacturers. ALI, SiS, UMC for chipsets, and as well as the usual group of Asus, Gigabyte, MSI (Asrock didn't exist yet) there was also Elitegroup, Biostar, Soyo (who are now a shitty AliExpress brand but used to do reliable but inexpensive boards), Chaintech, FIC, Abit, Aopen, and the dreaded PC Chips (who merged with Elitegroup, and ECS, the company that resulted, are responsible for a lot of OEM and bargain-basement boards.)
 

AmpleApricots

kiwifarms.net
I remember having a Chaintech board that let me set CPU speed multiplier etc. in the BIOS (as opposed to jumpering) first time I've ever seen that back then. It didn't work well. That was a Pentium.

ALI chipsets where in a lot of embedded x86 and could often be pushed to the absolute limit, SiS used to be good value for the money and they also made an x86-SoC, the Vortex86. Cyrix also was pretty good value for the money in general, if you needed a cheap upgrade for a system you already had. AMD based their later SoCs on their MediaGX line. intel chipsets could be surprisingly dodgy sometimes and expensive, and often they weren't that good value for the price because intel just loved to nickel and dime features. (intel BX chipset was killer though) If you go farther in the past things get even more interesting. AMD used to be a big electronics parts supplier, programmable gate arrays for example. Western Digital used to make graphics chips, quite decent ones even. This stuff used to be very interesting and you had a plethora of hardware that all had slightly different features and shopping around and reading up on it was sometimes a science in itself. Chipset also totally could affect your computer's speed, sometimes quite dramatically, at the same specs. Today the selection just isn't that big, you buy the bigger number, you get a faster computer. More reliable but also kinda more boring. Doe VIA even still make CPUs?

Anybody remembers intels fuckery with the coppermine P3s that were actually unstable above 1 Ghz? They were also the first to implement software-readable hardware serial numbers on the P3 in 1999 which caused quite the uproar. (funny and almost quaint considering what a nightmare that stuff is now) Intel was always shady AF.
 

Ginger Piglet

Burglar of Jess Phillips MP
True & Honest Fan
kiwifarms.net
I remember having a Chaintech board that let me set CPU speed multiplier etc. in the BIOS (as opposed to jumpering) first time I've ever seen that back then. It didn't work well. That was a Pentium.

ALI chipsets where in a lot of embedded x86 and could often be pushed to the absolute limit, SiS used to be good value for the money and they also made an x86-SoC, the Vortex86. Cyrix also was pretty good value for the money in general, if you needed a cheap upgrade for a system you already had. AMD based their later SoCs on their MediaGX line. intel chipsets could be surprisingly dodgy sometimes and expensive, and often they weren't that good value for the price because intel just loved to nickel and dime features. (intel BX chipset was killer though) If you go farther in the past things get even more interesting. AMD used to be a big electronics parts supplier, programmable gate arrays for example. Western Digital used to make graphics chips, quite decent ones even. This stuff used to be very interesting and you had a plethora of hardware that all had slightly different features and shopping around and reading up on it was sometimes a science in itself. Chipset also totally could affect your computer's speed, sometimes quite dramatically, at the same specs. Today the selection just isn't that big, you buy the bigger number, you get a faster computer. More reliable but also kinda more boring. Doe VIA even still make CPUs?

Anybody remembers intels fuckery with the coppermine P3s that were actually unstable above 1 Ghz? They were also the first to implement software-readable hardware serial numbers on the P3 in 1999 which caused quite the uproar. (funny and almost quaint considering what a nightmare that stuff is now) Intel was always shady AF.

Welp, for the record, back in 1998 we upgraded to a Pentium 200MMX from the Pentium 60 we had beforehand.

(The Pentium 60 was basically an obvious beta. Thrust it onto the public and let them work the kinks out. It trashed almost all 486s other than the AMD DX5/133 in certain workloads, but it ran extremely hot. On ours a single wonky case fan was enough to make it deeply unstable.)

The motherboard we had it on was called the TOPGUN TX PRO. You thought belt-hitching hardware names like the MSI Godlike Extreme Gaming and Acer Predator was only a thing in current year? Nope. I remember the manual even came with a picture of an F-14 fighter jet on the front. It was the only good board that PC Chips ever made. The chipset was not an Intel TX but a rebranded ALI. Still good though.

That was the best system I ever had from "retro" days. Totally stable. Could run most things up until the year 2000. I remember playing Quake on it and actually being amazed at how smooth it was compared to the Pentium 60. Ditto Hexen II, which I preferred to Quake on account that it was less brown.

PC Chips, however, were generally shit. Theirs was the 486 motherboard with fake cache chips that were basically plastic blocks glued to traces that went back on themselves and fraudulently engraved with the part number for true and honest cache chips. Theirs also was the VX Pro / VX Two motherboards with a paper thin PCB and a shit tier VIA chipset claiming to be an Intel VX.

Thankfully there are no real shit motherboards to that level any more, though personally I would avoid Foxconn and ECS like the coof. You can probably get an Ali Express special from Soyo or COLORFUL and it'll work and with the CPUs it claims to be compatible with. Just don't try to push it too hard. However there are shit power supplies. Maplins used to sell power supplies under the "G7" brand which were "EXTREME GAMING" and up to 780W. They were a nameless, faceless, grey box with a lil' 80mm fan at the back and no modular cables. They had a distinct smell of fire hazard about them. Seriously, just lever open your wallet and get a good power supply. Corsair or Seasonic or EVGA for ATX, Silverstone for SFX. Silverstone even do Titanium rated SFX power supplies these days so the ATX power supply makers have no excuse.
 

DNA_JACKED

kiwifarms.net
i've never heard of amd's xp-era motherboards being egregiously awful - it is fun to remember tho that there was once a world in which via, nvidia and ati actually fucking made mobo chipsets
Maybe I'm a bit biased. They were not the worst thing ever, however they did suffer from a lack of definitive bottom line. Some were unstable, some had horrible drivers, some had limited CPU support, ece. There was no one board that was overall good like intel, where you could get a stable board that had good drivers and good CPU support.
 
Tags
None