• Hi there and welcome to PC Help Forum (PCHF), a more effective way to get the Tech Support you need!
    We have Experts in all areas of Tech, including Malware Removal, Crash Fixing and BSOD's , Microsoft Windows, Computer DIY and PC Hardware, Networking, Gaming, Tablets and iPads, General and Specific Software Support and so much more.

    Why not Click Here To Sign Up and start enjoying great FREE Tech Support.

    This site uses cookies. By continuing to use this site, you are agreeing to our use of cookies. Learn More.

PC Gaming Week: How does the ultimate PC of 10 years ago compare to the best today?

PCHF IT Feeds

PCHF Tech News
PCHF Bot
Jan 10, 2015
50,043
26
pchelpforum.net
10-year-pc-470-75.JPG

Introduction, CPU and GPU


Yes, you may have a brand new PC with a Titan on graphics duty, Gigabytes of RAM sitting spare, and both a hard drive with near unlimited storage and a big SSD – one that makes everything fly. Could it be, though, that a seemingly humble machine from 2005 could keep up… or even, just maybe, even out-power it?

Of course not – what a ridiculous idea! But how far have we come in just 10 years? This was the year both the Xbox 360 and PlayStation 3 were unveiled, and the former actually came out. It wasn't a great gaming year for the PC, though we did get a few notable releases like Psychonauts and Fahrenheit/Indigo Prophecy – and Carol Vorderman's Sudoku – along with the almost instantly forgotten damp squib that was Quake 4.

If you wanted a game to show your rig at its best, chances are your game of choice was either Far Cry, to see how well it could render a gorgeous tropical island, or Doom 3, where systems competed to render the blandest of horror.

Intel%20CPU%202005-420-90.jpg


Engine of the beast


Let's start our retro system with the processor. 2005 was when dual-core processors ruled the day, with AMD's Socket 939 and Intel's Socket 775 taking point. AMD64 was popular with gamers at the time, but for simplicity's sake, we'll compare one of Intel's higher spec CPUs with today. The Pentium 4 Extreme Edition, running on a Prescott core, was a 64-bit CPU – though 32-bit was still the home standard – running at 3.7GHz with a 2MB cache.

Back to today, and let's look at the recommended specs for one of the year's most demanding games, The Witcher 3 – an Intel Core i7 3770. Perhaps surprisingly, this is only a quad-core chip, running at 3.4GHz with an 8MB cache – a big improvement to be sure, but not one that necessarily feels like a 10-year upgrade.

In the nineties for instance, we had the far more impressive jump from the Intel 386 through 486, with graphics going from simple 2D images to the likes of Doom and then full 3D experiences. Even if we raise our level to CPUs like the Core i7 5960X and 5820K that offer eight cores – far more than most people are using – there isn't the same raw feel of a generational leap.

Instead, while of course every task has its own requirements, in non-specialist environments processors and architectures have recently been more noted for other factors, like low power usage. Home applications at least haven't really benefitted from the jumps in years, with heavy lifting tasks increasingly a job for graphics cards.

Graphics%20card%202005-420-90.jpg


Graphic detail


That seems like a good next stop. Your card of choice in 2005 was likely the GeForce 7800 GTX – 512MB of throbbing graphical muscle, compared with the 4GB of NVIDIA's current top-end (excluding the Titan X), the GTX 980. Needless to say, that card doesn't so much crush its decade-old predecessor as atomise it. To pick just one stat, the 7800 GTX had a memory bandwidth of 54.4Gb/sec, while the GTX 980 boasts 224Gb/sec.

What really makes the difference though is how much more modern cards do. In 2005, for at least a while, there was talk of graphics cards being joined by another heavy-hitter, the 'physics' card. At the time even basic cloth simulation remained a showpiece technology (though given that Tomb Raider's next-gen console release is still making a big deal out of one of its characters having something approaching realistic hair, we probably shouldn't tut too much at that), to say nothing of throwing around debris and fancy effects.

Far%20Cry-420-90.jpg


Cue the Ageia PhysX card, which was seen around this time as destined to be either the next 3DFX card – the PC's most successful pioneering GPU – or a complete bust. In the end, it found a middle-ground, with Nvidia buying the company and integrating the physics support into the 3D cards everyone now needed. It's a little old now, but this demo of Arkham City shows the kind of difference this can make.

Updated GPUs have also added many new strings to their bows, including better shaders in games, and Nvidia's CUDA (Compute Unified Device Architecture), which allows its GPUs to take some of the weight off the CPU even while not throwing around 3D graphics – rendering video footage for instance, or crunching numbers for cryptography.

Memory lane


As for the rest of a 2005 PC, 512MB of RAM was still considered ample for most purposes. Then technical showpiece Doom 3 requested it, but would tolerate 256MB. It wasn't long before that began rising though, most notably with the horror FPS F.E.A.R in that year recommending 1GB of RAM (along with a 256MB 3D card and a Pentium 4 3.0GHz processor – high, high specs at the time).

Even so, it's only this year that we're really starting to see games asking for more than 4GB – our old friend The Witcher 3 for instance demanding a minimum of 6GB and ideally wanting 8GB to play with. Cue an upgrade even for many hardcore fans.

The reason for this is that in 2005, and the next decade, developers had to work on the assumption that gamers would be running a 32-bit operating system like Windows XP. In 32-bit, it's ordinarily (though not always) impossible to use even 4GB to its full, never mind anything more.

It's only now that 64-bit is well enough embedded for companies to expect it, with even most high-end PC games still slumming it as 32-bit apps. Last year's Dragon Age Inquisition was one of the first major mainstream games to finally cut that cord. Regular, albeit typically professional-grade applications, have been quicker off the mark, but not by much. Adobe's Premiere Pro CC? 64-bit only. The consumer Premiere Elements? Installs a 32-bit version as needed.

Game%20in%204-3-420-90.jpg


From the past to the future


So what can we say of our plucky PC from 2005? It's certainly less of a jump from it to today than you'd see from 1995 to 2005. Not only did that take us from the death-throes of DOS to a world where most folks would never even need to know that CD stood for 'change directory', that was a period where bulky CRTs were consigned to history in favour of flat LCDs, when 3D graphics grew up and took over the gaming world, and of course, when internet access finished its shift from modems to broadband for at least most lucky souls.

As for the next 10 years? Who can say. While the PC of 2025 will inevitably be far more powerful and capable of wonders we've yet to fully appreciate (in much the same way that good looking games in 2005 looked near photo-realistic but now look like crayon scribbles), it seems unlikely that its basics will be radically different.

Popular as phones and tablets and at least some of the ideas behind technologies like VR, Google Glass and HoloLens are, the good old monitor, keyboard, mouse and tower still have plenty of life left in them.

That said, it's impossible to predict with any real certainty. Luckily, being surprised is always part of the fun.

mf.gif







a2t.img
q78bK20vP-c


Continue reading...