Hardware Confusion 2009

[Page 10] The Finished System & The Future



The Finished System


Below are some pictures of the finished system, and in terms of aesthetics, I feel I've achieved my aim of making the entire system look coordinated and something that might be sold as a package. It's reasonably understated and mature without looking completely plain. While the Cooler Master Stacker 832 does come with a blue LED-lit case fan at the front, it's not as bright when the front door is closed. The ASUS P6T Deluxe motherboard also has a couple of small blue lights which are visible from the side through the mesh of the case, but nothing annoying or bright. All up I'd like to think that the case lighting is minimal and relatively tasteful, and doesn't look like a UFO on crack.



Click to enlarge Click to enlarge Click to enlarge Click to enlarge

Note that I'm using the mouse pad which came for free with the Cooler Master Stacker 832 SE case. It was a pleasant surprise to find it sitting in the same plastic bag as the instruction manual, because it's not mentioned anywhere and it's a nice touch by Cooler Master to include it. It's a sturdy metal-backed piece with a firm non-padded surface for greater accuracy, which makes it great for gaming. It seems to complement this system quite well.


Using this system can only be described as an immensely pleasurable experience, at least if you're a computer geek like me. Everything just works, there are no crashes or quirks, no stuttering or periods when the system will momentarily freeze or slow down. In particular I love the lack of noise, as this was a key priority for me, and one I wasn't sure I'd be able to achieve. This is the first time I can truly say my system is quiet. From the moment it starts up, there's no fan spinup noise. At idle the system gives off a soft hum/whirr noise; at full load, it is no lie to say that the sound is only barely above that of idle. Since the CPU and case fans are stuck at the same speed regardless of load, the PSU only activates its spare fan when it becomes very hot, and the motherboard is passively cooled with no fans whatsoever, this makes sense. However the GTX 285 is also hands down the most quiet graphics card I've ever owned. The fact that it barely raises its noise level while blasting through games at 1920x1200 max everything and lots of AA/AF thrown as well in means I'm doubly happy that I didn't go for a dual-GPU solution like the HD4870 X2 or GTX 295. Of course having a case that breathes so well probably helps the GTX 285 stay so quiet, since there's no heat buildup. This is also the same reason why max temps when gaming are 62C for the CPU and 76C for the GPU, even on components known to run quite hot, during summer and using only stock cooling and two case fans. It's all worked out the way I had hoped it would.


Speaking of gaming, it truly is a joy. The smoothness of the system is genuinely surprising, and this is coming from someone who's had a relatively smooth system in the past few years. The reason it feels strange is because of the way in which framerates are so consistent, and even during the initial period after a new level loads, there are no small loading pauses or stutters. Combined with the rich visuals, it's almost like watching a movie rather than just playing a game. To give you a taste of this, as well as an idea of what onboard audio is like, below is a 720P HD YouTube video showing a few minutes of gameplay and combat first from Crysis Warhead, then Fallout 3. Both games are at running at 1920x1200, Fallout 3 is at maximum settings along with 8xAA/16xAF; Crysis Warhead is at 'Cheap' Very High in DX10 mode using the 64-bit executable, along with Texture Streaming disabled (r_TexturesStreaming=0), resulting in image quality almost identical to the highest Enthusiast setting - see the screenshots under the Graphics component to more closely examine the image quality. You can see in the video that I began recording only a few seconds after the game has been launched from the desktop, and you can then see a saved game being loaded up. Even immediately after loading the saved game, everything is completely smooth, even with the overhead of the Fraps video recording utility which both halves the framerate and increases the potential for stuttering:





Note: To view the larger version of the HD Video, go here. If you don't want or can't watch the full HD quality video, you can watch the low resolution Standard Version or the medium resolution High Quality Version.


Of course without Fraps the framerates are much higher and the game even smoother, but equally as desirable is the fact that no game I've tried so far has had any crashes, lockups or glitches, making everything that much more enjoyable. Games like Crysis Warhead and Fallout 3 are known to be temperamental in their performance and stability, but not on this system, nor indeed on my last system either. My number one priority of stability and hassle-free functionality has been achieved again it seems, and I'm quite relieved.


The overwhelming feeling with this system is that there's a massive amount of headroom, that it can withstand anything thrown against it for the next couple of years and still have power to spare. This obviously aligns with my other top priority of longevity. A Core i7 and 6GB of RAM aren't going to become fully utilized for quite a while, but by the same token they also provide tangible benefits right now, especially in strenuous games when combined with the fast GPU and a 64-bit OS. System tweaking using the TweakGuides Tweaking Companion and relevant Game Guides have made their contribution as well I'm sure, so just buying the right components is not the end of the story. Knowing how to correctly configure them and optimize them is imperative for getting the most out of them and ensuring 100% stable operation, as my initial issue with AHCI proved.


However before you rush out to buy a similar new Core i7-based system, for anyone currently using a recent very fast dual-core, or particularly a good quad-core CPU, the jump to Core i7 won't be amazing in normal desktop usage, and may not be worth the investment. While it's noticeably faster even in single-threaded and basic multi-threaded apps, the Core i7 really shines when doing very CPU-intensive tasks, especially when multi-tasking. The Core i7/X58 platform also needs to be paired up with the right components to achieve excellent results; on its own isn't going to miraculously transform a system into something amazing unless you get the right GPU, PSU, Monitor and Hard drive for example. In any case the decision to upgrade to a system similar to mine should be based on a range of considerations and not just the urge to have the latest and greatest. Remember that I upgraded to this system from one that was built over three years ago, and I did so with a set of priorities which may not match yours.


Anyway the best way I can end my assessment of this new rig is simply to say I'm extremely happy with it. If you can build a new machine and honestly conclude the same thing at the end, then you've achieved all your aims as well.



The Future


For many people, upgrade decisions are based on worrying about what will happen in the future as much as what is available today. I certainly don't claim to know for certain what the future holds with regards to tech developments, but I can provide some thoughts in that regard if you're interested in hearing them. Don't think of this as anything more than the musings of one person with a moderate degree of tech knowledge; it's not designed to be a definitive study of future developments.



To predict the future, we first need to examine the past. In my previous Hardware Confusion article, I mentioned the following technologies in my introduction: Dual-core CPUs; 64-bit CPUs; 64-bit Operating System; Serial ATA 2 (SATA-2) devices; PCI Express (PCI-E) interface; Scalable Link Technology (SLI) graphics; Crossfire graphics; and Double Data Rate 2 (DDR2) RAM. It's interesting to note that most of these technologies were totally new back in 2005, and yet here we are four years later, only to find that the adoption of some of them hasn't been as fast as expected. Dual-Core CPU, SATA2, PCI-E and DDR2 technology eventually found their place in most mainstream systems. However 64-bit, SLI and CrossFire have had more mixed results. To add to this list, now we have SSDs, CPUs with more than two cores, Blu-Ray discs, and other assorted developments. How will these technologies pan out?


64-bit CPUs and 64-bit Operating Systems have long been available, yet it hasn't really been until the past year or so that 64-bit usage has started to gain momentum. Figures are hard to come by, but in this article on the Windows Blog, based on Windows Update data it was noted that between March and June 2008 alone the proportion of new Vista PCs which were 64-bit went up from 3% to 20%. The most recent (January 2009) Steam Hardware Survey shows that around 9% of all Windows users with Steam are using a 64-bit OS, the majority being on Vista 64-bit. Vista 64-bit users are also the fastest growing group of Windows users. So it seems clear that 64-bit adoption has grown rapidly and will continue to grow. Although it might take some time before most people are using a 64-bit OS, I have no doubt that given the fact that many systems already have a 64-bit CPU and will soon have more than 4GB of RAM as standard, the move towards 64-bit OSes is inevitable. Software developers have known for quite a while that the memory ceiling in 32-bit OSes is a handicap which cannot be overcome without a shift to 64-bit, and it's quite noticeable that more strenuous programs such as games, which are more likely to hit that ceiling, are now coming with 64-bit binaries as standard. As such, I would strongly recommend that anyone upgrading now and into the future consider purchasing and installing the 64-bit version of Windows, whether Windows Vista 64-bit or Windows 7 64-bit when it's released.


SLI and CrossFire multi-GPU technologies are a bit more dubious. The Steam Hardware Survey would be quite representative of the major users of this type of technology given the diversity of games on Steam, and it shows that only 1.57% of all Steam users are utilizing SLI configurations, the bulk of them with 2-card SLI (1.53%). Even fewer users are employing ATI's CrossFire multi-GPU solution, at 0.21%. So in total 1.78% of all Steam users are currently gaming on SLI or CrossFire-based multi-GPU systems, and they seem to be in decline. As such, SLI and CrossFire can really only be deemed niche technologies, even four years after their introduction, and I don't believe they will grow significantly in the future. Both ATI and Nvidia have recognized this and to make things easier and more attractive, have built single-card multi-GPU solutions in the form of the HD4870 X2, the HD4850 X2, and the GeForce GTX 295. A quick look at the proportion of Steam users with these cards shows that the GTX 295, released less than a month ago is used by 0.03% of those surveyed, while things are muddied a bit on the ATI side, as there's only a single line item labeled '4800 series' which would include the single-GPU 4870 and 4850 as well as the multi-GPU cards. In any case it makes up 2.31%. Once again, not very high, but they're showing growth, and I suspect given the popularity of these cards in most reviews, more high-end users may opt for these solutions in the coming months. Whether they need them or not is another matter.


I have serious reservations about both multi-card SLI and CrossFire, and multi-GPU single-card solutions based on the same technology. I just don't see them as being the direction for the future. It seems that this type of technology is currently a temporary brute-force solution, similar to the way Intel initially pretty much glued two CPU cores together to form the underwhelming dual-core Pentium D, before eventually releasing the more efficient Core 2. Just like the Pentium D, these multi-GPU solutions use large amounts of power and shed lots of heat and noise, sometimes for relatively small performance gains in return. I also think that SLI and CrossFire still have a fair way to go before they can be deemed sufficiently problem free. Right now a high-end enthusiast who craves the highest performance might put up with crashes, microstuttering or other odd behavior associated with multi-GPU, but the average user won't put up with these headaches given the costs involved. Driver improvements will resolve this, but then I also believe that software in general is the future for GPUs - the hardware will continue to evolve, but specialized software which utilizes the massive potential of the GPU for various parallel-processed computational tasks is where the major gains lie. See for example the GPGPU (General Purpose Computation on GPUs) site for more details. This is one of the reasons I currently favor Nvidia over ATI, not because of brand loyalty, but because I see Nvidia is forging ahead with making better use of the GPU through software, such as PhysX processing - in effect killing off the standalone PhysX PPU card - and CUDA. I only hope for the sake of competition that ATI can beef up the software side of things, especially its drivers, and compete with Nvidia in the future. If not ATI, then perhaps Intel will, but regardless, someone has to for our sake, otherwise prices will go up and progress will slow down.


On a related note, I'll discuss gaming as it's related to graphics hardware development. I've covered this topic in more detail in my PC Game Piracy Examined article, but basically I don't believe PC games will push graphical boundaries the way they did in the past, at least not in the next couple of years. This isn't because the hardware is not capable, it's because the market is not conducive to developers plowing vast sums of money to create graphically intensive games which most PC gamers will either complain about not being able to play, or simply pirate instead of paying for them. In short it's not viable for PC games to push beyond the boundaries created by the ageing graphics hardware on current consoles. In fact the opposite seems to be occurring; many games are purposely being scaled down to run on more systems so as to cater to the broader mainstream market. I believe it will be at least around 2011-12, when the new generation of consoles arrive, that games will again forge ahead with new levels of realism and graphical intensity. The only exceptions might be in cases where Nvidia or ATI directly sponsor a particular developer to incorporate advanced graphical features designed to require high-end graphics hardware. For example Nvidia sponsored Crytek during the creation of Crysis, and upon its release, simultaneously released the 8800GT graphics card to complement it. Given games now cost upwards of $20-30 million to develop, and the scorn which gamers poured over Crysis because of their disappointment with being unable to max it out on their system, I don't see this occurring too frequently though. Most developers will stick with the safe route of either creating their game for consoles and porting it across to PCs with minor enhancements, or making mainstream games like massively multiplayer games, or lowest common denominator games (e.g. Quake Live, Battlefield Heroes).


Hard drive technology is definitely going to move towards a non-mechanical future. Current hard drives are severely bound by the physical limitations of their moving parts, and so it was inevitable that something like the SSD would be developed as a consumer solution. Given the tremendous potential speeds that an SSD can achieve, it seems logical that SSDs will start replacing hard drives as their price falls and their storage capacity rises, and more importantly, their write speeds start to match their read speeds. Any remaining quirks can be ironed out through firmware improvements, and I'd say within a year many more enthusiast systems will start having SSDs as their main drive. It's about time too, because mechanical hard drives really are a lumbering relic of the past and still the slowest component in any system.


In terms of optical drives, there are developments afoot in that field as well. In the short term Blu-Ray has won the next-gen battle against HD-DVD and will slowly replace DVDs as the mainstream movie media, and likely become commonplace on PCs within two or three years. DVDs still have a lot of life left in them though, so I don't see them being completely replaced for quite a while. Most software can comfortably fit on one or two DVDs at the moment, so given the cost, I can't imagine manufacturers switching to Blu-Ray to distribute a game for example, especially given the cost effectiveness and popularity of digital distribution models. However in the longer term, even Blu-Ray will become redundant as technology such as Holographic Drives becomes viable for consumer use. At 300GB - 100TB of maximum storage space on a single disc (vs. 50GB on Blu-Ray and 8.5GB on DVD) and with current write speeds of almost 30 times that of Blu-Ray or DVD, it's obvious that unlike CDs which lasted almost 15 years as the dominant media, current optical media cycles are shortening and new technologies emerging whether we actually need them or not.


Last but not least, the future of the CPU appears to be more and more cores, not GHz. Dual Core CPUs, the first of which were only just emerging as I wrote my last Hardware Confusion article in July 2005, are well and truly mainstream by now. In fact some applications and games are now finally utilizing the processing power of dual cores to the maximum, making quad core CPUs a necessity in some cases. However although quads will likely be sufficient for mainstream usage in the next couple of years, it might interest you to know that the trend won't stop there - Intel for example is developing an 80 core CPU chip called Polaris which it plans to release by 2011. Clearly the CPU is going to reign supreme in the PC, and although GPUs may add general processing power for some tasks, the CPU will have plenty of headroom to crunch data at astronomical rates if required. Whether CPUs will make GPUs redundant is unclear, the issue coming down to software development once again, and again it will be up to developers to utilize this massive amount of processing power. Techniques such as Ray Tracing for example can use the power of multi-core CPUs to deliver ultra realistic graphics in the future. Last year Intel demonstrated a ray-traced Enemy Territory: Quake Wars running in real-time at 14-29FPS at 1280x720 on a 16-core 2.93GHz CPU, so the technology is already reasonably well-developed. This makes it important for consumers to buy the right number of cores on an efficient architecture; too many cores are wasteful, as are too few. Right now I believe a quad core CPU is the right figure for a multi-core system because duals are becoming maxed out in some cases, and because as I've noted earlier, more and more games are being designed for a minimum tri-core console CPU. It doesn't have to be Core i7 though, a Core 2 quad, Core i5 quad or even a Phenom II quad can be enough for the immediate future and yield benefits right away.


After all that speculation, who knows if some unpublicized development will suddenly come out of left field and totally takes over the PC market. If the past has taught us anything, it's that even the wisest tech gurus can get it wrong. For example Bill Gates predicted in 2004 that spam would be a thing of the past by 2006. I'm willing to bet he'll admit he made a mistake.



Conclusion


With that, the article comes to a close. I hope you found it mildly interesting, and I hope you'll forgive the sometimes self-indulgent nature of it, given that I often talk about my system and my choices almost as though I'm a proud parent rather than a PC enthusiast. Remember, the key message behind this article is that you should research widely and always think for yourself, so please don't follow any of my advice religiously, and please don't email me asking for purchasing advice or opinions as I can provide none - you are the best source of purchasing advice for yourself. Thanks again for reading.