AMD's Phenom Unveiled: A Somber Farewell to K8
by Anand Lal Shimpi on November 19, 2007 1:25 AM EST- Posted in
- CPUs
Overclocking
Given the launch frequencies, you can expect that Phenom isn't a tremendously overclockable chip.
While we were able to run our 2.4GHz chip at 3.0GHz, we couldn't get it stable. Even 2.8GHz wasn't entirely stable, but 2.6GHz was attainable for benchmarks.
AMD's OverDrive Utility, note that it reads the memory controller as single channel because Phenom actually has two independent 64-bit memory controllers instead of a single 128-bit one. Current BIOSes and the AMD utility incorrectly report this as being single-channel.
All of our overclocking was done using AMD's nifty new OverDrive utility, which is a Windows utility that can control virtually every single BIOS option from within your OS. You can overclock individual cores, change memory timings, voltages and most importantly: it all works.
The application is a little slow to respond when making changes and it would be nice if there was a hotkey to bypass the application loading its last settings, but it's truly a beauty to work with and one of the best aspects of today's launch.
The Test
CPU: | AMD Phenom 9900 (2.6GHz) AMD Phenom 9700 (2.4GHz) AMD Phenom 9600 (2.3GHz) AMD Phenom 9500 (2.2GHz) Intel Core 2 Quad Q9450 (2.66GHz/1333MHz) Intel Core 2 Quad Q6700 (2.66GHz/1066MHz) Intel Core 2 Quad Q6600 (2.40GHz/1066MHz) |
Motherboard: | ASUS P5E3 Deluxe (X38) MSI K9A2 Platinum (790FX) |
Chipset: | Intel X38 AMD 790FX |
Chipset Drivers: | Intel 8.1.1.1010 (Intel) AMD 790FX Launch Drivers |
Hard Disk: | Western Digital Raptor 150GB |
Memory: | Corsair XMS2 DDR2-800 4-4-4-12 (1GB x 2) Corsair XMS3 DDR3-1066 7-7-7-20 (1GB x 2) |
Video Card: | NVIDIA GeForce 8800 GTX |
Video Drivers: | NVIDIA ForceWare 169.09 |
Desktop Resolution: | 1920 x 1200 |
OS: | Windows Vista Ultimate 32-bit |
124 Comments
View All Comments
B166ER - Monday, November 19, 2007 - link
Had to reply and clarify. The phrase refers to current setups in which quadcore gaming is not a primary reason to purchase said processors. Alan Wake, Crysis, and others, while being able to take advantage of quadcore setups, are not out yet, and I would guess that less than 3% of games out there now are quadcore capable, and scaling of such games is probably hardly optimized even if there were more supply. You speak in a future context in which these games will be available, even still the abundance of them will still be in wait.Nonetheless, Anand that might be your best review in my book to date. It speaks honestly and depicts a very seemingly structureless company that only has time on its side to pull itself up from potential disaster. AMD has not shown too many positive strides as a company lately, and mindless spending on what should have been a direct "let em have at it" approach only shows what was speculated previously: the company has a vacuum in leadership that needs to be filled by capable hands. And we all know proper administration starts from the very top; Hectors step down will not be mourned by me. Dirk has his cup filled, but his past credentials speak highly and show him capable. I can only wish well.
Phenom needs to be priced competitively, simple enough. and it needs higher quality yields for overclocking. Its amazing how they can stay just one step behind in almost every step currently. I hope the 7 series mobos bring about better competition vs Intel and Nvidia boards. We as a community need this to happen.
leexgx - Monday, November 19, 2007 - link
there are about 2-3 games i think that use quadwingless - Monday, November 19, 2007 - link
Thats such a mixed bag it makes me sick. Phenoms are mediocre at best at almost everything but somehow magically rape Intel in Crysis. I'll have nightmares. WTF is going on internally in those four cores to make this happen? I hope software manufacturers code well for AMD so they can shine. The pro-Intel software market is huge and thats where the fight is. Unfortunately it doesn't look good for AMD there either because programmers hate having to learn new code.defter - Monday, November 19, 2007 - link
Rape Intel in Crysis??Crysis was one of the few benchmarks where fastest Phenom was faster than slowest Core2 Quad. Still the Phenom's advantage was less than a percent.
I think it's better to say that Crysis is the benchmark where Phenom doesn't utterly suck (it just sucks a lot).
eye smite - Tuesday, November 20, 2007 - link
You really think those shining numbers are realistic from pre production sample cpu's? I think you should all wait til sites have full production MB's and cpu's and can give real data with all their tests, then you can decide it's a steaming sack of buffalo droppings. Until then, you'll just sound like a racaous bunch of squabbling crows.JumpingJack - Monday, November 19, 2007 - link
Ohhhh, here we go again with the 'It's note coded well for AMD' conspiracy theories.wingless - Monday, November 19, 2007 - link
AMD recently released new software libraries for these processors....JumpingJack - Monday, November 19, 2007 - link
Yeah, great for the FPU library which they already compiled into their PGI for the SPEC.ORG runs, which consquently are slowly getting the non-compliant branding pasted all over them.TSS - Monday, November 19, 2007 - link
"It turns out that gaming performance is really a mixed bag; there are a couple of benchmarks where AMD really falls behind (e.g. Half Life 2 and Unreal Tournament 3), while in other tests AMD is actually quite competitive (Oblivion & Crysis)."the UT series and halflife 2 are both very CPU intensive, while oblivion and especially crysis are videocard killers. it's hard to say but it sucks in games as well. you'd wanna see a difference in crysis though, make it scale to about 640x480. it's just too demanding on the graphics card to compare at 1024x768. in lament terms, in HL2 the graphics card is usually picking his nose waiting for the CPU, so a stronger CPU will make a dig difference. in crysis especially it's exactly the other way around, so that's why scores are closer together.
why is this true? in half life 2, there are 50 frames between the best and worst of the line up. in UT3, there are 50 frames between the best and the worst of the lineup (meaning regardless of clockspeeds or architecture. the frames per second is also measured in the hundreds, even by such a new and graphic intensive game as UT3 (though it should be noted as far as i know the beta demo did NOT ship with the highest resolution textures to keep file size down). now crysis has about 9 frames per second difference between a 2,2 ghz phenom and a 2,66 ghz intel proc, and oblivion manages about 16. same system different game it can only be concluded that the game is much more graphic card intensive, which shouldn't be hard to imagine since crysis is well known to be the graphics card killer of the moment.... and oblivion of the last generation (HDR and AA anybody?).
AMD's 10-30% slower in every other test they are as well in gaming. i belive the difference would've shown more if they used a SLI or crossfire solution, though i understand that's not possible with the chipsets and drivers existing at the moment.
MDme - Monday, November 19, 2007 - link
Time to upgrade.....to the dark side.