Fusion
Remember Fusion? The whole reason for the ATI acquisition? Well, AMD gave us a little more information on its plans for the first Fusion CPUs.The first Fusion CPUs belong to a family of chips codenamed Falcon; note that Falcon refers to the Fusion CPU family and not the CPU or GPU cores themselves. Contrary to popular belief, the first Fusion CPUs will be built of a single die. On this die you will find the following components: a shared memory controller, Bulldozer or Bobcat based CPU cores, a DirectX GPU core with UVD support, a shared cache (shared between the CPU and GPU), and a PCIe controller - all on the same die.
For a one-die solution, the feature list for Falcon is pretty impressive. Let's discuss what we know:
The shared memory controller will most likely support DDR3 given the 2009 - 2010 launch timeframe for Fusion, and obviously it will be used by both the CPU and GPU portions of the die. We've already discussed the Bulldozer and Bobcat cores; you can expect the desktop/notebook Falcon chips to use Bulldozer cores while the smallest Ultra Mobile PCs, high performance smart phones and CE devices to use Bobcat based Falcon processors.
AMD just lists the graphics core as being a "Full DirectX GPU", but fails to attach any DX revision to the support sheet. AMD did mention that the GPU core would be a unified shader architecture, but we suspect that lower end Falcon CPUs may not support everything required by DX9/DX10.
The integrated UVD support will eliminate the need for an external graphics card just to decode high bitrate H.264 video. UVD only ends up being around 4.7 mm^2 of today's 65nm GPU die yet it is several orders of magnitude more efficient than the x86 CPU core at decoding H.264, highlighting the importance of its integration onto the CPU die itself. Given how powerful and efficient UVD is, we can't help but wonder how long it will take for AMD to include it in all of its CPUs. We may have to wait for a unified instruction set between the CPU/GPU before we get that sort of granular integration though.
The last item on the M-Space stack is the on-die PCIe 2.0 controller, which AMD said would support a minimum of 16 lanes externally. With an integrated PCIe controller, the only other chip needed is an external South Bridge that can connect via PCIe to the CPU itself.
The on-die PCIe controller won't kill the add-in GPU market, as you will be able to simply pop in an external graphics card if necessary. You can then either disable the on-die graphics or switch between the two as your usage demands change. In notebooks, AMD expects systems with discrete graphics to swap between it and the on-die GPU on the fly depending on usage.
31 Comments
View All Comments
Lord Evermore - Sunday, July 29, 2007 - link
What the heck are RDDR and UDDR? My only guess is the U might stand for the UMA design, but I don't know if that would be preferred for the server or workstation.Anand Lal Shimpi - Tuesday, July 31, 2007 - link
RDDR = Registered DDRUDDR = Unbuffered DDR
Take care,
Anand
Martimus - Thursday, August 2, 2007 - link
Ok, what is OoO? I couldn't find it with a search on Google.Spartan Niner - Saturday, August 4, 2007 - link
OoO is "out-of-order" referring to OoOE, "out-of-order-execution"http://en.wikipedia.org/wiki/Out_of_order_executio...">http://en.wikipedia.org/wiki/Out_of_order_executio...
Martimus - Monday, August 6, 2007 - link
Thanks.xpose - Saturday, July 28, 2007 - link
This is the best future roadmap article I have ever read. I am actually excited. No really.najames - Friday, July 27, 2007 - link
I am an AMD fanboy, of 7 computers I have at home, only the 5 year old laptop has an Intel chip now. Dual cores are actually likely all I REALLY need. That said, I am sick of a bunch of hype and no new products. It's all blow and no show. I don't care about years down the road because it could all change between now and then.AMD/ATI could be a good thing too if they make good, polished drivers, 100% working for what was promised. How about throwing people a bone to make them switch, maybe even make some kick butt Linux drivers too.
We were all on an AMD bus and nobody has been driving since the X2 chip. They taunted Intel and handed out huge bonuses, but forgot about any new development. I have to credit Intel, they kicked butt with Core 2, and seem to be doing more butt kicking going forward.
I watched Hector on CNBC last night and he didn't look like he had a clue what was going on. Granted they weren't asking him details of any processors, but he was dodging basic business questions. Why do I have several hundred shares of AMD?
Regs - Monday, July 30, 2007 - link
Because those relatively cheap shares, compared to Intel's, might be worth hundreds of times more one day from that stuff you call blow. Blow = prospects in business terms.
I would say the same thing as you did though at first. It's obvious AMD and ATi's pipeline dried up and unfortunately both consecutively. You can argue that the 2900XT is a good card, performs well, etc..etc.. but that doesn't explain why AMD offers crapware for main stream (where the real money is). As for AMD's CPU line up...well..you can only sell old for so long in the technology sector without taking a hit.
kilkennycat - Friday, July 27, 2007 - link
.... dump ATi. The marriage made in hell. New products unable to meet schedule and with inferior performance, thus no way of rapidly recovering development costs by pricing for performance.Dave Orton sure did a neat sell-job on AMD, walking away with $$millions when AMD paid a 20% premium for a chronically non-performing company barely managing to eke out some tiny profits during the last couple of years. No wonder Mr. Orton was finally shown the door.
kleinwl - Friday, July 27, 2007 - link
What is the problem with AMD, did they not receive enough feedback that UVD is a "must have" on high end units. I don't want to have choose between good gaming performance and movie performance... I am paying a ridiculous premium already for hardware... the least they could do is make sure it has all the bells and whistles.