At this year’s Consumer Electronics Show, NVIDIA had several things going on. In a public press conference they announced 3D Vision Surround and Tegra 2, while on the showfloor they had products o’plenty, including a GF100 setup showcasing 3D Vision Surround.
But if you’re here, then what you’re most interested in is what wasn’t talked about in public, and that was GF100. With the Fermi-based GF100 GPU finally in full production, NVIDIA was ready to talk to the press about the rest of GF100, and at the tail-end of CES we got our first look at GF100’s gaming abilities, along with a hands-on look at some unknown GF100 products in action. The message NVIDIA was trying to send: GF100 is going to be here soon, and it’s going to be fast.
Fermi/GF100 as announced in September of 2009
Before we get too far ahead of ourselves though, let’s talk about what we know and what we don’t know.
During CES, NVIDIA held deep dive sessions for the hardware press. At these deep dives, NVIDIA focused on 3 things: Discussing GF100’s architecture as is relevant for a gaming/consumer GPU, discussing their developer relations program (including the infamous Batman: Arkham Asylum anti-aliasing situation), and finally demonstrating GF100 in action on some games and some productivity applications.
Many of you have likely already seen the demos, as videos of what we saw have already been on YouTube for a few days now. What you haven’t seen and what we’ll be focusing on today, is what we’ve learned about GF100 as a gaming GPU. We now know everything about what makes GF100 tick, and we’re going to share it all with you.
With that said, while NVIDIA is showing off GF100, they aren’t showing off the final products. As such we can talk about the GPU, but we don’t know anything about the final cards. All of that will be announced at a later time – and no, we don’t know that either. In short, here’s what we still don’t know and will not be able to cover today:
- Die size
- What cards will be made from the GF100
- Clock speeds
- Power usage (we only know that it’s more than GT200)
- Pricing
- Performance
At this point the final products and pricing are going to heavily depend on what the final GF100 chips are like. The clockspeeds NVIDIA can get away with will determine power usage and performance, and by extension of that, pricing. Make no mistake though, NVIDIA is clearly aiming to be faster than AMD’s Radeon HD 5870, so form your expectations accordingly.
For performance in particular, we have seen one benchmark: Far Cry 2, running the Ranch Small demo, with NVIDIA running it on both their unnamed GF100 card and a GTX285. The GF100 card was faster (84fps vs. 50fps), but as Ranch Small is a semi-randomized benchmark (certain objects are in some runs and not others) and we’ve seen Far Cry 2 to be CPU-limited in other situations, we don’t put much faith in this specific benchmark. When it comes to performance, we’re content to wait until we can test GF100 cards ourselves.
With that out of the way, let’s get started on GF100.
115 Comments
View All Comments
marc1000 - Tuesday, January 19, 2010 - link
hey, Banshee was fine! I had one because by that time the 3dfx api was better than DirectX. But suddenly everything became DX compatible and that was one thing 3dfx GPUs could not do... then I replaced that Banshee with a Radeon 9200, later a Radeon X300 (or something), then Radeon 3850, and now Radeon 5770. I'm always in for the mainstream, not the top of the line, and Nvidia is not paying enough atention to mainstream since Geforce FX series...Zool - Monday, January 18, 2010 - link
The question is when they will come with mid range variants. The GF100 seems to be 448SP variant and the 512SP card will be only after A4 revision or who knows.http://www.semiconductor.net/article/438968-Nvidia...">http://www.semiconductor.net/article/43...en_Calls...
The interesting part on the article is the graph which shows the exponecial increase in leakage power after 40nm and less. (which of course hurts more if u have a big chip and diferent clocks to maintain)
They will have even more problems now that dx11 cards will be only gt300 architecture so no rebrand choices for mid range and lower.
For consumer gf100 will be great if they can buy it somewhere in the future, but nvidia will bleed more on it than the GT200.
QChronoD - Monday, January 18, 2010 - link
Maybe I'm missing something, but it seems like PC gaming has lost most of its value in the last few years. I know that you can run games at higher resolutions and probably faster framerates than you can on consoles, but it will end up costing more than all 3 consoles combined to do so. It just seems to have gotten too expensive for the marginal performance advantage.That being said, I bet that one of these would really crank through Collatz or GPUGRID.
GourdFreeMan - Monday, January 18, 2010 - link
I certainly share that sentiment. The last major graphical showcase we had was Crysis in 2007. There have been nice looking PC exclusive titles (Crysis Warhead, Arma 2, the Stalker franchise) since then, but no significant new IP with new rendering engines to take advantage of new technology.If software publishers want our money, they are going to have to do better. Without significant GPGPU applications for the mainstream consumer, GPU manufacturers will eventually suffer as well.
dukeariochofchaos - Monday, January 18, 2010 - link
no, i think you're totally correct, from a certain point of view.i had the thought that the DX9 support is probably more than enough for console games, and why would developers pump money into DX11 support for a product that generates most of it's profits on consoles?
obviously, there is some money to be made in the pc game sphere, but is it really enough to drive game developers to sink money into extra quality just for us?
At least NV has made a product that can be marketed now, and into the future, for design/enterprise solutions. That should help them extract more of the value out of their r&d if there are very few DX11 games for the lifespan of fermi.
Calin - Monday, January 18, 2010 - link
If Fermi is working good, NVidia is in a great place for the development of their next GPU - they'll only need to update some things here and there, based mostly on where the card's performance lack (improve this, improve that, reduce this, reduce that). Also, they are in a very good place for making lower-end cards based on Fermi (cut everything in two or four, no need to redesign the previously fixed function blocks).As for AMD... their current design is in the works and probably too advanced for big changes, so their real Fermi-killer won't come faster than a year or so (that is, if Fermi proves to be so great a success as NVidia wants it to be).
toyota - Monday, January 18, 2010 - link
what I have saved on games this year has more than paid for the difference between the price of a console and my pc.Stas - Tuesday, January 19, 2010 - link
that ^^^^^^^besides, with Steam/D2D/Impulse there is new breath in PC gaming. constant sales on great games, automatic updates, active support, forums full of people, all integrated with virtual community (profiles, chats, etc.). a place to release demos, trailers, etc. I was worried about PC gaming 2-3 years ago, but I'm absolutely confident that it's coming back better than ever.
deeceefar2 - Monday, January 18, 2010 - link
Are the screen shots from left 4 dead 2 missing at the end of page 5?[quote]
As a consequence of this change, TMAA’s tendency to have fake geometry on billboards pop in and out of existence is also solved. Here we have a set of screenshots from Left 4 Dead 2 showcasing this in action. The GF100 with TMAA generates softer edges on the vertical bars in this picture, which is what stops the popping from the GT200.
[/quote]
Ryan Smith - Monday, January 18, 2010 - link
Whoops. Fixed.