Original Link: https://www.anandtech.com/show/197



Eight different chipsets, 6 processors, 5 tests, 3 resolutions, 1 winner...
...prepare for the visual ride of your life.

Out with the old and in with the new, it is this spring cleaning attitude that drives the markets we so dearly scrutinize during every minute of our dedicated hobby time.  When referring to the newest automobiles, the latest fashion trends, or even the most popular music, parting with the old and heading towards the world of the new is relatively easy and painless (unless, of course, you have that pair of 70's style socks you just can't bring yourself to get rid of).   Unfortunately, many of us are blessed with a hobby that not only changes at a rate barely faster than the market can handle, but at the same time, a hobby whose maintenance costs prohibit the mindless process of trial and error when it comes to which parts to add to your collection.  

The hobby in question is none other than that associated with being a PC hardware enthusiast.  While the title itself seems good enough to qualify as a previous employer entry on a job application, a hardware enthusiast usually has to face the facts and utter the phrase "out with the old and in with the new" every now and then; and when that time comes, the wallet is the first to suffer.   In looking towards the new, we usually forget about the differences between the new and the old, and since the budget of the average hardware enthusiast doesn't include the cost of every single product on the market, a roundup of all products up to the current generation is necessary to keep in touch with the past while concentrating on the present.

In this particular case, one of the most rapidly evolving factions of PC hardware, Video Accelerators, have become the topic for much discussion among enthusiasts and casual users alike.  What card is best for me?   What are the real differences between all of the next generation chipsets?   And the most frequently asked question, what kind of performance improvements can I expect over my current video card? 

In order to answer all of those questions, and more, AnandTech traversed the paths of performance starting with the original 3Dfx Voodoo chipset to the latest offerings from S3 and nVidia among others.  A bumpy road, it may have been, the trip was well worth it, and to find out why, keep your attention focused as AnandTech unravels the universe of the video accelerators of today and yesterday with the most in-depth comparison ever performed its labs:

Eight different chipsets, 6 processors, 5 tests, 3 resolutions, 1 winner...prepare for the visual ride of your life.



With time comes change, and with change comes adjustment, however what adjustments must we make now that the video accelerator market has somewhat stabilized itself?  In the past we were always waiting for the elusive "what if?" however now that the question as been virtually eliminated by the release of a number of new graphics chipsets, what should we look for in a next generation graphics chipset?

What to look for in a Graphics Accelerator

  • Acceleration Strengths - What sort of acceleration do you primarily need?   While a card may offer excellent 3D acceleration there are some out there that need more than the ability to run Quake 2 at unbelievable speeds. 

  • API Support - Glide, Direct3D, and OpenGL.  Those are the three major Application Programming Interfaces (API's) you'll see present in the 3D world, while only 3Dfx cards support the least used Glide API Direct3D and OpenGL support are provided for with virtually any card/chipset.   If OpenGL is something you're looking forward to having outstanding performance under, then make sure that the card you're after has full OpenGL support now with an OpenGL Installable Client Driver (ICD) available for download. 

  • Drivers & 3DNow! - Make sure that the manufacturer of your next-generation graphics accelerator won't leave you in the dark when the time to upgrade your drivers comes around.  Keep track of all driver updates made to manufacturer websites and be sure to keep communication lines open between yourself and the manufacturer (that's what email is for).  For you K6-2 users, you may want to lean towards a graphics accelerator chipset that has either current or planned support for AMD's 3DNow! instructions in their drivers.  Among others, 3Dfx, 3DLabs, Matrox and nVidia have either announced or currently have drivers out that support the AMD 3DNow! instructions.

  • Interface & Card Length - Two very important factors in purchasing a graphics accelerator, the Bus Interface and the physical length of the card.  For PCI cards, you must make sure that you have at least one open PCI slot that can accommodate the physical length of the card.  Voodoo2 accelerators, for example, require full-length PCI slots due to their incredible length.   Unfortunately those are luxuries denied to most AT-Super7 motherboard owners, in which case an AGP accelerator becomes more attractive especially since PCI slots are quickly increasing in scarcity among upgraders.

  • Refresh Rates & Integrated RAMDAC - If you have a 14" monitor and don't have any plans on upgrading your monitor in the near future then you should probably skip this section.  For those of you that have either taken advantage of or are planning to take advantage of the rock bottom prices on 17" monitors or for those of you that simply have the budget for a 21" monitor, then you will want to pay close attention to the supported Refresh Rates and Integrated RAMDAC of any graphics accelerator you purchase.  The rule of thumb here, the higher the better, it is as simple as that.  Remember that when outputting video you must take the digital data stored in your video memory or RAM and send it down your VGA cable to your monitor.  However monitors, in spite of what you may think, do not receive information in digital bits since they are analog devices.   In order to convert the digital signal from the RAM to an analog signal the monitor can use a RAMDAC (Random Access Memory Digital-Analog Converter) is present on the video card itself.  The faster a RAMDAC the better the 2D image quality you see on your monitor will be.  Expect most RAMDAC's to fall in the 200MHz - 250MHz with 230MHz as the sweet spot for most users.

  • Rendering Capabilities - Need to be able to render in a window rather than a full screen application?  Look into the specifications of the graphics chipset you're considering and ask yourself: does it allow for a 32-bit internal accuracy for rendering calculations?  Does it support 32-bpp rendering?  Is color expansion optional in the future with this chipset, and what sort of performance hit will result if such an expansion is initiated?

  • Resolutions & Video Memory - If you have an ideal resolution in mind, one you would like to run all of your games at as well as another you wish to keep your windows desktop at you need to make sure that the video card you're purchasing has enough memory (or a large enough frame buffer in the case of 3D accelerators) to accommodate for the resolution.  The once sought after 640 x 480 gaming resolution is now a thing of the past, you should accept no 3D accelerator that doesn't allow for 800 x 600 support, and provided that the performance is decent, support for higher 3D resolutions such as 1024 x 768 and 1280 x 1024 can be desired as well.   Just remember that your monitor must also be able to handle the resolutions you're aiming for, 800 x 600 and 1024 x 768 are pretty much supported on all monitors (even 14" monitors have 1024 x 768 support) however once you break 1024 x 768 you may want to start looking for a 17" or larger screen. 

  • Texture Units - A key to continued levels of high performance in future gaming titles will lie in the ability of a card to process textures in an efficient and effective manner.  Currently, the most popular way of achieving this effective and efficient is by offering two separate texture units, each of which capable of processing a single texture - paving the way for high performance under multi-textured gaming situations where a single texture is overlapped by the presence of another one.

  • TV-Output - The world of TV-Output has matured tremendously since the days when ATI's 3DXpression+ dominated the boards with its "crisp and clear" TV-Output.  Companies have already begun pushing the limits of TV-Output to previously unheard of degrees, Matrox's Mystique G200 supports a TV-Out resolution of up to 1024 x 768.  If you have a large enough TV in the area where your computer will reside, then you may want to give TV-Output another look, for the first time.

  • Finally, there's price, and without further ado let's get to the roundup...



The reigning 3D champ, 3Dfx is consistently finding themselves trying harder and harder to keep their number one position at the top of the graphics accelerator market among gamers.  Times have changed considerably since the days when 3Dfx had the only chipset capable of decent performance at 640 x 480 on the market.  The company has grown tremendously, and has ventured into previously sparsely journeyed territories with the advent of a 24MB dedicated 3D-only graphics accelerator running at a higher clock speed ever thought possible from 3Dfx.  

With the amount of competition 3Dfx demands as a company leading the pack, it is obvious that 3Dfx cannot rely on a single product alone to bring them the success and recognition they need to stand tall as the number one 3D graphics accelerator manufacturer on the planet.  For this reason, 3Dfx's currently living and breathing chipset triumvirate consists of the 3Dfx Banshee for the mid-range PC, the 3Dfx Voodoo for the low end gamer, and the fierce monster, the Voodoo2, for the high end speed daemon.  Covering all ends of the spectrum, let's give 3Dfx's top three chipsets a closer look.

The Good, the Bad, and the Banshee?

The best of both worlds, that was 3Dfx's goal with the release of the Banshee; essentially a stripped down, yet overclocked version of the Voodoo2, the 3Dfx Banshee was designed to be the mid-range contender from the formerly 3D-only realm of 3Dfx.

  • Multi-Texturing

    The Banshee acquires its stripped down description from the nature of its processor configuration, unlike the Voodoo2, the Banshee comes equipped with a single processor (versus the Voodoo2's 2 Texture Units & 1 Pixel Unit) which houses a single V2 texture unit and a single V2 pixel unit.  These two units are what handle the 3D processing of the Banshee chipset, and as you might be able to tell, the presence of a single texture unit does have its downsides.  While the Voodoo2, and other cards with two texture units, can process multi-textured environments in a single pass, the Banshee is forced to make two passes in order to render the same object with multiple textures on it.   This gives it a huge disadvantage in comparison to its bigger brother, the Voodoo2, as games such as Unreal as well as upcoming 3D titles make heavy use of multi-textured environments.  

    If you look at 3D rendering as painting a wall, a single coat of paint can easily be accomplished by virtually any brush, while that same brush will require two strokes to place two separate coats of paint on the wall (1 texture processor).  Now imagine a brush capable of placing two coats of paint on a wall in a single pass (2 texture processors).  By using the latter type of brush you are essentially doubling your productivity.  In 3D gaming and rendering situations the application of such a technique is a bit more complex, yet it follows the same basic principle.  If a wall in a game, such as Unreal, happens to have a texture placed on it, such as a brick texture, followed by another layer, say a reflection from a nearby fire, you basically have two textures on that one surface.

  • An Overclocked Voodoo2

    In order to compensate for its single texture unit, 3Dfx placed the recommended clock speed for their Banshee at the 100MHz level rather than have it rated at 90MHz as its Voodoo2 counterparts have been from the start.  This 100MHz clock speed does give the Banshee the edge over the Voodoo2 in cases where single textured surfaces are present, games such as Forsaken, and even Quake 2 whose multi-textured effects are barely noticeable in terms of performance, the Banshee performs just as well if not better than the Voodoo2. 

    This essentially overclocked nature of the Banshee does have its downsides, the primary being that the processor runs extremely hot and can dramatically affect the stability of your system if you do not have a well ventilated case.  The cheapest way around this would be to pick up an older clip-on 486 fan from Radio Shack and plop it on the Banshee's heatsink.

The Banshee provides 3D image quality virtually on par with that of the Voodoo2, which can be considered average in the round-up presented before us.  Where the  Banshee excels in visual quality is in the fact that the Banshee, unlike the Voodoo2, is an all-in-one solution and doesn't require a pass-through cable to connect to your 2D card since the Banshee is a 2D and 3D graphics accelerator.  The presence of another filter through which the video signal must pass through degrades the final image quality by, in some cases, a noticeable degree, and the Banshee effectively eliminates that possibility.

While the Banshee does support the AGP specification, its implementation is incredibly poor and only allows for 1X transfers over the AGP bus.  At the same time, the Banshee doesn't allow for AGP texture storage, meaning that the textures which would normally be transferred over the AGP bus for storage in system memory must remain in the local memory of the graphics card.  This will cause a considerable performance hit once games begin to exceed the available memory for texture storage, however for the present, the Banshee will do just fine. 

The sweet spot for the Banshee is, naturally, the 800 x 600 resolution setting since it offers virtually no performance degradation in comparison to running it at 640 x 480.  Unfortunately, the jump to 1024 x 768 may bring performance down to a level below that of 30 frames per second making the gaming experience seem much more of a stop-and-go driving style rather than fluid motion.   The current drivers for the Banshee chipset are still being tweaked, with the MiniGL drivers just recently released.  The driver support as of now is average, however with the backing of 3Dfx you can expect that support to change quickly.  The Banshee does support a technology known as color expansion, which essentially allows for improved image quality through the usage of a somewhat speculative calculation of "in-between" colors, however enabling this feature will most likely result in a performance drop of 50% or more and therefore not too practical. 

3Dfx Voodoo - Still a Player

It seems that almost yesterday we were rushing to our doorsteps to greet the delivery man who carefully carried our brand new Monster 3D's in his arms.  The rush to the 3D gaming scene was thrust into full speed with the introduction of 3Dfx's Voodoo chipset, truly the first of its kind, the Voodoo was a 3D-only add-on giving users the ability to fine tune their system down to a level of having the best 2D performance and the best 3D performance at the same time. 

The Voodoo brings below average gaming performance to the table when dealing with today's advanced titles, however it still has the ability to make its presence known as a player in the 3D gaming realm.  The original beauty of the Voodoo was its relatively weak dependence on the speed of your processor in deriving its own performance.  This means that owners of slower processors will experience decent performance from the Voodoo while higher end systems will be cheated out of a considerable amount as the Voodoo was never designed to take full advantage of the power of a Pentium II 450 or an overclocked Celeron A. 

With below average image quality, and no 2D support out of the box, you can consider the 3Dfx Voodoo, more or less, an entry level 3D accelerator.  Its 640 x 480 resolution limitation (Pure3D excluded) and its relatively weak performance in most complex games will keep it from becoming a major contender in the 3D race, it looks like it's time to finally put the good ol' Voodoo into retirement.  One benefit of having a chipset that has been around for so long is the incredible driver support and its well established presence in the gaming industry, however support can only take you so far, as performance quickly becomes an important issue.

Powerful or Powerless? The 3Dfx Voodoo2

At the time of its release, the Voodoo2 managed to set a new standard for gaming.  Virtually abolishing the 640 x 480 resolution and replacing with its high performance at 800 x 600, the Voodoo2 made it to the top of the market by its sheer power.  Again, a 3D-only solution from 3Dfx, the Voodoo2 boasts 2 texture units and a single pixel processing unit.  The two texture units allow for multi-texture rendering in a single pass which give it the edge over the competition in games such as Unreal in which multi-textured environments are more common than one could possibly imagine. 

The Voodoo2 offers the standard for multi-textured gaming performance, and will continue to be a high performer from now until the day its fate repeats that of the original Voodoo.  The beauty of the Voodoo2 is that it picks up where the Voodoo left off, its performance remains completely independent of the presence (or lack thereof) of L2 cache and as long as you have a mid-range to high-end processor, it'll give you its all. 

The Voodoo2 comes upgradable friendly, if you, one day, find that you crave even more performance out of your system you can go out and purchase a second Voodoo2 and enable what 3Dfx calls Scan Line Interleave (SLI) Mode.   With two Voodoo2's running in conjunction with each other, taking advantage of SLI (each card handles a separate scan line, theoretically doubling performance), you can breathe new life into your system at just about any time.  This extends the longevity of the Voodoo2 beyond that of most other graphics accelerators, especially when you consider that the Voodoo2's performance scales quite well with the performance of your CPU.

Unfortunately, the 3D-only design of the Voodoo2 and its ability to be run in pairs (SLI) quickly eats up your PCI slots, which can be a major problem for users with only a few slots available.  In lieu of this, 3Dfx will be providing the specifications for a single board SLI Voodoo2 to remain competitive with the rest of the market as well as extend 3Dfx's reign over the market for at least a few more months. 

The 0.35 micron design of the Voodoo2 keep it running nice and warm, and if you do decide to overclock it you may want to work some cooling enhancements to your current system, but for the most part, as long as you have a well ventilated case, the Voodoo2 shouldn't be too much of a hassle. 

The image quality of the Voodoo2 can be considered average by today's standards, it isn't the best, and at the same time it isn't the worst.  It is an unfortunate truth that the Voodoo2 has no single card 1024 x 768 support until the single card SLI boards become more readily available, however the excellent driver support that the Voodoo2 has acquired in the time that it has been present in the 3D accelerator market, and the support 3Dfx backs it with almost make up for this fact.  Supporting Direct3D, OpenGL, and Glide, as do all 3Dfx chipsets, the Voodoo2 has made its presence well known among developers and gamers alike in the entertainment community.



What happens when you take the king of 2D and give them a third dimension to play around in?  You get the best, most crisp, image quality available on the market.  If your primary concern with a graphics accelerator is image quality, then the Matrox MGA-G200 is definitely what you're looking for.  Boasting the absolute best image quality in both 2D and 3D environments, Matrox has brought their excellence to the game once again with the sheer beauty of their G200 chipset. 

The chipset itself isn't much of a performer in comparison to the Voodoo2 class of performers in the market, however it provides you with what you need in order to remain productive and have fun at the same time.  Its Direct3D performance is average, with a still unproven OpenGL performance due to the continued delay of Matrox's OpenGL ICD for the G200 chipset.  When the ICD becomes available, G200 users will probably receive a decent performance experience under OpenGL situations, however don't expect the G200 to be the elusive Voodoo2 killer anytime soon.

Taken from the AnandTech Matrox Mystique G200 Review

128-bit DualBus

Imagine that you are on an 8-lane highway.   The 8-lanes of this highway allow for more traffic to move from one end of it to the other, however there is a catch.  The cars on the highway can only be moving in one direction at a time, meaning that all the cars must either be moving up the highway or down it but not both at the same time (all 8-lanes move in the same direction).   Consider that the limited functionality of an internal 128-bit Data Bus when applied to video cards, on any given CPU clock cycle the data being transferred via the internal 128-bit Data Bus can only flow in one direction (to the graphics engine).   On the following clock cycle the data can be transferred down the bus in the other direction (from the graphics engine).  While this approach does have its benefits, when dealing with 2D images and bitmaps where the data that must be transferred down the bus remains quite small (less than 128-bits) there is a much more efficient way of approaching this.

Let's take that highway example from above, now instead of making that highway an 8-lane highway let's split it up into a 4-lane going and a 4-lane coming highway.  Meaning that at the same time 4 lanes of cars can be traveling on the highway in the opposite direction of 4 lanes of cars on the other side of the highway (4 lanes can be leaving the city while 4 lanes can be entering).  If there is no need for 8 lanes to be open for transportation in any one direction then the first 8-lane highway wouldn't be as efficient as this modified 4/4-lane highway.  The same theory applies to the Matrox G200.

Instead of occupying the entire width of a 128-bit bus to transfer data in 64-bit chunks why not create a dual 64-bit setup with one bus dedicated to sending data to the graphics engine and the other dedicated to receiving data from it.  This is what the G200's 128-bit DualBus architecture is, in essence it is 2 64-bit buses offering the same combined bandwidth as a single 128bit data bus while allowing for data to be sent in parallel to and from the graphics engine.  It is this technology that gives the G200 the edge over the competition in its 2D performance, allowing for 24-bit desktop color depths at a faster level of performance than most of the competition can do at 16-bit color depths.

3D Performance and Image Quality

What good is a 2D combo card without the powerful 3D punch to back up its dimension crippled counterpart in performance?  The G200 doesn't lose any points here either, with a 100 million pixels/second fill rate one would expect the G200 to be able to hold its ground fairly well in 3D games and applications.  You must keep in mind that the G200 was never intended to be a Voodoo2-killer rather a lower cost alternative for those who don't have the funds to accommodate a single Voodoo2 + 2D accelerator which justifies the sub-Voodoo2 levels of performance you'll be seeing from the G200. 

One advantage, outside of price, that the G200 holds over the Voodoo2 as well as all other 2D/3D combo cards is its top notch image quality.  Using Vibrant Color Quality (VCQ) Rendering the G200 is capable of rendering images in 32-bits per pixel color (meaning 8-bits for red, green, blue and alpha) even if the rendering is set to 16-bpp the internal calculations and accuracy is done in 32-bpp and dithered down to 16-bpp upon displaying the images.  Since the number of games that make use of 32-bit textures is extremely low this feature doesn't carry as much weight as performance does, for example.  But for now you can rest assured knowing that one day you'll be able to make more use of the G200's advanced rendering capabilities - better safe than dithered in this case. 



It's All About Looks - Riva 128

The predecessor of nVidia's latest chipset, the Riva 128 gained its fame for dethroning the 3Dfx Voodoo as the fastest overall 3D accelerators of its time.  The Riva 128 is still an average performer by today's standards, while it's not going to give the next generation chipsets a run for their money, it gives you what you need, and lots of it.  The Riva 128 features proven Direct3D and OpenGL support, and a generally welcomed presence among games. 

In situations where multi-texturing hasn't been completely taken advantage of, the Riva 128 can even perform at levels greater than its "more powerful" brother, the TNT on lower end systems.  Games such as SiN perform just fine with the Riva 128, even better than some of the latest and greatest video accelerator offerings from S3, and even nVidia themselves.  If this were a perfect world, then the Riva 128 would probably be looking quite appetizing right now, unfortunately this isn't a perfect world (thankfully in some cases, who would want to live in a boring world?) which is what leads us to the weaknesses of the Riva 128.

Not to be superficial, but the appearance of the image quality on the Riva 128 is simply horrid by today's standards.  The quality of the 3D output is extremely poor, and aided by its still flaky OpenGL ICD drivers, the Riva 128 is by no means a viable solution for someone that actually wants to enjoy their games.  If you're just looking for a card to handle 2D and 3D with no real care as to how your games look and feel, then the Riva 128 is great for the money, however if you actually feel like using your video accelerator for its purpose, then the Riva 128 can be thrown out of the comparison just as easily as it was introduced.   

While the Riva 128 is available in an AGP 1X compliant card (and the 128ZX as a 2X version), the AGP implementation is quite poor, and doesn't help the card break its 960 x 720 resolution limit and definitely doesn't help it eliminate the noticeable 800 x 600 resolution performance penalty.  For lower end systems the Riva 128 isn't too bad, however if you've got a high end system and are running a Riva 128 it is best that you keep that fact on the silent side as you're doing your system a huge disfavor with such a combination.

Live and Learn - The Riva TNT

The best overall video accelerator money can buy, the Riva TNT is everything the Riva 128 was during its time, and more.  Boasting outstanding performance on higher end systems, and above average image quality, the Riva TNT picks up where the 128 left off.  Taking a complete 360 degree turn from their history, nVidia worked hard to ensure that the Riva TNT's usage of the AGP 2X specification was the best on the market, and this they did with incredible success.  

The 0.35 micron, extremely hot running, chip, uses a twin texel 32-bit graphics pipeline allowing for incredible multi-textured performance on par with if not greater than that of the 3Dfx Voodoo2.  While its performance under single textured environments is generally lower or equal to that of a Voodoo2, its ability to scale incredibly with its host processor speed gives it the edge over 3Dfx on the high end.  Unlike the introduction of the Riva 128, the Riva TNT ships with a full OpenGL ICD out of the box, a truly excellent driver implementation as well, since the TNT doesn't seem to have any noticeable problems with its OpenGL performance and its image quality.

The 24-bit hardware Z-buffer support leaves the TNT prepared for the next wave of 3D gaming titles, and as mentioned before, its processor scalability will ensure its domination over most of the competition for months to come.  The full support for 3D resolutions up to 1600 x 1200 make the TNT an appetizing solution, however performance rapidly degrades after 800 x 600, and unless you have an extremely fast processor, the TNT isn't a smooth performer at 1024 x 768.  In a few months nVidia will introduce the 0.25 micron version of the TNT which will run at a 125MHz clock speed versus the 90MHz clock of the current TNT chipsets.  This increased clock speed will come with at least a 20 - 25% boost in performance as AnandTech has hinted at in the first tests of a TNT at 125MHz, it should also remove most of the heat problems the current TNT cards experience as a result of the 0.35 micron chip design.  It is highly recommended that you get a fan for your TNT if you want the most reliable performance out of your card.  

Unfortunately the current drivers for the TNT do require a bit of maturing before rising to the level of the Voodoo2 or other predecessors, especially with Super7/Socket-7 systems.  Most Pentium II users won't have that big of a problem with the TNT's drivers however if you are experiencing any problems you may want to give up the conveniences of your manufacturer supplied drivers in favor of the nVidia reference drivers. 

The winner, hands down, out of the entire roundup of chipsets in terms of overall quality, performance, and even price is nVidia's Riva TNT.  A beautiful salvation of the name from nVidia, the TNT is everything a gamer could ask for, except for maybe a Voodoo2 SLI killer.



The Founding Father - The Long Lost Verite

If you remember back to the first days of the 3D accelerator hype, the Rendition Verite V1000 chipset was often considered the founding father of this race for the best.  Who would've guessed that the world of Voodoo2's and TNT's evolved from a chipset most of the recently added passengers on the video accelerator bandwagon never knew existed.  Inspiring such creations as the 3Dfx Voodoo, Rendition's own successor to their success story, the V2x00 had its 15 minutes of fame when the rush to find the Voodoo killer took place not too long ago. 

The strengths of the V2x00 include support for resolutions up to 800 x 600, performance greater than that of the 3Dfx Voodoo, as well as image quality that is much more than decent, even by today's standards.   Unfortunately the performance issues and its not-so-great future outlook will keep the V2x00 on the same track of retirement as the 3Dfx Voodoo.  If you're looking to upgrade from a V2x00, you probably don't want to walk down the path of the Voodoo2 due to the image quality, the ideal step in your journey away from the V2x00 will probably lie in the realm of the Riva TNT or maybe the final chipset of this comparison, the S3 Savage3D.  

A Nice Try - The S3 Savage3D

Promises of Voodoo2 levels of performance at a $100 price level, outstanding quality, and incredible driver support were left unanswered by S3 with their latest concoction, the Savage3D.  When AnandTech reviewed the initial revision of the Savage3D, those promises were made, and when AnandTech took another look at the final revision of the Savage3D and its drivers, those promises seemed to fade away into the horrendous looking fog present in games such as Unreal courtesy of S3's horrible OpenGL drivers. 

If you're looking for something in the price range of the Savage3D, you're better off getting a 3Dfx Banshee, or coughing up the extra cash and picking up a Riva TNT.  While S3 has some interesting technology on their hands with the Texture Compression the Savage3D supports, the immediate effects of that technology has yet to be seen as the current selection of gaming titles doesn't take much advantage of the Savage3D's texture compression.  In the future the Savage3D may become a better product, however for now, it is a pre-teen chipset trying desperately to make its way into adolescence, and definitely not a choice for the right video accelerator for the masses until it gets a decent OpenGL ICD to start with. 

What can we look forward to from the Savage3D?   Well the crusher.dm2 benchmarks show that the Savage3D can easily handle immense amounts of data as the Crusher benchmark tests just that by overflowing the bus with explosions, textures, and more with its simulation of the worst-case scenario in terms of performance.  In the future, with more mature drivers, we can expect the Savage3D to possibly replace the 3Dfx Banshee especially since the Savage3D boasts a fairly decent image quality in comparison to the Banshee, but not on par with that of the Matrox G200.   A nice try at redemption after 3 years of virtually idle production time, however it would've been better had S3 not jumped the gun on releasing a chipset that wasn't ready for introduction by a long shot.



AnandTech Video Chipset Feature Comparison Chart

-

-
  3Dfx Matrox G200 nVidia Rendition V2x00 S3 Savage3D
Banshee Voodoo Voodoo2 Riva 128 Riva TNT

Type

Standalone 3D - - - - - - - -
2D/3D - - - - - - - -

Interface Support

PCI 2.1 - - - - - - - -
AGP 1X - - - - - - - -
AGP 2X - - - - - - - -
Rendering
16bpp Color - - - - - - - -
32bpp Color - - - - - - - -
Single Pass Multi-Texturing - - - - - - - -
3D Resolution Limit 1600 x 1200 800 x 600 800 x 600 1600 x 1200 960 x 720 1600 x 1200 800 x 600 1600 x 1200
API Support
Direct3D - - - - - - - -
OpenGL - - - * - - - -
Glide - - - - - - - -

* At the time of publication, the Matrox G200 OpenGL ICD was not available



Absent from the Test
The following cards/chipsets were absent from the test and will be added at a later time:
  3Dfx Voodoo2 Single Card SLI
  3DLabs Permedia 3
  ATI Rage 128
  Intel i740
  Number Nine Revolution IV

Test Configuration

This review consisted of benchmarks on Slot-1 processors only, AnandTech will produce a separate article dealing with Super7/Socket-7 performance of the cards tested here.

The Slot-1 Pentium II Test System AnandTech used was configured as follows:


CPU's
Intel Celeron 300A
Intel Pentium II 266
Intel Pentium II 400

Motherboard
ABIT BH6

Memory
64MB Mushkin SEC PC100 SDRAM

CD-ROM Drive
AOpen 32X IDE CD-ROM Drive

Operating System
Microsoft Windows 98

Benchmarking Software (full versions)
Direct3D
Forsaken Nuke Demo
Turok TMark
OpenGL
Unreal FPSTimedemo
SiN Rocket Demo
Quake 2 Demo1 & Crusher Demo

VSYNC was disabled during AnandTech's tests (VSYNC is the synchronization of all buffer swaps to the refresh rate of your monitor, theoretically limiting the attainable frame rate by the refresh rate your monitor is set at.  Disabling it will improve performance but may degrade visual quality by introducing "tearing")

All video cards/chipsets were run using their respective manufacturer's reference drivers.

For the in-depth gaming performance tests Brett "3 Fingers" Jacobs Crusher.dm2 demo was used to simulate the worst case scenario in terms of Quake 2 performance, the point at which your frame rate will rarely drop any further.  In contrast, the demo1.dm2 demo was used to simulate the ideal situation in terms of Quake 2 performance, the average high point for your frame rate in normal play.  The range covered by the two benchmarks can be interpreted as the range in which you can expect average frame rates during gameplay.

At the time of publication, the Matrox G200 OpenGL ICD was not available and therefore the G200 ran the Unreal Benchmark in Direct3D.



Click Images to Enlarge
Pentium II 266
-
Pentium II 400
-
Celeron 300A
-
Celeron 450
-
Celeron 450A
-


Click on Images to Enlarge
Pentium II 266
-
Pentium II 400
-
Celeron 300A
-
Celeron 450
-
Celeron 450A
-


-
-

The nVidia Riva 128 takes all? Yep, you read right, SiN's unique rendering nature which focuses primarily upon single textured surfaces renders the Riva TNT's twin texel pipelines useless, and gives the S3 Savage3D a nice little thrashing as the older first and second generation chipsets came out on top of the "more advanced" competitors.



-
-


Pentium II 266 Pentium II 400
640 x 480
-
640 x 480
-
800 x 600
-
1024 x 768
-
1024 x 768
-


Pentium II 266 Pentium II 400
640 x 480
-
640 x 480
-
800 x 600
-
1024 x 768
-
1024 x 768
-


3Dfx Banshee Matrox G200

On the left we have the 3Dfx Banshee whose image quality is reminiscent of the Voodoo2, the Savage3D (when viewing a scene not effected too badly by the poor OpenGL drivers), and other cards of that class. On the right, the Matrox G200 is the trademark of most other next generation chipsets, such as the nVidia TNT, and of course, the G200 itself.



In the end, the final decision comes down to the choices you made in building your system the first time around. 

For Pentium II owners, regardless of clock speed, if you don't mind using two cards to handle your video tasks, the 3Dfx Voodoo2 still seems to be the best solution from an overall performance standpoint.   Unfortunately, this doesn't carry on into the value standpoint as you are throwing much money away when you choose to go with a Voodoo2 due to its 3D-only nature.

If money is a definite factor, as it is for many of us, the 3Dfx Banshee is the best overall graphics accelerator for the money.   While the best overall graphics accelerator regardless of price may be the nVidia Riva TNT, for the money, it seems as if nothing can beat 3Dfx's Banshee.  The only chipset out of the roundup which was never intended to dethrone a previous champ, the Banshee proved to be an excellent performer on all platforms, offering support for Glide based games in addition to Direct3D and OpenGL titles as well as the backing from the largest company of its kind, 3Dfx. 

S3 has much potential with their Savage3D which is quite reminiscent of the Banshee in terms of overall value, unfortunately the Savage3D is a card which suffers from a premature release and horrible drivers.   Currently the Savage3D has no place in any recommendations, mainly because of its poor driver support.  One the chipset matures and its drivers with it, it could possibly replace the Banshee as the most cost effective/high performing chipset on the market. 

Finally, as you all probably know by now, the nVidia Riva TNT isn't the chipset of choice if you have a low-end Pentium II processor (anything slower than a 333), and at the same time, the Matrox G200 is nothing more than a chipset for users whose primary concern lies in image quality and 2D performance.  If you grab a G200, don't expect to be able to watch Quake 2 fly by anytime soon as Matrox has delayed their OpenGL ICD once again. 

What will the future hold?  3Dfx and their single card Voodoo2 SLI board should breathe more life into that chipset, while Number Nine's Revolution IV could steal some sales away from Matrox.  ATI is promising quite a bit with the Rage 128, however, if you ask this reviewer, the way things appear to be now is the way they'll stay for at least a few more months.  This market is all about giving the consumer what they want, and fast...and what if a company can't do just that?  Ask ATI, S3, and Number Nine if they don't get their acts together soon.   It's the harsh reality of the ever changing times that can make or break a company with a single product release; cross your fingers, and hope for the best but expect the realistic as you set out to find your next video accelerator.

Log in

Don't have an account? Sign up now