One thing that specifically sets MSI’s G210 card apart from the others is that it comes packed with more than the bare minimum as is usually found with cards at this price level. In terms of included hardware you don’t get anything besides the card, the manual, the brackets, and the software – but it’s the software that makes the difference here.
MSI offers several software utilities for all of their cards, and the cornerstone of this is their Afterburner software. In a nutshell Afterburner is the distilled child of the long-favored RivaTuner utility, using RivaTuner’s technology to offer a straightforward video card overclocking and monitoring tool. We’ll be looking at Afterburner and other overclocking utilities in-depth later this month, but we wanted to take a quick look at it today.
As a RivaTuner descendant, Afterburner offers overclocking of the core, shader, and memory clocks, along with the usual suite of clock and temperature monitoring. Furthermore despite being an MSI utility, it works on a generic level with all NVIDIA and AMD GPUs that Afterburner supports. As a trump card specifically for MSI’s cards, it’s also capable of doing voltage tweaking on most of their cards. In the case of the G210 however, this feature is not supported (which would be a bad idea anyhow since it’s passively cooled).
Finally, we took a look at the G210’s usability in an HTPC setting. With the same VP4 decoder and 8-channel LPCM audio capabilities as on the rest of NVIDIA’s 40nm G(T) 200 series, the G210 has the potential to be a solid HTPC card on paper. As with the other low-profile cards we’ve been looking at this month, we ran it through the Cheese Slices HD deinterlacing test, which as we’ve seen can quickly expose any flaws or limitations in a card’s video decoding and post-processing capabilities
Unfortunately the G210 did extremely poorly here. In our testing the G210 would consistently drop frames when trying to run the Cheese Slices test, leading to it only processing around 2 out of every 3 frames. NVIDIA doesn’t offer any deinterlacing settings beyond enabling/disabling Inverse Telecine support, so the interlacing method used here is whatever the card/drivers support, which looks to be an attempt at Vector Adaptive deinterlacing.
GeForce 210
GeForce GT 220
The quality is reminiscent of VA deinterlacing, however it’s not as clean as what we’ve seen on the GT 220. More to the point, the G210 clearly doesn’t have the processing power to do this, but it’s unable to fall back to a lesser mode. Cheese Slices isn’t a fair test by any means, but it does mean something when a card can’t gracefully fail the test. Once we throw deinterlacing out of the equation however the G210 has no problem playing back progressively encoded MPEG-2 and H.264 material. It looks to only be serious limited when deinterlacing, which means the G210 is only at a serious disadvantage with interlaced material such as live television.
24 Comments
View All Comments
gumdrops - Thursday, February 18, 2010 - link
Where can I find this card for $30? Froogle and Newegg both list this card at $40 which is only $2 cheaper than a 5450. In fact, the cheapest 210 of *any* brand is $38.99.With only a $2-$5 difference to the 5450, is it really value for money to go with this card?
Taft12 - Thursday, February 18, 2010 - link
In a word: Noncix.com has the BFG version of this card on sale for $29.99CAD, but Ryan makes it pretty clear the MSI is the only OEM that produces a G210 worth owning
mindless1 - Thursday, February 18, 2010 - link
If building for a small form factor system you have to be a bit more concerned because you may not have any place to put bigger 92+mm fans, so for any particular airflow rate your smaller fans are running at higher RPM already.If you are building towards low noise, your system will be quieter by having a lower intake and exhaust rate, then a very low RPM fan on a heatsink instead of a passive heatsink.
That way it will also accumulate less dust, and help cool other areas like the power regulation circuit (mosfets). It also makes a product more compatible to have a single-height heatsink without an elaborate construction to maximize surface area like you'd need if that single height sink were passive.
Don't fear or avoid fans, just avoid high(er) RPM fans. Low RPM fans are inaudible, last a long time if they don't pick a very low quality fan.
greenguy - Wednesday, February 24, 2010 - link
You've got a good point there - that's why I went with the megahalems and a pwm fan (as opposed to ninja), and scythe kama pwm fans on both intake and exhaust (on 400rpm or so). I probably should have done the same with the graphics card, but didn't do the research. Do you have any pointers to specific cards or coolers?I might have to come up with a more localized fan or some ducting.
AnnonymousCoward - Wednesday, February 17, 2010 - link
I just wanted to say, great article and I love the table on Page 1. Without it, it's so hard to keep model numbers straight.teko - Wednesday, February 17, 2010 - link
Come on, does it really make sense to benchmark Crysis for this card? Choose something that the card buyer will actually use/play!killerclick - Wednesday, February 17, 2010 - link
Once my discrete graphics card died on me on a saturday afternoon and since I didn't have a spare or an IGP my computer was useless until monday around noon. I'm going to get this card to keep as a spare. It has passive cooling, it's small, it's only $30 and I'm sure it'll perform better than any IGP even if I had one.greenguy - Wednesday, February 17, 2010 - link
I was quite amazed to see this review of the card I had just purchased two of. I wasn't sure, but I have since determined that you can run two 1920x1200 monitors from the one card (using the DVI port and the HDMI port). This is pretty cool - it doesn't force you to use the D-SUB port if you want multi-monitors, so you have all that fine detailed resolutiony goodness.It looks very promising that I will be able to get the quad monitor in portrait setup working in linux like I wanted to, using two of these cards rather than an expensive quadro solution. Fingers crossed that I can do it also in FreeBSD or OpenSolaris. I really want the self-healing properties of ZFS, because this will be a developer workstation and I don't want any errors not of my own introduction.
I'm using a P183 case, and I've found that the idle temperature of the heatsinks are 61 degrees C without the front fan (the one in front of the top 3.5" enclosure). Installing a Scythe Kama PWM fan there I got this down to 47 degrees C. (Note that both of these I had both exhaust fans installed, though they are only doing about 500rpm tops.)
Using nvidia-settings to monitor the actual temperature of the GPU itself, I am getting a temperature of 74 degrees C of one card that is running two displays with compiz on, and the other is running at 54 degrees C.
Note that the whole system is a Xeon 3450 (equivalent to i5-750 with HT), 8GB RAM, with Seasonic X-650, and it is idling at 62-67 Watts. Phenomenal.
Exelius - Tuesday, February 16, 2010 - link
I'd be interested in seeing how this card performs as an entry-level CAD card. I understand it's not going to set any records, but for a low-end CAD station coupled with 8GB RAM and a core i7, does this card perform acceptably with AutoCAD 2010 (or perform at all?)I'm not a CAD guy, btw, so don't flame me too hard if this is totally unacceptable (and I know you can't benchmark AutoCAD so I'm not expecting numbers.) This card just shows up in a lot of OEM configurations so I'm curious if I'd need to replace it with something beefier for a CAD station.
LtGoonRush - Tuesday, February 16, 2010 - link
The reality is that the cards at this pricepoint don't really provide any advantages over onboard video to justify their cost. There's so little processing power that they still can't game at all, can't provide a decent HTPC experience, all they're capable of is the same basic video decode acceleration as any non-Atom video chipset. This sort of makes sense when you're talking about an Ion 2 drop-in accelerator for an Atom system to compete with Broadcom, but I just don't see the value proposition over AMD HD 4200 or Intel GMA X4500 (much less Intel HD Graphics in Clarkdale). I'd like to see how the upcoming AMD 800-series chipsets with onboard graphics stack up.