Thursday, February 01, 2007

NVIDIA for mythTV

We migrated our production mythTV environment from ATI to NVIDIA.

Why, THE Business case:
We had serious issues with the latest ati driver on our new kernel in combination with latest mythTV. (Read more) We had more and more downtime due to testing and research.
The WAF (wife acceptance factor) was going down and we calculated that this potentially could have a serious impact in and around the productivity of our data center.

Investment:
The nvidia 6200LE PCI-e is cheap. I found myself one with composite out for 20 euro's. Click here for a Google on the 6200LE.
Passively cooled card with the component out dongle

The implementation:
Took about 45 minutes in total to open the machine, disable the onboard ATI crap and to migrate from ATI drivers to NVIDIA drivers.

3 comments:

Anonymous said...

thanks moose. Have the same issue with mythTV and ATI. Was following your works the last weeks.

Think I'll do the same thing.

Anonymous said...

I was an ATI Zealot... until today. After some days tweaking fglrx drivers, xorg.conf and slowly destroying my system I thought about switching to nVidia.

Now your post made me realize I'm in the right direction.

Thanks!

Anonymous said...

Heard. I've been trying to get a Radeon 8500 working since the product came out. I'm not about to buy another ATI product before this one is working just slightly acceptable.

By working, I mean the simplest and most standard configuration you could expect for someone buying a 3D-capable video card with TV-out: 2D, 3D, a flatscreen, and video overlay on tv-out.

ATI is performing abysmally. There are so many problems, and they could have caught at least 95% of them so easily if they had just tested their driver in a couple of configurations before releasing it. It's painfully clear that they don't.

I'm not too happy about switching to nvidia either, since the driver is closed source. Would much prefer to go with Intel, because they manage to develop their driver within the framework of the open source and X.org community. Oh Intel, when will you get proper 3D performance on your GPUs?