[mythtv-users] Can a FX5200 drive a Westinghouse LVM-42w2

Marc Infield marc at infielddesign.com
Thu Jul 13 00:28:48 UTC 2006


>> Hi,
>>
>> I just purchased a Westinghouse LVM-42w2 and my existing Myth setup
>> has a chaintech fx5200 card with DVI. Can a fx5200 card drive the
>> LVM-42w2 at 1080p?
>>
>> I've been searching around and reading posts for more than an hour
>> and I seem to be going around in circles. There are many posts on the
>> FX5200 card and I see some conflicting post on weather it can handle
>> 1080i through the DVI port and some issues about frequency and
>> drivers, but I haven't found much on 1080p. I'm a little nervouse
>> that I'll wreck a reliable MythTV setup if I start fiddling with any
>> modeline parameters.
>>
>> Currently when I hookup the monitor up to with a DVI cable it shows
>> the text startup but drops the signal when it goes into X, the s-
>> video works fine.
>>
>> Any links or pointer would be appreciated.
>>
>> By the way, I'm pretty impressed with the LVM-42w2 so far.
>>
>
>
> I have that exact TV, as well as that exact Chaintech FX5200 card.  
> I found
> that with the newest nvidia proprietary drivers, the FX5200 is  
> limited to a
> 135Mhz pixel clock over the DVI output. With earlier drivers, this was
> apparently not a limitation, but getting the older drivers to work  
> with the
> newer kernels is a non-trivial thing. The 135Mhz clock is not  
> enough to
> drive 1080p, so it refused to work. However, over the 15 pin vga  
> connector,
> I was able to do 1080p. I had another machine that is a dual boot
> windows/FC5 box that I tried it out on also, with a 6x00 series  
> video card
> (Can't remember the exact model off the top of my head). With that  
> card, the
> binary drivers did 1080p just fine over DVI. Testing is fairly  
> easy, as you
> don't have to specify custom modelines to make it work. You can  
> specify
> "1920x1080" as the resolution, and it just works. However, you can  
> muck with
> the modelines if you want to, and I found these modelines from  
> somebody
> else's config:
>
>         Modeline "1920x1080-59.94p"
> 	148.352
> 	1920 1960 2016 2200
>         1080 1082 1088 1125
> 	+hsync -vsync
> 	# The modeline I use
>
>         Modeline "1920x1080-60p"
> 	148.5
> 	1920 1960 2016 2200
> 	1080 1082 1088 1125
> 	# Another modeline I've used
>
>
> I tried the 1920x1080-59.94p modeline with my TV, and that worked  
> fine also
> (In the same combinations as just using "1920x1080", not on the  
> FX5200 DVI).
> However, I didn't really notice any difference, so I went back to just
> specifying "1920x1080". I was originally going to just upgrade the  
> video
> card to a low end 6x00 or 7x00 video card that wouldn't have that  
> issue, but
> I ended up deciding on more comprehensive upgrades so that I could  
> do hi-def
> without using XvMC.
>
> Shawn

Wow, thanks that makes pretty good sense of things! Thanks so much.

To summarize, with the current Nvidia driver I need to use the VGA  
port. I'm at work now so I can't double check but I think I just have  
s-video and DVI on my fx5200. I will check when I get home. But if  
this is true then the cowards way out would be to get something like   
a fanless Geforce 6200 [http://www.newegg.com/Product/Product.asp? 
Item=N82E16814145118], which I might add is only a few of dollars  
more than the DVI cable I picked up from Best Buy.

[Off-topic] But I'm considering turning this system into a backend  
and getting mac mini as front end so we can also run iTunes.

Thanks,
-marc




More information about the mythtv-users mailing list