[mythtv-users] What do I have to do to get HD working?

Steven Adeff adeffs.mythtv at gmail.com
Sat Apr 15 17:38:38 UTC 2006


On 4/15/06, Joe Votour <joevph at yahoo.com> wrote:
> --- Steven Adeff <adeffs.mythtv at gmail.com> wrote:
> > On 4/15/06, Ivan Kowalenko
> > <ivan.kowalenko at gmail.com> wrote:
> > > On Apr 15, 2006, at 01.52, Steven Adeff wrote:
> > > > On 4/15/06, Marco Nelissen <marcone at xs4all.nl>
> > wrote:
> > > >>>>> Whoever said that a P4 3.2 is fast enough
> > for HD doesn't know what
> > > >>>>> they are talking about.  Get an AMD dual
> > core.  I have a 3800
> > > >>>>> and it
> > > >>>>> works well (It replaced a Prescott 3.2 that
> > I used before after
> > > >>>>> making
> > > >>>>> the same mistake that you just made).
> > > >>>>
> > > >>>> Processor-envy aside, a dual-core 3800+ is
> > overkill. I'm using an
> > > >>>> Athlon64 3200+, and it is sufficient for HD,
> > even without XvMC.
> > > >>>
> > > >>> It is only sufficient if you reduce the
> > quality of playback.  A
> > > >>> 3200+
> > > >>> will not be able to deinterlace to 60fps.
> > This may not be important
> > > >>> to everyone, but it is quite noticeable on
> > sports, Letterman, etc.
> > > >>
> > > >> I suppose if your TV only supports progressive,
> > you might need a
> > > >> beefier
> > > >> CPU. I just play my recordings in their native
> > formats, since the TV
> > > >> supports both interlaced and progressive input
> > signals.
> > > >
> > > > I don't understand that commend about a 3200+
> > not being able to
> > > > deinterlace 60fps. I had Myth converting 1080i
> > to 720p output before
> > > > the new nvidia driver was released and never had
> > a problem.
> > >
> > > I think we're talking about going from 1080i to
> > 1080p (though this
> > > hasn't been explicitly stated, as far as I'm
> > aware) in real-time.
> > > Probably would help, though, if we knew for sure.
> >
> > I've got a 3200+ and use BOB with 1080i and still
> > only hit 90%
> > occasionaly.  The downside is the CPU needed to
> > record from 3 DVB
> > cards can make it choppy, just means I need to build
> > a dedicated
> > masterbackend, but I can definitely go from 1080i to
> > 1080p with a
> > 3200+.
> >
> > --
> > Steve
> > _______________________________________________
> > mythtv-users mailing list
> > mythtv-users at mythtv.org
> >
> http://mythtv.org/cgi-bin/mailman/listinfo/mythtv-users
> >
>
>
> Steve,
>
> Your recent messages have made a light bulb go off in
> my head.  I admit, I haven't played with HD for too
> long, so I'm always looking to learn some new tricks.

I did not always, 8756 lets me do so now though that I can get 1080i
working. I used to play everything back in 720p as thats all I could
get working with my TV.


> You mentioned that you play the recordings in their
> native mode, how exactly do you do that?  I take it
> that you have modelines for 720p and 1080i, and you
> use XRandR (via the MythTV configuration under
> Appearance/Separate video modes for GUI and TV
> playback) to do that?  If that's the case, I'll be
> making a 1080i modeline to see if that helps.
> (Currently, I think I'm playing everything back in
> 720p).

thats how I do it. But like I said, i used to do everything in 720p
with 7676 (no XvMC) and it worked fabulously. I had to run a deint for
1080i content (I just used bob, seemed to use the least CPU with
acceptable results on my TV).


> But, does that mean that you use any deinterlacing at
> all?  I suspect not, if you're running the TV
> interlaced.

You'd think, and from what people say OpenGL is supposed to let me
not, but I get really bad frame flicker if I don't use BOB, with or
without OpenGL Vsync enabled. In fact, and I have to test this. I
think if I disable it in MythTV and use the GL vsync from
nvidia-settings it actually works this way. But I need to play some
more. This change for me has been a pleasure and painful. I'm playing
with 1080i, XvMC and OpenGL vsync all at the same time and it seems
like something in the whole equation doesn't quite work right, at
times I feel like going back to 7676 and running in 720p again...


> Hmmm...  I need to be able to deinterlace my SD
> recordings, but if I can get by without deinterlacing
> my (interlaced) HD recordings and have MythTV
> automatically recognise that, it would work well.

I agree, but I don't think this is an option yet unless you transcode
your SD content with a deinterlace filter.
Or... which I just thought of... run your SD content at 1080i? Does
scaling it affect the interlace? I dunno...


--
Steve


More information about the mythtv-users mailing list