[mythtv-users] Latest thoughts on small silent frontends

Gary Buhrmaster gary.buhrmaster at gmail.com
Mon Mar 10 17:57:51 UTC 2014


On Sun, Mar 9, 2014 at 1:56 PM, Stephen Worthington
<stephen_agent at jsw.gen.nz> wrote:
....
> So that is good news - the cheaper Nvidia 610 chipset is OK for 1080i,
> so it is easy to get a silent video card for any frontend box.

It may be good news, but it is not accurate news.  The GT610 is
not capable (by objective, measurable values) capable of Advanced
2X de-interlacing of high bit-rate 1080i 60Hz content in all cases
(it might, barely, be able to do so in 50Hz countries).

I am not saying that Greg is wrong.  He may very well observe
a 610 is "buttery smooth".  But that is not (and cannot be)
true for all configurations or requirements or individuals.

Here are some possible reasons of the disconnect:

* Sometimes (all too often) nVidia will use the same marketing
  name for different underlying chips.  In this particular case,
  both the GT610 and GT610M are currently GF119 chips,
  so it should not apply, but at sometime in the future, a GT610
  might be something else (see the GT630 comment below)
  (should this post be read in the future).

* Some TV's have great de-interlacers and/or video processors
  that hide "badness".  Ignoring the 960Hz video processing
  marketing claims by some manufactures, some TVs are
  still very good at hiding artifacts (and the default is usually
  to smooth the picture and remove artifacts).

* Some TV's (especially smaller ones) down-sample everything
  to a lower resolution (sometimes 1280x720) which can hide
  artifacts as part of their post-processing.

* Some AV receivers have great video processors that
  are as good (or better) than the TVs they are connected to.

* Some TVs (screens) are so slow, that they hide some
  artifacts (i.e. slow refresh rate LCD screens).

* Some individuals visual processing (brain) simply do not see
  artifacts.  Just like some people never see 60Hz refresh
  rate artifacts (while for others, the headaches are terrible).
  I am sure there is more recent research, but the old papers
  I read showed no clear correlation with anything else about
  the person and their ability to observe the visual artifacts.
  Some people just observed what others did not.

* Some individuals just want to believe so hard that they
  convince themselves there is no issue.  Selective hearing.
  Selective seeing.  The mind is an impressive piece of work.
  It matters not whether it is your new Bugatti, or your new
  Pioneer Elite Kuro Pro.  One wants (needs) to believe it was
  a good choice.

The "objective" qvdpautest results for a GT610 show a a value of
    MIXER TEMPORAL_SPATIAL + IVTC (1920x1080): 51 fields/s
which is less than the required 60 (plus some extra for
headroom) for true minimization of visual artifacts due to
de-interlacing for all cases.

However, that all said, subjective results may be what counts
(to you).  Unless you have the exact same FE, TV, AV receiver,
and visual processing, your experience *will* vary from any
other individual.  It may be better, or worse.

Caveat emptor.



btw, at this time, the lowest end current generation nVidia card
that has objective numbers that seem adequate for true Adv 2X
is the gen 2 GT630 cards.  The gen 2 is important, with a GK208
chipset.  Some manufactures have a passively cooled low profile
capable version in the market.  [For the record, there was a OEM
GT630 with a (sort of ) gen 1.5 chipset.  I do not recall ever seeing
results from that version of the GT630.]


More information about the mythtv-users mailing list