[mythtv-users] Nvidia GPU for HP Microserver
Thomas Boehm
mythtv-users at lists.boehmi.net
Mon Jul 28 15:17:13 UTC 2014
On 28/07/14 01:02, Stephen Worthington wrote:
> A 210 will not do playback of interlaced 1080i properly. If you live
> somewhere where the TV does not broadcast 1080i, then that may be OK
> for you. If you have a 210, you need to have a CPU powerful enough to
> do the deinterlacing for you. That used to mean, as a rule of thumb,
> having to have a 3.0 GHz or above CPU, as it had to be able to do it
> with one core. But that might have changed. Having your CPU do the
> deinterlacing uses lots of electricity and creates plenty of heat, so
> if you have fans, they will usually speed up a bit and make noise.
> Having the GPU do the deinterlacing seems to be virtually free - it
> does not seem to cost any noticeable extra energy use.
>
> In the 200 series, you need a 220 for everything to work correctly.
> They are available as silent boards, but I have no idea if you can
> still get one in the form factor you need. My normal height Asus
> Bravo 220 takes up all of the next slot for its heatsink.
>
> It has been reported that in the current Nvidia chipsets, a 610 is
> capable of playing 1080i in its GPU without any CPU assistance.
Thanks for this explanation. The thing is that my current frontend,
which has the G210 in it, has has a P4 3.2GHz. This CPU has a CPU MArk
of 362, whereas the Turion N54L of the Microserver has a CPU Mark of 1419.
http://www.cpubenchmark.net/compare.php?cmp[]=477&cmp[]=1074
So the Microserver should run even better with a G 210 than the current
frontend. But maybe you're right and I should go for the GT610 as the
backend and a few other things run on the server too.
Thanks
Thomas
More information about the mythtv-users
mailing list