[mythtv-users] System specifications review...
Michael T. Dean
mtdean at thirdcontact.com
Mon Nov 16 18:52:38 UTC 2009
On 11/16/2009 01:15 PM, Patrick Doyle wrote:
> On Mon, Nov 16, 2009 at 12:37 PM, Yeechang Lee wrote:
>
>> Patrick Doyle says:
>>
>>> So if I go with this processor, which I selected based on reports of
>>> it's idle and max power consumption (6.8W and 31W), is there any point
>>> in adding a VDPAU graphics card later?
>>>
>> Yes.
>>
>> Let me reword that: No matter how much CPU horsepower you have, there
>> is no point in *not* getting a VDPAU-capable card *now*. For $30-50,
>> how can you lose?
>>
> I could lose if it increased the overall power consumption of my box.
> That one could be tough to measure, as there is a certain "idle" power
> consumption that, almost certainly must be higher just by adding a
> card to the box. But that would, ideally, be offset by the total
> reduction in power of using the VDPAU decoder instead of using CPU
> cycles.
>
Agreed.
> Hmmm... as I think about this more, it occurs to me that it is more
> complex than I first thought... after all, when I turn off the TV, the
> MythTV box is likely to stay powered on, still tuned to and displaying
> whatever channel it was last tuned to and displaynig. (At least
> that's the mode that my family and I have been trained to expect from
> the TiVo.)
Myth does not do this. Myth uses the capture cards /only/ when
recordings are in progress (including LiveTV recordings). Myth does
/not/ record random garbage when you're not using it. This means that
Myth actually /does/ benefit from idle CPU/GPU power savings.
VDPAU is not currently used for commercial flagging or transcoding;
therefore, VDPAU /only/ comes into play when you're actively /watching/ TV.
Even if you set your system to record 24/7, VDPAU is only affecting
playback. Leaving your system playing back recordings when no one is
watching/listening (especially when the TV is powered off) is a terrible
waste of power.
> Therefore, the so called "idle" power consumption could be
> lower with a VDPAU-capable card, just because it would (hopefully) be
> more efficient and decoding the OTA video than the CPU would be. Of
> course, if we change our habits such that the default screen is the
> menu screen and we never watch live TV through the Myth box, then the
> the idle power would (presumably) be much lower.
>
Which means that really, the difference in power consumption that VDPAU
vs no-VDPAU will make (assuming all hardware is constant) is the
difference between the increase in GPU power usage when decoding using
VDPAU and the increase in CPU power usage when decoding not using VDPAU.
Therefore, plugging in a discrete video card means you get a total extra
power consumption of:
(Gi * (24 - Np)) + (Gv * Np)
Where
Gi is GPU idle power consumption
Gv is GPU VDPAU power usage
Np is number of hours of playback per day
Without VDPAU, you get extra power consumption of:
Cd * Np
Where
Cd is /extra/ CPU power usage when decoding
Therefore, if you:
a) run BOINC or some other program that maxes your CPU all the time,
you will see /absolutely/ no increase in power usage without VDPAU, but
/will/ see an increase in power usage with a discrete VDPAU-supporting GPU.
b) let your CPU idle when not in use, may see either higher or lower
power usage when adding a discrete VDPAU-supporting GPU--depending on
whether the increase in power consumption when doing software
(CPU-based) decoding for the number of hours you actually /watch/ TV
each day is greater or less than the increase in power usage from the
discrete video card.
The only way to know for sure is to actually measure usage with and
without and have a good idea of your average number of hours of TV
viewing per day.
Based on the link posted by tortise at
http://www.gossamer-threads.com/lists/mythtv/users/407591#407591 -- a
link to
http://en.wikipedia.org/wiki/Comparison_of_Nvidia_graphics_processing_units#GeForce_8_.288xxx.29_series
--which shows the GF 220 with a TDP of 58W, I seriously believe that
VDPAU will cost you power.
If we assume 4 hours of TV watching per day (Np = 4hrs) and nice 10W
conservative GPU idle draw (Gi = 10W) for that 58W GPU, we're talking
about an extra 200Wh per day just at idle. Then, we add Np*Gv. For
fun, let's say that decoding ATSC is so easy that there's no increase in
power usage on the card (Gv = (a fantastic) 10W), so we get 40Wh per day
decoding ATSC. That means that adding the discrete graphics card is
costing you 240(fancifully-low)Wh per day.
Assuming you find a 45W TDP CPU and--for fun--saying it takes 0W at idle
and 45W when decoding ATSC, that's only 180Wh for that same 4hrs of TV
viewing. Even with a 65W TDP CPU with a full 0/65W spread, it's 260Wh.
If you watch, on average, less than 4hrs (1/6 of the entire day or 1/4
of the "waking day") of TV per day, the scale slides more in favor of
the software decoding. If you get some real numbers for the discrete
card Gi and Gv and for Cd, I think you'll find that the scale again
slides more in favor of software decoding.
I think you'll also find that no CPU takes 0W at idle (safe bet, there
:), and decoding ATSC on a modern (such as the Regor-based Athlon II X2)
CPU doesn't use the full design power. I would also guess that Gi=10W
and, especially, Gv=10W are low guesses for a 58W TDP video card.
All of the above, of course, is focused /only/ on energy-usage savings.
All of the above discounts the power savings incurred when pausing video
playback (how much time do you spend getting snacks, etc. when watching
TV :).
If you use Myth's feature to allow you to shut down the system when not
in use, the above numbers would change--and in that situation, when a
majority of the power-on time is spent decoding--VDPAU playback may
actually save power, even when you have to add a discrete video card to
get it. Whether it does, though, would be dependent on whether Gv is
greater or less than Cd.
Just my $0.02.
Mike
More information about the mythtv-users
mailing list