[mythtv-users] Video jitter possibly cased by low cpu usage
Marius Schrecker
marius.schrecker at lyse.net
Tue Dec 17 08:42:36 UTC 2013
An HTML attachment was scrubbed...
URL: <http://www.mythtv.org/pipermail/mythtv-users/attachments/20131217/782b9580/attachment.html>
-------------- next part --------------
Thanks Gary, Stuart and Jean-Yves for your feedback.
On Monday, December 16, 2013 23:25 CET, Gary Buhrmaster <gary.buhrmaster at gmail.com> wrote:
On Mon, Dec 16, 2013 at 10:10 PM, Stuart Morgan wrote:
> On Tuesday 17 Dec 2013 08:55:36 Jean-Yves Avenard wrote:
>> There should be no need to fiddle with CPU frequencies when you use vdpau.
>> Not with any recent drivers or intel proc (only amd had issues)
>
> Unless the GPU is on-board and sharing system memory.
The impacts depend on the micro-architecture of the processor.
Westmere processors showed almost no impact on memory
speed, while Sandy Bridge based processors show a larger
impact. Bus speed (underclocking) is yet another dimension
that can be seen on some architectures.
As with much else, the specific details (processor, chipset,
planar design, etc.) matter (and sometimes finding the info
makes finding the needle in a haystack child's play).
_______________________________________________
mythtv-users mailing list
mythtv-users at mythtv.org
http://www.mythtv.org/mailman/listinfo/mythtv-users
On Tuesday 17 Dec 2013 08:55:36 Jean-Yves Avenard wrote:
> There should be no need to fiddle with CPU frequencies when you use vdpau.
> Not with any recent drivers or intel proc (only amd had issues)
Unless the GPU is on-board and sharing system memory.
--
Stuart Morgan
MythTV
I'm including some details so that we know what we're dealing with here. The CPU is, indeed an AMD and as I notice a clear difference when I raise the frequency, I think we can assume that we're affected by bus speed.
I made a mistake with the GPU. It's a GT610, but most of my content is progressive, so I'm not using it for any deinterlacing. That jitterometer only showed 1 15% CPU usage even at 800Mz wiould indicate that I am using VDPAU.
The only things I don't uinderstand is why I should notice a further improvement when raising both cores and why I should need to go as high as 2300 MHz for the problem to disappear completely.
Anyway, here's one core crom cpuinfo (other core identical):
processor : 1
vendor_id : AuthenticAMD
cpu family : 16
model : 6
model name : AMD Athlon(tm) II X2 250 Processor
stepping : 2
microcode : 0x10000c7
cpu MHz : 800.000
cache size : 1024 KB
physical id : 0
siblings : 2
core id : 1
cpu cores : 2
apicid : 1
initial apicid : 1
fpu : yes
fpu_exception : yes
cpuid level : 5
wp : yes
flags : fpu vme de pse tsc msr pae mce cx8 apic sep mtrr pge mca cmov pat pse36 clflush mmx fxsr sse sse2 ht syscall nx mmxext fxsr_opt pdpe1gb rdtscp lm 3dnowext 3dnow constant_tsc rep_good nopl nonstop_tsc extd_apicid pni monitor cx16 popcnt lahf_lm cmp_legacy svm extapic cr8_legacy abm sse4a misalignsse 3dnowprefetch osvw ibs skinit wdt hw_pstate npt lbrv svm_lock nrip_save
bogomips : 6029.95
TLB size : 1024 4K pages
clflush size : 64
cache_alignment : 64
address sizes : 48 bits physical, 48 bits virtual
power management: ts ttp tm stc 100mhzsteps hwpstate
Some lspci info (other device info available on request):
01:00.0 VGA compatible controller: NVIDIA Corporation GF119 [GeForce GT 610] (rev a1) (prog-if 00 [VGA controller])
Subsystem: ASUSTeK Computer Inc. Device 840d
Flags: bus master, fast devsel, latency 0, IRQ 18
Memory at fa000000 (32-bit, non-prefetchable) [size=16M]
Memory at d8000000 (64-bit, prefetchable) [size=128M]
Memory at d6000000 (64-bit, prefetchable) [size=32M]
I/O ports at dc00 [size=128]
[virtual] Expansion ROM at fbe80000 [disabled] [size=512K]
Capabilities: <access denied>
Kernel driver in use: nvidia
01:00.1 Audio device: NVIDIA Corporation GF119 HDMI Audio Controller (rev a1)
Subsystem: ASUSTeK Computer Inc. Device 840d
Flags: bus master, fast devsel, latency 0, IRQ 19
Memory at fbe7c000 (32-bit, non-prefetchable) [size=16K]
Capabilities: <access denied>
Kernel driver in use: snd_hda_intel
and some outpu from nvidia-smi:
Driver Version : 310.44
Attached GPUs : 1
GPU 0000:01:00.0
Product Name : GeForce GT 610
Persistence Mode : Disabled
GPU UUID : GPU-bda18178-2424-4734-8dd3-d2915077b96f
VBIOS Version : 75.19.55.00.02
PCI
Bus : 0x01
Device : 0x00
  ; Domain : 0x0000
Device Id : 0x104A10DE
Bus Id : 0000:01:00.0
Sub System Id : 0x840D1043
Memory Usage
Total : 1023 MB
Used : 211 MB
Free : 812 MB
Compute Mode : Default
Any adfvice on how I can modify my scripts to make sure both cores get the higher minimum frequency for playback?
Current scripts:
/usr/local/bin/frequencydown.sh
#!/bin/bash
echo 800000 | sudo tee /sys/devices/system/cpu/cpu0/cpufreq/scaling_min_freq > /dev/null
echo 800000 | sudo tee /sys/devices/system/cpu/cpu1/cpufreq/scaling_min_freq > /dev/null
/usr/local/bin/frequencyup.sh
#!/bin/bash
echo 2300000 | sudo tee /sys/devices/system/cpu/cpu0/cpufreq/scaling_min_freq > /dev/null
echo 2300000 | sudo tee /sys/devices/system/cpu/cpu1/cpufreq/scaling_min_freq > /dev/null
and /etc/sudoers.d/cpufrequency_video_playback
mythuser ALL=(ALL) PASSWD: ALL, NOPASSWD: /usr/bin/tee /sys/devices/system/cpu/cpu0/cpufreq/scaling_min_freq
mythuser ALL=(ALL) PASSWD: ALL, NOPASSWD: /usr/bin/tee /sys/devices/system/cpu/cpu1/cpufreq/scaling_min_freq
Cheers!
Marius
More information about the mythtv-users
mailing list