[mythtv-users] Re: Tearing in playback

Bruce Markey bjm at lvcm.com
Fri Jan 7 21:20:56 EST 2005


Doug Larrick wrote:
> William Uther wrote:
> 
>> Quick question: how does RTC sync work?
>>
>> RTC stands for Real Time Clock, right?
>> I assume that the frequency has to be higher than the total number of 
>> horizontal lines in your X modeline.  If myth then got a known sync 
>> signal at any point in the image display it could use that to find the 
>> blacking period.  Do you know at what point in the frame the sync 
>> signal is sent and how myth gets it?
> 
> 
> Actually, the RTC vsync method is not one that synchronizes to the 
> vertical retrace... you have to use /dev/nvidia0, OpenGL, or DRI for 
> that.  It simply uses the programmable interrupt timer to sleep for an 
> accurate delay until it's time to display the next frame.  That said, it 
> uses less CPU than "usleep with busy wait," so you are more likely to 
> get accurate frame timing.  There's still a possibility of tearing, but 
> it will most likely either consistently be OK, or consistently tear (at 
> which point you'll probably get fed up and start playback over).

Good explaination, Doug. A couple things. Busy wait only uses
~6% of the CPU time with 100Hz timeslices (6% of the time regardless
of the CPU speed ;-). at 1024 it's ~0.6%. The RTC only refreshes
on the next 1/1024th of a second whereas busy wait sleeps until
delay is close to zero then spins the rest of the way. As it stands
right now, busy wait is more accurate. However, without retrace
info, either of these are subject to periods of rapid fire jitter.

--  bjm



More information about the mythtv-users mailing list