[mythtv-users] Little OT: RealTime Parallel Multi-PC Transcoding

belcampo belcampo at zonnet.nl
Wed Jun 16 16:38:39 UTC 2010


Raymond Wagner wrote:
> 
>>> You would want the video cuts to be tens of seconds long to reduce 
>>> data and startup overhead.
>> Could you be more specific on this ?
> 
> Starting up processes is expensive.  The startup and teardown of 
> something complex like x264 is probably going to be a couple seconds.  
> If you're only going to be encoding chunks of a couple seconds long, 
> that is a huge amount of overhead.
The original whole-file transcode takes 262 seconds for 1523 frames
The 34 chunks transcode takes 254 seconds, but only 1400 frames remain.
How counterintuitive it is/may look the chunk approach is on encode-time 
per frame faster by 5%
The odds that are happening:
Split bbchd.ts on every 2 seconds with tsMuxeR.
Chop the 34-chunks with tsMuxeR into a new bbchdnew.ts and I do get a 
perfect playing file with exactly the same amount of frames.

Every individual file is correct and complete. Playing the individual 
files with a PopCornHour is 100% glitch-free.

After encoding, with ffmpeg, the 34chunks and stitching them together 
again with tsMuxeR, results in slightly unsmooth, because of the missing 
frames, and 4-times little glitches/blockiness.

Conclusion so far:
No indications for huge amount of  overhead because of the chunks.
Running it with 6Cores it completes in 44seconds being 595% but with 
8.78% less frames, leaves a total score of 547% being 91% per core 
performace which is much better than any multithreading approach I have 
seen so far.

But, ffmpeg should be able to pick all chunks as 100% valid which it 
doesn't seem to be able.

Please comment/suggest/flame or whatever.

Henk
> _______________________________________________
> mythtv-users mailing list
> mythtv-users at mythtv.org
> http://mythtv.org/cgi-bin/mailman/listinfo/mythtv-users



More information about the mythtv-users mailing list