[mythtv-users] Populating the .xmltv file for my grabber from mythtv database?

David Watkins watkinshome at gmail.com
Mon May 10 07:08:47 UTC 2021


On Sun, 9 May 2021 at 20:22, Larry Kennedy <lunchtimelarry at gmail.com> wrote:

>
>
> On Wed, May 5, 2021 at 4:19 AM David Watkins <watkinshome at gmail.com>
> wrote:
>
>>
>>
>> On Wed, 5 May 2021 at 05:06, Stephen Worthington <
>> stephen_agent at jsw.gen.nz> wrote:
>>
>>> On Tue, 4 May 2021 13:32:55 -0400, you wrote:
>>>
>>> >On Tue, May 4, 2021 at 12:58 PM David Watkins <watkinshome at gmail.com>
>>> wrote:
>>> >
>>> >>
>>> >>
>>> >> On Sun, 2 May 2021 at 14:55, Larry Kennedy <lunchtimelarry at gmail.com>
>>> >> wrote:
>>> >>
>>> >>> I am just now attempting the upgrade to version 31, and the cutover
>>> from
>>> >>> DD to XMLTV.
>>> >>>
>>> >>> I successfully configured the sqlite grabber to pull from SD, and
>>> noticed
>>> >>> that it downloaded all of the channels on my SD lineup, even though
>>> most
>>> >>> are not selected.  I guess this is intended behavior?
>>> >>>
>>> >>> Seeing how tedious it is to mark channels as "selected" in the Sqlite
>>> >>> database, I came up with a slightly different way to transfer the
>>> channels
>>> >>> I currently care about.
>>> >>>
>>> >>> I run this query in my existing mythconverg database:
>>> >>>
>>> >>> SELECT CONCAT('update channels set selected = 1 where channum = ',
>>> >>> channum ,";") AS combined FROM channel where visible=1
>>> >>>
>>> >>> Pipe this output to a file named selected.sql which creates the
>>> sqlite
>>> >>> update statements to set the correct channels to "selected" in the
>>> new
>>> >>> grabber database to match what you now have as visible channels in
>>> mythtv.
>>> >>>
>>> >>> Then, just use the grabber tool to make all the channels as "not
>>> >>> selected" by default, and then pipe the file from above into the
>>> sqlite
>>> >>> query as seen on the xmltv wiki:
>>> >>>
>>> >>> sqlite3 $HOME/.xmltv/SchedulesDirect.DB < selected.sql
>>> >>>
>>> >>> This seems simpler than some of the other methods I've seen on the
>>> wiki
>>> >>> and elsewhere...
>>> >>>
>>> >>> Moving on....I need some advice before I go past the rubicon on this
>>> v31
>>> >>> + xmltv upgrade:
>>> >>>
>>> >>> After I do the 31 upgrade, and since I only use an HDHR, I plan to
>>> just
>>> >>> blow away my channels and repopulate.  I think all the channel ids
>>> will
>>> >>> line up the same as before.
>>> >>>
>>> >>> Currently, in v30, I run mythfilldatabase as a cron job.  Should I
>>> >>> continue to do that, or switch to another method?  I've never let
>>> mythtv
>>> >>> natively run the mythfilldatabase process.  I'm having a hard time
>>> >>> following the intent of the setup-video-sources wiki page on this
>>> topic.
>>> >>> Advice appreciated!
>>> >>>
>>> >>> Thanks,
>>> >>> Larry
>>> >>>
>>> >>>
>>> >> I have a combined BE/FE which shuts down when not being used and
>>> wakes up
>>> >> for recordings.  I run this script from crondaily and it seems to
>>> work out
>>> >> OK.
>>> >>
>>> >> #!/usr/bin/bash
>>> >> /usr/local/bin/mythshutdown --lock
>>> >>
>>> >> tv_grab_zz_sdjson --days 10 --config-file
>>> ~/.xmltv/tv_grab_zz_sdjson.conf
>>> >> --output ~/sd_listing.xml 2>/dev/null
>>> >>
>>> >> /usr/local/bin/mythfilldatabase --only-update-guide --max-days 10
>>> --file
>>> >> --sourceid 2 --xmlfile ~/sd_listing.xml 2>/dev/null
>>> >>
>>> >> /usr/local/bin/mythshutdown --unlock
>>> >>
>>> >> It's not an ideal solution because I have a low powered ION
>>> motherboard
>>> >> with only 4GB RAM and  this more or less guarantees that MFD will run
>>> while
>>> >> a recording is in progress when I'd prefer to ensure it ran when the
>>> box
>>> >> wasn't recording.  Also, if  I ever go more than 10 days without a
>>> >> scheduled recording the whole thing will come to a stop because it
>>> will run
>>> >> out of guide data and never wake up to load any more.  In practice
>>> neither
>>> >> of those things causes a problem.
>>> >>
>>> >>
>>> >> Although SD returns 18 days of guide data I've found that many
>>> channels
>>> >> just have 'boilerplate'  programme information after 10 days or so,
>>> so I
>>> >> limit both SD and MFD to 10.
>>> >>
>>> >>
>>> >I attempted an upgrade over the weekend, but ended up rolling it back.
>>> >Kudos to the myth devs that created the backup and restore scripts!
>>> >
>>> >I'm still on v30 with Schedules Direct DataDirect and a cron job to run
>>> >mythfilldatabase.   To get to v31 and the new xmltv grabber, I decided
>>> to
>>> >do the XMLTV upgrade now, followed by an upgrade to v31, say, a week
>>> >later.  I chose this path since it appears that v31 would force the same
>>> >upgrade of XMLTV, and I wanted to decouple these two things.  If I
>>> >approached this incorrectly, please let me know.
>>> >
>>> >I ran the xmltv install, then followed the wiki to configure the sqlite
>>> >grabber.  This all appeared to work, as I was able to see like 1,000
>>> rows
>>> >of channel data  in the sqlite database.  As indicated in this thread, I
>>> >created a simple way to set the right channels as "selected".
>>> >
>>> >After that, I went into mythtv-setup, channel editor, and deleted all my
>>> >channels.  I only have one source that's by HDHR.
>>> >
>>> >Then, I added a new video source that matches the name of the grabber
>>> >config, set this up as Multinational, and done.
>>> >
>>> >Then I went into the other setup screen where I map the video source to
>>> my
>>> >HDHR tuners.  Forget the name of that one.  Capture cards?
>>> >
>>> >After this, I ran mythfilldatabase.  Looking at mythweb, I could see it
>>> >populated the channels back into the database, but I am pretty sure it
>>> put
>>> >in more than the ones I marked as "selected" in the Sqlite database
>>> >Strange.  The listing started to appear, but VERY slowly.
>>> >
>>> >Mytfilldatabase ran for like 4+ hours, and since I was getting close to
>>> the
>>> >time I needed to finish or bail out, I killed it.  I then tried running
>>> the
>>> >script that does 3 days at a time, and this ran to full completion right
>>> >away, but never fully populated my listings.  I still had major gaps in
>>> the
>>> >listings.  Some were there and some never showed up.
>>> >
>>> >Any advice before I give this another go this coming weekend?  Is my
>>> >strategy flawed?  Did I miss a step?  How best to run mythfilldatabase
>>> so
>>> >that it doesn't take all day?  Is it the 3 days at a time script?
>>> >
>>> >Larry
>>>
>>> As I understand it (not being an SD user), the first time you run
>>> mythfilldatabase after installing the new setup, it does take hours to
>>> run.  Subsequent updates need to be limited to only a few days to make
>>> the time reasonable.  This has been discussed here several times, and
>>> also on the forum, so a search here:
>>>
>>> https://lists.archive.carbon60.com/mythtv/users/
>>>
>>> and
>>>
>>> here:
>>>
>>> https://forum.mythtv.org/
>>>
>>> should help.
>>>
>>> Unfortunately, it seems that Google searches these days are not
>>> finding the mailing list references much, unless you have a very
>>> specific keyword.
>>>
>>> Looking on as a disinterested party, I have never understood why the
>>> SD json EPG requires such long times to run.  The actual process of
>>> putting the EPG data from my XMLTV generated sources into the database
>>> takes less than a minute.  I have two sources, 141 channels and one
>>> week of EPG data.
>>
>>
>>
>> When I configured xmltv for the Schedules Direct lineup at my location it
>> picked up about 140 channels, which is consistent with what MythTV picks up
>> with a channel scan and puts into the database.
>>
>> In MythTV I've marked all but 51 channels as 'invisible' and I've
>> disabled those channels in xmltv as well, by replacing the '=' with a '!'
>> in the relative line in the xmltv config file.
>>
>> I have pretty low powered hardware (Zotac ION / 4GB RAM).  Running the
>> xmltv grab for 10 days takes 2 or 3 minutes to grab the file.  Running
>> mythfilldatabase  on that file takes about 20 minutes the first time (when
>> the guide database table is empty) and about 5 minutes when it only has to
>> update the 10th day (assuming that no earlier entries have changed).
>>
>> So my daily mythfilldatabase run takes about 7 minutes. Reducing the
>> number of channels from 140 to 51 helped, as did reducing the days from 18
>> to 10.
>>
>> I only have one data source.  When I had two then I ran xmltv once to
>> grab the data, but I had to run mythfilldatabase twice(once for each
>> source).  This slowed  things down somewhat.
>>
>> I have to say that working out the xmltv id's for each channel is not
>> trivial and there can be maintenance to do after a channel rescan. I have a
>> couple of SQL queries and an Excel spreadsheet to help keep track of
>> things, and I've seen people post similar schemes here and on the WiKi.  I
>> believe that there's work in hand to preserve xmltvid, icon name and other
>> channel parameters after a rescan so things should get a bit easier
>> sometime.
>>
>> HTH
>>
>> D
>>
>
> I'm attempting the XMLTV upgrade again today.  For background, my myth
> backend is a virtual machine with 8GB RAM and 4 vCPUs on an i7-9700K host.
> I ran the script to optimize the database, just in case.
>
> My number one issue is the length of time it is taking mythfilldatabase to
> run.  I've only got 125 channels in my lineup with one source.  I'm running
> the script that chunks the work into 3 day increments as seen on the
> mythfilldatabase wiki.
>
> The first 3-day chunk took 70 minutes.   This seems quite long to me, but
> at least it seems to be working -- the gaps in my listings are disappearing.
>
> Should I expect this to be the case every day?  I'm thinking seven 3-day
> chunks of 70 minutes each will take a total of ~8 hours.
>
> Larry
>


That seems an extraordinarily long time.  As I said, my 10-day chunk of 50
channels takes about 25 minutes on way less hardware.

Can you find out whether it's the xmltv stage or the mythfilldatabase stage
which is taking the time?  Maybe in the logs, or run them separately by
hand.

D
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mythtv.org/pipermail/mythtv-users/attachments/20210510/d2adaabe/attachment.htm>


More information about the mythtv-users mailing list