Tuesday, January 15, 2013

Multi-homed MythTV backend

This is the first of my non OpenWRT related posts.

I have a peculiar situation.  I have two houses that I split my life between.  I don't want to pay for cable in both houses, but I also want access to some of the TV shows that you only get newer episodes on cable and not via Netflix/Hulu/Amazon.

So to solve this, I have set up MythTV in both houses.  Both setups run Mythbuntu 12.04.1 (www.mythbuntu.org).  In the house without cable I have a Silicondust HD Homerun.


The HD Homerun is set to record OTA content exclusively.  I get all the basic TV networks this way in that house.

In the second house I have a Silicondust HD Homerun Prime.  I'm fortunate that in that house I have Comcast, which is one of the more friendly companies with regard to copy protection.  I'm able to record pretty much everything except the premium networks like HBO and Showtime.
 

In the second house I duplicate all the OTA recording schedules but also schedule things on the other stations I care about.

Secondary House

Now to get the recordings from the second house to the first house, I had to setup a collection of scripts that run in staggered cron jobs.  The first one is on the secondary house.  I first determined which channel ID's I needed to export from.  This can be found by looking through mythweb or by examining the database with a tool like phpmyadmin.  Look for 'chanid' in the 'channels' table.

Once I had those channels figured out I created this script.  On each daily run it will find all the recordings who's starttime happened to be "today".  It exports all relevant SQL data about that recording into a flat SQL file that can be imported on the primary house.

Next it creates a list of recordings that will need to be synced over to the primary house.  This list is based upon the last successful sync date from the primary house.  It only syncs all recordings between the lastrun date and "tomorrow".  This sets it up so that if for some reason the primary backend doesn't sync a day it will still work.  Also it's important to sync only a handful of days because the autoexpire will be happening at different rates on the two backends.

I store all this content in the directory /var/lib/mythtv/exported-sql

sudo mkdir -p /var/lib/mythtv/exported-sql 
sudo chown mythtv:mythtv /var/lib/mythtv/exported-sql

/home/mythtv/export.sh 

#!/bin/sh

#export SQL
chanid="2657 2659 2622"
DIRECTORY=/var/lib/mythtv/exported-sql/
BIGGEST_DATE=`find $DIRECTORY -maxdepth 1 -name '*.sql' | sort -r | head -1 | sed 's,.*exported-,,; s,.sql,,'`
for chan in $chanid;
do
[ -n "$where" ] && where="$where or"
[ -n "$BIGGEST_DATE" ] && date=" and starttime > '$BIGGEST_DATE 00:00:00'"
where="$where (chanid='$chan'$date)"
done
CONFIG=$HOME/.mythtv/config.xml
DATE=`date '+%F'`
if [ "$DATE" = "$BIGGEST_DATE" ]; then
echo "Already ran today, not running SQL generation again"
else
db=`xpath  -q -e 'string(//DatabaseName)' $CONFIG 2>/dev/null`
user=`xpath  -q -e 'string(//UserName)' $CONFIG 2>/dev/null`
pass=`xpath  -q -e 'string(//Password)' $CONFIG 2>/dev/null`
host=`xpath  -q -e 'string(//Host)' $CONFIG 2>/dev/null`
fname=/var/lib/mythtv/exported-sql/exported-$DATE.sql
mysqldump -h$host -u$user -p$pass $db recorded recordedseek recordedrating recordedprogram recordedmarkup recordedcredits --where="$where" --no-create-db --no-create-info > $fname
fi


#generate a recordings list
lastrun=/home/mythtv/lastrun
tomorrow=$(date --date="tomorrow" '+%Y%m%d')
if [ -f $lastrun ]; then
        tmp=$(cat $lastrun)
else
        tmp=$tomorrow
fi
while [ "$tmp" != "$tomorrow" ]
do
        test_dates="$test_dates $tmp"
        tmp=$(date --date="1 day $tmp" '+%Y%m%d')
done
test_dates="$test_dates $tomorrow"
for date in $test_dates;
do
        for chan in $chanid;
        do
                from="$from /var/lib/mythtv/recordings/${chan}_${date}*"
        done
done
ls $from 2>/dev/null | tee /var/lib/mythtv/exported-sql/recordings-list

Next I set up rsync to export my /var/lib/mythtv directory (read only) and my lastrun successful (write only).

/etc/rsyncd.conf

max connections = 2
log file = /var/log/rsync.log
timeout = 300

[mythtv]
comment = mythtv
path = /var/lib/mythtv
read only = yes
list = yes
uid = nobody
gid = nogroup
auth users = rsync
secrets file = /etc/rsyncd.secrets

[lastrun]
comment = last rsync run
path = /home/mythtv/
write only = yes
read only = no
list = no
uid = mythtv
gid = mythtv
auth users = rsync
secrets file = /etc/rsyncd.secrets

Last thing to do on the secondary house is to set up the crontab to run at night.  I set it for 11:05 PM every day.  It should only take a 10-15 seconds to run.

5 23 * * * /home/mythtv/export.sh

Because of the way this all works, I decide to leave my secondary house backend on all the time.  

Primary House

Now in my primary house I need to sync recordings, SQL data, and then update the last successful run at the secondary house.

/home/mythtv/import.sh

#!/bin/sh

domain=rsync@address
sql_directory=/home/mythtv/sql
recordings_directory=/var/lib/mythtv/recordings
password_file=/home/mythtv/password-file
lastrun=/home/mythtv/lastrun

sync_recordings()
{
today=$(date '+%Y%m%d')
from=$(cat $sql_directory/recordings-list | sed "s,/var/lib/mythtv,$domain::mythtv,")
RET=30
while [ $RET -eq 30 ]; do
#rsync -avz --partial --timeout 120 --progress $from --password-file=password-file $recordings_directory
rsync -av --partial --timeout 120 --progress $from --password-file=$password_file $recordings_directory
RET=$?
done
echo "rsync return code: $?"
[ $RET -ne 0 ] && exit 1
echo "$today" > lastrun
rsync -avz --password-file=password-file lastrun $domain::lastrun/
}

sync_sql()
{
rsync -avz $domain::mythtv/exported-sql/* --password-file=$password_file $sql_directory
}

insert_sql()
{
CONFIG=$HOME/.mythtv/config.xml
db=`xpath  -q -e 'string(//DatabaseName)' $CONFIG 2>/dev/null`
user=`xpath  -q -e 'string(//UserName)' $CONFIG 2>/dev/null`
pass=`xpath  -q -e 'string(//Password)' $CONFIG 2>/dev/null`
host=`xpath  -q -e 'string(//Host)' $CONFIG 2>/dev/null`
old_host=supermario
new_host=kingkoopa
for fname in $(find $sql_directory -maxdepth 1 -name '*.sql');
do
if [ ! -f ${fname}.imported ]; then
cat $fname | sed "s,${old_host},${new_host},g; s,INSERT INTO, INSERT IGNORE INTO,g" > ${fname}.imported
mysql --host=$host --user=$user -p$pass $db < ${fname}.imported
fi
done
}

suspend()
{
sudo /usr/local/bin/setwakeup.sh $(date --date='18:00' '+%s')
sudo pm-suspend
}

mythfilldatabase
sync_sql
sync_recordings
insert_sql
#suspend

I'm careful about the order I do things.  The SQL has to get inserted last in case for some reason the recordings fail to sync or don't all sync while i'm watching.

I currently don't suspend afterwards due to some instability on my system, but I have been experimenting with that too.  If S3 is reliable you can configure the setup to suspend after the sync is done and wake up next time you need to use it or record from it.

I set the cronjob to run at midnight every day on the primary backend.

0 0 * * * /home/mythtv/import.sh