Google Docs Backup with GDocBackup

I was searching for a simple way to make local backups of my Google Docs and found gdocbackup on Google Code. According to the project docs it runs on Windows and Linux (with Mono) so I tried it on both Windows 7 and Ubuntu 9.10 Desktop.

On the Windows 7 PC, I downloaded the installer from the Google Code project page, installed the application, and ran it. After configuring the backup directory and export formats for the documents I executed the backup and it worked fine.

Running it on Ubuntu took a bit more setup since I did not have Mono installed. First I installed the required Mono packages.

sudo apt-get install mono-runtime libmono-winforms2.0-cil mono-devel

The mono-devel package installs the mozroots utility needed to install a certificate required to access Google Docs (see http://gs.fhtino.it/gdocbackup/faq).

Next I imported the Mozilla root certificates into Mono (see http://manpages.ubuntu.com/manpages/intrepid/man1/mozroots.1.html).

mozroots --import --sync

I downloaded GDocBackup_0.4.9.71_BIN.zip from the gdocbackup project’s Downloads page and extracted it to a GDocBak directory I created in my home directory. I also created a Data directory under GDocBak to hold the backup files. I opened a terminal in the GDocBak directory and ran GDocBackup.exe in Mono.

mono ./GDocBackup.exe

At this point the GDocBackup application worked the same as in Windows 7. It looks a little different but it downloaded the documents without errors. Now I just need to automate the backups.

Pair Networks Database Backup Automation

I have a couple WordPress blogs, this being one of them, hosted at Pair Networks. I also have another non-blog site that uses a MySQL database. I have been doing backups of the databases manually through Pair’s Account Control Center (ACC) web interface on a somewhat regular basis, but it was bugging me that I hadn’t automated it. I finally got around to doing so.

A search led to this blog post by Brad Trupp. He describes how to set up an automated database backup on a Pair Networks host. I used “technique 2” from his post as the basis for the script I wrote.

Automating the Backup on the Pair Networks Host

First I connected to my assigned server at Pair Networks using SSH (I use PuTTY for that). There was already a directory named backup in my home directory where the backups done through the ACC were written. I decided to use that directory for the scripted backups as well.

In my home directory I created a shell script named dbbak.sh.

touch dbbak.sh

The script should have permissions set to make it private (it will contain database passwords) and executable.

chmod 700 dbbak.sh

I used the nano editor to write the script.

nano -w dbbak.sh

The script stores the current date and time (formatted as YYYYmmdd_HHMM) in a variable and then runs the mysqldump utility that creates the database backups. The resulting backup files are simply SQL text that will recreate the objects in a MySQL database and insert the data. The shell script I use backs up three different MySQL databases so the following example shows the same.

#!/bin/sh

dt=`/bin/date +%Y%m%d_%H%M`

/usr/local/bin/mysqldump -hDBHOST1 -uDBUSERNAME1 -pDBPASSWORD1 USERNAME_DBNAME1 > /usr/home/USERNAME/backup/dbbak_${dt}_DBNAME1.sql

/usr/local/bin/mysqldump -hDBHOST2 -uDBUSERNAME2 -pDBPASSWORD2 USERNAME_DBNAME2 > /usr/home/USERNAME/backup/dbbak_${dt}_DBNAME2.sql

/usr/local/bin/mysqldump -hDBHOST3 -uDBUSERNAME3 -pDBPASSWORD3 USERNAME_DBNAME3 > /usr/home/USERNAME/backup/dbbak_${dt}_DBNAME3.sql

Substitute these tags in the above example with your database and account details:

  • DBHOST is the database server, such as db24.pair.com.
  • DBUSERNAMEn is the full access username for the database.
  • DBPASSWORDn is the password for that database user.
  • USERNAME_DBNAMEn is the full database name that has the account user name as the prefix.
  • USERNAME is the Pair Networks account user name.
  • DBNAMEn is the database name without the account user name prefix.

Once the script was written and tested manually on the host, I used the ACC (Advanced Features / Manage Cron jobs) to set up a cron job to run the script daily at 4:01 AM.

Automating Retrieval of the Backup Files

It was nice having the backups running daily without any further work on my part but, if I wanted a local copy of the backups, I still had to download them manually. Though FileZilla is easy to use, downloading files via FTP seemed like a prime candidate for automation as well. I turned to Python for that. Actually I turned to an excellent book that has been on my shelf for a few years now, Foundations of Python Network Programming by John Goerzen. Using the ftplib examples in the book as a foundation, I created a Python script named getdbbak.py to download the backup files automatically.

#!/usr/bin/env python
# getdbbak.py

from ftplib import FTP
from datetime import datetime
from DeleteList import GetDeleteList
import os, sys
import getdbbak_email

logfilename = 'getdbbak-log.txt'
msglist = []

def writelog(msg):
    scriptdir = os.path.dirname(sys.argv[0])
    filename = os.path.join(scriptdir, logfilename)
    logfile = open(filename, 'a')
    logfile.writelines("%sn" % msg)
    logfile.close()

def say(what):
    print what
    msglist.append(what)
    writelog(what)

def retrieve_db_backups():
    host = sys.argv[1]
    username = sys.argv[2]
    password = sys.argv[3]
    local_backup_dir = sys.argv[4]
    
    say("START %s" % datetime.now().strftime('%Y-%m-%d %H:%M'))
    say("Connect to %s as %s" % (host, username))

    f = FTP(host)
    f.login(username, password)

    ls = f.nlst("dbbak_*.sql")
    ls.sort()
    say("items = %d" % len(ls))
    for filename in ls:
        local_filename = os.path.join(local_backup_dir, filename)
        if os.path.exists(local_filename):
            say("(skip) %s" % local_filename)
        else:
            say("(RETR) %s" % local_filename)
            local_file = open(local_filename, 'wb')
            f.retrbinary("RETR %s" % filename, local_file.write)
            local_file.close()
            
    date_pos = 6
    keep_days = 5
    keep_weeks = 6
    keep_months = 4    
    del_list = GetDeleteList(ls, date_pos, keep_days, keep_weeks, keep_months)
    if len(del_list) > 0:
        if len(ls) - len(del_list) >= keep_days:
            for del_filename in del_list:
                say("DELETE %s" % del_filename)
                f.delete(del_filename)
        else:
            say("WARNING: GetDeleteList failed sanity check. No files deleted.")
    
    f.quit()
    say("FINISH %s" % datetime.now().strftime('%Y-%m-%d %H:%M'))
    getdbbak_email.SendLogMessage(msglist)


if len(sys.argv) == 5:
    retrieve_db_backups()
else:
    print 'USAGE: getdbbak.py Host User Password LocalBackupDirectory'

This script runs via cron on a PC running Ubuntu 8.04 LTS that I use as a local file/subversion/trac server. The script does a bit more than just download the files. It deletes older files from the host based on rules for number of days, weeks, and months to keep. It also writes some messages to a log file and sends an email with the current session’s log entries.

To set up the cron job in Ubuntu I opened a terminal and ran the following command to edit the crontab file:

crontab -e

The crontab file specifies commands to run automatically at scheduled times. I added an entry to the crontab file that runs a script named getdbbak.sh at 6 AM every day. Here is the crontab file:

 
MAILTO="" 

# m h dom mon dow command 

0 6 * * * /home/bill/GetDbBak/getdbbak.sh 

The first line prevents cron from sending an email listing the output of any commands cron runs. The getdbbak.py script will send its own email so I don’t need one from cron. I can always enable the cron email later if I want to see that output to debug a failure in a script cron runs.

Here is the getdbbak.sh shell script that is executed by cron:

 
#!/bin/bash 

/home/bill/GetDbBak/getdbbak.py FTP.EXAMPLE.COM USERNAME PASSWORD /mnt/data2/files/Backup/PairNetworksDb 

This shell script runs the getdbbak.py Python script and passes the FTP login credentials and the destination directory for the backup files as command line arguments.

As I mentioned, the getdbbak.py script deletes older files from the host based on rules. The call to GetDeleteList returns a list of files to delete from the host. That function is implemented in a separate module, DeleteList.py:

#!/usr/bin/env python
# DeleteList.py

from datetime import datetime
import KeepDateList


def GetDateFromFileName(filename, datePos):
    """Expects filename to contain a date in the format YYYYMMDD starting 
       at position datePos.
    """   
    try:
        yr = int(filename[datePos : datePos + 4])
        mo = int(filename[datePos + 4 : datePos + 6])
        dy = int(filename[datePos + 6 : datePos + 8])
        dt = datetime(yr, mo, dy)
        return dt
    except:
        return None
 

def GetDeleteList(fileList, datePos, keepDays, keepWeeks, keepMonths):
    dates = []
    for filename in fileList:
        dt = GetDateFromFileName(filename, datePos)
        if dt != None:
            dates.append(dt)
    keep_dates = KeepDateList.GetDatesToKeep(dates, keepDays, keepWeeks, keepMonths)        
    del_list = []
    for filename in fileList:
        dt = GetDateFromFileName(filename, datePos)
        if (dt != None) and (not dt in keep_dates):
                del_list.append(filename)    
    return del_list

That module in turn uses the function GetDatesToKeep defined in the module KeepDateList.py to decide which files to keep on order to maintain the desired days, weeks, and months of backup history. If a file’s name contains a date that’s not in the list of dates to keep then it goes in the list of files to delete.

#!/usr/bin/env python
# KeepDateList.py

from datetime import datetime


def ListHasOnlyDates(listOfDates):
    dt_type = type(datetime(2009, 11, 10))
    for item in listOfDates:
        if type(item) != dt_type:
            return False
    return True
    

def GetUniqueSortedDateList(listOfDates):
    if len(listOfDates) < 2:
        return listOfDates
    listOfDates.sort()
    result = [listOfDates[0]]
    last_date = listOfDates[0].date()
    for i in range(1, len(listOfDates)):
        if listOfDates[i].date() != last_date:
            last_date = listOfDates[i].date()
            result.append(listOfDates[i])
    return result
    
    
def GetDatesToKeep(listOfDates, daysToKeep, weeksToKeep, monthsToKeep):
    if daysToKeep < 1:
        raise ValueError("daysToKeep must be greater than zero.")
    if weeksToKeep < 0:
        raise ValueError("weeksToKeep must not be less than zero.")
    if monthsToKeep  0) and (tail > 0):
        tail -= 1
        days_left -= 1
        keep.append(dates[tail])
        
    year, week_number, weekday = dates[tail].isocalendar()
    weeks_left = weeksToKeep
    while (weeks_left > 0) and (tail > 0):
        tail -= 1
        yr, wn, wd = dates[tail].isocalendar()
        if (wn  week_number) or (yr  year):
            weeks_left -= 1
            year, week_number, weekday = dates[tail].isocalendar()
            keep.append(dates[tail])
        
    month = dates[tail].month
    year = dates[tail].year
    months_left = monthsToKeep
    while (months_left > 0) and (tail > 0):
        tail -= 1
        if (dates[tail].month  month) or (dates[tail].year  year):
            months_left -= 1
            month = dates[tail].month
            year = dates[tail].year
            keep.append(dates[tail])
        
    return keep

I also put the function SendLogMessage that sends the session log via email in a separate module, getdbbak_email.py:

#!/usr/bin/env python
# getdbbak_email.py

from email.MIMEText import MIMEText
from email import Utils
import smtplib

def SendLogMessage(msgList):
    from_addr = 'atest@bogusoft.com'
    to_addr = 'wm.melvin@gmail.com'
    smtp_server = 'localhost'
    
    message = ""
    for s in msgList:
        message += s + "n"

    msg = MIMEText(message)
    msg['To'] = to_addr 
    msg['From'] = from_addr 
    msg['Subject'] = 'Download results'
    msg['Date'] = Utils.formatdate(localtime = 1)
    msg['Message-ID'] = Utils.make_msgid()

    smtp = smtplib.SMTP(smtp_server)
    smtp.sendmail(from_addr, to_addr, msg.as_string())

Here is a ZIP file containing the set of Python scripts, including some unit tests (such as they are) for the file deletion logic: GetDbBak.zip

I hope this may be useful to others with a similar desire to automate MySQL database backups and FTP transfers who haven’t come up with their own solution yet. Even if you don’t use Pair Networks as your hosting provider some of the techniques may still apply. I’m still learning too so if you find mistakes or come up with improvements to this solution, please let me know.