Expat-IT Tech Bits

Home

Contact

Links

Search this site:

Categories:

/ (287)
  Admin/ (122)
    Apache/ (10)
      HTTPS-SSL/ (4)
      PHP/ (3)
      performance/ (2)
    Cherokee/ (1)
    LAN/ (4)
    LVM/ (6)
    Monitoring/ (2)
      munin/ (2)
    SSH/ (6)
    SSL/ (1)
    Samba/ (1)
    VPN-options/ (6)
      OpenVPN/ (1)
      SSH-Proxy/ (3)
      Tinc/ (1)
      sshuttle/ (1)
    backups/ (17)
      SpiderOak/ (1)
      backuppc/ (5)
      dirvish/ (1)
      misc/ (6)
      rdiff-backup/ (1)
      rsync/ (1)
      unison/ (2)
    commandLine/ (24)
      files/ (8)
      misc/ (10)
      network/ (6)
    crontab/ (1)
    databases/ (15)
      MSSQL/ (2)
      MySQL/ (8)
      Oracle/ (3)
      PostgreSQL/ (1)
    dynamicDNS/ (2)
    email/ (11)
      Dovecot/ (1)
      deliverability/ (1)
      misc/ (1)
      postfix/ (7)
      puppet/ (1)
    iptables/ (3)
    tripwire/ (1)
    virtualization/ (9)
      VMware/ (1)
      virtualBox/ (8)
  Coding/ (14)
    bash/ (1)
    gdb/ (1)
    git/ (3)
    php/ (5)
    python/ (4)
      Django/ (2)
  Education/ (1)
  Hosting/ (27)
    Amazon/ (18)
      EBS/ (3)
      EC2/ (10)
      S3/ (1)
      commandline/ (4)
    Godaddy/ (2)
    NearlyFreeSpeech/ (3)
    Rackspace/ (1)
    vpslink/ (3)
  Linux/ (30)
    Android/ (1)
    Awesome/ (3)
    CPUfreq/ (1)
    China/ (2)
    Debian/ (8)
      APT/ (3)
      WPA/ (1)
    audio/ (1)
    encryption/ (3)
    fonts/ (1)
    misc/ (6)
    remoteDesktop/ (1)
    router-bridge/ (3)
  SW/ (45)
    Micro$soft/ (1)
    browser/ (2)
      Chrome/ (1)
      Firefox/ (1)
    business/ (28)
      Drupal/ (9)
      KnowledgeTree/ (6)
      Redmine/ (2)
      SugarCRM/ (7)
      WebERP/ (2)
      WordPress/ (1)
      eGroupware/ (1)
    chat/ (1)
    email/ (1)
    fileSharing/ (2)
      btsync/ (1)
      mldonkey/ (1)
    graphics/ (2)
    research/ (2)
    website/ (6)
      blog/ (6)
        blosxom/ (3)
        rss2email/ (1)
        webgen/ (1)
  Security/ (15)
    IMchat/ (2)
    circumvention/ (2)
    cryptoCurrency/ (1)
    e-mail/ (4)
    greatFirewall/ (1)
    hacking/ (1)
    password/ (1)
    privacy/ (2)
    skype/ (1)
  Services/ (1)
    fileSharing/ (1)
  TechWriting/ (1)
  xHW/ (14)
    Lenovo/ (1)
    Motorola_A1200/ (2)
    Thinkpad_600e/ (1)
    Thinkpad_a21m/ (3)
    Thinkpad_i1300/ (1)
    Thinkpad_x24/ (1)
    USB_audio/ (1)
    scanner/ (1)
    wirelessCards/ (2)
  xLife/ (17)
    China/ (9)
      Beijing/ (5)
        OpenSource/ (3)
    Expatriation/ (1)
    Vietnam/ (7)

Archives:

  • 2016/07
  • 2016/05
  • 2016/02
  • 2016/01
  • 2015/12
  • 2015/11
  • 2015/06
  • 2015/01
  • 2014/12
  • 2014/11
  • 2014/10
  • 2014/09
  • 2014/07
  • 2014/04
  • 2014/02
  • 2014/01
  • 2013/12
  • 2013/10
  • 2013/08
  • 2013/07
  • 2013/06
  • 2013/05
  • 2013/04
  • 2013/02
  • 2013/01
  • 2012/12
  • 2012/10
  • 2012/09
  • 2012/08
  • 2012/07
  • 2012/06
  • 2012/05
  • 2012/04
  • 2012/03
  • 2012/01
  • 2011/12
  • 2011/11
  • 2011/10
  • 2011/09
  • 2011/08
  • 2011/07
  • 2011/06
  • 2011/05
  • 2011/04
  • 2011/02
  • 2010/12
  • 2010/11
  • 2010/10
  • 2010/09
  • 2010/08
  • 2010/07
  • 2010/06
  • 2010/05
  • 2010/04
  • 2010/03
  • 2010/02
  • 2010/01
  • 2009/12
  • 2009/11
  • 2009/10
  • 2009/09
  • 2009/08
  • 2009/07
  • 2009/06
  • 2009/05
  • 2009/04
  • 2009/03
  • 2009/02
  • 2009/01
  • 2008/12
  • 2008/11
  • 2008/10
  • 2008/09
  • Subscribe XML RSS Feed

    Creative Commons License
    This work is licensed under a Creative Commons Attribution-ShareAlike 4.0 International License.
    PyBlosxom

    This site has no ads. To help with hosting, crypto donations are accepted:
    Bitcoin: 1JErV8ga9UY7wE8Bbf1KYsA5bkdh8n1Bxc
    Zcash: zcLYqtXYFEWHFtEfM6wg5eCV8frxWtZYkT8WyxvevzNC6SBgmqPS3tkg6nBarmzRzWYAurgs4ThkpkD5QgiSwxqoB7xrCxs

    Sun, 05 Apr 2009


    /Admin/backups/misc: Semi-Automating My Monthly Backup

    Boring repetitive tasks should be scripted. Backups *really* should be automated. So here is a first step down that path for the tarball that I send to my hosted server every month:

    #!/bin/sh cd /path/to/script/directory echo "My monthly backup:" echo "First archive mail trash" ./archivemail.sh echo "Now build the tar file." FILENAME="Backup`date +%Y%m%d`.tar" PATHFILE="/scratch/"$FILENAME echo "Will backup to " $PATHFILE echo "Archive /home/userid..." tar -cf $PATHFILE /home/userid echo "Add /etc..." tar -rf $PATHFILE /etc /etc/init.d/apache2 stop /etc/init.d/mysql stop echo "add /var/www..." tar -rf $PATHFILE /var/www echo "add /var/lib/mysql/" tar -rf $PATHFILE /var/lib/mysql/ /etc/init.d/apache2 start /etc/init.d/mysql start echo "Backup complete, list contents of archive" tar -tvf $PATHFILE

    and then I get an e-mail telling me its all done, and there is a huge tarball waiting for me in /scratch. I run this script on the 1st of every month from cron. archivemail.sh uses archivemail[1] to clean out my Mail trash folder. I split it out in a separate script because I run it more often (once a week).

    [1] http://blog.langex.net/index.cgi/SW/email/

    posted at: 02:26 | path: /Admin/backups/misc | permanent link to this entry

    Sat, 17 Jan 2009


    /Admin/backups/misc: The Problem with a Hardlink Backup Strategy

    Most of the Open Source backup software that I am aware of that does incremental backups (including backuppc, which I use) re-creates the entire directory structure for each increment. Copies of files with mulitple identical copies are then hard-linked together so that there is only one copy on disk, with obvious savings in disk usage.

    This post[1] points out that as the amount of stuff being backed up increases, an fsck on the partition in question can start taking a *very* long time, perhaps even running out of memory.

    I have not even noticed, because I did not place my backuppc archive on the root partition. And it is only the root partition that occasionally experiences an automatic fsck on boot. Generally speaking, for better or worse, I simply never fsck non-root partitions without good reason, and Linux seems to give me very little of such "good reason". I am inclined to say this seems to be more of a hypothetical then a real problem, and until I actually see evidence of breakage I am not going to worry about it.

    [1] http://feeds.feedburner.com/~r/ThoughtsByTed/~3/510209390/

    posted at: 08:19 | path: /Admin/backups/misc | permanent link to this entry

    Sun, 23 Nov 2008


    /Admin/backups/misc: What Works on a Slower Machine

    I have this thing about keeping older machines usable for as long as possible. In other words, I resist bloat-ware that just assumes any computer more then two years old should be placed in a dumpster. So I currently own about a half a dozen laptops, and none of them are faster then a late-model Pentium III. And this works fine for me, as long as I make judicious choices about what software should run on what machine, and when.

    Getting backups done painlessly has caused just such a "judicious choice"....

    As it turns out, Spideroak[1] has a lot going for it, but fast it is not. Unsurprisingly, Spideroak is a Python app, and Python is also a language that "has a lot going for it, but fast it is not". Its not just that Spideroak is just slow, but like its Python sibling, Miro[4], it tends to bog down my whole system and reduce responsiveness. For the moment, I will resist the urge to add Spideroak to my list[2] of open source resource hogs, as I have not yet experimented with running it "niced".

    This brings backuppc[3] back into favor for me. And I have found a partial fix for the fact that backuppc also bogs down the server it is running on: put this is the root cron:

    1 * * * * /usr/bin/renice 15 -u backuppc > /dev/null 2>&1

    backuppc starts backups right on the hour. This cron job reduces the priority of all running backuppc processes one minute after every hour. Much better. And no operator intervention required, unless I am watching a really CPU-intensive video on that box and need to stop backuppc entirely.

    [1] http://blog.langex.net/index.cgi/Admin/backups/spideroak.html
    [2] http://blog.langex.net/index.cgi/Linux/memory-hogs.html
    [3] http://blog.langex.net/index.cgi/Admin/backups/backuppc/
    [4] http://www.getmiro.com/

    posted at: 09:51 | path: /Admin/backups/misc | permanent link to this entry

    Sat, 04 Oct 2008


    /Admin/backups/misc: Backing up a MySQL Database[1]

    Simply making a copy of the files in /var/lib/mysql/ while the database is running is not guaranteed to work, as MySQL *might* complain about corruption and refuse to start with such "hot" copies. Of course, if you can afford to stop MySQL while you are taking a snapshot of /var/lib/mysql/, then it should work fine.... The simplest way to grab a copy of a running database is with 'mysqldump'. I use the following, run from cron a couple of times a week:

    mysqldump --user=**** --password=**** name-of-database | bzip2 > /var/www/name-of-database/db-backup/name-of-database-backup-`date +%Y-%m-%d`.sql.bz2

    backuppc, running on another machine, makes daily backups of the whole /var/www/ directory. If the security of the contents of the database is a concern, do not put the dump in /var/www/.

    To delete files that are older then 20 days on a Linux system, add this to your cron:

    find /var/www/name-of-database/db-backup/name-of-database-backup* -mtime +20 -exec rm {} \;

    [1] http://dev.mysql.com/doc/refman/4.1/en/backup.html

    posted at: 09:46 | path: /Admin/backups/misc | permanent link to this entry

    Wed, 24 Sep 2008


    /Admin/backups/misc: Easy Linux Off-site Backups

    Probably the lowest-tech route is to use tar, gpg, and some free file storage service. For instance, at the root prompt (since we will be backing up some priveleged files) lets gather all the files up into one tar archive file, beginning with the /home directory:

    tar -cvf Backup20080901.tar /home
    Append the /etc directory:
    tar -rvf Backup20080901.tar /etc
    Now encrypt the result with gpg (you will be prompted for a password):
    gpg -c Backup20080901.tar
    Now upload the file to your favorite file storage service.

    Some storage options:

    1. Should you have access to an off-site server:

    scp Backup20080901.tar.gpg www.urltoserver.com:
    This may be a very big file and a very long transfer. If there is an interruption, don't start over again from scratch. We can use rsync to resume an interrupted scp transfer. Just replace "scp" in the last command with "rsync --partial --progress --rsh=ssh", ie.
    rsync --partial --progress --rsh=ssh Backup20080901.tar.gpg www.urltoserver.com:

    2. Exchange encrypted backups with a friend:

    Since both ends encrypt, trust is not even an issue. But how to exchange potentially very large files?

    If both of you have access to a UNIX environment where you can unblock / forward ports, sendfile[1] sounds REALLY cool.

    If one of you has root on a UNIX server, F*EX[2] also looks like an option.

    [1] http://fex.rus.uni-stuttgart.de/saft/sendfile.html
    [2] http://fex.rus.uni-stuttgart.de/

    posted at: 00:47 | path: /Admin/backups/misc | permanent link to this entry

    Sun, 07 Sep 2008


    /Admin/backups/misc: Review / Comparison of rdiff-backup[1] & backuppc[2]

    I have used both. backuppc has some clear advantages:

    In a word, backuppc is truly an enterprise-class piece of software, highly recommended for big complex backup situations.

    However, there is a price to pay for all that automation and all those features:

    I like both of them very much, but they are suited for quite different situations. If you are backing up several machines or more, and you have one machine with a lot of disk space that you can devote largely if not entirely to backuppc, backuppc is probably the way to go. Any lesser requirements are probably best met with rdiff-backup.

    [1] http://rdiff-backup.nongnu.org/
    [2] http://backuppc.sourceforge.net/

    posted at: 03:20 | path: /Admin/backups/misc | permanent link to this entry