Saturday, June 16, 2012

Back to Cable TV with mixed feelings

I gave in to the pressure to return to cable.

The kids flat out won this battle as we have been under assault all year to get back cable and now back to watching their favorite Disney and Nickelodeon shows.  I hate to admit it but I've been watching quite of bit of Motocross, Moto GP , and Superbike racing lately and what a season of racing we've had in all the different styles. Wish it were different but I even priced out renting/buying content via iTunes/Amazon Video & TV, and DVD's for the content but at an average of 2.99 a show the cost would add up.  Now if it were just a few shows I could see the big cost savings but you start getting into a half-a-dozen TV shows and movies to boot and your're right up there with cable. Please prove me wrong if you can but Hollywood greed has got us by the wallet.

Saturday, June 9, 2012

Ubuntu 12.04 migrate form 32-bit to 64-bit

So I see many posts asking if there is a way to upgrade from 32-bit to 64-bit and it can be done without the upgrade mentality which will also give you a forced cleanup of cruft-o-packages. I'm doing this because I wanted to play with KVM and some other 64-bit OS so I need a 64-bit kernel, go figure.... like I don't enough outstanding honey-do's.

Disclaimer: I have been doing SA related work for many years and while this is not a HOWTO but more of my tale of how I accomplished this to give less savvy home admin types yet another reference.

Pre-reqs:
 - Verify the install CD when you boot!!! The ubuntu-12.04-alternate-amd64.iso had a file with a bad MD5 ( https://bugs.launchpad.net/bugs/1010757 ) and bailed on the install so I had to start over with the ubuntu-12.04-beta2-alternate-amd64.iso.
 - Make sure you have two backups! ( one should be on another machine/disk ).
 - Don't freak out if you think something went wrong just stay calm.
 - Pay attention when in the disk partition section that you don't format anything but /boot, swap and /.

Packages I run include web, mythtv-backend, kde, xfce4, mail server w/ spam, amavis, clamav cleaners getting all the nastys coming in the ethers.  I like to play with different window managers because the devs screw it up and lose followers with complete "it's gotta be better" re-writes hint: GNOME devs and Unity.  Look, if you just have a toggle UP: show me all the tweaks and toggle DOWN: hide all the tweaks; that would make more users happy.  Heck, I might even give E17 another look.

My disk layout:
RAID-1 pair
/dev/md0           101018      60169     35633  63% /boot
/dev/md1  ( my swap partition )
/dev/md2         29230360    8181368  19583872  30% /
/dev/md3         67290936   30562360  33310352  48% /home
/dev/md4        382552620  195536600 167583400  54% /data

RAID-5 four 1TB ( mythtv recordings, backups, ripped DVD's, etc )
/dev/md5       2927499732 2455517884 471981848  84% /backup


Section 1 ( info gathering and backup ) :
  - apt-get update; apt-get -y dist-upgrade
  - fdisk -l > before_fdisk_ouput
  - cat /proc/mdstat > before_mdstat_output
  - dpkg -l > before_package_list
  - Backup the system TWICE.

Here are my backup commands ( make sure destination directories exist ):

backitup for backup to the RAID-5
rsync -avx --delete /boot/ /backup/mirror/webby_boot/
rsync -avx --delete / /backup/mirror/webby_root/
rsync -avx --delete /dev/ /backup/mirror/webby_root/dev/
rsync -avx --exclude=".gvfs" --delete-after /home/ /backup/home/

backitup_toquad for backup to my other machine
rsync -avx --numeric-ids --delete /boot/ root@quad:/home/webby_backup/webby_boot/
rsync -avx --numeric-ids --delete / root@quad:/home/webby_backup/webby_root/
rsync -avx --numeric-ids --delete /dev/ root@quad:/home/webby_backup/webby_root/dev/
rsync -avx --numeric-ids --exclude=".gvfs" --delete-after /home/ root@quad:/home/webby_backup/home/

Section 2 (  Install 64-bit version ):
  - Boot the install CD and verify disk for defects!!
  - Install and do the manual disk partitioning and only format /boot, swap and / partitions.

Section 3 ( Install/Remove some packages and make a few backups of new /etc files ):
 Install ssh server and get back old keys and setup
  - apt-get install openssh-server openssh-blacklist openssh-blacklist-extra
  - cp -a /etc/ssh /etc/ssh.sav
  - rsync -axv /$BACKUP/etc/ssh/ /etc/ssh/
  - /etc/init.d/ssh restart
  Now you can log in remotely if you want i.e. if you have keys setup ;)

  Optional -  I have scripts and info I need here
  - rsync -axv /$BACKUP/root/ /root/

  Optional - this is more of a server machine so I whack Network Manager
  - apt-get remove network-manager

 Copy hosts, fstab, passwd, group, interfaces
  - cp fstab fstab.64-bit
  You might want to merge the other filesystems into the new fstab if you did not set those up during install.  I just added /home, /data and /backup to the new /etc/fstab.

  - mount -av
  Make sure at least the /backup mounts so you can copy necessary file back.

  - cp -a passwd passwd.64-bit
  - cp -a group group.64-bit
  - cp -a shadow shadow.64-bit


  - cp /$BACKUP/etc/hosts /etc/
  - cp /$BACKUP/etc/network/interfaces  /etc/network/
  - cd /$BACKUP/etc ; cp passwd group shadow shadow- passwd- group- /etc/
  NOTE: user and group id's are not consistent even between like distros so we must fix
  since we copied our versions back.

  - chown -R lightdm:lightdm /var/lib/lightdm
  - chown -R avahi-autoipd:avahi-autoipd /var/lib/avahi-autoipd
  - chown -R couchdb:couchdb /var/lib/couchdb
  - chown -R avahi:avahi /var/run/avahi-daemon
  
Moved this step up as my sound card and logins were getting consolkit dbus error messages when logging in.  This is because the user and group id's for system accounts are not consistent i.a. the user creation scripts will just pick unused account and group numbers.
  - apt-get --reinstall install pulseaudio pulseaudio-module-gconf pulseaudio-utils dbus dbus-x11 gstreamer0.10-pulseaudio pulseaudio-module-gconf pulseaudio-module-x11 pulseaudio-utils python-dbus python-dbus-dev

Section 4 ( reboot, copy back files and re-install packages ):


  - reboot
  Login and make sure everything fires up AOK.


  - cd /$BACKUP/etc ; rsync -axv Mutt* amavis apache2 fail2ban* my* nx* postfix razor spamassassin /etc/
  - cd /$BACKUP/var/lib ; rsync -axv amavis spamassassin nxserver /var/lib/
  - cd /$BACKUP/usr/local/bin; cp librarian-notify-send MythDataGrabber mythicalLibrarian myth-status procmail-check.pl spamfilter /usr/local/bin/
  - cd /$BACKUP/var ;  rsync -axv www /var/
  - cd /$BACKUP/etc ; rsync -axv apache2/ /etc/apache2/
  - cd /$BACKUP/etc/default ; cp google-musicmanager spamassassin mythweb /etc/default/
  - cd /$BACKUP/etc/apt/sources.list.d/ ; cp freenx-team-ppa-lucid.list chromium-daily-ppa-lucid.list google-musicmanager.list /etc/apt/sources.list.d/
  - cd /$BACKUP/etc/apt/ ; cp trusted* /etc/apt/

  NOW compare your dpkg  -l output with the file before_package_list created above. look for packages need to be re-installed.


  - dpkg -l > now_package_list
  - diff -b -y now_package_list before_package_list


Here are a few runs I did:

  - apt-get install vim amavisd-new apache2 php5 spamassassin postfix mythtv-backend mythweb flashplugin-installer clamav clamav-daemon lha arj unrar zoo nomarch lzop cabextract p7zip ttf-mscorefonts-installer mythtv-frontend mythtv-database xmltv-util procmail mysql-server razor bsd-mailx libclamunrar6



  - dpkg -l > now_package_list
  - diff -b -y now_package_list before_package_list

  - apt-get install openjdk-6-jdk icedtea-6-plugin icedtea-plugin fonts-ipafont-mincho ttf-telugu-fonts ttf-oriya-fonts ttf-kannada-fonts ttf-bengali-fonts fonts-ipafont-gothic
  - apt-get install lame rsstail openshot lzop lynx logwatch picard mp3info iptraf irssi git ethtool curl cvs arj agrep irssi-scripts git-cvs git-svn
  - apt-get install imagemagick enscript ffmpeg x264 mencoder ddclient mythplugins mythnetvision mythweather

  - dpkg -l > now_package_list
  - diff -b -y now_package_list before_package_list

Yeah..... everything should be a right a rain :)

Thursday, February 16, 2012

Adventures migrating photos from iPhoto 9 to Digikam 2.1.1

After our mac mini update to Snow Leopard, the system was clearly now under-powered and since my wife and I both stay logged at most times running just browsers caused us to swap out way too much. I decided to build a Shuttle SH67H3 with i5-2500k CPU and 16GIG of RAM from NewEgg and we are now running Linux Mint 12. The most important part of the transitions was to not loose all our iPhoto Albums. Also, I did not want to do them over from scratch again so I set out on google and found photokam ( https://sites.google.com/site/laurentbovet/photokam ) which did the majority of my lifting but I did not want a straight copy from iPhoto.

iPhoto splits your images tree into Masters(Originals now a link)/Year/Date(Roll)/etc... and Preview(Modified now a link)/Year/Date(Roll)/etc... but digikam ( http://www.digikam.org/ ) allows you to manage your tree how you like and makes new version of a photo by appending _v1 to the basename of the file.

Just to make sure I did not leave myself with the easiest transition possible :-P I decided to just rsync my iPhoto Library Masters directory to /home/photos on the new machine.  I did not want the separate directory tree so after copying all the Masters/Originals sync'd I figured I'd look at copying the files from the Previews/Modified directory copied over and make a -v1 in the same path from /home/photo_temp.  I'm not sure why but iPhoto changes the extension from .jpg to .JPG on or after the copy process? So I wrote a script but this only gets you 90% there. You'll have to do some manual cleanup because in additions because iPhoto seems to copy modified versions i.e. cropped to an entirely different directory(Roll) so that will need to clean up afterwards.


Here is my update_photo script to move the Previews(Modified) versions over the new master location:
----------------- cut -------------------------
#!/usr/bin/perl


use File::Copy;
use File::Find;


find({ wanted => \&process_file, no_chdir => 1},  "." );
sub process_file {
  $tempfilename = $File::Find::name;
  if ( -f $tempfilename ) {
    #print " This is a file: $tempfilename";
       if ( -f "/home/photos/$tempfilename" ) {
          print " and matching file exist \n";
          my $new_name = "/home/photos/$tempfilename";
          $new_name =~ s/(.*)(\..+$)/$1-v1$2/g;
          print "move $tempfilename $new_name\n";
          move($tempfilename, $new_name);
       } else {
          #print " and matching file does not exist \n";
                my $upper_name = $tempfilename;
                $upper_name =~ s/.jpg/.JPG/g; 
                if ( -f "/home/photos/$upper_name" ) {
                   #print " Upper case is there: /home/photos/$upper_name\n";
                   my $new_upper_name = "/home/photos/$upper_name";
                   $new_upper_name =~ s/(.*)(\..+$)/$1-v1$2/g;
                   print "move $tempfilename $new_upper_name\n";
                   move($tempfilename, $new_upper_name);
                  } else {
                   my $lower_name = $tempfilename;
                   $lower_name =~ s/.JPG/.jpg/g;
                   #print " Lower case is there: /home/photos/$lower_name\n";
                   my $new_lower_name = "/home/photos/$lower_name";
                   $new_lower_name =~ s/(.*)(\..+$)/$1-v1$2/g;
                   print "move $tempfilename $new_lower_name\n";
                   move($tempfilename, $new_lower_name);
                }
       }
  } else {


    print " This is NOT a file: $tempfilename\n";
 }
  
}
----------------- cut -------------------------
So now you have to tweak photokam scripts to migrate the information from the AlbumData.xml file copied over from your iPhoto Library directory I just commented out the copy commands since the files were already copied over:
# diff prepare.py photokam-0.5/prepare.py 
8c8
< file_extensions=('jpg', 'JPG', 'jpeg', 'JPEG')
---
> file_extensions=('jpg', 'JPG', 'jpeg', 'JPEG', 'tif', 'TIF', 'tiff', 'TIFF', 'avi', 'AVI')
23c23
<         input = '.'
---
>         input = 'iPhoto Library'
30c30
<     debug = True
---
>     debug = False
128,129c128
<         print('                     Fullname --- '+to_date_string(date))
<         fullname = to_date_string(date)+' - '+roll['RollName']
---
>         fullname = to_date_string(date)+' - '+roll['AlbumName']
166c165
<     #            copy_file(input, original_source_path, out+'/'+target_path, mtime, True)
---
>                 copy_file(input, original_source_path, out+'/'+target_path, mtime, True)
177,178c176,177
<     #            copy_file(input, original_source_path, out+'/'+original_target_path, mtime, True)
<     #            copy_file(input, source_path, out+'/'+target_path, mtime)
---
>                 copy_file(input, original_source_path, out+'/'+original_target_path, mtime, True)
>                 copy_file(input, source_path, out+'/'+target_path, mtime)
183c182
<     #        copy_file(input, source_path, out+'/'+target_path, mtime)
---
>             copy_file(input, source_path, out+'/'+target_path, mtime)
 So after running the process-digikam-db.py script I get errors immediately on the process so I started hacking away and here is the diff.
$ diff process-digikam-db.py photokam-0.5/process-digikam-db.py 
26c26
<         input = args[0]+'/digikam4.db'
---
>         input = args[0]+'/digikam3.db'
59,60d58
<             if debug:
<                 print( ' The pieces  '+pieces[0])
66,75c64,73
<     #-#print('Setting album dates')
<     #-#p=re.compile('[1-2][0-9][0-9][0-9]-[0-1][0-9]-[0-3][0-9]')
<     #-#c=con.cursor()
<     #-#c.execute("select id, relativePath from Albums")
<     #-#for id, relativePath in c.fetchall():
<     #-#    name=relativePath.split('/')[-1]                
<     #-#    if len(name) >= 10 and p.match(name):
<     #-#        date=time.strptime(name[:10]+' 12', "%Y-%m-%d %H") #12h offset to avoid tz shifts
<     #-#        params=(name[:10], id)
<     #-#        c.execute('update albums set date=date(?) where id=?', params)
---
>     print('Setting album dates')
>     p=re.compile('[1-2][0-9][0-9][0-9]-[0-1][0-9]-[0-3][0-9]')
>     c=con.cursor()
>     c.execute("select id, url from Albums")
>     for id, url in c.fetchall():
>         name=url.split('/')[-1]                
>         if len(name) >= 10 and p.match(name):
>             date=time.strptime(name[:10]+' 12', "%Y-%m-%d %H") #12h offset to avoid tz shifts
>             params=(name[:10], id)
>             c.execute('update albums set date=date(?) where id=?', params)
116d113
<     #print( '    Album path '+album_path+'      Image name '+image_name )
121c118
<             "where Images.album=Albums.id and relativePath=? and name=?", params)
---
>             "where Images.dirid=Albums.id and url=? and name=?", params)
I thought that would get me home but I kept getting errors about not finding the photos in the database and found that the tag-mappings.txt was somehow not quite right... A sample here:

2011/2011-08-28 - 09-11-PhoneDump/IMAG0242.jpg=Albums/Favorites/Summer_2011
2011/2011-08-28 - 09-11-PhoneDump/IMAG0245.jpg=Albums/Favorites/Summer_2011
2011/2011-08-30 - Aug 27, 2011/P1090019.JPG=Albums/Favorites/Summer_2011
2011/2011-08-30 - Aug 27, 2011/P1090023.JPG=Albums/Favorites/Summer_2011

so I hacked up this:
$ cat update_tag_map.sh 
#!/bin/bash

while read line 
do

 myyear=`echo $line | cut -c 1-4`

 mydir=`echo $line | awk -F' - ' '{print $2}' | sed  's!\=.*$!!'`
 myfile=`echo $mydir | sed 's!^.*/!!g'`

  mynewfile=`locate $myfile | grep /home/photos/$myyear | cut -c 19-`
  echo " this is my new file --- $mynewfile"

  mynewline=`echo $line | sed s!"$mydir"!"$mynewfile"!g`
  echo "$mynewline" >> /home/photos/tag-mappings.txt
done < /home/photo_temp/tag-mappings.txt
And the result is:
2011/09-11-PhoneDump/IMAG0242.jpg=Albums/Favorites/Summer_2011
2011/09-11-PhoneDump/IMAG0245.jpg=Albums/Favorites/Summer_2011
2011/Aug 27, 2011/P1090019.JPG=Albums/Favorites/Summer_2011
2011/Aug 27, 2011/P1090023.JPG=Albums/Favorites/Summer_2011
NOTE: make sure the tag-mappings.txt file does not have any blank lines or the process-digikam-db.py script will fail.

Everything seems OK for now but I'll post back if I see any other gotchas.


.