• Creating an offline replica of our Windows Environment-Part 1

    I am working on my new project, upgrading our Exchange 2003 environment to Exchange 2010. I wanted to create an off line replica of our current environemnet. This is how I set about doing that

    • Before I started, I created a new workstation – Windows 7 on our ESX server. This machine is joined to the domain, but no one has ever logged on (deployed via SCCM OSD). I left it overnight to make sure it was in AD and AD had replicated.
    • Next I created a network in ESX that is not attached to any adapters (and no other machines are connected to) – called “offline network”
    • I then used the built in ESX function of cloning a VM, and created a clone of one of our Domain Controllers (DC).
    • This clone was assigned to the off line network, and the Windows 7 VM was moved to the offline network.
    • Since the DC has AD,DNS, and DHCP all running on it, I should be able to reboot the Windows 7 machine and I should be able to log in with my account.
    • Since I never have logged on to this Windows 7 VM before, I know it is not using a cached version of my account and the Windows 7 VM has to be communicating to the DC.
    • Finally, I seized the FSMO roles from the the other non existant DC (not on the offline network)

    That gets me a functioning DC and a functioning DNS with a test Windows 7 machine and Office 2010.

    Next up: Recovering an Exchange 2003 cluster to a single machine in an off line network.


  • Xnest, xdmcp and X11 on CentOS

    It has been a while since I have used XNest. It works. Slow, but it works. I can ssh into a box and bring back a full X session back to my Mac.

    In CentOS 5.5 I had to edit /etc/gdm/custom.conf and add:

    Enable=true under [xdmcp]. Restart X and now I can run:

    Xnest :1 -geometry 1024×768 -kb -query localhost

    which will bring a gnome session back to my X11 server.

    Very easy with out having to open any ports!


  • GNU date vs BSD date

    I usually develop and test my BASH scripts on my mac, mostly for use on RedHat systems. Occasionally I run into problems with this workflow. Recently I realized there was a differnce between the date command on RedHat and the date command in OS X. Turns out BDS date != GNU date. The workaround, install coreutils from Mac Ports, and add this alias to my .bashrc:

    alias date=”/opt/local/bin/gdate”

     

    Update: gdate is part of the GNU coreutils, and the MacPorts install command for gdate is: sudo port install coreutils


  • My mac ports cheat sheet

    These 3 commands will check for new ports, upgrade outdated ports, and remove older versions.
    • sudo port selfupdate
    • sudo port upgrade outdated
    • sudo port uninstall inactive

    To make sure you have a slimmed down install, use port_cutleaves to remove unnecessary ports. There are often “Build Dependencies” (like autoconf, automake, libtool, m4, help2man, p5-locale-gettext) that are no longer needed after a package is installed.

    • sudo port install port_cutleaves
    • sudo port_cutleaves (I run this a couple of times)

  • Check if a file exists on a remote server

    I was working on a copy WordPress site to a local machine script, and I wanted to check if the remote path was a WordPress site.
    Here is the code I used.

    CHECKCMD="ls /path/to/wp-config.php | grep wp-config.php > /dev/null; echo \$?"
    CHECKFILE=$(ssh $SRCSERVER $CHECKCMD)
    if [ $CHECKFILE -eq 0 ]; then
    	echo "file exists"
    fi
    

  • Run a task sequence after completed OSD

    We wanted to run a Task Sequence after an Operating System Deployment. The OSD image would have only Office,Antivirus, and Windows. The task sequence would have all the other common packages we roll out (Adobe Reader and Flash, Firefox, Quicktime, Java . . .). The problem was that I wanted an easy way to deploy all the packages at once, and an easy way to keep it up to date.

    First we created a Collection (that updates every 5 mins) based on the following query:

    SELECT SMS_R_System.Name, SMS_G_System_OPERATING_SYSTEM.InstallDate 
    FROM SMS_R_System inner join SMS_G_System_OPERATING_SYSTEM on SMS_G_System_OPERATING_SYSTEM.ResourceId = SMS_R_System.ResourceId 
    WHERE DATEDIFF(dd,SMS_G_System_OPERATING_SYSTEM.InstallDate,GetDate()) < 2
    ORDER BY SMS_G_System_OPERATING_SYSTEM.InstallDate DESC
    

    This query returns all the machines that have had their operating system installed in the last 2 days.

    Next we created a Task Sequence that Installed all the packages, and advertised it to the new collection.

    Now within a few mins of a machine adding itself to SCCM, it will show up in the collection, and the Task Sequence will be able to be run.

    The key was the query to find the machines that have been installed recently. Thanks xrobx99 for your help with this!


  • Quick check if a mysql database exists

    Here is my bash code that checks if a db exists before I try to create one in a script:

    $DBNAME="dblookingfor"
    DBEXISTS=$(mysql --batch --skip-column-names -e "SHOW DATABASES LIKE '"$DBNAME"';" | grep "$DBNAME" > /dev/null; echo "$?")
    if [ $DBEXISTS -eq 0 ];then
    	echo "A database with the name $DBNAME already exists. exiting"
    	exit;
    fi
    

    This will exit out if there is a database with the name you are searching for. The tricky part for me (and always is) was this double quotes inside the single quotes in the LIKE statement.


  • Rackspace Cloud Files download script

    A new(er) tool in the services I use/recommend is Rackspace Cloud servers and Rackspace Cloud Files.

    We were evaluating cloud services to host client websites, and I ended up choosing Rackspace’s cloud offerings. I really like the services the provide.

    With their Cloud files, I can upload files that can be accessed anywhere. I decided that I wanted to put our common scripts there, that way when we provision a new server, behind a firewall or in the cloud, we can pull from the same place. All I would have to do is keep them up to date in one place.

    Before I knew about Chef (future project I can’t wait to have time for), I created simple scripts to install a common set of packages on every server – our SOE (Standard Operatin Environment). Once a server is provisioned, from any other server, we can update the new server to have the same core set of packages and configurations. The most important part of this is that we install GIT and pulldown the python-cloudfiles:

    yum install git -y
    git clone git://github.com/rackspace/python-cloudfiles.git
    

    Once python-cloudfiles is installed, we use the following script to pull down the common set of scripts:

    conn = cloudfiles.get_connection('usename','keynumberthatisreallylong')
    cont = conn.get_container(container)
    obj = cont.get_objects(path=sourcepath)
    for filename in obj:
    	print "Downloading " + (os.path.join("/",container,sourcepath,os.path.basename(filename.name))) + " to " + destpath
    	filename.save_to_filename(os.path.join(destpath, os.path.basename(filename.name)))
    	destfile = os.path.join(destpath, os.path.basename(filename.name))
    	timestamp = filename.last_modified[:filename.last_modified.find(".")-3].replace('-','').replace(':','').replace('T','')
    	cmd = "touch -m -t " + timestamp + " " + destfile
    	os.system(cmd)
    

    What this does is pull down each file in a directory in the Cloud Files infrastructure and saves it locally. Then I added the extra step of setting the modified date to the Cloud Files last_modified date, so that we can tell what downloaded files have been changed recently (uploaded to Rackspace Cloud Files).

    I look to replace this with Chef one day, but right now it works really well for us