• Using git and a post-recive to update production node.js apps.

    I have been trying to figure out the best way to deploy and maintain node.js apps in development and production. If I have a local git repo on my machine, and I want to push it to production, what is the best way to do this? I don’t think the .git files should be there. I also don’t keep my modules in the repo, so I need a way to push updates, and make sure the newest dependencies are on the server.
    I figured out that people are using a post-recieve script to update the site. This is what I ended up with. You put it in a file named post-receive in the hooks folder (on the server not on your local repo)

    #!/bin/sh
    GIT_WORK_TREE=/opt/node/nodapp
    git --work-tree=$GIT_WORK_TREE checkout --force
    cd $GIT_WORK_TREE
    npm install
    

    I may take this a step further and recycle pm2, but that is another post!


  • Using PowerShell to extract all contacts from MS CRM 2011

    We are moving to Salesforce from MSCRM 2011. We need to get our data out so we can import into Salesforce. Here is the PowerShell script I am using to export contacts to csv.

    $url="http://crm.sardverb.com/Company/xrmservices/2011/OrganizationData.svc/ContactSet?`$filter=StatusCode/Value eq 1"
    
    $assembly = [Reflection.Assembly]::LoadWithPartialName("System.Web.Extensions")
    $count=0
    $output = @()
    
    while ($url){
        function GetData ($url) {
        $webclient = new-object System.Net.WebClient
        $webclient.UseDefaultCredentials = $true
        $webclient.Headers.Add("Accept", "application/json")
        $webclient.Headers.Add("Content-Type", "application/json; charset=utf-8");
        $data=$webclient.DownloadString($url)
        return $data
        }
        $data=GetData($url) | ConvertFrom-Json
        $output += $data
        $count=$count+$data.d.results.length
        write-host $count
        if ($data.d.__next){
            #$url=$null
            $url=$data.d.__next.ToString()
        }
        else {
            $url=$null
        }
    }
    
    $output.d.results | Select -ExcludeProperty ParentCustomerId,__metadata @{l="ParentCustomerID";e={$_.ParentCustomerID.Id}},* | Export-Csv -NoTypeInformation C:\Contact.csv
    

    Hope that helps someone.


  • When using PowerShell to pull REST data from MS CRM, escape `$filter !

    Note to self.

    When trying to filter a REST response in PowerShell, by using the “$filter” parameter in the url (as with MS CRM 2011), you must escape the “$” with “`$”.

    For example:

    Does not work:
    $url=”http://crmserver.company.com/Organization/xrmservices/2011/OrganizationData.svc/ContactSet$filter=StateCode/Value eq 0″

    Works:
    $url=”http://crmserver.company.com/Organization/xrmservices/2011/OrganizationData.svc/ContactSet`$filter=StateCode/Value eq 0″

    Gets me every time, and I can’t figure out why my filters are being ignored!


  • Microsoft Certified Solutions Expert: Cloud Platform and Infrastructure

    I just became a Microsoft Certified Solutions Expert: Cloud Platform and Infrastructure (MCSE) !
    I haven’t taken any new exams in over a year – I guess I already had enough of the requirements !

    Nice?


  • Using jsforce and node.js to connect to Salesforce

    I wanted to write a node.js app to pull data from Salesforce. I found the NPM library jsforce. I added it to my packages in my package.json:

      "dependencies": {
        "express": "*",
        "dotenv": "*",
        "jsforce": "*"
      }
    

    I also added “dotenv” which I am using to load my client secret and all configuration data from a hidden .env file. This is not in my git repo, so I can have different values in production and development.

    Here is what I have in my .env file:

    CLIENTID=zWHRIM8F87FChMcfHpZKS9LhQeeLwfthDbaiL9iXNO7ZBwfUwFPFqpDzC2HruNkJfIxrOdeITtftxBg20WEIm
    CLIENTSECRET=123456789987654
    REDIRECTURI=localhost
    [email protected]
    PASSWORD=PASSWORDANDCODE
    LOGINURL=https://sitename-dev-ed.my.salesforce.com
    

    Here is the code to pull in the .env values, define the oauth2 connection and login.

    var dotenv         = require('dotenv').load();
    var conn = new jsforce.Connection({
      oauth2 : {
          loginUrl : process.env.LOGINURL,
          clientId : process.env.CLIENTID,
          clientSecret : process.env.CLIENTSECRET,
          redirectUri : process.env.REDIRECTURI
        }
    });
    var username = process.env.USERNAME;
    var password = process.env.PASSWORD;
    conn.login(username, password, function(err, userInfo) {
      if (err) { return console.error(err); }
      console.log(conn.accessToken);
      console.log(conn.instanceUrl);
      console.log("User ID: " + userInfo.id);
      console.log("Org ID: " + userInfo.organizationId);
    });
    

    Once connected and logged in, we can query using SOQL. This is a query to pull All Opportunities, their contacts and contact roles, and their team members and the team member roles. If that makes sense. I am using this query to show the relationships between Opportunities and their Contacts and team members using d3.js. More on that later.

        var query = "SELECT Id, Name,(SELECT Contact.Name,Contact.Email,Contact.Id,Contact.AccountId,ContactId,Role,Contact.Account.Name FROM OpportunityContactRoles),(SELECT User.Name,User.Email,User.Id,UserId,TeamMemberRole FROM OpportunityTeamMembers) FROM Opportunity"
        conn.query(query, function(err, results) {
          if (err) { return console.error(err); }
          console.log("Query: " + results.totalSize)
          console.log(JSON.stringify(results, null, 2))
        });
    

  • Extract ADFS signing certificate from the Federation Metadat URL

    I did not write this, but I liked it, so I thought I would pass it on!

    Use this script to extract the ADFS signing certificate from the FederationMetadat url (https://sts.yourserver.com/FederationMetadata/2007-06/FederationMetadata.xml):

    https://raw.githubusercontent.com/bergie/passport-saml/master/docs/adfs/retrieve_adfs_certificate.sh


  • My script/procedure to move Hyper-V VMs to Azure

    We have been moving resources from ESXi to Hyper-V to Azure. ESXi to Hyper-V is done via the Microsoft Virtual Machine Converter (MVMC). Here is the Checklist/Script/Procedure I have been using to get Hyper-V to Azure.

    1. Once machine is in Hyper-V, make sure the VMs HDs are VHD and not VHDX
    2. Make sure DHCP is set on the VM
    3. Make sure RDP is enabled (ours is set via group policy)
    4. Power down VM
    5. Run the PowerShell below to add the HD (Add-AzurermVhd), and create a new VM in Azure:
    Login-AzureRmAccount
    $VMName="NAMEOFMACHINE"
    $DestinationVMSize="Standard_A1"
    $DestinationAvailabilitySet="AvailabilitySetName"
    $PrivateIpAddress="192.168.5.55"
    $ResourceGroupName="YourResourceGroup"
    $DestinationNetworkName="YourNetwork"
    $DestinationNetworkSubnet="YourLanSubnet"
    $Location="East US2"
    $OSType="Windows"
    [switch]$DataDisk=$false
    $SourceSystemLocalFilePath="C:\PathToYour\VHDs\$($VMName)-System.vhd"
    $SourceDataLocalFilePath="C:\PathToYour\VHDs\$($VMName)-Data.vhd"
    $DestinationStorageAccountName="yourstorageaccount"
    $DestinationSystemDiskUri= "http://$DestinationStorageAccountName.blob.core.windows.net/vhds/$VMName-System.vhd"
    $DestinationDataDiskUri= "http://$DestinationStorageAccountName.blob.core.windows.net/vhds/$VMName-Data.vhd"
    $DestinationSystemDiskName="$($VMNAME)_SYSTEM.vhd"
    $DestinationDataDiskName="$($VMNAME)_DATA01.vhd"
     
    Add-AzurermVhd -Destination $DestinationSystemDiskUri -LocalFilePath $SourceSystemLocalFilePath -ResourceGroupName $ResourceGroupName
    if ($DataDisk){
    Add-AzurermVhd -Destination $DestinationDataDiskUri -LocalFilePath $SourceDataLocalFilePath -ResourceGroupName $ResourceGroupName
    }
     
    #region Build New VM
    $DestinationVM = New-AzureRmVMConfig -vmName $vmName -vmSize $DestinationVMSize -AvailabilitySetId $(Get-AzureRmAvailabilitySet -ResourceGroupName $ResourceGroupName -Name $DestinationAvailabilitySet).Id
    $nicName="$($VMName)_NIC01" 
    $vnet = Get-AzureRmVirtualNetwork -Name $DestinationNetworkName -ResourceGroupName $ResourceGroupName
    $subnet = $vnet.Subnets | where {$_.Name -eq $DestinationNetworkSubnet}
    $nic = New-AzureRmNetworkInterface -Name $nicName -ResourceGroupName $ResourceGroupName -Location $Location -SubnetId $Subnet.Id -PrivateIpAddress $PrivateIpAddress
    $DestinationVM = Add-AzureRmVMNetworkInterface -VM $DestinationVM -Id $nic.Id
    $DestinationSystemDiskUri = $DestinationSystemDiskUri
    $DestinationDataDiskUri = $DestinationDataDiskUri
     
    If ($OSType -eq "Windows"){
    $DestinationVM = Set-AzureRmVMOSDisk -VM $DestinationVM -Name $DestinationSystemDiskName -VhdUri $DestinationSystemDiskUri -Windows -CreateOption attach
    if ($DataDisk){
    $DestinationVM = Add-AzureRmVMDataDisk -VM $DestinationVM -Name $DestinationDataDiskName -VhdUri $DestinationDataDiskUri -CreateOption attach -DiskSizeInGB $DatDiskSize
    }
    }
     
    New-AzureRmVM -ResourceGroupName $resourceGroupName -Location $Location -VM $DestinationVM
    

    The most important part is to use “-attach” with “Set-AzureRmVMOSDisk”

    Hope that helps someone.


  • Using Let’s Encrypt, cerbot-auto with Apache on CentOS 6

    There are plenty of better documented examples out there, so this is more of a note to self.

    cd /opt
    mkdir YourDir
    cd YourDir/
    wget https://dl.eff.org/certbot-auto
    chmod a+x certbot-auto
    
    /certbot-auto --apache certonly -d www.FirstDomain.com -d FirstDomain.com -d www.SecondDoamin.com -d SecondDoamin.com -d www.ThirdDoamin.com -d ThirdDoamin.com -d www.FourthDomain.com -d FourthDomain.com
    

    The name on the cert will be the first domain you list int he command above. All the other names will be part of the SAN cert.

    And to renew, cron this up:
    /opt/YourDir/certbot-auto renew