Forcing update of BitLocker Recovery Key to AD:

Use the following commands to backup the recovery key of a BitLocker enabled drive to Active Directory

manage-bde -protectors c: -get 

BitLocker Drive Encryption: Configuration Tool version 6.1.7600
Copyright (C) Microsoft Corporation. All rights reserved.

Volume C: [Windows]
All Key Protectors

    Numerical Password:
      ID: {9557D616-0BD0-4B2A-8A2A-9DD4C5C21CCC}
      Password:
        527560-068585-114378-134288-010131-496430-662706-631224

    TPM:
      ID: {5EB69F42-4ABC-4D6B-87C5-C894A3840FC4} 

manage-bde -protectors c: -adbackup -id {9557D616-0BD0-4B2A-8A2A-9DD4C5C21CCC}

hope that helps someone.

0

My PowerShell scripts to encrypt Azure VM disks

This is my steps that I took from this very long document.

First we need to create a Key vault and then an AAD application, then you connect them. Make note of the output of $aadClientID.

$KeyVaultName="YourName-EastUS"
$ResourceGroupName="Default-EastUS"
$Location="East US"


#Create New KeyVault
New-AzureRmKeyVault -VaultName $KeyVaultName -ResourceGroupName $ResourceGroupName -Location $Location

#Create New AAD Application
$aadClientSecret = "YourLongSecret"
$azureAdApplication = New-AzureRmADApplication -DisplayName "Encryption-EastUS" -HomePage "https://IThinkAnythingCanGoHere" -IdentifierUris "https://IThinkAnythingCanGoHereURi" -Password $aadClientSecret
$servicePrincipal = New-AzureRmADServicePrincipal -ApplicationId $azureAdApplication.ApplicationId
$aadClientID = $azureAdApplication.ApplicationId
$aadClientID
Set-AzureRmKeyVaultAccessPolicy -VaultName $KeyVaultName -ServicePrincipalName $aadClientID -PermissionsToKeys all -PermissionsToSecrets all -ResourceGroupName $ResourceGroupName;
Set-AzureRmKeyVaultAccessPolicy -VaultName $KeyVaultName -EnabledForDiskEncryption

Once that is setup, you can encrypt a VM:

$KeyVaultName="YourName-EastUS"
$ResourceGroupName="Default-EastUS"
$Location="East US"
$vmName="VMNAME"

$aadClientSecret = "YourLongSecret"
$aadClientID = "YouMadeNoteOfThisAbove"
$KeyVault = Get-AzureRmKeyVault -VaultName $KeyVaultName -ResourceGroupName $ResourceGroupName;
$diskEncryptionKeyVaultUrl = $KeyVault.VaultUri;
$KeyVaultResourceId = $KeyVault.ResourceId;

Set-AzureRmVMDiskEncryptionExtension -ResourceGroupName $ResourceGroupName -VMName $vmName -AadClientID $aadClientID -AadClientSecret $aadClientSecret -DiskEncryptionKeyVaultUrl $diskEncryptionKeyVaultUrl -DiskEncryptionKeyVaultId $KeyVaultResourceId;

If you did not make note of your aadClientID, then you run:

get-AzureRmADApplication

And the ApplicationId is what you are looking for.

I forgot how I set this up, so I went back and made some notes, and now I hope this helps someone.

0

Using git and a post-recive to update production node.js apps.

I have been trying to figure out the best way to deploy and maintain node.js apps in development and production. If I have a local git repo on my machine, and I want to push it to production, what is the best way to do this? I don’t think the .git files should be there. I also don’t keep my modules in the repo, so I need a way to push updates, and make sure the newest dependencies are on the server.
I figured out that people are using a post-recieve script to update the site. This is what I ended up with. You put it in a file named post-receive in the hooks folder (on the server not on your local repo)

#!/bin/sh
GIT_WORK_TREE=/opt/node/nodapp
git --work-tree=$GIT_WORK_TREE checkout --force
cd $GIT_WORK_TREE
npm install

I may take this a step further and recycle pm2, but that is another post!

0

Using PowerShell to extract all contacts from MS CRM 2011

We are moving to Salesforce from MSCRM 2011. We need to get our data out so we can import into Salesforce. Here is the PowerShell script I am using to export contacts to csv.

$url="http://crm.sardverb.com/Company/xrmservices/2011/OrganizationData.svc/ContactSet?`$filter=StatusCode/Value eq 1"

$assembly = [Reflection.Assembly]::LoadWithPartialName("System.Web.Extensions")
$count=0
$output = @()

while ($url){
    function GetData ($url) {
    $webclient = new-object System.Net.WebClient
    $webclient.UseDefaultCredentials = $true
    $webclient.Headers.Add("Accept", "application/json")
    $webclient.Headers.Add("Content-Type", "application/json; charset=utf-8");
    $data=$webclient.DownloadString($url)
    return $data
    }
    $data=GetData($url) | ConvertFrom-Json
    $output += $data
    $count=$count+$data.d.results.length
    write-host $count
    if ($data.d.__next){
        #$url=$null
        $url=$data.d.__next.ToString()
    }
    else {
        $url=$null
    }
}

$output.d.results | Select -ExcludeProperty ParentCustomerId,__metadata @{l="ParentCustomerID";e={$_.ParentCustomerID.Id}},* | Export-Csv -NoTypeInformation C:\Contact.csv

Hope that helps someone.

0

When using PowerShell to pull REST data from MS CRM, escape `$filter !

Note to self.

When trying to filter a REST response in PowerShell, by using the “$filter” parameter in the url (as with MS CRM 2011), you must escape the “$” with “`$”.

For example:

Does not work:
$url=”http://crmserver.company.com/Organization/xrmservices/2011/OrganizationData.svc/ContactSet$filter=StateCode/Value eq 0″

Works:
$url=”http://crmserver.company.com/Organization/xrmservices/2011/OrganizationData.svc/ContactSet`$filter=StateCode/Value eq 0″

Gets me every time, and I can’t figure out why my filters are being ignored!

2

Using jsforce and node.js to connect to Salesforce

I wanted to write a node.js app to pull data from Salesforce. I found the NPM library jsforce. I added it to my packages in my package.json:

  "dependencies": {
    "express": "*",
    "dotenv": "*",
    "jsforce": "*"
  }

I also added “dotenv” which I am using to load my client secret and all configuration data from a hidden .env file. This is not in my git repo, so I can have different values in production and development.

Here is what I have in my .env file:

CLIENTID=zWHRIM8F87FChMcfHpZKS9LhQeeLwfthDbaiL9iXNO7ZBwfUwFPFqpDzC2HruNkJfIxrOdeITtftxBg20WEIm
CLIENTSECRET=123456789987654
REDIRECTURI=localhost
USERNAME=username@yourdomain.com
PASSWORD=PASSWORDANDCODE
LOGINURL=https://sitename-dev-ed.my.salesforce.com

Here is the code to pull in the .env values, define the oauth2 connection and login.

var dotenv         = require('dotenv').load();
var conn = new jsforce.Connection({
  oauth2 : {
      loginUrl : process.env.LOGINURL,
      clientId : process.env.CLIENTID,
      clientSecret : process.env.CLIENTSECRET,
      redirectUri : process.env.REDIRECTURI
    }
});
var username = process.env.USERNAME;
var password = process.env.PASSWORD;
conn.login(username, password, function(err, userInfo) {
  if (err) { return console.error(err); }
  console.log(conn.accessToken);
  console.log(conn.instanceUrl);
  console.log("User ID: " + userInfo.id);
  console.log("Org ID: " + userInfo.organizationId);
});

Once connected and logged in, we can query using SOQL. This is a query to pull All Opportunities, their contacts and contact roles, and their team members and the team member roles. If that makes sense. I am using this query to show the relationships between Opportunities and their Contacts and team members using d3.js. More on that later.

    var query = "SELECT Id, Name,(SELECT Contact.Name,Contact.Email,Contact.Id,Contact.AccountId,ContactId,Role,Contact.Account.Name FROM OpportunityContactRoles),(SELECT User.Name,User.Email,User.Id,UserId,TeamMemberRole FROM OpportunityTeamMembers) FROM Opportunity"
    conn.query(query, function(err, results) {
      if (err) { return console.error(err); }
      console.log("Query: " + results.totalSize)
      console.log(JSON.stringify(results, null, 2))
    });
0

My script/procedure to move Hyper-V VMs to Azure

We have been moving resources from ESXi to Hyper-V to Azure. ESXi to Hyper-V is done via the Microsoft Virtual Machine Converter (MVMC). Here is the Checklist/Script/Procedure I have been using to get Hyper-V to Azure.

  1. Once machine is in Hyper-V, make sure the VMs HDs are VHD and not VHDX
  2. Make sure DHCP is set on the VM
  3. Make sure RDP is enabled (ours is set via group policy)
  4. Power down VM
  5. Run the PowerShell below to add the HD (Add-AzurermVhd), and create a new VM in Azure:
Login-AzureRmAccount
$VMName="NAMEOFMACHINE"
$DestinationVMSize="Standard_A1"
$DestinationAvailabilitySet="AvailabilitySetName"
$PrivateIpAddress="192.168.5.55"
$ResourceGroupName="YourResourceGroup"
$DestinationNetworkName="YourNetwork"
$DestinationNetworkSubnet="YourLanSubnet"
$Location="East US2"
$OSType="Windows"
[switch]$DataDisk=$false
$SourceSystemLocalFilePath="C:\PathToYour\VHDs\$($VMName)-System.vhd"
$SourceDataLocalFilePath="C:\PathToYour\VHDs\$($VMName)-Data.vhd"
$DestinationStorageAccountName="yourstorageaccount"
$DestinationSystemDiskUri= "http://$DestinationStorageAccountName.blob.core.windows.net/vhds/$VMName-System.vhd"
$DestinationDataDiskUri= "http://$DestinationStorageAccountName.blob.core.windows.net/vhds/$VMName-Data.vhd"
$DestinationSystemDiskName="$($VMNAME)_SYSTEM.vhd"
$DestinationDataDiskName="$($VMNAME)_DATA01.vhd"
 
Add-AzurermVhd -Destination $DestinationSystemDiskUri -LocalFilePath $SourceSystemLocalFilePath -ResourceGroupName $ResourceGroupName
if ($DataDisk){
Add-AzurermVhd -Destination $DestinationDataDiskUri -LocalFilePath $SourceDataLocalFilePath -ResourceGroupName $ResourceGroupName
}
 
#region Build New VM
$DestinationVM = New-AzureRmVMConfig -vmName $vmName -vmSize $DestinationVMSize -AvailabilitySetId $(Get-AzureRmAvailabilitySet -ResourceGroupName $ResourceGroupName -Name $DestinationAvailabilitySet).Id
$nicName="$($VMName)_NIC01" 
$vnet = Get-AzureRmVirtualNetwork -Name $DestinationNetworkName -ResourceGroupName $ResourceGroupName
$subnet = $vnet.Subnets | where {$_.Name -eq $DestinationNetworkSubnet}
$nic = New-AzureRmNetworkInterface -Name $nicName -ResourceGroupName $ResourceGroupName -Location $Location -SubnetId $Subnet.Id -PrivateIpAddress $PrivateIpAddress
$DestinationVM = Add-AzureRmVMNetworkInterface -VM $DestinationVM -Id $nic.Id
$DestinationSystemDiskUri = $DestinationSystemDiskUri
$DestinationDataDiskUri = $DestinationDataDiskUri
 
If ($OSType -eq "Windows"){
$DestinationVM = Set-AzureRmVMOSDisk -VM $DestinationVM -Name $DestinationSystemDiskName -VhdUri $DestinationSystemDiskUri -Windows -CreateOption attach
if ($DataDisk){
$DestinationVM = Add-AzureRmVMDataDisk -VM $DestinationVM -Name $DestinationDataDiskName -VhdUri $DestinationDataDiskUri -CreateOption attach -DiskSizeInGB $DatDiskSize
}
}
 
New-AzureRmVM -ResourceGroupName $resourceGroupName -Location $Location -VM $DestinationVM

The most important part is to use “-attach” with “Set-AzureRmVMOSDisk”

Hope that helps someone.

0

Using Let’s Encrypt, cerbot-auto with Apache on CentOS 6

There are plenty of better documented examples out there, so this is more of a note to self.

cd /opt
mkdir YourDir
cd YourDir/
wget https://dl.eff.org/certbot-auto
chmod a+x certbot-auto

/certbot-auto --apache certonly -d www.FirstDomain.com -d FirstDomain.com -d www.SecondDoamin.com -d SecondDoamin.com -d www.ThirdDoamin.com -d ThirdDoamin.com -d www.FourthDomain.com -d FourthDomain.com

The name on the cert will be the first domain you list int he command above. All the other names will be part of the SAN cert.

And to renew, cron this up:
/opt/YourDir/certbot-auto renew

0

Powered by WordPress. Designed by WooThemes