Moving to the new (and future) Azure PowerShell Module : Az

It looks like we need to move to the new Az module. It is not required, but future functionality will not be added to AzureRM, so I decided to make the switch. Here is how I went about it.

First, to enable backwards compatibility, you need to add this command to your profile : “Enable-AzureRmAlias” . You can edit your profile and append the line by:

notepad $profile

or you can just append it by:

Add-Content $profile "`nEnable-AzureRmAlias"

(remeber ISE has it’s own $profile so you may need to modify it too)

Once you have “Enable-AzureRmAlias” in your profile, all your old scripts should still work.

Next, I wanted to remove all the old AzureRM modules. I had several versions installed, so it took a long time! Note: this code removes any module that starts with Azure*

foreach ($module in (Get-Module -ListAvailable Azure*).Name) {
    write-host "Removing Module $module"
    Uninstall-module $module -Force
}

Now that we are feeling clean, add the new module:

Install-Module Az

For some reason it didn’t install the new Az resource graph module so I added it:

Install-Module Az.ResourceGraph

I am ready for the future. Hope that helps someone.

0

Code to query Azure Load Balancer Metrics to verify Availability (VipAvailability )

This one was fun to put together.

I wanted to write code to query the status of an Azure Load Balancer. I couldn’t find much out there. This code query’s the Azure Load Balancer’s Metrics for VipAvailability – through the REST API. If it returns 100 then are good to go. Anyting else, then there may be a issue. You can query any metric, and you can set a time range, I am just looking at the last min.

Note: This is for a Standard Load Balancer, not Basic.

Some of the Metrics Available:

VipAvailability : Average count of availability of VIP endpoints, based on probe results.
DipAvailability : Average count of availability of DIP endpoints, based on probe results.
ByteCount : Total number of bytes processed per front-end.
PacketCount : Total number of packets processed per front-end.
SynCount : Total number of SYN packets received.
SnatConnectionCount : Total number of new SNAT connections, that is, outbound connections that are masqueraded to the Public IP address front-end.

And the same metrics are often referred to by different names (this was confusing to me):

value               localizedValue                
-----               --------------                
VipAvailability     Data Path Availability        
DipAvailability     Health Probe Status           
ByteCount           Byte Count                    
PacketCount         Packet Count                  
SYNCount            SYN Count                     
SnatConnectionCount SNAT Connection Count         
AllocatedSnatPorts  Allocated SNAT Ports (Preview)
UsedSnatPorts       Used SNAT Ports (Preview) 

Here is the code (bouns: BASH/cURL too) to find the VipAvaiablity of Azure Load Balancers:

$SubscriptionId = "$($env:SubscriptionId)"
$TenantId       = "$($env:TenantId)" 
$ClientID       = "$($env:ClientID)"      
$ClientSecret   = "$($env:ClientSecret)"  
$TenantDomain   = "$($env:TenantDomain)" 
$loginURL       = "https://login.microsoftonline.com/$TenantId/oauth2/token"
$resource      = "https://management.core.windows.net/" 
$resourceGroupName = "eastUS-01"
$body           = @{grant_type="client_credentials";resource=$resource;client_id=$ClientID;client_secret=$ClientSecret}
$oauth          = Invoke-RestMethod -Method Post -Uri $loginURL -Body $body
$headerParams = @{'Authorization'="$($oauth.token_type) $($oauth.access_token)"}

$start=((Get-Date).AddMinutes(-1)).ToUniversalTime().ToString("yyy-MM-ddTHH:mm:00Z")
$end=(Get-Date).ToUniversalTime().ToString("yyy-MM-ddTHH:mm:00Z")
$filter = "(name.value eq 'VipAvailability') and aggregationType eq 'Average' and startTime eq $start and endTime eq $end and timeGrain eq duration'PT1M'"
$url = "https://management.azure.com/subscriptions/$SubscriptionId/resourceGroups/$resourceGroupName/providers/Microsoft.Network/loadBalancers/jemurphyLB01/providers/microsoft.insights/metrics?`$filter=${filter}&api-version=2016-09-01"
$results=Invoke-RestMethod -Uri $url -Headers $headerParams -Method Get
$results.value | select -ExpandProperty data | select timestamp,average
SUBSCRIPTIONID=""
RESOURCEGROUPNAME=""
CLIENTID=""
CLIENTSECRET=""
TENANTID=""
RESOURCEGROUPNAME=""
LBNAME=""

LOGINURL="https://login.microsoftonline.com/$TENANTID/oauth2/token"
RESOURCE="https://management.core.windows.net/" 

TOKEN=$(curl --silent --request POST $LOGINURL --data-urlencode "resource=https://management.core.windows.net" --data-urlencode "client_id=$CLIENTID" --data-urlencode "grant_type=client_credentials" --data-urlencode "client_secret=$CLIENTSECRET" | jq -r '.access_token')

STARTTIME=$(date -u +'%Y-%m-%dT%H:%M:00' --date='-1 min')
ENDTIME=$(date -u +'%Y-%m-%dT%H:%M:00')

FILTER="(name.value eq 'VipAvailability') and aggregationType eq 'Average' and startTime eq $STARTTIME and endTime eq $ENDTIME and timeGrain eq duration'PT1M'"
URL="https://management.azure.com/subscriptions/$SUBSCRIPTIONID/resourceGroups/$RESOURCEGROUPNAME/providers/Microsoft.Network/loadBalancers/$LBNAME/providers/microsoft.insights/metrics"

RESULTS=$(curl -s -G --header "authorization: Bearer $TOKEN" --data-urlencode "\$filter=$FILTER" --data-urlencode "api-version=2016-09-01" $URL | jq .value[].data[].average)

echo "$RESULTS"

I think the hardest part was trying to get the date and time in the right format. Why is that so hard?

This HAS to be helpful to some one!

0

Note to self: cURL with data-urlencode for GET/QuerySting values

I know I would loose this if I didn’t blog it.
With cURL, you can use “–data-urlencode” with query string params and a GET if you include the “-G” parameter. Of course you still have to escape things out, I just found it easer to add all the QueryString params separately. All the examples I could find were for POSTs.

FILTER="ReallyLongStringWIth"$VARS" SPACES and ' SINGLE quotes and a &"
curl -s -G --header "authorization: Bearer $TOKEN" --data-urlencode "\$filter=$FILTER" --data-urlencode "api-version=2016-09-01" $URL 

0

Using Azure Traffic Manager with IP White-listed resources

The question was, how can you use Azure Traffic Manager if the destinations are restricted with IP white lists?
This is the only way I could find:

  1. There is a blob that contains the source IPs of the probes. Here is the file, And here is the reference
  2. This list would need to be queried often because I couldn’t find any indication of when it would be updated
  3. I wrote PowerShell to parse the results and put it into an NSG.
$RGName= "Your RG Name"
$NSGName = "Your NSG Name"
$NSGRuleName = "Your Rule Name"
$Priority = 120
$DestinationPortRange = 443
$url="https://azuretrafficmanagerdata.blob.core.windows.net/probes/azure/probe-ip-ranges.json"
$results=Invoke-RestMethod -Uri $url
[email protected]()
foreach ($address in $results.ipv4_prefixes){
$allAddresses += $address.ip_prefix
}
# for some reason, get-AzureRmNetworkSecurityRuleConfig errors out if there is no matchin name
# could use a try - catch
if  ((Get-AzureRmNetworkSecurityGroup -ResourceGroupName $RGName -Name $NSGName | get-AzureRmNetworkSecurityRuleConfig -Name $NSGRuleName -ErrorAction SilentlyContinue) -eq $null){
# Creating RUle
Get-AzureRmNetworkSecurityGroup -ResourceGroupName $RGName -Name $NSGName | `
Add-AzureRmNetworkSecurityRuleConfig -Name $NSGRuleName -Description "Allow Probe from ATM" -Access Allow -Protocol Tcp -Direction Inbound -Priority $Priority -SourceAddressPrefix $allAddresses -SourcePortRange * -DestinationAddressPrefix * -DestinationPortRange $DestinationPortRange | Set-AzureRmNetworkSecurityGroup
}
else {
# Updating Rule
Get-AzureRmNetworkSecurityGroup -ResourceGroupName $RGName -Name $NSGName | `
Set-AzureRmNetworkSecurityRuleConfig -Name $NSGRuleName -Description "Allow Probe from ATM" -Access Allow -Protocol Tcp -Direction Inbound -Priority $Priority -SourceAddressPrefix $allAddresses -SourcePortRange * -DestinationAddressPrefix * -DestinationPortRange $DestinationPortRange | Set-AzureRmNetworkSecurityGroup
}

Hope that helps.

0

PowerShell to move a VM to a new Log Analytics WorkSpace

This code uninstalls the Microsoft Monitoring agent and re-installs it to a new WorkSpace.

# change your VM Name and it's resource group
$vm = get-azurermvm -VMName YourVMName -ResourceGroupName VMResourceGroup
Remove-AzureRmVMExtension -ResourceGroupName $vm.ResourceGroupName -VMName $vm.Name -Name MicrosoftMonitoringAgent -Force
# put in your new workspaceId & workspaceKey
$workspaceId = "NewWorksSpaceID"
$workspaceKey = "SupaSecretKey"

$PublicSettings = @{"workspaceId" = $workspaceId;"stopOnMultipleConnections" = $false}
$ProtectedSettings = @{"workspaceKey" = $workspaceKey}

Set-AzureRmVMExtension -ExtensionName "MicrosoftMonitoringAgent" -ResourceGroupName $vm.resourcegroupname -VMName $vm.name `
-Publisher "Microsoft.EnterpriseCloud.Monitoring" `
-ExtensionType "MicrosoftMonitoringAgent" `
-TypeHandlerVersion 1.0 `
-Settings $PublicSettings `
-ProtectedSettings $ProtectedSettings `
-Location $vm.Location

Nothing special, just thought I would put it here. Mayby it will help someone?

2

Use the REST API to create a new Project in Azure DevOps

As the title says, I wanted to create a new project in VSTS / Azure DevOps, whatever you want to call it. Here is the code to do that. You need a Personal Access Token to authenticate with.

$User="[email protected]"
$PAT="YourPAT"
$Organization="YourOrg"
$base64authinfo = [Convert]::ToBase64String([Text.Encoding]::ASCII.GetBytes(("{0}:{1}" -f $User, $PAT)))
$url="https://dev.azure.com/$Organization/_apis/projects?api-version=5.1-preview.4"
$body = @"
{
  "name": "FabrikamTravel",
  "description": "Frabrikam travel app for Windows Phone",
  "capabilities": {
    "versioncontrol": {
      "sourceControlType": "Git"
    },
    "processTemplate": {
      "templateTypeId": "6b724908-ef14-45cf-84f8-768b5384da45"
    }
  }
}
"@
Invoke-RestMethod -Method POST -ContentType application/json -Uri $url -Headers @{Authorization=("Basic {0}" -f $base64authinfo)} -Body $Body

Hope that helps someone?

0

Using PowerShell to query Azure Log Analytics via the REST API

I wanted to pull some data out of Azure Log Analytics using PowerShell and the REST API. Here is the code to Pull all errors in the Application event logs on VMs that are pushing their logs into Log Analytics via MicrosoftMonitoringAgent.

Hopefully this may help someone:

$SubscriptionId = "$($env:SubscriptionId)"
$TenantId       = "$($env:TenantId)" 
$ClientID       = "$($env:ClientID)"      
$ClientSecret   = "$($env:ClientSecret)"  
$TenantDomain   = "$($env:TenantDomain)" 
$loginURL       = "https://login.microsoftonline.com/$TenantId/oauth2/token"
$resource       = "https://api.loganalytics.io"         

$body           = @{grant_type="client_credentials";resource=$resource;client_id=$ClientID;client_secret=$ClientSecret}
$oauth          = Invoke-RestMethod -Method Post -Uri $loginURL -Body $body
$headerParams = @{'Authorization'="$($oauth.token_type) $($oauth.access_token)"}

$Workspacename="Your WS Name"
$WorkspaceId="Your WS ID"

$url="https://api.loganalytics.io/v1/workspaces/$WorkspaceId/query"
$body = @{query = 'Event | where EventLog == "Application" | order by TimeGenerated asc | project Computer,EventLog,Source,EventLevelName,EventID,RenderedDescription,TimeGenerated'} | ConvertTo-Json
$webresults=Invoke-RestMethod -UseBasicParsing -Headers $headerParams -Uri $url -Method Post -Body $body -ContentType "application/json"

Notes:

  1. I keep my subscription information in Env Varaibles. It is easier for me to swtich to a different Tenant
  2.  This returns the results in tables. To move the tables into an object look at this person’s code at line 60  https://blog.tyang.org/2017/11/14/searching-oms-using-the-new-search-language-kusto-rest-api-in-powershell/
  3. My interpretation of code in #2
$resultsTable=$webresults.Content | ConvertFrom-Json
$count = 0
foreach ($table in $resultsTable.Tables) {
$count += $table.Rows.Count
}
$results = New-Object object[] $count
$i = 0;
foreach ($table in $resultsTable.Tables) {
    foreach ($row in $table.Rows) {
        # Create a dictionary of properties
        $properties = @{}
        for ($columnNum=0; $columnNum -lt $table.Columns.Count; $columnNum++) {
            $properties[$table.Columns[$columnNum].name] = $row[$columnNum]
        }      
        $results[$i] = (New-Object PSObject -Property $properties)
        $null = $i++
    }
}
$results



0

Removing machines from Azure State Configuration (DSC)

I have been provisioning machines over an over trying to learn all the VM Extensions. One of the extensions that I have been playing with is the DSC extension. Every time I provision with this extension, it adds an additional record into the State Configuration, resulting in many stale machines. I wanted to clear out all the old machines. I couldn’t find a way to do it in PowerShell, so I figure out how to do it via the REST API (and PowerShell).

Here is the code to remove all machines from Azure State Configuration (DSC)

$SubscriptionId = "$($env:SubscriptionId)"
$TenantId       = "$($env:TenantId)" 
$ClientID       = "$($env:ClientID)"      
$ClientSecret   = "$($env:ClientSecret)"  
$TenantDomain   = "$($env:TenantDomain)" 
$loginURL       = "https://login.microsoftonline.com/$TenantId/oauth2/token"
$resource       = "https://management.core.windows.net/"    
$resourceGroupName = "YourResourceGroupName "
$automationAccountsName ="YourAutomationAccountsName "

# get the OAUTH token & prepare header
$body           = @{grant_type="client_credentials";resource=$resource;client_id=$ClientID;client_secret=$ClientSecret}
$oauth          = Invoke-RestMethod -Method Post -Uri $loginURL -Body $body
$headerParams = @{'Authorization'="$($oauth.token_type) $($oauth.access_token)"}

# main query to find all the nodes
$url="https://management.azure.com/subscriptions/$SubscriptionId/resourceGroups/$resourceGroupName/providers/Microsoft.Automation/automationAccounts/$automationAccountsName/nodes?api-version=2018-01-15"
$results=Invoke-RestMethod -Uri $url -Headers $headerParams -Method Get
# Loop through all the nodes and delete them all.
foreach ($node in $($results.value | Select-Object  -ExpandProperty properties | Select nodeid)){
$url="https://management.azure.com/subscriptions/$SubscriptionId/resourceGroups/$resourceGroupName/providers/Microsoft.Automation/automationAccounts/$automationAccountsName/nodes/$($node.nodeId)?api-version=2018-01-15"
Invoke-RestMethod -Uri $url -Headers $headerParams -Method Delete
}

Notes:

  1. I put all my SPN info into environmental varaibles (easier to switch of needed)
  2. Put in your RG name and Auutomation Account Name
  3. Warning. This will delete all nodes!

HTH

0

My WSL/BASH setup

Not sure if this a “Bootstrap” or not, but I wanted to have my WSL/Bash home directory match my windows home directory. This is the code that I use when I setup a new WSL/BASH instance.

This will find your home directory via PowerShell and put it in a variable “$WINHOME”.

Then I make make soft links to the directories in my “My Documents”.

Finally, I add the first part to my .bashrc. (lines 1-4)

WINHOME=/mnt/$(powershell.exe -noprofile -noninteractive -command '& {(gci env:USERPROFILE).Value}')
WINHOME=$(echo $WINHOME | sed 's/\\/\//g' | sed 's/\r$//' | sed 's/\://g' )
WINHOME=$(echo ${WINHOME/C/c})
export WINHOME=$WINHOME

ln -s $WINHOME/Documents
ln -s $WINHOME/Downloads
0

My SSL/Certificate Cheatsheet

Whenever a certificate needs to be renewed, I always have to scramble to remember how to update/renew. I finally put a cheat sheet together.

I decided I will do all cert related stuff form Linux. Here are some commands:

To request a new csr with a new key:

openssl req -newkey rsa:2048 -keyout yourcompany.com.key -out yourcompany.com.csr

Generating a 2048 bit RSA private key
.............................................................................................+++
.............+++
writing new private key to 'stratgovadvisors.com'
Enter PEM pass phrase:
Verifying - Enter PEM pass phrase:
-----
You are about to be asked to enter information that will be incorporated
into your certificate request.
What you are about to enter is what is called a Distinguished Name or a DN.
There are quite a few fields but you can leave some blank
For some fields there will be a default value,
If you enter '.', the field will be left blank.
-----
Country Name (2 letter code) [AU]:US
State or Province Name (full name) [Some-State]:New York
Locality Name (eg, city) []:New York
Organization Name (eg, company) [Internet Widgits Pty Ltd]:Your Company name
Organizational Unit Name (eg, section) []:IT
Common Name (e.g. server FQDN or YOUR name) []:*.yourcompany.com
Email Address []:

Please enter the following 'extra' attributes
to be sent with your certificate request
A challenge password []:
An optional company name []:

To request a new csr with an existing key:

openssl req -new -key yourcompany.com.key -out yourcompany.com.csr 

To make a PFX form a Private key and a cert:

openssl pkcs12 -export -out yourcompany.com.pfx -inkey yourcompany.com.key -in yourcompany.com.crt

To extract Private key and Cert from a PFX (3 steps)

Export the private key

openssl pkcs12 -in yourcompany.com.pfx -nocerts -out yourcompany.com.pem -nodes

Export the certificate

openssl pkcs12 -in yourcompany.com.pfx -nokeys -out yourcompany.com.crt

Remove the passphrase from the private key

openssl rsa -in yourcompany.com.pem -out yourcompany.com.key 
0