• Using Azure Resource Graph for your inventory script

    In the past I have always run Get-AzureRmResourceGroup, and then looped through all the resources groups and the VMs inside of them. This can be slow. So I put together this script to leverage Azure Resource Graph for your inventory. I hope this helps some one:

    
    $everything=$(Search-AzureRmGraph -Query "where type != ''" -First 5000)
    while ($($everything.Count) % 5000 -eq 0) { 
    $everything=$everything + $(Search-AzureRmGraph -Query "where type != ''" -Skip $($everything.Count))
    }
    
    $VMs=$everything | Where {$_.type -contains 'Microsoft.Compute/virtualMachines'} 
    $NICs=$everything | Where {$_.type -contains 'microsoft.network/networkinterfaces'} 
    $pubIPs = $everything | Where {$_.type -contains 'microsoft.network/publicipaddresses'}
    $NSGs= $everything | Where {$_.type -contains 'microsoft.network/networksecuritygroups'}
    $VMSizes = @()
    $locations=$VMs | Select location -Unique
    foreach ($location in $($locations.location)){
    $sizes=get-azurermvmsize -location $location | Select @{Name="Location";Expression={$location}},Name,NumberOfCores,MemoryInMB,MaxDataDiskCount,OSDiskSizeInMB,ResourceDiskSizeInMB
    $VMSizes+=$sizes
    }
    
    
    $output=$VMs `
    | select *,@{N='vmSize';E={$_.properties.hardwareProfile.vmSize}} `
    | select *,@{N='CurrentSku';E={$s=$_.VMSize;$l=$_.location;$VMSizes | where {$_.Location -eq $l -and $_.Name -eq $s}}} `
    | select *,@{N='NumberOfCores';E={$_.CurrentSku.NumberOfCores}} `
    | select *,@{N='MemoryInMB';E={$_.CurrentSku.MemoryInMB}} `
    | select *,@{N='MaxDataDiskCount';E={$_.CurrentSku.MaxDataDiskCount}} `
    | select *,@{N='ResourceDiskSizeInMB';E={$_.CurrentSku.ResourceDiskSizeInMB}} `
    | select *,@{N='NICInfo';E={$NICId=$_.id;$NICs | Where {$_.properties.virtualMachine.id  -eq $NICId }}} `
    | select *,@{N='NicName';E={(($_.NICInfo).Name)}} `
    | select *,@{N='NSGID';E={(($_.NICInfo).properties).networkSecurityGroup.id}} `
    | select *,@{N='NSGInfo';E={$NSGID=$_.NSGID;($NSGs | Where {$_.Id -eq $NSGID}).Properties}} `
    | select *,@{N='securityRules';E={(($_.NSGInfo).securityRules).Name}} `
    | select *,@{N='PrivIP';E={(((($_.NICInfo).Properties).ipConfigurations[0]).properties).privateIPAddress}} `
    | select *,@{N='PubIPID';E={(((($_.NICInfo).Properties).ipConfigurations[0]).properties).publicIPAddress.id }} `
    | select *,@{N='PubIPInfo';E={$PUBIPID=$_.PubIPID;($pubIPs | Where {$_.Id -eq $PUBIPID}).Properties}} `
    | select *,@{N='publicIPAllocationMethod';E={(($_.PubIPInfo)).publicIPAllocationMethod}} `
    | select *,@{N='publicIPAddress';E={(($_.PubIPInfo).ipAddress)}}
    
    
    

    This pulls back everything and then you can pull out what you want.


  • Add a Document to CosmosDB via the REST API using PowerShell

    There are a lot of examples out there on how to POST a document to Cosmos DB, but they weren’t working for me. I kept getting a 400 Bad Request. After far to long, I finally got it to work. I need the “x-ms-documentdb-partitionkey” header to make it work.

    Code for anyone who needs it (I hacked the original code here that wasn’t working for me):

    Function Generate-MasterKeyAuthorizationSignature{
    	[CmdletBinding()]
    	Param(
    		[Parameter(Mandatory=$true)][String]$verb,
    		[Parameter(Mandatory=$true)][String]$resourceLink,
    		[Parameter(Mandatory=$true)][String]$resourceType,
    		[Parameter(Mandatory=$true)][String]$dateTime,
    		[Parameter(Mandatory=$true)][String]$key,
    		[Parameter(Mandatory=$true)][String]$keyType,
    		[Parameter(Mandatory=$true)][String]$tokenVersion
    	)
    	$hmacSha256 = New-Object System.Security.Cryptography.HMACSHA256
    	$hmacSha256.Key = [System.Convert]::FromBase64String($key)
    
    	If ($resourceLink -eq $resourceType) {
    		$resourceLink = ""
    	}
    
    	$payLoad = "$($verb.ToLowerInvariant())`n$($resourceType.ToLowerInvariant())`n$resourceLink`n$($dateTime.ToLowerInvariant())`n`n"
    	$hashPayLoad = $hmacSha256.ComputeHash([System.Text.Encoding]::UTF8.GetBytes($payLoad))
    	$signature = [System.Convert]::ToBase64String($hashPayLoad)
    
    [System.Web.HttpUtility]::UrlEncode("type=$keyType&ver=$tokenVersion&sig=$signature")
    }
    

    Code above just creates the auth header and is called below:

    Function Post-CosmosDocuments{
    	[CmdletBinding()]
    	Param(
    		[Parameter(Mandatory=$true)][String]$EndPoint,
    		[Parameter(Mandatory=$true)][String]$DBName,
    		[Parameter(Mandatory=$true)][String]$CollectionName,
    		[Parameter(Mandatory=$true)][String]$MasterKey,
    		[String]$Verb="POST",
            [Parameter(Mandatory=$true)][String]$JSON
    	)
    	$ResourceType = "docs";
    	$ResourceLink = "dbs/$DBName/colls/$CollectionName"
        $partitionkey = "[""$(($JSON |ConvertFrom-Json).id)""]"
    
    	$dateTime = [DateTime]::UtcNow.ToString("r")
    	$authHeader = Generate-MasterKeyAuthorizationSignature -verb $Verb -resourceLink $ResourceLink -resourceType $ResourceType -key $MasterKey -keyType "master" -tokenVersion "1.0" -dateTime $dateTime
    	$header = @{authorization=$authHeader;"x-ms-version"="2015-12-16";"x-ms-documentdb-partitionkey"=$partitionkey;"x-ms-date"=$dateTime}
    	$contentType= "application/json"
    	$queryUri = "$EndPoint$ResourceLink/docs"
        #$header
        #$queryUri
    
        [Net.ServicePointManager]::SecurityProtocol = [Net.SecurityProtocolType]::Tls12
    	$result = Invoke-RestMethod -Method $Verb -ContentType $contentType -Uri $queryUri -Headers $header -Body $JSON 
    }
    

    And to run the functions above:

    $CosmosDBEndPoint = "https://YourDBAccount.documents.azure.com:443/"
    $DBName = "yourDB"
    $CollectionName = "YourCollection"
    $MasterKey = "YourPrimaryKey"
    
    Post-CosmosDocuments -EndPoint $CosmosDBEndPoint -MasterKey $MasterKey -DBName $DBName -CollectionName $CollectionName -JSON ($SomeObject | ConvertTo-Json)
    

    The key was to set the correct contentType and add “x-ms-documentdb-partitionkey” to the headers. This needs to match what your set your DB up with. I am useing “id”

    As a bonus, here is the code to query a DB. Leveraging the same first function to create the auth header:

    Function Query-CosmosDocuments{
    	[CmdletBinding()]
    	Param(
    		[Parameter(Mandatory=$true)][String]$EndPoint,
    		[Parameter(Mandatory=$true)][String]$DBName,
    		[Parameter(Mandatory=$true)][String]$CollectionName,
    		[Parameter(Mandatory=$true)][String]$MasterKey,
            [Parameter(Mandatory=$true)][String]$JSON,		
            [String]$Verb="POST"
    	)
    	$ResourceType = "docs";
    	$ResourceLink = "dbs/$DBName/colls/$CollectionName"
    $query=@"
    {  
      "query": "SELECT * FROM contacts c WHERE c.id = @id",  
      "parameters": [  
        {  
          "name": "@id",  
          "value": "$(($JSON |ConvertFrom-Json).id)"  
        }
      ]  
    } 
    "@
    
    	$dateTime = [DateTime]::UtcNow.ToString("r")
    	$authHeader = Generate-MasterKeyAuthorizationSignature -verb $Verb -resourceLink $ResourceLink -resourceType $ResourceType -key $MasterKey -keyType "master" -tokenVersion "1.0" -dateTime $dateTime
    	$header = @{authorization=$authHeader;"x-ms-version"="2015-12-16";"x-ms-documentdb-isquery"="True";"x-ms-date"=$dateTime}
    	$contentType= "application/query+json"
    	$queryUri = "$EndPoint$ResourceLink/docs"
        #$header
        #$queryUri
    
        [Net.ServicePointManager]::SecurityProtocol = [Net.SecurityProtocolType]::Tls12
    	$result = Invoke-RestMethod -Method $Verb -ContentType $contentType -Uri $queryUri -Headers $header -Body $query 
        return $result
    }
    
    

    Hope that helps someone


  • Connecting to the Azure REST API from an Azure Automation RunBook

    I was looking for data that I couldn’t find in a PowerShell command, so I needed an access token to run a query against an Azure API.

    I was stuck with the basic problem of how do I query the Azure REST endpoints from a RunBook. In my last post, I just learned that you can use the RunAs account for the AutomationAccount in an “Login-AzureRmAccount” session:

    $connection = Get-AutomationConnection -Name AzureRunAsConnection
    $loginresults=Login-AzureRmAccount -ServicePrincipal -Tenant $connection.TenantID `
    -ApplicationId $connection.ApplicationID -CertificateThumbprint $connection.CertificateThumbprint
    

    Taking that a step further, I can then get an access token from the logged in context, and use that with an REST API call:

    $connection = Get-AutomationConnection -Name AzureRunAsConnection
    $loginresults=Login-AzureRmAccount -ServicePrincipal -Tenant $connection.TenantID `
    -ApplicationId $connection.ApplicationID -CertificateThumbprint $connection.CertificateThumbprint
    
    $context = Get-AzureRmContext
    $SubscriptionId = $context.Subscription
    $cache = $context.TokenCache
    $cacheItem = $cache.ReadItems()
    $AccessToken=$cacheItem[$cacheItem.Count -1].AccessToken
    $resourceGroup="MyResourceGroup"
    
    $headerParams = @{'Authorization'="Bearer $AccessToken"}
    $url="https://management.azure.com/subscriptions/$SubscriptionId/resourceGroups/$resourceGroup/providers/Microsoft.Compute/virtualMachines?api-version=2018-06-01"
    $results=Invoke-RestMethod -Uri $url -Headers $headerParams -Method Get
    Write-Output $results.value
    

    Hope that helps someone !


  • Azure Runbook to enable Auto-Shutdown for New VMs

    I have an azure lab subscription (as do you, I am sure). In this lab, I am always provisioning and deleting and scaling VMs. In order to keep my costs down, I want to enable the AutoShutdown feature on all new VMs. I can do that easily with an ARM template. But for machines that aren’t provisioned by one of my templates, I always forget to enable the Auto-Shutdown.

    Here is code to loop through all of your machines and enable the Auto-Shutdown setting. I know there other ways to do this, but I wanted it to be visible when looking at the VM (in the Auto-Shutdown section).

    Take this code and put it in an Azure Automation RunBook, and it will run every night (I run it an hour before the AutoShutdow time!)

    Also, I have never used runbooks before, so I learned that you need lines 1 & 2 to connect to Azure as the Azure Automation RunAs account (no passwords in code!)

    Here is the code:

    $connection = Get-AutomationConnection -Name AzureRunAsConnection
    $loginresults=Login-AzureRmAccount -ServicePrincipal -Tenant $connection.TenantID `
    -ApplicationId $connection.ApplicationID -CertificateThumbprint $connection.CertificateThumbprint
    
    foreach ($rg in $((Get-AzureRmResourceGroup).ResourceGroupName)){
    foreach ($vm in $(Get-AzureRmVM -ResourceGroupName $rg)){
    $shutdown_time = "22:00"
    $shutdown_timezone = "Eastern Standard Time"
    $properties = @{
        "status" = "Enabled";
        "taskType" = "ComputeVmShutdownTask";
        "dailyRecurrence" = @{"time" = $shutdown_time };
        "timeZoneId" = $shutdown_timezone;
        "notificationSettings" = @{
            "status" = "Disabled";
            "timeInMinutes" = 30
        }
        "targetResourceId" = $VM.Id
    }
    try{
    $Status=(Get-AzureRmResource -ResourceId ("/subscriptions/{0}/resourceGroups/{1}/providers/microsoft.devtestlab/schedules/shutdown-computevm-{2}" -f (Get-AzureRmContext).Subscription.Id, $rg, $vm.Name) -ErrorAction stop).Properties.Status
    }
    Catch{
    write-output "Setting $($vm.Name) to auto shutdown @ $shutdown_time (was never enabled)"
    New-AzureRmResource -ResourceId ("/subscriptions/{0}/resourceGroups/{1}/providers/microsoft.devtestlab/schedules/shutdown-computevm-{2}" -f (Get-AzureRmContext).Subscription.Id, $rg, $vm.Name) -Location $vm.Location -Properties $properties -Force
    }
    if ($Status -eq "Disabled"){
    write-output "Setting $($vm.Name) to auto shutdown @ $shutdown_time (was disbaled)"
    New-AzureRmResource -ResourceId ("/subscriptions/{0}/resourceGroups/{1}/providers/microsoft.devtestlab/schedules/shutdown-computevm-{2}" -f (Get-AzureRmContext).Subscription.Id, $rg, $vm.Name) -Location $vm.Location -Properties $properties -Force
    }
    else {
        write-output "$($vm.Name) is already set to auto shutdown"
    }
    }
    }
    

    As you can see I am using some “write-output”s in the code. How can I see them with our having to naviagate to the histroy of the job? Log Analytics! To enable Auzre Automation to write to Log Analytics:

    https://docs.microsoft.com/en-us/azure/automation/automation-manage-send-joblogs-log-analytics

    And here is a Kusto query to see the output of the Automation Job

    AzureDiagnostics 
    | where ResourceProvider == "MICROSOFT.AUTOMATION"
    | where RunbookName_s == "MyJobName" 
    | where Category == "JobLogs" or  Category == "JobStreams" 
    | order by TimeGenerated 
    | project TimeGenerated,CorrelationId,RunbookName_s,ResultDescription,ResultType
    | where TimeGenerated > now() - 1d
    

    Now you can setup an Logic App too run the Kusto query and email you the results!

    Note: This will set all your VMs to Auto-Shutdown. Make sure you don’t run this against your production environment!

    Hope that helps someone.


  • vim: How to stop the tab annoyingness when you paste

    vim: How to stop the tab annoyingness when you paste

    This is a NoteToSelf. Often I paste into VI/VIM, and the tabs go crazy and fly across the screen, leaving the text starting in the middle of the line. Then next line is even further! I can never remember how fix the issue.

    To fix:

    :set paste


  • Moving to the new (and future) Azure PowerShell Module : Az

    It looks like we need to move to the new Az module. It is not required, but future functionality will not be added to AzureRM, so I decided to make the switch. Here is how I went about it.

    First, to enable backwards compatibility, you need to add this command to your profile : “Enable-AzureRmAlias” . You can edit your profile and append the line by:

    notepad $profile
    

    or you can just append it by:

    Add-Content $profile "`nEnable-AzureRmAlias"
    

    (remeber ISE has it’s own $profile so you may need to modify it too)

    Once you have “Enable-AzureRmAlias” in your profile, all your old scripts should still work.

    Next, I wanted to remove all the old AzureRM modules. I had several versions installed, so it took a long time! Note: this code removes any module that starts with Azure*

    foreach ($module in (Get-Module -ListAvailable Azure*).Name) {
        write-host "Removing Module $module"
        Uninstall-module $module -Force
    }
    

    Now that we are feeling clean, add the new module:

    Install-Module Az
    

    For some reason it didn’t install the new Az resource graph module so I added it:

    Install-Module Az.ResourceGraph
    

    I am ready for the future. Hope that helps someone.


  • Code to query Azure Load Balancer Metrics to verify Availability (VipAvailability )

    This one was fun to put together.

    I wanted to write code to query the status of an Azure Load Balancer. I couldn’t find much out there. This code query’s the Azure Load Balancer’s Metrics for VipAvailability – through the REST API. If it returns 100 then are good to go. Anyting else, then there may be a issue. You can query any metric, and you can set a time range, I am just looking at the last min.

    Note: This is for a Standard Load Balancer, not Basic.

    Some of the Metrics Available:

    VipAvailability : Average count of availability of VIP endpoints, based on probe results.
    DipAvailability : Average count of availability of DIP endpoints, based on probe results.
    ByteCount : Total number of bytes processed per front-end.
    PacketCount : Total number of packets processed per front-end.
    SynCount : Total number of SYN packets received.
    SnatConnectionCount : Total number of new SNAT connections, that is, outbound connections that are masqueraded to the Public IP address front-end.
    

    And the same metrics are often referred to by different names (this was confusing to me):

    value               localizedValue                
    -----               --------------                
    VipAvailability     Data Path Availability        
    DipAvailability     Health Probe Status           
    ByteCount           Byte Count                    
    PacketCount         Packet Count                  
    SYNCount            SYN Count                     
    SnatConnectionCount SNAT Connection Count         
    AllocatedSnatPorts  Allocated SNAT Ports (Preview)
    UsedSnatPorts       Used SNAT Ports (Preview) 
    

    Here is the code (bouns: BASH/cURL too) to find the VipAvaiablity of Azure Load Balancers:

    $SubscriptionId = "$($env:SubscriptionId)"
    $TenantId       = "$($env:TenantId)" 
    $ClientID       = "$($env:ClientID)"      
    $ClientSecret   = "$($env:ClientSecret)"  
    $TenantDomain   = "$($env:TenantDomain)" 
    $loginURL       = "https://login.microsoftonline.com/$TenantId/oauth2/token"
    $resource      = "https://management.core.windows.net/" 
    $resourceGroupName = "eastUS-01"
    $body           = @{grant_type="client_credentials";resource=$resource;client_id=$ClientID;client_secret=$ClientSecret}
    $oauth          = Invoke-RestMethod -Method Post -Uri $loginURL -Body $body
    $headerParams = @{'Authorization'="$($oauth.token_type) $($oauth.access_token)"}
    
    $start=((Get-Date).AddMinutes(-1)).ToUniversalTime().ToString("yyy-MM-ddTHH:mm:00Z")
    $end=(Get-Date).ToUniversalTime().ToString("yyy-MM-ddTHH:mm:00Z")
    $filter = "(name.value eq 'VipAvailability') and aggregationType eq 'Average' and startTime eq $start and endTime eq $end and timeGrain eq duration'PT1M'"
    $url = "https://management.azure.com/subscriptions/$SubscriptionId/resourceGroups/$resourceGroupName/providers/Microsoft.Network/loadBalancers/jemurphyLB01/providers/microsoft.insights/metrics?`$filter=${filter}&api-version=2016-09-01"
    $results=Invoke-RestMethod -Uri $url -Headers $headerParams -Method Get
    $results.value | select -ExpandProperty data | select timestamp,average
    
    SUBSCRIPTIONID=""
    RESOURCEGROUPNAME=""
    CLIENTID=""
    CLIENTSECRET=""
    TENANTID=""
    RESOURCEGROUPNAME=""
    LBNAME=""
    
    LOGINURL="https://login.microsoftonline.com/$TENANTID/oauth2/token"
    RESOURCE="https://management.core.windows.net/" 
    
    TOKEN=$(curl --silent --request POST $LOGINURL --data-urlencode "resource=https://management.core.windows.net" --data-urlencode "client_id=$CLIENTID" --data-urlencode "grant_type=client_credentials" --data-urlencode "client_secret=$CLIENTSECRET" | jq -r '.access_token')
    
    STARTTIME=$(date -u +'%Y-%m-%dT%H:%M:00' --date='-1 min')
    ENDTIME=$(date -u +'%Y-%m-%dT%H:%M:00')
    
    FILTER="(name.value eq 'VipAvailability') and aggregationType eq 'Average' and startTime eq $STARTTIME and endTime eq $ENDTIME and timeGrain eq duration'PT1M'"
    URL="https://management.azure.com/subscriptions/$SUBSCRIPTIONID/resourceGroups/$RESOURCEGROUPNAME/providers/Microsoft.Network/loadBalancers/$LBNAME/providers/microsoft.insights/metrics"
    
    RESULTS=$(curl -s -G --header "authorization: Bearer $TOKEN" --data-urlencode "\$filter=$FILTER" --data-urlencode "api-version=2016-09-01" $URL | jq .value[].data[].average)
    
    echo "$RESULTS"
    

    I think the hardest part was trying to get the date and time in the right format. Why is that so hard?

    This HAS to be helpful to some one!


  • Note to self: cURL with data-urlencode for GET/QuerySting values

    I know I would loose this if I didn’t blog it.
    With cURL, you can use “–data-urlencode” with query string params and a GET if you include the “-G” parameter. Of course you still have to escape things out, I just found it easer to add all the QueryString params separately. All the examples I could find were for POSTs.

    FILTER="ReallyLongStringWIth"$VARS" SPACES and ' SINGLE quotes and a &"
    curl -s -G --header "authorization: Bearer $TOKEN" --data-urlencode "\$filter=$FILTER" --data-urlencode "api-version=2016-09-01" $URL