Archive | PowerShell

Useful PowerShell Get-HotFix commands

I can never remember these, and now that I have posted them, I don’t have to! I will add more as I need them:



PowerShell 3: Using Invoke-RestMethod to refresh a new oAuth 2 token

I wanted to translate this code into powershell. Below is the Powershell code to request a refresh token from Google using oAtuh 2.

$CLEINTID="1234567890.apps.googleusercontent.com"$CLIENTSECRET="aBcDeFgHiJkLmNoPqRsTuVwXyZ"
$REFRESHTOKEN="1/551G1yXUqgkDGnkfFk6ZbjMLMDIMxo3JFc8lY8CAR-Q"$URL = "https://accounts.google.com/o/oauth2/token"
$Body= 'client_secret={0}&grant_type=refresh_token&refresh_token={1}&client_id={2}' -f$CLIENTSECRET,$REFRESH_TOKEN,$CLEINTID
Invoke-RestMethod -URI $URL -Method Post -Body$Body


Hope that helps someone.

My upgrade of SharePoint 2007 to 2010 “script”

One of my most recent projects was the migration of our intranet from SharePoint 2007 to 2010. Since we were going to change the name of the site, I was able to run through this “script” several times as practice to make sure I had everything correct.

I decided to do a detach and attach method. Here are some of the things we did.

1. We preformed several test a detach and attach upgrades with the new URL. This allowed us to test everything using the new url, and make changes back in the original 2007 site so that it would work once we performed the final live cutover.
2. All new code/pages/hacks were added to the 2010 site into New documents libraries. These were backed up using this script and restored after every new test detach an attach test. This way all new code would be in place with the final live cutover.
3. Since we were doing a new Navigation, we created the navigation in the old 2007 site, and hid them by audience. Then one of the steps below is to change the audience which would un-hide the new navigation in the final cutover.

Step 1. Backup all the new code/pages/hacks that have been added to the new site that needs to be restored.

$folderDate=$(get-date -uformat &quot;%Y-%m-%d&quot;)
$folderHour=$(get-date -uformat &quot;%H&quot;)
$backupDir=&quot;\\Path\To\Backup\$folderDate\$folderHour&quot; foreach ($web in $(Get-SPSite | Get-SPWeb)){ foreach ($list in $web.Lists) { mkdir -force &quot;$backupDir\$($Web.Title.replace(' ','').replace('/','-'))\&quot;
Export-SPWeb $($web.Url) -itemurl &quot;$($list.RootFolder.ServerRelativeUrl)&quot; -path &quot;$backupDir\$($Web.Title.replace(' ','').replace('/','-'))\$($list.Title.replace(' ','').replace('/','-')).cmp&quot; } }  Now we have captured all the changes that were made to the new site (which we will be restoring after the next cutover test) Step 2. Remove the previous test cutover site Remove-SPWebApplication &quot;SharePoint - intranet.company.com80&quot; -RemoveContentDatabases -DeleteIISSite  Step 3. Re-create new app and apppool. New-SPWebApplication -Name &quot;SharePoint - intranet.company.com80&quot; -Port 80 -HostHeader intranet.company.com -URL &quot;http://intranet.company.com&quot; -ApplicationPool &quot;SharePoint - intranet.company.com80&quot;  Step 4. remove the content database that is created by default Get-SPContentDatabase -WebApplication &quot;SharePoint - intranet.company.com80&quot; | Remove-SPContentDatabase  Step 5. Backup 2007 site to a share and restore to new SQL server (or new db name on existing SQL server). Backup: $SRCSQLSERVER='OldSQLServer'
$DESTSQLSERVER='NewSQLServer'$sqlcmdBackup=&quot;BACKUP DATABASE [Content_Intranet] TO DISK = N'\\path\to\network\share\Content_Intranet.bak' WITH NOFORMAT, NOINIT,  NAME = N'Content_Intranet FullBackup', SKIP, NOREWIND, NOUNLOAD,  STATS = 10&quot;
invoke-sqlcmd -query &quot;$sqlcmdBackup&quot; -Server$SRCSQLSERVER -QueryTimeout 1200


Restore:

$sqlcmdRestore=&quot;RESTORE DATABASE [Content_Intranet] FROM DISK = N'\\path\to\network\share\Content_Intranet.bak' WITH FILE = 1, MOVE N'Content_Intranet' TO N'K:\Microsoft SQL Server\MSSQL10_50.MSSQLSERVER\MSSQL\DATA\Content_Intranet.mdf', MOVE N'Content_Intranet_log' TO N'K:\Microsoft SQL Server\MSSQL10_50.MSSQLSERVER\MSSQL\DATA\Content_Intranet_log.LDF', NOUNLOAD, REPLACE, STATS = 10&quot; invoke-sqlcmd -query &quot;$sqlcmdRestore&quot; -Server $DESTSQLSERVER -QueryTimeout 1200  Step 6. Mount the restored database and upgrade the uner experience. Mount-SPContentDatabase -Name Content_Intranet -DatabaseServer NewSQLServer -WebApplication http://intranet.company.com -Updateuserexperience  Step 7. Re-import exported Document Libraries that contain the new code/pages/apps Import-SPWeb http://intranet.company.com -Path \\path\to\network\share\date\hour\LibraryName.cmp  Step 8. Clean up navigation by changing audience, change homepage to new page in restored Document Libraries. Step 9. Alter web.config for Cisco WebVPN Step 9. Allow inline PDFs That was it. I did it several times, and it ended up being a smooth cutover. PowerShell command to allow inline PDF viewing in SharePoint 2010 My users like to view PDFs in their browser on our SharePoint site. I needed to allow this in 2010: Here is the powershell to allow inline PDF viewing in SharePoint 2010 $webapps = Get-SPWebApplication "SharePoint - intranet.company.com80"
foreach ($webapp in$webapps)
{
$webapp.AllowedInlineDownloadedMimeTypes.Add("application/pdf")$webapp.Update()
}


PowerShell 3 – Invoke-WebRequest and Invoke-RestMethod, unable to set the Accept header?

With both of these commands, when I try to add an Accept header (I want to receive my CRM 2011 data in JSON format, so I need Accept=application/json) I receive the error:

“This header must be modified using the appropriate property or method.”

I think this is a bug. This link shows the bug, and I agree that the workaround does not apply

PowerShell 3: Invoke-WebRequest vs Invoke-RestMethod and a SharePoint 2010 list with more than 1000 entries

When using Invoke-RestMethod with a SharePoint 2010 list and ListData.svc, it returns an “System.Xml.XmlElement#http://www.w3.org/2005/Atom#entry” object. Not sure why, but the end result is that you can’t get access to the “rel=next” url, or am I doing something wrong?

$Results=Invoke-RestMethod -uri$ListUrl -UseDefaultCredentials
$Results | gm TypeName: System.Xml.XmlElement#http://www.w3.org/2005/Atom#entry  I had to use Invoke-WebRequest and then take the Content and put it in an XML variable, only then could I get to the next page of items. $MailingLabels = Invoke-WebRequest -Uri $ListUrl -UseDefaultCredentials$Next =  ($MailingLabelsXML.feed.link | ?{$_.rel -eq "next"}).href


Thoughts?

PowerShell: foreach with a first and last item number

I wanted to work through 1000+ items in a PowerShell foreach loop, but I wanted to do it X at a time. I figured out the following foreach syntax to loop through all items after first and before last:

$First = 15$Last = 45
foreach ($Row in$Rows | select -first $Last | select -last (($Last - $First)+1)){ . . . . }  Using PowerShell to add a Contact to a CRM 2011 MarketingList (SOAP) We had a user delete a Marketing List. I needed to recreate it. I went to a database backup and found the GUID of the deleted list. Then I used the following SQL query to find the GUIDs of all the members of that list: SELECT FullName ,ParentCustomerIdName ,[EntityId] ,[ListId] ,[ListMemberId] FROM [CRMDataBaseName].[dbo].[ListMember],[CRMDataBaseName].[dbo].Contact where ListId = '787b77ca-c47d-431b-863e-12a98969b097' AND [EntityId] = ContactId order by LastName,FirstName  I saved the EntityId column to a text file, and then I used the following PowerShell code to loop through the GUIDs and add them to a new MarketingList $ListMembers = Get-Content C:\IT\Temp\ListMemberGUIDs.txt
foreach ($EntityId in$ListMembers){
$xml = ""$xml += "<s:Envelope xmlns:s='http://schemas.xmlsoap.org/soap/envelope/'>";
$xml += " <s:Body>";$xml += "    <Execute xmlns='http://schemas.microsoft.com/xrm/2011/Contracts/Services' xmlns:i='http://www.w3.org/2001/XMLSchema-instance'>";
$xml += " <request i:type='b:AddMemberListRequest' xmlns:a='http://schemas.microsoft.com/xrm/2011/Contracts' xmlns:b='http://schemas.microsoft.com/crm/2011/Contracts'>";$xml += "        <a:Parameters xmlns:c='http://schemas.datacontract.org/2004/07/System.Collections.Generic'>";
$xml += " <a:KeyValuePairOfstringanyType>";$xml += "            <c:key>ListId</c:key>";
$xml += " <c:value i:type='d:guid' xmlns:d='http://schemas.microsoft.com/2003/10/Serialization/'>5deb4efb-4ed7-47f3-8e8e-bb487e0db423</c:value>";$xml += "          </a:KeyValuePairOfstringanyType>";
$xml += " <a:KeyValuePairOfstringanyType>";$xml += "            <c:key>EntityId</c:key>";
$xml += " <c:value i:type='d:guid' xmlns:d='http://schemas.microsoft.com/2003/10/Serialization/'>$($EntityId)</c:value>";$xml += "          </a:KeyValuePairOfstringanyType>";
$xml += " </a:Parameters>";$xml += "        <a:RequestId i:nil='true' />";
$xml += " <a:RequestName>AddMemberList</a:RequestName>";$xml += "      </request>";
$xml += " </Execute>";$xml += "  </s:Body>";
$xml += "</s:Envelope>";$url="http://crm.sardverb.com/SardVerbinnen/XRMServices/2011/Organization.svc/web"

$http_request = New-Object -ComObject Msxml2.XMLHTTP$http_request.Open('POST', $url,$false)
$http_request.setRequestHeader("SOAPAction", "http://schemas.microsoft.com/xrm/2011/Contracts/Services/IOrganizationService/Execute")$http_request.setRequestHeader("Content-Type", "text/xml; charset=utf-8")
$http_request.setRequestHeader("Content-Length",$xml.length)
$http_request.send($xml)
}



PowerShell script to backup all SharePoint 2010 lists in all webs in all sites

More backups the better. I wanted a file level backup of every list. Below I used PowerShell to iterate through all the lists on the server and dump them into a folder

$backupDir="c:\Temp" foreach ($web in $(Get-SPSite | Get-SPWeb)){ foreach ($list in $web.Lists) { mkdir -force "$backupDir\$($Web.Title.replace(' ',''))\"
Export-SPWeb $($web.Url) -itemurl "$($list.RootFolder.ServerRelativeUrl)" -path "$backupDir\$($Web.Title.replace(' ',''))\$(\$list.Title.replace(' ','')).cmp"
}
}