I'm creating a script to turn into a commandlet we can reuse as part of our custom PowerShell module. This works, but is very slow. I don't suspect it is anything to do with my code. I have to individually load each site to retrieve the title even though this property is visible off the $site object, it is always empty/null.
Anything else I can do to improve the efficiency?
$path = "C:\Users\$env:USERNAME\Desktop\SPOExport.csv"
$csv = "Title,URL,Sharing Capability,Storage Quota (MB),Template`r`n"
$sites = Get-SPOSite -Limit all | Select Url,SharingCapability,StorageQuota,Template
foreach($site in $sites){
try{
$context = New-Object Microsoft.SharePoint.Client.ClientContext($site.Url)
$context.Credentials = New-Object Microsoft.SharePoint.Client.SharePointOnlineCredentials($credential.UserName, $credential.Password)
$web = $context.Web
$context.Load($web)
$context.ExecuteQuery()
$title = $web.Title
}
catch {
$title = "Error fetching title"
}
finally{
$csv +=$title + "," + $site.Url + "," + $site.SharingCapability + "," + $site.StorageQuota + "," + $site.Template
}
}
$fso = new-object -comobject scripting.filesystemobject
$file = $fso.CreateTextFile($path,$true)
$file.write($csv)
$file.close()
Note that I have not fully adapted it into a commandlet yet, which will have all the commenting and help text available, I'm more worried about the code and the time it takes for this operation presently.