As part of a project to consolidate 3 organizations into 1 new organization, we decided as part of the IT consolidation, end user would go through a reset and re-register process to migrate their device from either BYOD or old environment corporate (Intune) managed to new environment corporate (Intune) managed. This process would occur over a short window of time (such as a weekend) for all users.
While sounding simple enough on paper, there are some snags along the way that will need to be addressed. One big one being how do we migrate the AutoPilot v1 device hashes from the device’s old Intune environment to the new one? Microsoft doesn’t allow you to extract this enmasse from Intune currently. This leaves us with the only option to touch each device individually.
Sounds like a job requiring a bit of automation no?
So we know how to output what we need and what are the import requirements, know we need to build out some scale in the process to collect the output and input in minimal steps (given the short window of time).
Okay, we have declared that Azure Tables shall provide the central storage of our device hashes for import into the new organisations Intune. Let’s quickly build a Storage Account, associated Table Service, Table and output a service level SAS key to be used by our device-level PowerShell to send up the device hash. We can do this by flexing some Bicep skills.
Save the below as a .bicep file.
@maxLength(24)@minLength(3)@description('Specifies the name of the Azure Storage account.')paramstorageAccountNamestring='autopilot${uniqueString(resourceGroup().id, deployment().name)}'@description('Specifies the name of the Azure Table.')paramtableNamestring='autopilotinfo'@description('Specifies the expiry of the service level SAS key.')paramtableSasExpirystring='2025-01-01T00:00:00Z'@description('Specifies the location in which the Azure Storage resources should be deployed.')paramlocationstring=resourceGroup().locationresourcesa'Microsoft.Storage/storageAccounts@2023-01-01'={name:storageAccountNamelocation:locationsku:{name:'Standard_LRS'}kind:'StorageV2'properties:{accessTier:'Hot'}}resourcetableServices'Microsoft.Storage/storageAccounts/tableServices@2023-05-01'={parent:saname:'default'}resourcetable'Microsoft.Storage/storageAccounts/tableServices/tables@2023-05-01'={parent:tableServicesname:tableName}varsasConfig={canonicalizedResource:'/table/${sa.name}/${table.name}'signedPermission:'rau'signedExpiry:tableSasExpirysignedProtocol:'https'keyToSign:'key2'}outputsasTokenstring=sa.listServiceSas(sa.apiVersion,sasConfig).serviceSasToken
To kick off the deployment, create the necessary resouce group; for example:
az group create --name Autopilot-RG --location "Australia East"
Now let’s deploy our bicep file (in this example, the above bicep was saved as deployaztable.bicep):
az deployment group create \
--name AutopilotTableDeployment \
--resource-group Autopilot-RG \
--template-file deployaztable.bicep \
--output none
and note, when successfully deployed, in the outputs section, our service-level SAS key is displayed to access the table with later.
With our central storage for the device hashes sorted, now we need to piece together something that will extract said device hashes and upload to the Azure Table, and do so in a scalable way.
We already have the extraction method sorted with the provided Get-WindowsAutopilotInfo.ps1 PowerShell script, so let’s dig into the script and extend upon it.
Fortunately, the script has a great level of detail, including examples built in. A quick glean of this shows that, without any paramaters defined, the script will extract the necessary information via WMI from the local host and output back to the Shell as a object:
# Create a pipeline object$c=New-Objectpsobject-Property@{"Device Serial Number"=$serial"Windows Product ID"=$product"Hardware Hash"=$hash}
Perfect, this means we can easily manipulate with PowerShell to do what we need.
Now we just need to take this outputted object and send it to the Azure Table. Enter
AzTable module.
Now, this module is a bit of a weird one. It’s somewhat official (copyright is Microsoft Corp) but the PowerShell Gallery specifically references a personal website that no longer works…
Never fear, as
Microsoft Learn has some example on how to use the module. But if your experience is anything like mine, you’ll quickly find out the method described to authenticate and retrieve a table doesn’t work.
Most likely, this is centered around that tables use a authentication abstraction referenced as Context. The documentation retrieves this context during the creation of a example storage account that the table lives in.
The documentation does not provide any steps of getting this context for existing tables.
Yikes
Fortunately, the internet take away but also giveth. Microsoft MVP Travis Roberts provides a
great example of how to connect to existing Storage Accounts and Tables within via the creation of the context object:
# Step 2, Connect to Azure Table Storage$storageCtx=New-AzureStorageContext-StorageAccountName$storageAccountName-SasToken$sasToken$table=Get-AzureStorageTable-Name$tableName-Context$storageCtx
With authentication sorted, now we just need to write our table data. For that, we need 4 key elements:
table
This is our Azure Table. Already retrieved as part of our context building earlier
partitionKey
Partition Keys enable us to separate data into several, you guessed it, partitions. This is not needed in our case so our key will all be the same
rowKey
Row Keys are a unique identifier for each table data entry. New-Guid will be handy here
property
This is our meat and potatoes, our Autopilot data including the hash will go here. This is in a format of hashtable.
With this knowledge, we can place these puzzle pieces together into a single PowerShell script that will install our dependencies needed for each script/module to work, connect to our table, retrieve the Autpilot hash and take that output and enter as a new row to our table. Without further ado, LFG:
Two things to call our in the script.
Azure Tables don’t enjoy spaces in their column headers. This is why, when writing to the table row, we are not using the the names defined in the object as outputted from Get-WindowsAutopilotInfo.ps1:
-property@{"DeviceName"="$($env:COMPUTERNAME)";"SerialNumber"="$(($Autopilot).'Device Serial Number')";"Hash"="$(($Autopilot).'Hardware Hash')"}
Jumping a head a bit but we will deploy this script at scale with Intune as a Win32 package and thus, to help detect it’s successful execution, we are writing a value to the registry to check against:
# Set detection key to indicate script has runNew-Item-Path'HKLM:\SOFTWARE\AutoPilotCollection'-ForceNew-ItemProperty-Path"HKLM:\SOFTWARE\AutoPilotCollection"-Name"AutoPilotInfoCaptured"-Value1-PropertyTypeDWORD-Force
Set your Install command to powershell.exe -ExecutionPolicy Bypass -File .\Get-AutoPilotInfoAndSend.ps1
Remember that the Intune agent runs as x86 application and therefore, our detection should look like the following:
If we don’t set the last option to Yes, Intune will look for the registry key in the wrong place (it lands in WOW6432Node key).
That’s it.
All that’s left is to extract the data from the Azure Table (I suggest using
Azure Storage Explorer) to a CSV and upload into Intune.
It was quite a entertaining challenge to put this together to bridge the (intentional) gap left by Microsoft. I hope this may help others who face a similar challenge or just want to get started interacting with Azure Tables via PowerShell.