There is Azure Resource Graph Explorer which is a very useful tool for auditing cloud resources over multiple subscriptions under the same tenant
Nowadays, Azure Portal does not have any view which provides possibility to understand which resource exactly connected to a Virtual Network because most of resources are attached to Vnet over its Network Interface. So, when we look to Connected Devices tab we see only NICs names and not names of resources.
It possible to make more complex views with KUSTO query language and join information about several resources. In our case, to get the list of VMs with associated Vnets, there are VMs, NICs and VNets:
| where type == "microsoft.compute/virtualmachines"
| project name, vmnics = (properties.networkProfile.networkInterfaces)
| mv-expand vmnics
| project name, vmnics_id = tostring(vmnics.id)
| join (Resources | where type == "microsoft.network/networkinterfaces" | project nicname=(name), vmnics_id = tostring(id), properties) on vmnics_id
| mv-expand ipconfigs = (properties.ipConfigurations)
| extend subnet_resource_id = split(tostring(ipconfigs.properties.subnet.id), '/')
| order by name asc, nicname asc
| project vmname=(name), nicname, vnetname=subnet_resource_id, subnetname=subnet_resource_id
To avoid this error only possible way which I have found it to use parameters_body argument. In this case we will lost any validation by Terraform except validity of parameters JSON but we will able to put any type of parameters. The further validation of parameters will be done by ARM.
To avoid Azure Storage account keys usage and give to user the just enough access that is recommended to use Azure AD authentication and RBAC.
To download or read the blob from Storage Account with Private container, user needs at least “Storage Blob Data Reader” role (even if he is an owner of Storage Account resource)
Azure CLI script example:
az account set -s $subscription_id
az storage blob download --account-name "$storage_account_name" \
--container-name "$container_name" \
--name "$blob_path" \
--file "$output_file_path" \
In Linux you have also Python by default and Python is included with Azure CLI installation (with all Azure, Azure AD, Azure Storage Python modules), following Python script can be used to get the similar to Azure CLI result with Device Login:
from azure.storage.blob import (
from azure.storage.common import (
storage_account_name = "<storage-account-name>"
container_name = "<container-name>"
blob_path = "<blob-name>"
output_file_path = "<local-file-path>"
# only for example Azure CLI Application ID
client_id = '04b07795-8ddb-461a-bbee-02f9e1bf7b46'
# Your organisation's Tenant ID which used for RBAC for Storage
tenant_id = '<tenant-id>'
authority_uri = ('https://login.microsoftonline.com/' + tenant_id + '/')
resource_uri = 'https://storage.azure.com'
context = adal.AuthenticationContext(authority_uri, api_version=None)
code = context.acquire_user_code(resource_uri, client_id)
mgmt_token = context.acquire_token_with_device_code(resource_uri, code, client_id)
block_blob_service = BlockBlobService(
account_name = storage_account_name,
token_credential = get_device_login_token()
block_blob_service.get_blob_to_path(container_name, blob_path, output_file_path)
Sometimes we need to have a CSV in output of the script. The manual concatenation of string is not a beautiful solution for object-oriented PowerShell. The easiest way to create the table and export it to CSV for me :
Usually any console is using monospaced fixed-width fonts, so, all characters should be well aligned. But i=even if we can understand the figures it does not looks like a chessboard. To make it more realistic we can use escape sequences. Now you can modify your code to add right background colors for chessboard cells:
Before thinking about Nested templates that is possible to imagine a lightweight scenario based on standard ARM template capabilities. Types of variables and parameters in ARM template are not only scalars and they can represent objects and arrays.
Troubleshooting and debugging PowerShell DSC configurations could be sometimes very painful. If you are preparing your DSC configuration to use in Azure or even on-premises, you would like to test it in the real environment.