Azure FileShare: Remove old files PowerShell

Azure File Share is a cloud-based file storage service in Microsoft Azure.

Azure File Share provides highly available, secure, and scalable storage for applications, services, and virtual machines running in the cloud or on-premises. It can be used for a variety of scenarios, including storing files for distributed applications, sharing files across different platforms and devices, and backing up data to the cloud.

With Azure File Share, users can easily store and share files in the cloud without the need for complex infrastructure or expensive hardware.

In this post we will be going through creating a script to report and if required remove files that are older than specific days.

First we need to get the storage account so we can use the context to query the file share.

Get-AzStorageAccount -ResourceGroupName name -Name storageac name

To get Azure file share there is no parameter to use resource group and storage account name so this is why we need to use context.

$sacontext = (Get-AzStorageAccount -ResourceGroupName Resourcegroup -StorageAccountName Name).Context
Get-AZStorageFile -Context $sacontext -ShareName sharename

To get the list of folders we can use the CloudFileDirectory property.

(Get-AZStorageFile -Context $sacontext -ShareName Name).CloudFileDirectory.Name

To get files we can use the CloudFile property.

(Get-AZStorageFile -Context $sacontext -ShareName operations-share).CloudFile

When checking the file properties most are blank.

To return the properties we need to fetch these using fetch attributes

$files = (Get-AZStorageFile -Context $sacontext -ShareName name).CloudFile

Once we have the properties we can create the script.

When running the script I have set two parameters. One to export to CSV and second one is to delete the files. If neither are set the script default out to the PowerShell console.

When using the -reportexport it will output to a csv file.

When using -delete the script will remove the files on the Azure file Share.

There is currently a bit of a limitation with Get-AZStorageFile command as it has no recursive parameter so going more than one folder down is difficult so this might not work for everyone.

The full script can be download from the below GitHub repository.


Just wanted to do a quick post as I was having issue connecting to Az CLi. When connection it was failing to validate the certificate.

I was getting the below verification error.

HTTPSConnectionPool(host=’’, port=443): Max retries exceeded with url: /organizations/v2.0/.well-known/openid-configuration (Caused by SSLError(SSLCertVerificationError(1, ‘[SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed: self signed certificate in certificate chain (_ssl.c:997)’)))

The issue looked to be caused by the SSL inspection being done on the firewall. To check the cert being used by the firewall I used the developer mode in Edge using F12. When in the developer mode go to security and view certificate.

There where a few different recommendations online to set environment variables and run python commands but none of these fixed the issue for me. These where all related to connection request going proxy but in this case to I wasn’t using a proxy.

I found this GitHub issue page

tried the python command to use the local system cert store but this didn’t work, someone said to try add the cert file content to the cacert.pem file in the Microsoft SDK. I tried to add it manually to cacert.pem.

Below steps fixed the issue for me.

  1. Using MMC console, export the root cert from my local cert store that was used for the SSL inspection as Base-64 encoded
  2. Use OpenSSL to view the cert content I used “openssl x509 -in ‘Firewall_Root_Cert.cer’ -text” ( can also use notepad but that does have the issuer or subject details)
  3. Go to C:\Program Files (x86)\Microsoft SDKs\Azure\CLI2\Lib\site-packages\certifi\cacert.pem
  4. Open using NotePad ++ or notepad and then added the cert content to the bottom after the last cert

Below is the update cacert.pem file.

Once I added the root cert content, I was then able to connect without issue.