This following post is for anyone which needs to manage storage policies at scale and uses vRops.

Imagine the scene, you have 1000’s of servers mostly with the default storage policy for the specific datastore but then some edge case VM’s require a specific policy or lets say a specific VMDK does… Unfortunately there does not appear to be a easy / useful way to track this over time….

vCenter events get cleared out faster than you can blink, vRLI can’t just have endless amounts of storage and anyone which has ever used vRLI’s NFS archive know how useful that is to restore from…

This solution uses PowerCLI to pull the data from vCenter and then pushes it all into vRops as custom properties against each VM…. While PowerCLI get-SpbmEntityConfiguration data collection does take a very long time because of the inefficient looping when you have 1000’s of VM’s. I have some idea’s to improve that in the near future as well as the “DIFF” so that the script is only pushing updates instead of a full push to vROPS every time the script runs.

I only run the script once a day as I am just trying to track changes which are mistakes… I can use all the usual bells and whistles from vRops for alerting on changes / reporting etc…

You will need to generate the credential files and place them within the \config directory (one for vRops and another for vCenter if the same account can’t be used.)

$cred = Get-Credential
$cred | Export-Clixml -Path "HOME-VC.xml"

Anyway grab the latest version of the script from my GIT, here is an example to run the script with the required parameters

.\getStoragePolicy.ps1 -vc '' -creds 'Home-VC' -vRopsAddress '' -vRopsCreds 'Home-VROPS' -FileName 'D:\StoragePolicy\Output\vcsa.csv' -ImportType 'Full'

Make sure to lookout at \Log\LogFile.log for the status / progess of the script. Once it’s complete check vRops for your new properties.

Well that’s about all I got for now, hope it was helpful.