$dirs= dir -Path \\server\SQLbackup\* -Recurse | Where-Object -FilterScript {($_.fullname -notlike "*Blah*") -and ($_.psIsContainer -eq $true) -and ($_.fullname -like "*servername*") }
foreach($dir in $dirs)
{
#write-host $dir.fullname
$fs= Get-ChildItem $dir.FullName -Recurse | Where-Object {(-not $_.PSIsContainer)} | Where-Object {($_.LastWriteTime -lt (get-date).adddays(-60))}
foreach($f in $fs)
{
write-host $f.fullname
}
write-host "Deleting File $File" -foregroundcolor "Red";Remove-Item $File.FullName |out-null
}
This just simply starts at a root directory and looks for all files that have a folder name NOT LIKE Blah, it has to be a folder, and must have the root name of all my servers in the path. Then it gets all items where the lastwritetime is older than 60 days. for each one of those it issues a Remove-Item command on the FULL name.
Some issues here...
- For big directories it is somewhat slow.
- If by some chance the directory name to search on is invalid or not accessible then it will resort to local drives - that is VERY bad because this would delete those files.
- Perhaps sending the initial output to a file then reading that file back in would be safer - at least I could examine the results before deleting.