Thursday, February 16, 2012

I am currently working my way through some PowerShell scripts for finding and removing old SQL backup files that have been hanging around for sometime now.  The automated process that creates these backup files does only keep a certain number of files that are a certain number of days old, however this process is not fail safe and it will leave around older backups.  I am working to keep the total number of files around to a minimum so I am implementing this process in PS.  Here is what I have so far...


$dirs= dir -Path \\server\SQLbackup\* -Recurse | Where-Object -FilterScript {($_.fullname -notlike "*Blah*") -and ($_.psIsContainer -eq $true) -and ($_.fullname -like "*servername*") }
 foreach($dir in $dirs)
    {
    #write-host $dir.fullname
    $fs= Get-ChildItem $dir.FullName -Recurse | Where-Object {(-not $_.PSIsContainer)} | Where-Object {($_.LastWriteTime -lt (get-date).adddays(-60))}
    foreach($f in $fs)
    {
        write-host $f.fullname
    }
   write-host "Deleting File $File" -foregroundcolor "Red";Remove-Item $File.FullName |out-null
    }

This just simply starts at a root directory and looks for all files that have a folder name NOT LIKE Blah, it has to be a folder, and must have the root name of all my servers in the path.  Then it gets all items where the lastwritetime is older than 60 days.  for each one of those it issues a Remove-Item command on the FULL name.

Some issues here...

  • For big directories it is somewhat slow.
  • If by some chance the directory name to search on is invalid or not accessible then it will resort to local drives - that is VERY bad because this would delete those files.  
  • Perhaps sending the initial output to a file then reading that file back in would be safer - at least I could examine the results before deleting.




I am back....Will be posting here a lot more often...