Copy Veeam backups to AWS S3 using PowerShell

Recently I was looking for a way to move monthly Veeam backups into AWS S3 without having the client invest in Veeam Cloud Connect or AWS Storage Gateway.

So I started by re-configuring the monthly Veeam backup job to produce a full backup instead of the usual incremental, this made each .vbk file (i.e. Full backup file) generated at the end of the job independent of the others, and allowed me to upload them to S3 and reduce my retention to one.

The goal was to automate this process and avoid manual monthly uploads. This naturally called for some PowerShell scripting.

Reducing the retention to one meant that each month the .vbk file would be overwritten, so I had to make sure the uploaded .vbk file in S3 was good. The only way I could know if the file was good or not was by generating a MD5 hash of the .vbk file in the local repository and comparing the hash with the S3 object ETag once the .vbk was uploaded. Unfortunately this approach didn’t work because the Write-S3Object cmdlet uses the Multipart API when uploading to S3 and stores files in chunks, so the ETag you get back from S3 is not for the whole file but for the first chunk. My workaround was to simply generate the MD5 hash, upload the file, then download the file back, generate another hash and compare it to the first hash.

Anyways here is the script, just configure it to run after the job, using the Advanced Settings\Scripts tab of your monthly backup job.

Also note that I’ve added basic logging and alerting functionality to the script.

I hope someone out there finds this script useful!!!


#Written by Cengiz Ulusahin 25/05/17
#This script runs after each Veeam monthly backup job and copies the latest full backup file into Amazon S3
#It offers a MD5 check on uploaded files, basic logging and alerting functionality

#Function for logging
Function Write-Log
{
Param ([string]$logstring)
$stamp = (Get-Date).toString("yyyy/MM/dd HH:mm:ss")
$line = "$stamp $logstring"
Add-content $logfile -value $line
}

#Function for sending email alerts
Function Send-Alert
{
Send-MailMessage -SmtpServer "Enter IP address" -To "Enter To email address" -From "Enter From Email address" -Subject "Company Monthly AWS S3 Copy Job" -Body "Company monthly backup copy job to AWS S3 has failed or completed with errors. Please alert the backup team immediately by opening a ticket. Copy job log: $logfile."
}

#Function to Pause script, use for debugging
Function Pause
{
Read-Host 'Press any key to continue…' | Out-Null
}

#Define static variables
$AccessKey = "Enter AWS Access Key"
$SecretKey = "Enter AWS Secret Key"
$bucketname = "Enter S3 bucket name"
$root = "Enter the path of your monthly backup folder (e.g. D:\Monthly-Backups)"
$temproot = "Enter a path for the temporary vbk file (e.g. D:\Monthly-Backups\S3\Temp)"
$filter="*.vbk"
$logfile = "Enter a path for the script log file (e.g. D:\Monthly-Backups\S3\Company-S3-Copy.log)"

#Get latest vbk file
$vbkfile = Get-ChildItem -Path $root -Filter $filter | Sort-Object LastAccessTime -Descending | Select-Object -First 1
$vbkfilefullpath = join-path -path $root -childpath $vbkfile

#Generate hash for vbk file
$hash = (Get-FileHash $vbkfilefullpath -Algorithm MD5).hash

#Upload vbk file to S3 bucket
Write-S3Object -BucketName $bucketname -CannedACLName bucket-owner-full-control -File $vbkfilefullpath -Key $vbkfile -AccessKey $AccessKey -SecretKey $SecretKey

#Get vbk file copied to S3 bucket
$vbks3 = (Get-S3Object -BucketName $bucketname -Key $vbkfile -AccessKey $AccessKey -SecretKey $SecretKey).key

#if else to check if the file was uploaded successfully
if ($vbks3 -ne $vbkfile)
{
Write-Log "File upload to S3 was unsuccessful. Terminating script now!"
Send-Alert
Exit
}
else
{
Write-Log "File upload to S3 was successful."
}

#Define Temp vbk file path and name
$temppath = "$temproot\$vbkfile"

#Download vbk file from S3 into Temp path
Copy-S3Object -BucketName $bucketname -Key $vbkfile -LocalFile $temppath -AccessKey $AccessKey -SecretKey $SecretKey

#Get Temp vbk file downloaded into Temp path
$tempfile = (Get-ChildItem -Path $temppath).Name

#if else to check if the file was downloaded successfully
if ($tempfile -ne $vbkfile)
{
Write-Log "File download from S3 was unsuccessful. Terminating script now!"
Send-Alert
Exit
}
else
{
Write-Log "File download from S3 was successful."
}

#Generate hash for Temp vbk file
$hash2 = (Get-FileHash $temppath -Algorithm MD5).hash

#if else to compares hashes and remove Temp file if hashes are equal
if ($hash2 -ne $hash)
{
Write-Log "Hash tags are not equal! Terminating script now!"
Send-Alert
Exit
}
else
{
Write-Log "Hash=$hash Hash2=$hash2 Hash tags match!"

#Remove Temp vbk file
Remove-Item –Path $temppath –recurse
$temppath2 = (Get-ChildItem -Path $temproot | Measure-Object).count

#if else to chrck if file is deleted or not
if ($temppath2 -eq 0)
{
Write-Log "Removed Temp vbk file successfully."
}
else {
Write-Log "Couldn't remove Temp vbk file!"
Send-Alert
}
}

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.