Tag Archives: Veeam

Change VM MoRef ID in Veeam B&R database

Nothing revolutionary here, but I thought I’ll document the procedure as I happen to use it every now and then.

All VMs are assigned with a MoRef id when they are added to a vCenter deployment.

When you want to backup a VM using Veeam B&R, you add it into a pre-configured job using its name, however on the backend Veeam associates the VM with that job using the MoRef id.

If the VM which is being backed up by Veeam B&R is removed from the vCenter inventory (because you removed a whole host from vCenter or a VM was in an invalid state), the MoRef id for that VM will change when the VM is added back into the vCenter. Veeam will not register the MoRef id change automatically and kick off a new full backup (on the next backup cycle) for the same VM it previously backed up – thinking it’s a completely new VM (due to the unique new MoRef id).

To avoid a new full backup, you need to manually update the MoRef id for that VM in the Veeam database prior to the next backup cycle.

Note that the following steps are not supported by Veeam officially. Mess with the Veeam database at your own risk!

Continue reading

Copy Veeam backups to AWS S3 using PowerShell

Recently I was looking for a way to move monthly Veeam backups into AWS S3 without having the client invest in Veeam Cloud Connect or AWS Storage Gateway.

So I started by re-configuring the monthly Veeam backup job to produce a full backup instead of the usual incremental, this made each .vbk file (i.e. Full backup file) generated at the end of the job independent of the others, and allowed me to upload them to S3 and reduce my retention to one.

The goal was to automate this process and avoid manual monthly uploads. This naturally called for some PowerShell scripting.

Reducing the retention to one meant that each month the .vbk file would be overwritten, so I had to make sure the uploaded .vbk file in S3 was good. The only way I could know if the file was good or not was by generating a MD5 hash of the .vbk file in the local repository and comparing the hash with the S3 object ETag once the .vbk was uploaded. Unfortunately this approach didn’t work because the Write-S3Object cmdlet uses the Multipart API when uploading to S3 and stores files in chunks, so the ETag you get back from S3 is not for the whole file but for the first chunk. My workaround was to simply generate the MD5 hash, upload the file, then download the file back, generate another hash and compare it to the first hash.

Anyways here is the script, just configure it to run after the job, using the Advanced Settings\Scripts tab of your monthly backup job.

Also note that I’ve added basic logging and alerting functionality to the script.

I hope someone out there finds this script useful!!!


#Written by Cengiz Ulusahin 25/05/17
#This script runs after each Veeam monthly backup job and copies the latest full backup file into Amazon S3
#It offers a MD5 check on uploaded files, basic logging and alerting functionality

#Function for logging
Function Write-Log
{
Param ([string]$logstring)
$stamp = (Get-Date).toString("yyyy/MM/dd HH:mm:ss")
$line = "$stamp $logstring"
Add-content $logfile -value $line
}

#Function for sending email alerts
Function Send-Alert
{
Send-MailMessage -SmtpServer "Enter IP address" -To "Enter To email address" -From "Enter From Email address" -Subject "Company Monthly AWS S3 Copy Job" -Body "Company monthly backup copy job to AWS S3 has failed or completed with errors. Please alert the backup team immediately by opening a ticket. Copy job log: $logfile."
}

#Function to Pause script, use for debugging
Function Pause
{
Read-Host 'Press any key to continue…' | Out-Null
}

#Define static variables
$AccessKey = "Enter AWS Access Key"
$SecretKey = "Enter AWS Secret Key"
$bucketname = "Enter S3 bucket name"
$root = "Enter the path of your monthly backup folder (e.g. D:\Monthly-Backups)"
$temproot = "Enter a path for the temporary vbk file (e.g. D:\Monthly-Backups\S3\Temp)"
$filter="*.vbk"
$logfile = "Enter a path for the script log file (e.g. D:\Monthly-Backups\S3\Company-S3-Copy.log)"

#Get latest vbk file
$vbkfile = Get-ChildItem -Path $root -Filter $filter | Sort-Object LastAccessTime -Descending | Select-Object -First 1
$vbkfilefullpath = join-path -path $root -childpath $vbkfile

#Generate hash for vbk file
$hash = (Get-FileHash $vbkfilefullpath -Algorithm MD5).hash

#Upload vbk file to S3 bucket
Write-S3Object -BucketName $bucketname -CannedACLName bucket-owner-full-control -File $vbkfilefullpath -Key $vbkfile -AccessKey $AccessKey -SecretKey $SecretKey

#Get vbk file copied to S3 bucket
$vbks3 = (Get-S3Object -BucketName $bucketname -Key $vbkfile -AccessKey $AccessKey -SecretKey $SecretKey).key

#if else to check if the file was uploaded successfully
if ($vbks3 -ne $vbkfile)
{
Write-Log "File upload to S3 was unsuccessful. Terminating script now!"
Send-Alert
Exit
}
else
{
Write-Log "File upload to S3 was successful."
}

#Define Temp vbk file path and name
$temppath = "$temproot\$vbkfile"

#Download vbk file from S3 into Temp path
Copy-S3Object -BucketName $bucketname -Key $vbkfile -LocalFile $temppath -AccessKey $AccessKey -SecretKey $SecretKey

#Get Temp vbk file downloaded into Temp path
$tempfile = (Get-ChildItem -Path $temppath).Name

#if else to check if the file was downloaded successfully
if ($tempfile -ne $vbkfile)
{
Write-Log "File download from S3 was unsuccessful. Terminating script now!"
Send-Alert
Exit
}
else
{
Write-Log "File download from S3 was successful."
}

#Generate hash for Temp vbk file
$hash2 = (Get-FileHash $temppath -Algorithm MD5).hash

#if else to compares hashes and remove Temp file if hashes are equal
if ($hash2 -ne $hash)
{
Write-Log "Hash tags are not equal! Terminating script now!"
Send-Alert
Exit
}
else
{
Write-Log "Hash=$hash Hash2=$hash2 Hash tags match!"

#Remove Temp vbk file
Remove-Item –Path $temppath –recurse
$temppath2 = (Get-ChildItem -Path $temproot | Measure-Object).count

#if else to chrck if file is deleted or not
if ($temppath2 -eq 0)
{
Write-Log "Removed Temp vbk file successfully."
}
else {
Write-Log "Couldn't remove Temp vbk file!"
Send-Alert
}
}

Veeam V9.0 U2: Backup AlwaysOn SQL instance hosting vCenter database

Recently I was troubleshooting a Veeam backup issue for a client who is utilizing the Always-On SQL architecture in their environment.  The client was receiving the following error when trying to backup any of their Always-On SQL instances:

“Failed to freeze guest, wait timeout”

Note that the backup jobs started failing right after migrating the client’s vCenter databases across to the Availability Groups hosted on the Always-On SQL instances.

The only bit of literature I could find on backing up AlwaysOn SQL instances was in this (not very helpful) Veeam guide.

So I’ve dug a little deeper and found this Veeam KB article, which suggests the issue maybe with the vCenter database failing to be excluded from Application-Aware image processing.

Continue reading