Copy Veeam backups to AWS S3 using PowerShell

Recently I was looking for a way to move monthly Veeam backups into AWS S3 without having the client invest in Veeam Cloud Connect or AWS Storage Gateway.

So I started by re-configuring the monthly Veeam backup job to produce a full backup instead of the usual incremental, this made each .vbk file (i.e. Full backup file) generated at the end of the job independent of the others, and allowed me to upload them to S3 and reduce my retention to one.

The goal was to automate this process and avoid manual monthly uploads. This naturally called for some PowerShell scripting.

Since I wanted to reduce my retention to one, consequently overwriting my .vbk files every month, I had to make sure the copy of the .vbk file in S3 was good. The only way I could know if the file was good or not was by generating a MD5 hash of the .vbk file in the local repository and comparing the hash with the S3 object ETag once the .vbk was uploaded. Unfortunately this approach didn’t work because the Write-S3Object cmdlet uses the Multipart API when uploading to S3 and stores files in chunks, so the ETag you get back from S3 is not for the whole file but for the first chunk. My workaround was to simply generate the MD5 hash, upload the file, then download the file back, generate another hash and compare it to the first hash.

Anyways here is the script, just configure it to run after the job under the Advanced Settings\Scripts tab of your monthly backup job.

Also note that I’ve added basic logging and alerting functionality to the script.

I hope someone out there finds this script useful!!!


#Written by Cengiz Ulusahin 25/05/17
#This script runs after each Veeam monthly backup job and copies the latest full backup file into Amazon S3
#It offers a MD5 check on uploaded files, basic logging and alerting functionality 

#Function for logging
Function Write-Log 
{
   Param ([string]$logstring)
   $stamp = (Get-Date).toString("yyyy/MM/dd HH:mm:ss")
   $line = "$stamp $logstring" 
   Add-content $logfile -value $line
}

#Function for sending email alerts
Function Send-Alert
{
Send-MailMessage -SmtpServer "Enter IP address" -To "Enter To email address" -From "Enter From Email address" -Subject "Company Monthly AWS S3 Copy Job" -Body "Company monthly backup copy job to AWS S3 has failed or completed with errors. Please alert the backup team immediately by opening a ticket. Copy job log: $logfile." 	
}

#Function to Pause script, use for debugging
Function Pause
{
   Read-Host 'Press any key to continue…' | Out-Null
}

#Define static variables
$AccessKey = "Enter AWS Access Key"
$SecretKey = "Enter AWS Secret Key"
$bucketname = "Enter S3 bucket name"
$root = "Enter the path of your monthly backup folder (e.g. D:\Monthly-Backups)"
$temproot = "Enter a path for the temporary vbk file (e.g. D:\Monthly-Backups\S3\Temp)"
$filter="*.vbk"
$logfile = "Enter a path for the script log file (e.g. D:\Monthly-Backups\S3\Company-S3-Copy.log)"

#Get latest vbk file
$vbkfile = Get-ChildItem -Path $root -Filter $filter | Sort-Object LastAccessTime -Descending | Select-Object -First 1
$vbkfilefullpath = join-path -path $root -childpath $vbkfile

#Generate hash for vbk file
$hash = (Get-FileHash $vbkfilefullpath -Algorithm MD5).hash

#Upload vbk file to S3 bucket 
Write-S3Object -BucketName $bucketname -CannedACLName bucket-owner-full-control -File $vbkfilefullpath -Key $vbkfile -AccessKey $AccessKey -SecretKey $SecretKey

#Get vbk file copied to S3 bucket
$vbks3 = (Get-S3Object -BucketName $bucketname -Key $vbkfile -AccessKey $AccessKey -SecretKey $SecretKey).key

#if else to check if the file was uploaded successfully
if ($vbks3 -ne $vbkfile)
{
	Write-Log "File upload to S3 was unsuccessful. Terminating script now!" 
	Send-Alert
  	Exit
}
else
{
	Write-Log "File upload to S3 was successful."
}

#Define Temp vbk file path and name
$temppath = "$temproot\$vbkfile"
	
#Download vbk file from S3 into Temp path
Copy-S3Object -BucketName $bucketname -Key $vbkfile -LocalFile $temppath -AccessKey $AccessKey -SecretKey $SecretKey

#Get Temp vbk file downloaded into Temp path
$tempfile = (Get-ChildItem -Path $temppath).Name

#if else to check if the file was downloaded successfully
if ($tempfile -ne $vbkfile)
{
	Write-Log "File download from S3 was unsuccessful. Terminating script now!"   
	Send-Alert
  	Exit
}
else
{
	Write-Log "File download from S3 was successful."
}

#Generate hash for Temp vbk file
$hash2 = (Get-FileHash $temppath -Algorithm MD5).hash

#if else to compares hashes and remove Temp file if hashes are equal
if ($hash2 -ne $hash)
{
	Write-Log "Hash tags are not equal! Terminating script now!"
    Send-Alert
    Exit		
}
else 
{
    Write-Log "Hash=$hash Hash2=$hash2 Hash tags match!" 
  
    #Remove Temp vbk file
	Remove-Item –Path $temppath –recurse 
    $temppath2 = (Get-ChildItem -Path $temproot | Measure-Object).count
 
    #if else to chrck if file is deleted or not
    if ($temppath2 -eq 0)
    {	
	Write-Log "Removed Temp vbk file successfully."   
    }
    else {
    Write-Log "Couldn't remove Temp vbk file!"
    Send-Alert
    }      
}

Authenticating Office 365 users using on-premises AD DS

This blog post covers the integration methods for allowing administrators to authenticate their Office 365 users using on-premises Active Directory Domain Services (AD DS), particularly focusing on the method which involves utilizing Active Directory Federation Services (AD FS).

This post is not intended as a step-by-step guide on how to implement integration between Office 365 and the on-premises Active Directory Domain Services!!!  It’s only intended as a high level overview.

Continue reading

Dropbox on Server 2012 R2 closes unexpectedly

The Dropbox app closes unexpectedly (service continues to run) while running on Server 2012 R2.  When the app closes syncing stops and users start complaining.  So I wrote the following script and scheduled it to run every 30 minutes.

FYI… although it installs, Dropbox is not officially supported on Server systems.

https://www.dropbox.com/en/help/3

Script checks to see if the Dropbox process is running, if it is, it simply quits after displaying a message, if it’s not, then it tries starting the Dropbox process and sends an email out if the process gets started successfully.

$Dropbox = Get-Process "Dropbox" -ea SilentlyContinue
if ($Dropbox) {
  echo "Dropbox is running!!!"
}
  Else {
    Start-Process -FilePath "C:\Program Files (x86)\Dropbox\Client\Dropbox.exe"
    $Dropbox = Get-Process "Dropbox" -ea SilentlyContinue
      if ($Dropbox) {
        echo "Dropbox has been started!!!"
        Send-MailMessage -to "administrator@domain" -from "email@domain" -subject "Dropbox has been started!!!" -body "The Dropbox process on server SERVER NAME (SERVER IP) wasn't running, a        
        script has started the Dropbox process." -SmtpServer EMAIL SERVER IP -Port EMAIL SERVER PORT
                    }
       }

CUCM integration in a Multi-Forest environment

Only a single Active Directory Forest can be integrated with Cisco Unified Communication Manager (CUCM) to get user information and perform authentication.

In Multi-Forest environments you can utilize AD LDS (Lightweight Directory Services), formerly known as ADAM, to get user information and perform authentication from different AD domains that exist in different forests.

AD LDS is a Lightweight Directory Access Protocol (LDAP) directory service that provides flexible support for directory-enabled applications, without the dependencies that are required for Active Directory Domain Services (AD DS). AD LDS provides much of the same functionality as AD DS, but it does not require the deployment of domains or domain controllers. You can run multiple instances of AD LDS concurrently on a single computer, with an independently managed schema for each AD LDS instance.

This was my first time configuring AD LDS. Hence I had to reference a number of blog posts and a load of Microsoft documentation to get it working.  In all honesty, it has been an absolute nightmare. I’m hoping this post will save you from all the headache I’ve endured.

The step-by-step instructions I’ve given below follow the official guide produced by Cisco.  Make sure you have it open as you work through my instructions, as I do reference the Cisco guide often (there was no point in posting some of the same instructions in the Cisco guide).

Continue reading

Veeam V9.0 U2: Backup AlwaysOn SQL instance hosting vCenter database

Recently I was troubleshooting a Veeam backup issue for a client who is utilizing the Always-On SQL architecture in their environment.  The client was receiving the following error when trying to backup any of their Always-On SQL instances:

“Failed to freeze guest, wait timeout”

Note that the backup jobs started failing right after migrating the client’s vCenter databases across to the Availability Groups hosted on the Always-On SQL instances.

The only bit of literature I could find on backing up AlwaysOn SQL instances was in this (not very helpful) Veeam guide.

So I’ve dug a little deeper and found this Veeam KB article, which suggests the issue maybe with the vCenter database failing to be excluded from Application-Aware image processing.

Continue reading

Script to create local administrator account on remote domain machine

As Microsoft no longer supports creating local user accounts on domain machines using GPO, I’ve put together this script below to achieve this. However note that once the account is created it can be modified using GPO.

This script will create a local user account on a remote domain machine, set the account password to never expire and add the account to the local Administrators security group (or which ever other group you desire – just change variable).

Run this script on a domain controller server using a domain administrator account, before executing the script, create a txt or csv file containing all the names of the computers on which you wish to create the local user account on (and place it in the root of the C drive), and define the user account variables (such as username, password, description) in the variables section of the script.


#Define variables
$computers = Get-Content C:\Computers.txt
#$computers = Import-CSV C:\Computers.csv | select Computer
$username = "Username"
$password = "Password"
$fullname = "Fullname"
$local_security_group = "Administrators"
$description = "Description"

Foreach ($computer in $computers) {
    $users = $null
    $comp = [ADSI]"WinNT://$computer"

    #Check if username exists   
    Try {
        $users = $comp.psbase.children | select -expand name
        if ($users -like $username) {
            Write-Host "$username already exists on $computer"

        } else {
            #Create the account
            $user = $comp.Create("User","$username")
            $user.SetPassword("$password")
            $user.Put("Description","$description")
            $user.Put("Fullname","$fullname")
            $user.SetInfo()         
             
            #Set password to never expire
            #And set user cannot change password
            $ADS_UF_DONT_EXPIRE_PASSWD = 0x10000 
            $ADS_UF_PASSWD_CANT_CHANGE = 0x40
            $user.userflags = $ADS_UF_DONT_EXPIRE_PASSWD + $ADS_UF_PASSWD_CANT_CHANGE
            $user.SetInfo()

            #Add the account to the local admins group
            $group = [ADSI]"WinNT://$computer/$local_security_group,group"
            $group.add("WinNT://$computer/$username")

                #Validate whether user account has been created or not
                $users = $comp.psbase.children | select -expand name
                if ($users -like $username) {
                    Write-Host "$username has been created on $computer"
                } else {
                    Write-Host "$username has not been created on $computer"
                }
               }
        }

     Catch {
           Write-Host "Error creating $username on $($computer.path):  $($Error[0].Exception.Message)"
           }
}

Migrate a VMware View linked-clone replica to another ESXi host

The other day I was patching the hosts in a cluster which currently hosts our Virtual Desktop environment.  I’ve put the first host in the cluster into the maintenance mode and migrated all the Virtual Desktops to the other host in the cluster.  Unfortunately I also migrated the VMware View Linked-Clone replica residing on that host.  I forgot to un-tick the prompt tick box about “Powered off VMs” which comes up after you initiate maintenance mode for the host.  Luckily this didn’t create a major issue as VMware View doesn’t care about which host the replica resides on.  It only cares about the datastore the replica and the linked clones are stored on (so it’s best to turn off SDRS for VDI clusters*). But nevertheless, I wanted to migrate replica back to its original host.  However, when I tried to do so, I’ve realised the migrate option on the replica was greyed out.

Continue reading

Customising the Cisco Jabber MSI file using Microsoft Orca

Last year we moved on to a Cisco based telephony infrastructure and installed Cisco Jabber on our client machines. We deployed Cisco Jabber via Microsoft group policies using the standard MSI file provided by Cisco. The deployment was successful however we ended up getting a lot of complaints from the users about not being able to login.

After some troubleshooting together with support, we established that the GPO deployed Jabber application was trying to authenticate against a WebEx Connect server on the cloud rather than the Unified Communication server based locally in the LAN. Since there was no WebEx Messenger subscription, the login process was failing. The solution was to customise the MSI file and prevent the installed Jabber application from trying to authenticate against a Webex Connect server.

Continue reading

SVA installation issue – “Unable to install SVA: com.symantec.vsep.VSEPException: bad certificate…”

Symantec Security Virtual Appliance (SVA) was failing to deploy on to my Esxi hosts, producing the following error in the logs and on screen:

“Unable to install SVA: com.symantec.vsep.VSEPException: bad certificate, fingerprint: 99:eb:e7:73:e1:63:54:2c:94:81:7a:aa:c3:b9:3a:67:04:73:2e:ee”

Continue reading