Uploading VEEAM Backups to Google Drive

Uploading VEEAM Backups to Google Drive

We always copied our weekly full VEEAM backups to tape, then drove them across town to another building, so that in the event of a geographic emergency (tornado), we could recover. Tapes kept piling up, so I wanted to find a new way to store this data. We’re a Google Drive shop, which comes with unlimited storage – so I figured why not use that? The downside with Google Drive, is that individual file transfers don’t go that fast. We have a full gigabit upload, but due to Google’s limit, we are only able to pull off about 10-13MB/sec. If I have a 1TB VEEAM backup, that would take forever.

Here is our setup, and how I have automated it:

  • VEEAM completes full weekly backups on Thursday nights from each one of our 5 VM Hosts.
  • We have 2 VEEAM servers doing the backups for all 5 hosts.
  • Thursday after backups complete, I will use VEEAM’s post-job field to call a batch file to automate the zipping/splitting of each VBK file.
  • 7zip will take the rest of Thursday night/Friday morning zipping/splitting the archives.
  • Friday afternoon after the majority of users have gone home, and bandwidth isn’t an issue, I’ll kick off a batch file to upload the files to Google Drive via RCLONE to max out our connection

Every Thursday night, we end up with 5 different VBK full backup files (3 on one VEEAM server, and 2 on another – all in separate folders). I have a Server 2012 R2 VM that I use for random projects, and this is where I’ve mapped drives to both of those servers, so it can handle the uploading. We set VEEAM to run a post-backup batch file to zip everything up into 50GB chunks. The only issue with this, is that VEEAM’s post-script will time out eventually, before 7zip finishes. You can edit the registry on your VEEAM server to extend the timeout period (I think it’s 15 minutes by default). You have to add the DWORD value as shown below:



Before we get ahead of ourselves, we need to create the batch file. This is heavily customized to our setup, drive letters, folder structure, etc. The thought process behind this batch file is that it’ll move the latest VBK file into a sub folder called “compress”, then proceed to use 7zip’s command line utility to split it into 50GB chunks – then moving the VBK back to it’s original spot, so that VEEAM can control the automation of deleting old backup files. Using 2 different batch files may seem sloppy, but I was having problems with things kicking off with the correct timing, and other random errors – and so this works for us:



I have these batch files for each one of our 5 host backups – each pointing to their respective folders.

Once it’s done zipping, it will move the VBK full backup file back into it’s original location. Here is a pic of it almost done zipping:



Once everything is done zipping, moving VBK back, and completed – we need to upload everything using RCLONE. I won’t go through the RCLONE setup, since that’s heavily documented elsewhere online. I mapped both VEEAM servers to my main test server where this batch file will be run from. It will go through each of the 5 folders, and upload all 7zip archive files, then move onto the next one. After each transfer is complete, it will delete all 7zip archive files so it doesn’t keep filling up every week.



Share this Story:
  • facebook
  • twitter
  • gplus

About admin


  1. Raheel Chaudhry
    384 days ago

    It’s been a year. How is it going? Still doing the same thing or have you changed anything yet?

    • admin
      356 days ago

      Sorry for the late reply – the process works fine, but we’ve since stopped since Google capped uploaded data each day. We have a few TB that must be uploaded each weekend, and no way that we’d finish fast with the restrictions that Google put on it. It’s nice to upload to the cloud, but I can’t wait a week for everything to finish uploading. We need redundancy. Granted, the chances of a VM failing, then our VEEAM backup failing as well are slim – but I also don’t feel like combating Google restrictions every few months just to wake up one day and find out we don’t have the backups done properly. Real backups would probably be sent to AWS, but it’s too costly at this point, especially with our internal policy striving for 7 years of backups – which would cost a fortune. Tape backups are still the most cost efficient right now. Otherwise, if your backups are smaller and don’t have a time sensitive requirement, I’d still suggest Google Drive if your enterprise/education environment provides unlimited storage.

  2. Jeff
    357 days ago

    Hi, Looking to do the same thing with our Veeam Backups. Wondering why you move the vbk file before zipping? Why not just leave it where it’s at and zip to the Archive folder?

    • admin
      356 days ago

      I’m guessing there are better ways to do it, but that’s what I did just to isolate the file and not touch any of the other VBK files in the folder.

      • Jeff
        351 days ago

        Ahh, Thanks.

        In my case, that will be the only vbk file in that location…

Leave a comment