Quantcast
Channel: TechNet Blogs
Viewing all articles
Browse latest Browse all 34890

(Part 1) Backup is good. Restore is great. But test your data is even better

$
0
0

I think you will agree with me, backing up your data (workstation & servers) is one of the key pilar of security. Nobody can garantee that due to a hardware failure, a security attack or just a error done by a team member… we can’t prevent major data loss, impacting the business.

In this article, I want to focus on Azure Backup, especially how to automate restore via scripting, with a great final goal : now that the data is restore, why don’t we check and detect problems we have not detected a month ago ?

Quick intro about Azure Backup

Microsoft has a long history around security in general and of course backup. No matter if it is workstation or server, we have always provided great products to backup and secure data.

Azure Backup – Microsoft’s Backup solution – fully embrace the cloud model leveraging the best of « on premisses » and « cloud ».

An agent on the machine can either backup directly on the internet (small sites, workstations, ..) or on a local « backup server » (JBOD) that will then archive data in Azure for long . This way we have the best approach to fully take advantage of hybrid infrastructure.

In this blog, we will not only leverage the GUI (server side or Azure Portal) but also Scripting.

The goal of this blog : what a bright approach

The idea came discussing with financial institutions. For them (rules, regulations, certifications) validating a backup scenario is not just receiving an email from the software saying « it worked », but really the ability – following a very strinct process in terms of rythm during the year – to test restore from A to Z.

With a classical approach, especially if Archiving is done on tapes (which requires hardware equipements to write/read such tapes) it is a bit complex since you need first to get these items from a storage location (usually a few kilometers away), and then make sure you have the appropriate device to read such tape format : For long term archiving (months or 5/15 even  99 years) that is a bit challenging since you would need to keep in spare a lot of devices.

Of course when you think hybrid, you just remove this tape/device problem, and store this data (you need to have good bandwith though, Express Route for example) a few thousands kilometers away on a very high secured – totally private – and very un expensive storage.

Continguing this discussion with these financial companies, they asked me to set up a prototype of vision leveraging Microsoft Datacenters all around the world, and even add some nice features in this global process : check data consistency !

The final idea they put on the table is simple but so efficient :

1) We backup the data Microsoft Datacenter via Azure Backup, let’s say Europe West.

2) On this same Datacenter, we run every month – for just a few hours (time to run the A to Z process) – a machine named « RestoreData », which contains a magic script that will in fact restore locally this data, and perform the consistency checks. If restore worked, we are good and compliant for this phase.

3) but now that we have the data ready, why don’t we run several checks such as Antivirus (maybe files are infected, not detectable a few month/years ago but detectable now), Disk usage, .. why not behaviors to identify Ransomware ?

Here comes the idea of this blog, share with you how automate this process, especially the restore phase.

How to automate restore with Azure Backup : let’s install the components

Azure Backup is totally scriptable via Powershell. You can create Vaults, create policies, automate backup and of course, restore this Data.

But to do so, the machine performing this job needs to get several components. Let’s see together how to do it (This process usually takes 15 to 30 mn).

 

Here is the list of the tiny steps you need to follow to run this scenario :

  • Create your machine in Azure. An A2 is a great approach, providing a C and a D (restore) drive.
  • I would recommend to use Windows 2016 image, since it includes by default a lot security features of course, especially Defender pre installed (we will use it for the “checks” phase).
  • It makes sense to run this machine in the same Datacenter as your backup Vault is located. This is just quicker.
  • I would recommend to add network rules on this VM, so only your IP address (the one of your proxy, your home DSL) can reach this machine. Why leaving a machine alone on the internet if it is not necessary ?
  •  

    Now the machine is running, Install Azure Backup agent : https://go.microsoft.com/fwLink/?LinkID=288905&clcid=0x040c. This will in fact install the Powershell commands to automate any kind of process, especially restore.

    TIP : to authorize download on this brand new machine, change IE settings set by default on windows 2016 . Internet Zone, go Download section, enable download. Otherwise IE will block this download… then go back to previous config if needed.

     

    Now the agent is installed, it has to be registered to the backup Vault. To do so, just follow the classical procedure, same as if you wanted to also backup this machine (this is not our goal indeed). Once registered, this machine will be able to “talk” to the vault.

    So far so good, but the machine is not ready yet.

     

    We have several other steps to talk via Powershell. To follow the steps below, you must run Powershell ISE as an administrator, otherwise some commands will fail :

    • Install las version of Powershell : https://github.com/Azure/azure-powershell/releases. Download the EXEcutable and run it, it will install the components
    • In Powershell ISE, run this command : « Install-Module -Name AzureRM » … will be prompted several times, say « yes » or « yes to all ». This process takes a while, be patient.

    Now to continue the configuration, you will need to authenticate Azure from powershell ISE.

    To do so, run this command « Login-AzureRmAccount » and provide your identity, password and strong authentication if you have enabled it.

    Once authenticated (I assume you have sufficient right in Azure on  this identity) we can continue the installation.  Run this command « Register-AzureRmResourceProvider -ProviderNamespace “Microsoft.RecoveryServices” »

    TIP : You can check modules installed : Get-Command *azurermrecoveryservices*

    Now comes the weird part, this is for me  a bug. Run Powershell Command prompt (but not the ISE !!!!) as an administrator. Then run this command « Import-module MSOnlineBackup ».

    TIP : If you are not administrator, or running it in Powershell ISE, you will get an error message.

    Now you are all set : This VM in Azure contains all the components to automate restore phase.

    You can run your script.

     

    … in the next blog, I will explain to you the script I used and the purpose of each command, etc.

    Note : I have noticed working on this scenario that sometimes the online documentation is a bit partial. To reach this goal, I had to read and merge several technical documentations.

     


    Viewing all articles
    Browse latest Browse all 34890

    Trending Articles



    <script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>