The defender tales Part two: Granular deployment
By Rob Litjens
For me, this year was an awesome year. I have done many new things. I found, that after all the Corana stuff I had to change a bit of the things I am doing. Just before Corona I tried getting into the world of the speaking. It is difficult to get involved.
I have reached out to some people and together we made a first attempt to get into it. Few good fellows have helped me with this. My first attempts where at an Company internal event. My talk there was about Defender for SQL and later also about Data API Builder (DAB). These sessions are a visited quite good when looking at the feedback I received.
A few months a ago someone asked if I could look into the Festive Tech Calendar. If it was something for me…
I needed to think about it. How does it work, what is the relation to what I like to do and how does that fit in my interests. Then I read that they are going to donate money to the RaspberryPI.org. I have raspberry Pi’s running around for various tasks. They are awesome..
It made me think…
I have done some work on Ansible and Data API Builder. Both can run in Docker containers and both can run on Raspberry PI. The latest editions are quite powerful, at least if you take the ones with a lot of memory and the ability to add SSD to them…
The end result should be a managed Defender for SQL on premise environment running on a Raspberry Pi that has Ansible and DAB configured. It should look like this:
More interesting to this layout is the fact that next to Azure and the servers you have is that DAB and Ansible Tower run on Raspberry Pi’s!
What is the logic behind this?
The windows server do have a SCOM agent installed. This Microsoft Management Agent (MMA) is connected to a Log Analytics Workspace (LAW) in Azure. This workspace is used throughout the whole estate, meaning Prod, UAT and Dev.
Defender for SQL requires a LAW too. We could have reconfigured that workspace, but our requirement was that only Prod and some C=3 level systems needed Defender for SQL installed. That made the SCOM Workspace useless because all servers would have been deployed. The MMA supports multiple LAW’s. In this case we will create one for our purposes and we configure it only for Defender for SQL. This way we can achieve granular deployment of Defender for SQL.
But how can we make that work with Automation. Doing over 500 server by hand is not done. We do require this to be automated, for existing and new deployments. Also reporting on the number of servers and the regions is important.
That is where our database comes in. Our configuration database contains the servers we manage. Also a lot of additional information on the servers is stored there. We have processes updating that database, adding data from the CMDB and Active Directory. The same database also keeps records of automation requests. Another thing where we use this database for is for tagging. A tag is label which stands for an action which is executed by Ansible.
Our deployments are using Ansible playbooks. They are quite extensive and the full install runs from Ansible. In just over two hours a full SQL server is deployed including the OS, SQL Server and all required settings. When a server is delivered, a tag is added to the database for our Daily Configuration. This Tag is bound to certain playbooks and templates that run on a Schedule.
Ansible Tower gets its inventory dynamically from our configuration database to bring a kind of Desired State Control to the estate.
Detailed: What’s in the database?
The database is the middle of our technology stack. Like said it contains all kinds of data about the instances in the estate. Next to Instance information it also contains automation information. Every automation request run by any platform will be registered in this database also.
In the database you will find table objects, views and Stored procedures that address:
- Instance table
- CMDB data
- Request table
- Tags table
- Cross Reference tables (for example between TAGS and Instances)
There are no straight connections in the database, only connections to the stored procedures they need. These Stored Procedures are permissioned with users, and they are called by the Data API Builder.
Detail: Where is PowerShell Universal used?
PowerShell Universal(PU) is the entrance to our automation. It accepts requests from the automation portals. Authentication is handled by the domain. Authorization can be through roles in the application, or straight on the API. PU runs PowerShell files as API’s. We have all PowerShell running in a module, supported by community modules like DBATools, SQLServerDSC and more. If an automation request is made, first action is that we register it in the Database with all the parameters we need. Second step can be either a call to Ansible or a call to the required PowerShell file.
Detail: Install DAB on the Pi
We need to configure the Raspberry Pi with 64 bits emulators and Docker. Manuals are here. When you are ready, you should install docker.
Data API Builder is a AMD64 Linux architecture. Therefore you need to install x64 binaries This looks like a cake, but Raspberry PI is not supporting x86_64 binaries. But there is a solution. You should install Box64 on the Pi. For this I suggest you to look at How to Install Box86, Box64 and Wine on Raspberry Pi OS Bullseye 64-bit | Medium. That describes how to generate x865_64 binaries on your Pi. It is not complex. Installing the binaries is not enough. You need to Run AMD64 Docker Images On An ARM Computer | Enlear Academy to make sure that you can run AMD64 binaries in Docker running on arm64 code. When docker is installed you need to create a User Defined Bridge. After this you can install DAB on the Pi. Instructions are on Microsoft Learn. First pull the image and then convince docker to run DAB using those amd64 binaries. But that is on the Enlear website also.
Detail: Ansible setup
Ansible is another important step in our automation flows. It is called from PU but reads and writes data and configurations. Ansible has a lot of credentials, workbooks and schedules for SQL. Next to deployments we use it also to control the desired state of the servers. Ansible can connect to any domain we have using different credentials.
What is the technical flow to Provision a server with Defender for SQL?
Looking at the processes above you can already think of how we setup the flow to install Defender for SQL. First we have setup a separate LAW in our Azure Subscription. From that we copied over the details like ID and tokens we need later.
The second step was that we prepared tags. We needed them for various domains and environments. Reason is that various domains run on separate time schedules. Then we created some scripts to install, validate and remove the settings we need to configure Defender for SQL. Configuration is quite simple: you just need to add the WorkSpaceID and its key to the local Workspace configurations on the server. This required the variables from the LAW. The scripts are added to libraries in Ansible.
Within Ansible we created a Role to run the script. The role can be used on its own via a separate script but also as part of deployments. The separate script is then connected to a template that runs on a schedule. The script also generates output about successful and failed installs. This output is forwarded to our database. Within the LAW we can see the servers and if there are alerts (that is in Defender for Cloud which is configured to forward events to Sentinel).
The control of Defender for SQL is also done with PowerBI. This is important for the users of the other domains. They use this dashboard to see how many servers are registered and which fail. The data for this is pulled from our database.
How does OPS add a server to the Defender for SQL configuration
One of the API we have created is the addition of a tag to the table. That means that a user just needs to call the endpoint (at this moment it is done by a PowerShell script). When the API has been called, the tag is set for this environment is matched with the server/instance on which Defender must be configured. This just looks like this:
The steps are:
- A tag is added based on the instance and the domain
- Ansible reads the inventory through DAB
- When the time is there, Ansible will validate if the server has been provisioned with Defender for SQL. There are three statuses which are important: a. Changed, this is when Defender for SQL settings are applied to the instance b. Ok, when the settings are according the configuration c. Failed, when the configuration failed to configure it.
- The results are then written to the logging table via DAB
- The dashboard in PowerBI connects to the logging table and updates at the required intervals Ops is only adding ONE reference which is then picked up automagically… Ain’t this nice?
Conclusion
This configuration outline is working in real life, except that we do all the request using one API server, simply because at that time DAB was not available. But that is not the scope of this blog.
The configuration of Defender for SQL is a real-life scenario, but this mechanism can be input for almost any kind of granular deployment, as long as it is worth investing the time to setup the automations behind it.
I rebuild the stuff in my private lab using DAB on a raspberry Pi, Ansible in a docker container, a database in Azure, and a trial version of PowerShell Universal. I have used own made PowerShell Scripts and snippets of the Ansible code we have at work. On various LAB machines I had many SQL Instances running.
I was surprised by the speed of DAB in a container and Ansible and Tower running on containers on the same RaspberyPI 4. My thoughts were that it runs.. but that is it. The raspberry Pi runs the containers without problems and overcommitment.
This is food for thought if it can be used on future sessions also. I like the idea of playing with multiple types of hardware for this. For example, I could have build Ansible in the cloud or use DAB in the cloud. In this case it would not open up various security risks.
Another piece of Food for Thought is that if we use RaspberryPi, you should consider to buy one. If you like FestiveTech Calendar or RaspberryPi, please consider to donate