Azure Arc-enabled infrastructure components deliver on the promise of simplified governance and a single management control plane for the following services:
In many instances, all three are part of the solution. Enable the servers and nodes for the Kubernetes cluster, then deploy data services control to the Kubernetes cluster. Let’s dive into a few of those scenarios.
It is worth noting that some of the tools used by Azure Arc, are familiar and some may be new to you, but no worries, you’ll be perfectly comfortable with them in no time. Here’s a list of some of the tools and platforms to be used with Azure Arc, which will instantly transform you into an IT Polyglot. I believe that in time, additional Microsoft services will be delivered as portable containers.
- Azure Portal – managing through Azure blades
- Azure Data Studio – including Jupyter notebooks
- Visual Studio Code
- Linux command line and shell scripting
- Kubernetes control plane – via Linux SSH
- Putty, MobaXterm, or my favorite, MRemoteNG
- Hyper-V / vCenter or another hypervisor
- Potentially Red Hat OpenShift management
- Windows Admin Center
- SQL Server – SSMS
- PGAdmin for PostgreSQL DB management
- Windows and Azure CLI
- And, of course Notepad or Notepad ++
One of the best starting points for a deep dive into all thing Azure Arc, is the Azure Arc Jumpstart reference content. Many detailed Arc onboarding and deployment scenarios are detailed there. You can also create demos and test various offerings. I highly recommend this jumpstart page for diving in! Additionally, Microsoft recently announced Azure ArcBox, which is a place to test your technical scenarios via a sandbox environment. Here you can practice creating and managing new testing and PoC environments within a single Resource Group.
Servers – Azure Arc-enabled servers
You can manager servers virtually or physically, while applying consistent RBAC tagging, and identity policies to these resources for both on-premises or public deployments. You can fold in the additional functionality of Azure Security Center to ensure various compliance requirements of these machines and even connect machines to Azure Arc using Windows Admin Center within the Azure hybrid center blade.
Kubernetes – Azure Arc-enabled Kubernetes
You can manage any Kubernetes cluster similarly to how you would manage the Azure Kubernetes Cluster from Azure. For example, you can also manage a VMware vSphere TKG (Tanzu Kubernetes Grid) Kubernetes environment that is registered with Azure Arc. This is where the conversation gets interesting. For me, this advancement is all about client choice and what the client would like to maintain long-term as their supported stack solution. Azure Arc provides true flexibility. And, with the Kubernetes portion of Azure Arc, you can quickly manage any/all Kubernetes clusters, no matter where they are or what Kubernetes distribution is executing. That is pretty powerful!
Databases – Azure Arc-enabled data services
I have been waiting patiently for this functionality to become GA! You can read about the announcement from Microsoft here. I am going to ramble a bit longer here.
Specifically, how do Azure Arc-enabled data services add value? I am asked this countless times. This offering brings Azure data service to any client—on-premises on a Dell Technologies platform preferably, —in your data center, edge, or via other HyperScalers with a common Kubernetes orchestration. Additional benefits include:
- Always current. Updates and upgrades are fully automated and controlled by your policies. Think about this for a moment… No more SQL version tracking, no more migrations, no end of support, everything is simply always current. PostgreSQL adds a hugely beneficial hyperscale deployment option.
- Cloud-style elasticity. Optimize application performance of backend database workloads. Scale up, down, or out, with no application downtime. Is this really something you can do today?
- Consistent management experience. Single control plane, or “single pane of glass” control, along with the handful of the toolsets I listed above.
- Disconnected support. Even if the connection to Azure is indirect due to network or security constraints (which is common), you can still have an automated process to indirectly connect to Azure for telemetry and the latest container images. Therefore, you are covered if there are any data sovereignty or governance concerns, and you can still benefit from Azure Arc-enabled data services.
Now on to the cool data-enablement offerings
Let’s talk about the Azure Arc-enabled data services offerings.
There are a few prerequisite requirements for Azure Arc-enabled data services
- A Kubernetes cluster(s)
- The Azure Arc Data Controller is deployed into the cluster. Visit Azure Arc Data Controller to reference the current list of supported Kubernetes distributions. Additionally, check the Release Notes listed here monthly to make sure your platform is current.
Azure Arc Data Controller
As mentioned, deploying the Azure Arc Data Controller is a prerequisite for using the data services listed below. This controller is responsible for managing and deploying the database instances and for reporting telemetry to Azure. Think of this as the Azure Resource Manager control plane extension. You can deploy the Azure Arc data controller with Azure Data Studio or with the azdata CLI – your choice. Personally, I recommend that you deploy the azdata CLI first, understand the parameters required, break a few things, expose a few errors, and then you will understand the process fully. It is worth noting the two different CLIs. Azure Data is azdata and Azure is az.
The Data Controller has two connectivity modes:
- Direct mode – The Data Controller is constantly connected to Azure via the Azure Arc Kubernetes agent.
- Indirect mode – Upload data manually using the azdata CLI. This mode is preferred for limited connectivity to the public cloud
SQL Managed Instance (MI) — Similar to Azure SQL Database, this offering brings support for the operating system (in this case a container) and SQL Server with the latest security updates. Deployed to a Kubernetes cluster of your choice, it provides container flexibility, that lets you easily manage and migrate data from different cloud platforms. You can deploy SQL MI with azdata (azure CLI), Azure Data Studio, or using the Azure portal. As the Admin, you define the policy for auto-updates that download the latest patches and updates, with no downtime.
PostgreSQL Hyperscale — Powered by the hyperscale extension called Citus (from Citusdata, which is now part of Microsoft) PostgreSQL HyperScale distributes (shards) table rows across multiple PostgreSQL servers. It can even distribute them on different nodes, providing highly scalable queries. If you need to add more Kubernetes nodes, the Hyperscale Shard Rebalancer will automatically redistribute data to the new nodes. This is an online operation – meaning the data remains available for queries.
Onboarding the various tooling that Azure Arc provides
First, let’s investigate Arc for Servers and a Kubernetes cluster
Azure Arc-enabled Kubernetes is a collection of tools you can use to connect, configure, operate / monitor, and govern and secure data. If your organization and teams are already executing Kubernetes in production, you are already well down the maturity curve path. However, if you have yet to truly embrace K8s, Arc for Kubernetes is the answer.
Cluster health, although very much self-healing, and container sprawl wrangling is important. Even more important is cluster security. This is where Azure Monitor, driven with Azure Policy, is the solution for Kubernetes clusters governance and compliance, managed with Azure Arc. Container sprawl is the new VM sprawl, potentially multiplied by the thousands. Governance is paramount.
GitOps – Desired State
Next, let’s align GitOps configurations with the Kubernetes cluster to simplify desired state configuration. If the desired state experiences drift, the process either self-heals, or it rolls back. With GitOps, your environment is configured (declared) with your scripting tool of choice and push it to a Git repo. The repo stores the current state and the history of previous versions with the appropriate markdown comments, I hope!
As part of GitOps, you will deploy the Flux operator. Think of this operator as a “listener” that listens between the Kubernetes environment and the Git repo. The Flux operator “listens” for any changes that are checked in or changes that have caused a desired state drift in production.
Think source control for your operations. Yes! The Git repo is now your single source of truth. Full stop. Additions, changes, and deletions are all tracked, and these updates should also be tied to projects. I love the fact that Git management is now mainstream! Git repos and historical tracking have saved me many times and have proven to be invaluable tools for many years.
An excellent tutorial to use GitOps with an Azure Arc enabled Kubernetes cluster is reference here.
Use the logs – Intelligently
All services and machines create logs. Much of this data is challenging to mash up and consume. All Arc resources that are registered with Azure Arc can send logs to the cloud-based Azure Monitor. Now, you can derive insight from any highly distributed and disparate infrastructure is available for your consumption. You can interactively look at the results, in slices using multi-dimensional queries as needed.
You can also very easily ”wire up” Azure Sentinel with Azure Arc. Azure Sentinel can be described with a few more acronyms.
- SIEM – Security Information Event Management
- SOAR – Security Orchestrated Automated Response
The Azure Sentinel solution provides many connectors for Microsoft solutions or REST-APIs that provide real-time integration for physical and/or virtual machines. The Log Analytics agent manages the log and forwards it to Azure for execution. You can deploy the Log Analytics Agent in two different ways:
- Using the VM extensions framework – to a non-Azure, Windows or Linux machine
- Using Azure Policy
Azure Sentinel provides a powerful information and activity response hybrid solution. Perfect!
Database insights – OOTB
Additionally, for every Kubernetes distribution that is served from the main control plane to the Kubernetes namespace, dashboarding is available with Kibana and Grafana. Kibana will provide insight for converting logs into intelligent insights for the environment. Grafana excels at providing database metrics for CPU usage, memory, container, node, database table level, and so on. These metrics are extremely helpful for managing databases within an orchestrated and containerized world.
For both dashboarding tools and for database insights, you connect directly to the Azure Arc data controller. You can pull these service endpoints from ADS or you can use the azdata CLI.
Evolving DBaaS offerings
We have talked about the Azure Arc enabled data services offerings, but what can we really do with them. Well, let’s talk about that. There have been many products over my tech career, that have talked about solving for DBaaS. Most, if not all, I have questioned. To truly be DBaaS, the offerings need to be clean and simple for business users. Meaning, Business users need a database, and with a few clicks, the business has that database connection string, in a very short amount of time. Not, days or weeks, but minutes. Let’s call this a database vending machine, brought to you by Kubernetes. Complete with optional tagging and maybe chargeback/showback. Additionally, the DBA, should not be the blocker for the process to provision a database. The DBA has many other things to attend to. DBA policies and procedures are provisioned for the user database, in real-time, without DBA interaction. All the DBA internals are folded into the process. Including always current offerings. YES!
Azure Arc validation program
The Azure Arc validation program brings the consistency of a conformance test framework – that tests and validates that the Arc functionality offerings are correctly configured and operating as they should.
During the validation testing, top-notch teams from Microsoft and Dell worked together in a simple and streamlined fashion. I reviewed the early release validation process, tested a bit in my small lab environment, and mentioned to all our product teams that the validation was not a heavy lift. Microsoft did an excellent job of compiling all the validation testing into a set of Jupyter Notebooks. The Dell engineer began the validation process with limited Azure Portal, GitHub, Azure Data Studio, and Notebook experience, but with very solid SQL Server and Kubernetes experience. In a few hours, his entire solution offering set was validated and ready to upload to Microsoft. I was very impressed. As I mentioned, the testing is solid, streamlined, and very polished. Here is the Azure Arc validation program that our Dell product engineering teams have been following.
Our Services teams are the north star
Our Dell clients all have a strong desire to innovate faster. Our Dell Technologies Services teams can help across multiple modern innovative workstreams. Partnering with you for success! Here are just a handful of the modern services our teams engage in. Find more here.
- Application cloud suitability
- Modern Apps on containers, standardize management for Arc services in Azure and on-premise
- App Modernization (re-platforming to cloud native)
- Services to assist with Azure Arc workflow, App movement or re-platform – GitOps, Helm, FluxCD, Azure Data Studio, Windows Admin Center, etc.
- SQL modernization
- Dell Technologies kit for – migrate, consolidate, upgrade
- Kubernetes services
- Clusters, Policy Management, Monitoring, Distros, Services Application containers
- DevOps services
- Azure Pipeline services and tools with Azure Arc
Also, it is awesome to see Dell EMC for cloud native storage and Dell Technologies Consulting as a Kubernetes Certified Service Provider within the CNCF reference page
Until the next time
In my opinion, there is so much more that is on the horizon, that will be delivered with Azure Arc. A single control plane for Azure-native and Azure Arc resources such as containerization and serverless architectures that enable fluid and flexible offerings.
Until the next blog…. Innovate, collaborate, and never stop learning!
You should also check out my blog post from a business perspective “Fuel Azure hybrid cloud with a validated infrastructure”.