{"id":52516,"date":"2022-10-03T15:36:12","date_gmt":"2022-10-03T15:36:12","guid":{"rendered":"https:\/\/dbtut.com\/?p=52516"},"modified":"2022-10-03T15:44:00","modified_gmt":"2022-10-03T15:44:00","slug":"azure-arc-enabled-data-services","status":"publish","type":"post","link":"https:\/\/dbtut.com\/index.php\/2022\/10\/03\/azure-arc-enabled-data-services\/","title":{"rendered":"Azure Arc Enabled Data Services"},"content":{"rendered":"<p>We need a Data Controller running on Kubernetes to be able to deploy Azure Arc-enabled data services to different cloud providers on-premises or in a hybrid cloud scenario.<\/p>\n<p>With the Data Controller, you can implement Azure Arc integration and core functions such as management services into your own structure.<\/p>\n<p>For this reason, Azure Arc Data Controller is of vital importance to us.<\/p>\n<p>To summarize, you can use Azure Arc-enabled data services if you deploy Azure Arc Data Controller wherever you have infrastructure and can run kubernetes.<\/p>\n<p>So, how to run or install the Data Controller, which is of such importance in the use of Azure Arc-enabled data services? Let me answer then&#8230;<\/p>\n<p>It is very important at this point to start by knowing that Data Controller has two different connection types.<\/p>\n<p>The connection type provides flexibility to choose how much data is sent to Azure and how users interact with the Arc data controller, and depending on the connection type selected, you may not be able to use some features of Azure Arc Data Services.<\/p>\n<p><strong>1. Directly Connected Mode: <\/strong>You can manage and use the management services through the Azure Portal.<\/p>\n<p><strong>2. Indirectly Connected Mode<\/strong>\u00a0<strong>:\u00a0<\/strong>It allows you to do most of the management services in your own environment without the need for Azure.<\/p>\n<p>If you choose the direct connected connection type you can use the Azure Portal to use the Azure Resource Manager APIs, Azure CLI and Azure Arc data services, if you choose the Indirectly type you will need to send a minimal amount of data to Azure for inventory creation and billing purposes.<\/p>\n<p id=\"JwMIfYU\"><img loading=\"lazy\" decoding=\"async\" width=\"1082\" height=\"383\" class=\"size-full wp-image-52517 aligncenter\" src=\"https:\/\/dbtut.com\/wp-content\/uploads\/2022\/10\/img_633af0e2b3db3.png\" alt=\"\" \/><\/p>\n<p>After all this theoretical knowledge, now it&#8217;s time to deploy the Data Controller to our Azure Kubernetes cluster. I can do this in two different ways.<\/p>\n<p>1. Using Powershell<\/p>\n<p>2. Using the Azure Portal<\/p>\n<p>In the article, we will be performing our operations using both methods. For this reason, the first thing to do is open powershell and login to our azure account with little login. You can use this address to use Powershell and Azure.<\/p>\n<pre class=\"lang:default decode:true \">https:\/\/docs.microsoft.com\/tr-tr\/powershell\/azure\/install-az-ps?view=azps-5.7.0<\/pre>\n<p>Since we will use our AKS information using Azure Data Studio, we will need powershell at some point, so we do not do everything through the Azure Portal.<\/p>\n<p><code><strong>Az login<\/strong><\/code><\/p>\n<figure style=\"width: 602px\" class=\"wp-caption aligncenter\"><img loading=\"lazy\" decoding=\"async\" src=\"http:\/\/www.sqlekibi.com\/wp-content\/uploads\/2021\/04\/Resim1.png\" alt=\"\" width=\"602\" height=\"159\" \/><figcaption class=\"wp-caption-text\">Picture1: Login with Az Login<\/figcaption><\/figure>\n<p>When you use the az login command, you will open a web page in your default browser and ask you to verify your account, after these steps you will see the screen as above.<\/p>\n<p>Then we will need a resource group to do our work. I am using the following code block to create a Resource Group.<\/p>\n<pre class=\"lang:default decode:true \">az group create --name rg-dmc-k8s --location eastus\r\n\r\n<\/pre>\n<figure style=\"width: 602px\" class=\"wp-caption aligncenter\"><img loading=\"lazy\" decoding=\"async\" src=\"http:\/\/www.sqlekibi.com\/wp-content\/uploads\/2021\/04\/Resim2.png\" alt=\"\" width=\"602\" height=\"180\" \/><figcaption class=\"wp-caption-text\">Picture2: Creating a Resource Group<\/figcaption><\/figure>\n<p>Now that the Resource Group is OK, I&#8217;m creating the Azure Kubernetes Cluster.<\/p>\n<p>For this, I log into my Azure account and create a resource group named rg-dmc-k8s and create a kubernetes cluster named dmc-aks-arc in it. Here we will be making some changes as we create the AKS.<\/p>\n<figure style=\"width: 528px\" class=\"wp-caption aligncenter\"><img loading=\"lazy\" decoding=\"async\" src=\"http:\/\/www.sqlekibi.com\/wp-content\/uploads\/2021\/04\/Resim3.png\" alt=\"\" width=\"528\" height=\"639\" \/><figcaption class=\"wp-caption-text\">Picture3: Creating AKS \u2013 Basic<\/figcaption><\/figure>\n<p>As you can see in Picture3, I changed Availability Zone to Zone 1 and Kubernetes version information to 1.19.6.<\/p>\n<p>The reason for making changes in the Zone part is that Data Controller does not support different PVC distributions for now.<\/p>\n<p>For this reason, if you want to deploy the Data Controller successfully, you should choose a single zone.<\/p>\n<p>I changed the Kubernetes version information to 1.19+, you can change it according to your needs, but you need to check the azure arc support. (Minimum kubernetes version is 1.16 and above.) We switch to the \u201cNetworking\u201d tab and continue with our adjustments.<\/p>\n<figure style=\"width: 432px\" class=\"wp-caption aligncenter\"><img loading=\"lazy\" decoding=\"async\" src=\"http:\/\/www.sqlekibi.com\/wp-content\/uploads\/2021\/04\/Resim4.png\" alt=\"\" width=\"432\" height=\"579\" \/><figcaption class=\"wp-caption-text\">Picture4: AKS \u2013 Networking<\/figcaption><\/figure>\n<p>As seen in Picture4, we activate the http routing part. In the final AKS configuration, I perform the last step through the &#8220;Integrations&#8221; tab.<\/p>\n<figure style=\"width: 392px\" class=\"wp-caption aligncenter\"><img loading=\"lazy\" decoding=\"async\" src=\"http:\/\/www.sqlekibi.com\/wp-content\/uploads\/2021\/04\/Resim5.png\" alt=\"\" width=\"392\" height=\"495\" \/><figcaption class=\"wp-caption-text\">Picture5: AKS \u2013 Integration Tab<\/figcaption><\/figure>\n<p>As seen in Picture 5, I turn off the Container Monitoring part. The reason is that I will monitor myself with AKS or Grafana. I will take advantage of Azure Arc blessings.<\/p>\n<p>I did not make any adjustments to the other tabs such as &#8220;Node Pools&#8221;, &#8220;Authentication&#8221;. I left them as default and now it&#8217;s time to do my final checks and create my Azure Kubernetes Cluster. I get the cluster I want in an average of 10 minutes.<\/p>\n<p>Then, as the first step, I get the credentials of my AKS with the help of powershell. For this process, the following code block will work for me.<\/p>\n<pre class=\"lang:default decode:true \">az aks get-credentials --resource-group rg-dmc-k8s --name dmc-aks-arc\r\n\r\n<\/pre>\n<figure style=\"width: 602px\" class=\"wp-caption aligncenter\"><img loading=\"lazy\" decoding=\"async\" src=\"http:\/\/www.sqlekibi.com\/wp-content\/uploads\/2021\/04\/Resim6.png\" alt=\"\" width=\"602\" height=\"87\" \/><figcaption class=\"wp-caption-text\">Picture6 : AKS \u2013 I am getting my credentials.<\/figcaption><\/figure>\n<p>Second step; I am learning the name of the context with the code block below.<\/p>\n<pre class=\"lang:default decode:true \">kubectl config view -o jsonpath='{.contexts[*].name}'\r\n\r\n<\/pre>\n<p>Third step; Now that I have learned the context information, I can confirm the correctness of the context I want to work with with the code block below.<\/p>\n<p>You can see more than one context information and that&#8217;s why I use the following code block to see which one is active.<\/p>\n<p>kubectl config current-context As seen in Picture7, I have only one config in this example. For this reason, I do not have to change the context information, but if there were more than one config, I could change the context with the command &#8220;kubectl config use-context WishContext Write&#8221;.<\/p>\n<figure style=\"width: 602px\" class=\"wp-caption aligncenter\"><img loading=\"lazy\" decoding=\"async\" src=\"http:\/\/www.sqlekibi.com\/wp-content\/uploads\/2021\/04\/Resim7.png\" alt=\"\" width=\"602\" height=\"85\" \/><figcaption class=\"wp-caption-text\">Picture 7 : Sample output<\/figcaption><\/figure>\n<p>Now that we have migrated to the cluster we want, we will now need to learn the storage class of the cluster we are connecting to. We will need this information later.<\/p>\n<p><code><strong>kubectl get storageclass<\/strong><\/code><\/p>\n<figure style=\"width: 602px\" class=\"wp-caption aligncenter\"><img loading=\"lazy\" decoding=\"async\" src=\"https:\/\/www.sqlekibi.com\/wp-content\/uploads\/2021\/04\/Resim8.png\" alt=\"\" width=\"602\" height=\"92\" \/><figcaption class=\"wp-caption-text\">Picture 8 : Storage class list<\/figcaption><\/figure>\n<p>If you are working in a k8s cluster with more than one storage class, you need to decide which one you want to work on.<\/p>\n<p>By the way, you can host log and data files of kubernetes in different storage classes.<\/p>\n<p>Publishing the Data Controller via Azure Data Studio When you open Azure Data Studio, Azure Arc Controllers is located in the Connection section of the menu on the left. Click on the \u201c+\u201d sign in that section.<\/p>\n<figure style=\"width: 441px\" class=\"wp-caption aligncenter\"><img loading=\"lazy\" decoding=\"async\" src=\"http:\/\/www.sqlekibi.com\/wp-content\/uploads\/2021\/04\/Resim9.png\" alt=\"\" width=\"441\" height=\"200\" \/><figcaption class=\"wp-caption-text\">Picture9: Azure Data Studio Add Controller<\/figcaption><\/figure>\n<figure style=\"width: 347px\" class=\"wp-caption aligncenter\"><img loading=\"lazy\" decoding=\"async\" src=\"http:\/\/www.sqlekibi.com\/wp-content\/uploads\/2021\/04\/Resim10.png\" alt=\"\" width=\"347\" height=\"312\" \/><figcaption class=\"wp-caption-text\">Picture10: Data Controller that appears in Preview is selected.<\/figcaption><\/figure>\n<figure style=\"width: 517px\" class=\"wp-caption aligncenter\"><img loading=\"lazy\" decoding=\"async\" src=\"http:\/\/www.sqlekibi.com\/wp-content\/uploads\/2021\/04\/Resim11.png\" alt=\"\" width=\"517\" height=\"374\" \/><figcaption class=\"wp-caption-text\">Picture11: Prerequisites for deployment<\/figcaption><\/figure>\n<p>As you can see in Figure 11, there are some pre-requisites, but we have installed these requirements while installing SQL Server in Azure Kubernetes. For this reason, we proceed quickly.<\/p>\n<figure style=\"width: 602px\" class=\"wp-caption aligncenter\"><img loading=\"lazy\" decoding=\"async\" src=\"http:\/\/www.sqlekibi.com\/wp-content\/uploads\/2021\/04\/Resim12.png\" alt=\"\" width=\"602\" height=\"585\" \/><figcaption class=\"wp-caption-text\">Picture12: Cluster wants us to choose.<\/figcaption><\/figure>\n<p>For this article, we created an Azure Kubernetes Cluster named dmc-aks-arc. For this reason, we continue by choosing this.<\/p>\n<figure style=\"width: 425px\" class=\"wp-caption aligncenter\"><img loading=\"lazy\" decoding=\"async\" src=\"http:\/\/www.sqlekibi.com\/wp-content\/uploads\/2021\/04\/Resim13.png\" alt=\"\" width=\"425\" height=\"413\" \/><figcaption class=\"wp-caption-text\">Picture13: We choose the config profile.<\/figcaption><\/figure>\n<figure style=\"width: 439px\" class=\"wp-caption aligncenter\"><img loading=\"lazy\" decoding=\"async\" src=\"http:\/\/www.sqlekibi.com\/wp-content\/uploads\/2021\/04\/Resim14.png\" alt=\"\" width=\"439\" height=\"362\" \/><figcaption class=\"wp-caption-text\">Picture14: It asks us to connect to our Azure Account.<\/figcaption><\/figure>\n<p>We log in to our Azure account with the sign in in Picture 14. Then it will be asking us for kubernetes information.<\/p>\n<figure id=\"attachment_52531\" aria-describedby=\"caption-attachment-52531\" style=\"width: 449px\" class=\"wp-caption aligncenter\"><img loading=\"lazy\" decoding=\"async\" class=\"wp-image-52531 size-full\" src=\"https:\/\/dbtut.com\/wp-content\/uploads\/2022\/10\/img_633af93f6cfe8.png\" alt=\"\" width=\"449\" height=\"437\" \/><figcaption id=\"caption-attachment-52531\" class=\"wp-caption-text\">Picture14_1: Definitions were made.<\/figcaption><\/figure>\n<p>&nbsp;<\/p>\n<p>If you choose direct as the connection mode, it will ask you for an Azure service account. I&#8217;ll explain later how the Azure Service account is created.<\/p>\n<figure style=\"width: 458px\" class=\"wp-caption aligncenter\"><img loading=\"lazy\" decoding=\"async\" src=\"http:\/\/www.sqlekibi.com\/wp-content\/uploads\/2021\/04\/Resim16.png\" alt=\"\" width=\"458\" height=\"378\" \/><figcaption class=\"wp-caption-text\">picture15: Controller Configuration<\/figcaption><\/figure>\n<p>As you can see in Picture 15, it asks us for some recognition for our data controller.<\/p>\n<figure style=\"width: 407px\" class=\"wp-caption aligncenter\"><img loading=\"lazy\" decoding=\"async\" src=\"http:\/\/www.sqlekibi.com\/wp-content\/uploads\/2021\/04\/Resim17.png\" alt=\"\" width=\"407\" height=\"396\" \/><figcaption class=\"wp-caption-text\">Picture15_1 : The data we entered for the controller configuration.<\/figcaption><\/figure>\n<figure style=\"width: 404px\" class=\"wp-caption aligncenter\"><img loading=\"lazy\" decoding=\"async\" src=\"http:\/\/www.sqlekibi.com\/wp-content\/uploads\/2021\/04\/Resim18.png\" alt=\"\" width=\"404\" height=\"393\" \/><figcaption class=\"wp-caption-text\">Picture16: Configuration controlled.<\/figcaption><\/figure>\n<p>As you can see in Picture 17, we can show a final summary of the configuration we made with Azure Data Studio and take a printout from us as a notebook or deploy it. I bought a script as a notebook for our example.<\/p>\n<figure style=\"width: 602px\" class=\"wp-caption aligncenter\"><img loading=\"lazy\" decoding=\"async\" src=\"http:\/\/www.sqlekibi.com\/wp-content\/uploads\/2021\/04\/Resim19.png\" alt=\"\" width=\"602\" height=\"463\" \/><figcaption class=\"wp-caption-text\">Picture18: Azure Arc Data Controller Deploy<\/figcaption><\/figure>\n<p>We can run the notebook code blocks you see in Picture 18 separately, or we can run all of them by saying Run All above. However, one of the parts that should be noted here is that the variables defined in the Set variables section have come as i instead of the letter i.<\/p>\n<p>In the code block, they are all listed as i. That&#8217;s why I change the parts that have i in the variables to i before executing the codes. If you run the codes after completing the changes, you will see Picture19.<\/p>\n<figure style=\"width: 602px\" class=\"wp-caption aligncenter\"><img loading=\"lazy\" decoding=\"async\" src=\"http:\/\/www.sqlekibi.com\/wp-content\/uploads\/2021\/04\/Resim20.png\" alt=\"\" width=\"602\" height=\"329\" \/><figcaption class=\"wp-caption-text\">Picture19: Azure Arc Data Controller Deployment<\/figcaption><\/figure>\n<p>As you can see in Picture 19, even though we performed the process using Azure Data Studio, it produces an azdata output for us and informs us about how to follow the process via powershell.<\/p>\n<pre class=\"lang:default decode:true \">kubectl get get pods -n arc\r\n<\/pre>\n<p>We can observe what is going on using the above command.<\/p>\n<figure style=\"width: 602px\" class=\"wp-caption aligncenter\"><img loading=\"lazy\" decoding=\"async\" src=\"http:\/\/www.sqlekibi.com\/wp-content\/uploads\/2021\/04\/Resim21.png\" alt=\"\" width=\"602\" height=\"130\" \/><figcaption class=\"wp-caption-text\">Picture20: Kubectl get pods Arc<\/figcaption><\/figure>\n<p>We looked as shown in Picture 20, but when we looked at Picture 19 again, we saw that the process was continuing.<\/p>\n<p>Instead of constantly running and checking the code in Picture20, I can add -watch to the same code and continue to observe.<\/p>\n<figure style=\"width: 504px\" class=\"wp-caption aligncenter\"><img loading=\"lazy\" decoding=\"async\" src=\"http:\/\/www.sqlekibi.com\/wp-content\/uploads\/2021\/04\/Resim22.png\" alt=\"\" width=\"504\" height=\"285\" \/><figcaption class=\"wp-caption-text\">Picture21: kubectl get pod arc watch<\/figcaption><\/figure>\n<p>The deployment of Azure Arc Data Controller took about 30 minutes. The time here will vary depending on the size of your Azure Kubernetes Cluster, the number of nodes and your internet speed.<\/p>\n<p>If you see the screen in Picture 22 on Azure Data Studio, you have successfully deployed Azure Arc Data Controller.<\/p>\n<figure style=\"width: 503px\" class=\"wp-caption aligncenter\"><img loading=\"lazy\" decoding=\"async\" src=\"http:\/\/www.sqlekibi.com\/wp-content\/uploads\/2021\/04\/Resim23.png\" alt=\"\" width=\"503\" height=\"160\" \/><figcaption class=\"wp-caption-text\">Picture22: Deploy Arc Data Controller<\/figcaption><\/figure>\n<p>After successful deployment, you can login using the code block below. The code block will ask you for the Data Controller information you entered in ADS (Azure Data Studio).<\/p>\n<pre class=\"lang:default decode:true \">azdata login -ns arc\r\n\r\n<\/pre>\n<figure style=\"width: 602px\" class=\"wp-caption aligncenter\"><img loading=\"lazy\" decoding=\"async\" src=\"http:\/\/www.sqlekibi.com\/wp-content\/uploads\/2021\/04\/Resim24.png\" alt=\"\" width=\"602\" height=\"69\" \/><figcaption class=\"wp-caption-text\">Picture23: Login arc<\/figcaption><\/figure>\n<p>If we have successfully logged into Arc, we can view our endpoints on arc using the code block below.<\/p>\n<pre class=\"lang:default decode:true \">azdata arc dc endpoint list -o table\r\n\r\n<\/pre>\n<figure style=\"width: 489px\" class=\"wp-caption aligncenter\"><img loading=\"lazy\" decoding=\"async\" src=\"http:\/\/www.sqlekibi.com\/wp-content\/uploads\/2021\/04\/Resim25.png\" alt=\"\" width=\"489\" height=\"111\" \/><figcaption class=\"wp-caption-text\">Picture24: Endpoint list<\/figcaption><\/figure>\n<p>As you can see in Picture24, Azure gave us IP addresses for Arc Data Controler to perform different functions. Let&#8217;s connect the Endpoint address of the Cluster Management Service we obtained with ADS.<\/p>\n<figure style=\"width: 297px\" class=\"wp-caption aligncenter\"><img loading=\"lazy\" decoding=\"async\" src=\"http:\/\/www.sqlekibi.com\/wp-content\/uploads\/2021\/04\/Resim26.png\" alt=\"\" width=\"297\" height=\"289\" \/><figcaption class=\"wp-caption-text\">Picture25: Connect Data Controller<\/figcaption><\/figure>\n<p>When the connection is successful, you can see it in the lower left part of the ADS as you can see in Picture26.<\/p>\n<figure style=\"width: 134px\" class=\"wp-caption aligncenter\"><img loading=\"lazy\" decoding=\"async\" src=\"http:\/\/www.sqlekibi.com\/wp-content\/uploads\/2021\/04\/Resim27.png\" alt=\"\" width=\"134\" height=\"292\" \/><figcaption class=\"wp-caption-text\">picture26: Connected Data Controller<\/figcaption><\/figure>\n<p>When you right-click on the Data Controller, the manage section will come and it&#8217;s time to use the features provided by Azure Arc-enabled data services.<\/p>\n<figure style=\"width: 602px\" class=\"wp-caption aligncenter\"><img loading=\"lazy\" decoding=\"async\" src=\"http:\/\/www.sqlekibi.com\/wp-content\/uploads\/2021\/04\/Resim28.png\" alt=\"\" width=\"602\" height=\"311\" \/><figcaption class=\"wp-caption-text\">Figure27: Azure Arc Data Controller ADS view<\/figcaption><\/figure>\n<p>In the next article, I will be explaining the Azure Arc-enabled Managed Instance setup. Next, I&#8217;ll be telling you how to create a PostgreSQL HyperScale Server Group with Azure Arc.<\/p>\n<div class=\"pvc_clear\"><\/div>\n<p id=\"pvc_stats_52516\" class=\"pvc_stats all  \" data-element-id=\"52516\" style=\"\"><i class=\"pvc-stats-icon medium\" aria-hidden=\"true\"><svg aria-hidden=\"true\" focusable=\"false\" data-prefix=\"far\" data-icon=\"chart-bar\" role=\"img\" xmlns=\"http:\/\/www.w3.org\/2000\/svg\" viewBox=\"0 0 512 512\" class=\"svg-inline--fa fa-chart-bar fa-w-16 fa-2x\"><path fill=\"currentColor\" d=\"M396.8 352h22.4c6.4 0 12.8-6.4 12.8-12.8V108.8c0-6.4-6.4-12.8-12.8-12.8h-22.4c-6.4 0-12.8 6.4-12.8 12.8v230.4c0 6.4 6.4 12.8 12.8 12.8zm-192 0h22.4c6.4 0 12.8-6.4 12.8-12.8V140.8c0-6.4-6.4-12.8-12.8-12.8h-22.4c-6.4 0-12.8 6.4-12.8 12.8v198.4c0 6.4 6.4 12.8 12.8 12.8zm96 0h22.4c6.4 0 12.8-6.4 12.8-12.8V204.8c0-6.4-6.4-12.8-12.8-12.8h-22.4c-6.4 0-12.8 6.4-12.8 12.8v134.4c0 6.4 6.4 12.8 12.8 12.8zM496 400H48V80c0-8.84-7.16-16-16-16H16C7.16 64 0 71.16 0 80v336c0 17.67 14.33 32 32 32h464c8.84 0 16-7.16 16-16v-16c0-8.84-7.16-16-16-16zm-387.2-48h22.4c6.4 0 12.8-6.4 12.8-12.8v-70.4c0-6.4-6.4-12.8-12.8-12.8h-22.4c-6.4 0-12.8 6.4-12.8 12.8v70.4c0 6.4 6.4 12.8 12.8 12.8z\" class=\"\"><\/path><\/svg><\/i> <img loading=\"lazy\" decoding=\"async\" width=\"16\" height=\"16\" alt=\"Loading\" src=\"https:\/\/dbtut.com\/wp-content\/plugins\/page-views-count\/ajax-loader-2x.gif\" border=0 \/><\/p>\n<div class=\"pvc_clear\"><\/div>\n","protected":false},"excerpt":{"rendered":"<p>We need a Data Controller running on Kubernetes to be able to deploy Azure Arc-enabled data services to different cloud providers on-premises or in a hybrid cloud scenario. With the Data Controller, you can implement Azure Arc integration and core functions such as management services into your own structure. For this reason, Azure Arc Data &hellip;<\/p>\n<div class=\"pvc_clear\"><\/div>\n<p id=\"pvc_stats_52516\" class=\"pvc_stats all  \" data-element-id=\"52516\" style=\"\"><i class=\"pvc-stats-icon medium\" aria-hidden=\"true\"><svg aria-hidden=\"true\" focusable=\"false\" data-prefix=\"far\" data-icon=\"chart-bar\" role=\"img\" xmlns=\"http:\/\/www.w3.org\/2000\/svg\" viewBox=\"0 0 512 512\" class=\"svg-inline--fa fa-chart-bar fa-w-16 fa-2x\"><path fill=\"currentColor\" d=\"M396.8 352h22.4c6.4 0 12.8-6.4 12.8-12.8V108.8c0-6.4-6.4-12.8-12.8-12.8h-22.4c-6.4 0-12.8 6.4-12.8 12.8v230.4c0 6.4 6.4 12.8 12.8 12.8zm-192 0h22.4c6.4 0 12.8-6.4 12.8-12.8V140.8c0-6.4-6.4-12.8-12.8-12.8h-22.4c-6.4 0-12.8 6.4-12.8 12.8v198.4c0 6.4 6.4 12.8 12.8 12.8zm96 0h22.4c6.4 0 12.8-6.4 12.8-12.8V204.8c0-6.4-6.4-12.8-12.8-12.8h-22.4c-6.4 0-12.8 6.4-12.8 12.8v134.4c0 6.4 6.4 12.8 12.8 12.8zM496 400H48V80c0-8.84-7.16-16-16-16H16C7.16 64 0 71.16 0 80v336c0 17.67 14.33 32 32 32h464c8.84 0 16-7.16 16-16v-16c0-8.84-7.16-16-16-16zm-387.2-48h22.4c6.4 0 12.8-6.4 12.8-12.8v-70.4c0-6.4-6.4-12.8-12.8-12.8h-22.4c-6.4 0-12.8 6.4-12.8 12.8v70.4c0 6.4 6.4 12.8 12.8 12.8z\" class=\"\"><\/path><\/svg><\/i> <img loading=\"lazy\" decoding=\"async\" width=\"16\" height=\"16\" alt=\"Loading\" src=\"https:\/\/dbtut.com\/wp-content\/plugins\/page-views-count\/ajax-loader-2x.gif\" border=0 \/><\/p>\n<div class=\"pvc_clear\"><\/div>\n","protected":false},"author":1414,"featured_media":52546,"comment_status":"open","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"om_disable_all_campaigns":false,"_monsterinsights_skip_tracking":false,"_monsterinsights_sitenote_active":false,"_monsterinsights_sitenote_note":"","_monsterinsights_sitenote_category":0,"_uf_show_specific_survey":0,"_uf_disable_surveys":false,"footnotes":""},"categories":[10403,3],"tags":[],"class_list":["post-52516","post","type-post","status-publish","format-standard","has-post-thumbnail","","category-azure","category-mssql"],"aioseo_notices":[],"yoast_head":"<!-- This site is optimized with the Yoast SEO plugin v24.9 - https:\/\/yoast.com\/wordpress\/plugins\/seo\/ -->\n<title>Azure Arc Enabled Data Services - Database Tutorials<\/title>\n<meta name=\"description\" content=\"We need a Data Controller running on Kubernetes to be able to deploy Azure Arc-enabled data services to different cloud providers on-premises\" \/>\n<meta name=\"robots\" content=\"index, follow, max-snippet:-1, max-image-preview:large, max-video-preview:-1\" \/>\n<link rel=\"canonical\" href=\"https:\/\/dbtut.com\/index.php\/2022\/10\/03\/azure-arc-enabled-data-services\/\" \/>\n<meta property=\"og:locale\" content=\"en_US\" \/>\n<meta property=\"og:type\" content=\"article\" \/>\n<meta property=\"og:title\" content=\"Azure Arc Enabled Data Services - Database Tutorials\" \/>\n<meta property=\"og:description\" content=\"We need a Data Controller running on Kubernetes to be able to deploy Azure Arc-enabled data services to different cloud providers on-premises\" \/>\n<meta property=\"og:url\" content=\"https:\/\/dbtut.com\/index.php\/2022\/10\/03\/azure-arc-enabled-data-services\/\" \/>\n<meta property=\"og:site_name\" content=\"Database Tutorials\" \/>\n<meta property=\"article:published_time\" content=\"2022-10-03T15:36:12+00:00\" \/>\n<meta property=\"article:modified_time\" content=\"2022-10-03T15:44:00+00:00\" \/>\n<meta property=\"og:image\" content=\"https:\/\/dbtut.com\/wp-content\/uploads\/2022\/10\/Ekran-goruntusu-2022-10-03-183539.png\" \/>\n\t<meta property=\"og:image:width\" content=\"590\" \/>\n\t<meta property=\"og:image:height\" content=\"388\" \/>\n\t<meta property=\"og:image:type\" content=\"image\/png\" \/>\n<meta name=\"author\" content=\"\u00c7a\u011flar \u00d6zen\u00e7\" \/>\n<meta name=\"twitter:card\" content=\"summary_large_image\" \/>\n<meta name=\"twitter:label1\" content=\"Written by\" \/>\n\t<meta name=\"twitter:data1\" content=\"\u00c7a\u011flar \u00d6zen\u00e7\" \/>\n\t<meta name=\"twitter:label2\" content=\"Est. reading time\" \/>\n\t<meta name=\"twitter:data2\" content=\"14 minutes\" \/>\n<script type=\"application\/ld+json\" class=\"yoast-schema-graph\">{\"@context\":\"https:\/\/schema.org\",\"@graph\":[{\"@type\":\"Article\",\"@id\":\"https:\/\/dbtut.com\/index.php\/2022\/10\/03\/azure-arc-enabled-data-services\/#article\",\"isPartOf\":{\"@id\":\"https:\/\/dbtut.com\/index.php\/2022\/10\/03\/azure-arc-enabled-data-services\/\"},\"author\":{\"name\":\"\u00c7a\u011flar \u00d6zen\u00e7\",\"@id\":\"https:\/\/dbtut.com\/#\/schema\/person\/92baa6fd666fb707d903177fed07d6ab\"},\"headline\":\"Azure Arc Enabled Data Services\",\"datePublished\":\"2022-10-03T15:36:12+00:00\",\"dateModified\":\"2022-10-03T15:44:00+00:00\",\"mainEntityOfPage\":{\"@id\":\"https:\/\/dbtut.com\/index.php\/2022\/10\/03\/azure-arc-enabled-data-services\/\"},\"wordCount\":1789,\"commentCount\":0,\"publisher\":{\"@id\":\"https:\/\/dbtut.com\/#organization\"},\"image\":{\"@id\":\"https:\/\/dbtut.com\/index.php\/2022\/10\/03\/azure-arc-enabled-data-services\/#primaryimage\"},\"thumbnailUrl\":\"https:\/\/dbtut.com\/wp-content\/uploads\/2022\/10\/Ekran-goruntusu-2022-10-03-183539.png\",\"articleSection\":[\"Azure\",\"MSSQL\"],\"inLanguage\":\"en-US\",\"potentialAction\":[{\"@type\":\"CommentAction\",\"name\":\"Comment\",\"target\":[\"https:\/\/dbtut.com\/index.php\/2022\/10\/03\/azure-arc-enabled-data-services\/#respond\"]}]},{\"@type\":\"WebPage\",\"@id\":\"https:\/\/dbtut.com\/index.php\/2022\/10\/03\/azure-arc-enabled-data-services\/\",\"url\":\"https:\/\/dbtut.com\/index.php\/2022\/10\/03\/azure-arc-enabled-data-services\/\",\"name\":\"Azure Arc Enabled Data Services - Database Tutorials\",\"isPartOf\":{\"@id\":\"https:\/\/dbtut.com\/#website\"},\"primaryImageOfPage\":{\"@id\":\"https:\/\/dbtut.com\/index.php\/2022\/10\/03\/azure-arc-enabled-data-services\/#primaryimage\"},\"image\":{\"@id\":\"https:\/\/dbtut.com\/index.php\/2022\/10\/03\/azure-arc-enabled-data-services\/#primaryimage\"},\"thumbnailUrl\":\"https:\/\/dbtut.com\/wp-content\/uploads\/2022\/10\/Ekran-goruntusu-2022-10-03-183539.png\",\"datePublished\":\"2022-10-03T15:36:12+00:00\",\"dateModified\":\"2022-10-03T15:44:00+00:00\",\"description\":\"We need a Data Controller running on Kubernetes to be able to deploy Azure Arc-enabled data services to different cloud providers on-premises\",\"breadcrumb\":{\"@id\":\"https:\/\/dbtut.com\/index.php\/2022\/10\/03\/azure-arc-enabled-data-services\/#breadcrumb\"},\"inLanguage\":\"en-US\",\"potentialAction\":[{\"@type\":\"ReadAction\",\"target\":[\"https:\/\/dbtut.com\/index.php\/2022\/10\/03\/azure-arc-enabled-data-services\/\"]}]},{\"@type\":\"ImageObject\",\"inLanguage\":\"en-US\",\"@id\":\"https:\/\/dbtut.com\/index.php\/2022\/10\/03\/azure-arc-enabled-data-services\/#primaryimage\",\"url\":\"https:\/\/dbtut.com\/wp-content\/uploads\/2022\/10\/Ekran-goruntusu-2022-10-03-183539.png\",\"contentUrl\":\"https:\/\/dbtut.com\/wp-content\/uploads\/2022\/10\/Ekran-goruntusu-2022-10-03-183539.png\",\"width\":590,\"height\":388},{\"@type\":\"BreadcrumbList\",\"@id\":\"https:\/\/dbtut.com\/index.php\/2022\/10\/03\/azure-arc-enabled-data-services\/#breadcrumb\",\"itemListElement\":[{\"@type\":\"ListItem\",\"position\":1,\"name\":\"Home\",\"item\":\"https:\/\/dbtut.com\/\"},{\"@type\":\"ListItem\",\"position\":2,\"name\":\"Azure Arc Enabled Data Services\"}]},{\"@type\":\"WebSite\",\"@id\":\"https:\/\/dbtut.com\/#website\",\"url\":\"https:\/\/dbtut.com\/\",\"name\":\"Database Tutorials\",\"description\":\"MSSQL, Oracle, PostgreSQL, MySQL, MariaDB, DB2, Sybase, Teradata, Big Data, NOSQL, MongoDB, Couchbase, Cassandra, Windows, Linux\",\"publisher\":{\"@id\":\"https:\/\/dbtut.com\/#organization\"},\"potentialAction\":[{\"@type\":\"SearchAction\",\"target\":{\"@type\":\"EntryPoint\",\"urlTemplate\":\"https:\/\/dbtut.com\/?s={search_term_string}\"},\"query-input\":{\"@type\":\"PropertyValueSpecification\",\"valueRequired\":true,\"valueName\":\"search_term_string\"}}],\"inLanguage\":\"en-US\"},{\"@type\":\"Organization\",\"@id\":\"https:\/\/dbtut.com\/#organization\",\"name\":\"dbtut\",\"url\":\"https:\/\/dbtut.com\/\",\"logo\":{\"@type\":\"ImageObject\",\"inLanguage\":\"en-US\",\"@id\":\"https:\/\/dbtut.com\/#\/schema\/logo\/image\/\",\"url\":\"https:\/\/dbtut.com\/wp-content\/uploads\/2021\/02\/dbtutlogo.jpg\",\"contentUrl\":\"https:\/\/dbtut.com\/wp-content\/uploads\/2021\/02\/dbtutlogo.jpg\",\"width\":223,\"height\":36,\"caption\":\"dbtut\"},\"image\":{\"@id\":\"https:\/\/dbtut.com\/#\/schema\/logo\/image\/\"}},{\"@type\":\"Person\",\"@id\":\"https:\/\/dbtut.com\/#\/schema\/person\/92baa6fd666fb707d903177fed07d6ab\",\"name\":\"\u00c7a\u011flar \u00d6zen\u00e7\",\"image\":{\"@type\":\"ImageObject\",\"inLanguage\":\"en-US\",\"@id\":\"https:\/\/dbtut.com\/#\/schema\/person\/image\/\",\"url\":\"https:\/\/secure.gravatar.com\/avatar\/997658bc236de4f5a0f3f46e64535566e31ba96824c77c01165e863fc38fd1ba?s=96&d=mm&r=g\",\"contentUrl\":\"https:\/\/secure.gravatar.com\/avatar\/997658bc236de4f5a0f3f46e64535566e31ba96824c77c01165e863fc38fd1ba?s=96&d=mm&r=g\",\"caption\":\"\u00c7a\u011flar \u00d6zen\u00e7\"},\"url\":\"https:\/\/dbtut.com\/index.php\/author\/caglarozenc\/\"}]}<\/script>\n<!-- \/ Yoast SEO plugin. -->","yoast_head_json":{"title":"Azure Arc Enabled Data Services - Database Tutorials","description":"We need a Data Controller running on Kubernetes to be able to deploy Azure Arc-enabled data services to different cloud providers on-premises","robots":{"index":"index","follow":"follow","max-snippet":"max-snippet:-1","max-image-preview":"max-image-preview:large","max-video-preview":"max-video-preview:-1"},"canonical":"https:\/\/dbtut.com\/index.php\/2022\/10\/03\/azure-arc-enabled-data-services\/","og_locale":"en_US","og_type":"article","og_title":"Azure Arc Enabled Data Services - Database Tutorials","og_description":"We need a Data Controller running on Kubernetes to be able to deploy Azure Arc-enabled data services to different cloud providers on-premises","og_url":"https:\/\/dbtut.com\/index.php\/2022\/10\/03\/azure-arc-enabled-data-services\/","og_site_name":"Database Tutorials","article_published_time":"2022-10-03T15:36:12+00:00","article_modified_time":"2022-10-03T15:44:00+00:00","og_image":[{"width":590,"height":388,"url":"https:\/\/dbtut.com\/wp-content\/uploads\/2022\/10\/Ekran-goruntusu-2022-10-03-183539.png","type":"image\/png"}],"author":"\u00c7a\u011flar \u00d6zen\u00e7","twitter_card":"summary_large_image","twitter_misc":{"Written by":"\u00c7a\u011flar \u00d6zen\u00e7","Est. reading time":"14 minutes"},"schema":{"@context":"https:\/\/schema.org","@graph":[{"@type":"Article","@id":"https:\/\/dbtut.com\/index.php\/2022\/10\/03\/azure-arc-enabled-data-services\/#article","isPartOf":{"@id":"https:\/\/dbtut.com\/index.php\/2022\/10\/03\/azure-arc-enabled-data-services\/"},"author":{"name":"\u00c7a\u011flar \u00d6zen\u00e7","@id":"https:\/\/dbtut.com\/#\/schema\/person\/92baa6fd666fb707d903177fed07d6ab"},"headline":"Azure Arc Enabled Data Services","datePublished":"2022-10-03T15:36:12+00:00","dateModified":"2022-10-03T15:44:00+00:00","mainEntityOfPage":{"@id":"https:\/\/dbtut.com\/index.php\/2022\/10\/03\/azure-arc-enabled-data-services\/"},"wordCount":1789,"commentCount":0,"publisher":{"@id":"https:\/\/dbtut.com\/#organization"},"image":{"@id":"https:\/\/dbtut.com\/index.php\/2022\/10\/03\/azure-arc-enabled-data-services\/#primaryimage"},"thumbnailUrl":"https:\/\/dbtut.com\/wp-content\/uploads\/2022\/10\/Ekran-goruntusu-2022-10-03-183539.png","articleSection":["Azure","MSSQL"],"inLanguage":"en-US","potentialAction":[{"@type":"CommentAction","name":"Comment","target":["https:\/\/dbtut.com\/index.php\/2022\/10\/03\/azure-arc-enabled-data-services\/#respond"]}]},{"@type":"WebPage","@id":"https:\/\/dbtut.com\/index.php\/2022\/10\/03\/azure-arc-enabled-data-services\/","url":"https:\/\/dbtut.com\/index.php\/2022\/10\/03\/azure-arc-enabled-data-services\/","name":"Azure Arc Enabled Data Services - Database Tutorials","isPartOf":{"@id":"https:\/\/dbtut.com\/#website"},"primaryImageOfPage":{"@id":"https:\/\/dbtut.com\/index.php\/2022\/10\/03\/azure-arc-enabled-data-services\/#primaryimage"},"image":{"@id":"https:\/\/dbtut.com\/index.php\/2022\/10\/03\/azure-arc-enabled-data-services\/#primaryimage"},"thumbnailUrl":"https:\/\/dbtut.com\/wp-content\/uploads\/2022\/10\/Ekran-goruntusu-2022-10-03-183539.png","datePublished":"2022-10-03T15:36:12+00:00","dateModified":"2022-10-03T15:44:00+00:00","description":"We need a Data Controller running on Kubernetes to be able to deploy Azure Arc-enabled data services to different cloud providers on-premises","breadcrumb":{"@id":"https:\/\/dbtut.com\/index.php\/2022\/10\/03\/azure-arc-enabled-data-services\/#breadcrumb"},"inLanguage":"en-US","potentialAction":[{"@type":"ReadAction","target":["https:\/\/dbtut.com\/index.php\/2022\/10\/03\/azure-arc-enabled-data-services\/"]}]},{"@type":"ImageObject","inLanguage":"en-US","@id":"https:\/\/dbtut.com\/index.php\/2022\/10\/03\/azure-arc-enabled-data-services\/#primaryimage","url":"https:\/\/dbtut.com\/wp-content\/uploads\/2022\/10\/Ekran-goruntusu-2022-10-03-183539.png","contentUrl":"https:\/\/dbtut.com\/wp-content\/uploads\/2022\/10\/Ekran-goruntusu-2022-10-03-183539.png","width":590,"height":388},{"@type":"BreadcrumbList","@id":"https:\/\/dbtut.com\/index.php\/2022\/10\/03\/azure-arc-enabled-data-services\/#breadcrumb","itemListElement":[{"@type":"ListItem","position":1,"name":"Home","item":"https:\/\/dbtut.com\/"},{"@type":"ListItem","position":2,"name":"Azure Arc Enabled Data Services"}]},{"@type":"WebSite","@id":"https:\/\/dbtut.com\/#website","url":"https:\/\/dbtut.com\/","name":"Database Tutorials","description":"MSSQL, Oracle, PostgreSQL, MySQL, MariaDB, DB2, Sybase, Teradata, Big Data, NOSQL, MongoDB, Couchbase, Cassandra, Windows, Linux","publisher":{"@id":"https:\/\/dbtut.com\/#organization"},"potentialAction":[{"@type":"SearchAction","target":{"@type":"EntryPoint","urlTemplate":"https:\/\/dbtut.com\/?s={search_term_string}"},"query-input":{"@type":"PropertyValueSpecification","valueRequired":true,"valueName":"search_term_string"}}],"inLanguage":"en-US"},{"@type":"Organization","@id":"https:\/\/dbtut.com\/#organization","name":"dbtut","url":"https:\/\/dbtut.com\/","logo":{"@type":"ImageObject","inLanguage":"en-US","@id":"https:\/\/dbtut.com\/#\/schema\/logo\/image\/","url":"https:\/\/dbtut.com\/wp-content\/uploads\/2021\/02\/dbtutlogo.jpg","contentUrl":"https:\/\/dbtut.com\/wp-content\/uploads\/2021\/02\/dbtutlogo.jpg","width":223,"height":36,"caption":"dbtut"},"image":{"@id":"https:\/\/dbtut.com\/#\/schema\/logo\/image\/"}},{"@type":"Person","@id":"https:\/\/dbtut.com\/#\/schema\/person\/92baa6fd666fb707d903177fed07d6ab","name":"\u00c7a\u011flar \u00d6zen\u00e7","image":{"@type":"ImageObject","inLanguage":"en-US","@id":"https:\/\/dbtut.com\/#\/schema\/person\/image\/","url":"https:\/\/secure.gravatar.com\/avatar\/997658bc236de4f5a0f3f46e64535566e31ba96824c77c01165e863fc38fd1ba?s=96&d=mm&r=g","contentUrl":"https:\/\/secure.gravatar.com\/avatar\/997658bc236de4f5a0f3f46e64535566e31ba96824c77c01165e863fc38fd1ba?s=96&d=mm&r=g","caption":"\u00c7a\u011flar \u00d6zen\u00e7"},"url":"https:\/\/dbtut.com\/index.php\/author\/caglarozenc\/"}]}},"amp_enabled":true,"_links":{"self":[{"href":"https:\/\/dbtut.com\/index.php\/wp-json\/wp\/v2\/posts\/52516","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/dbtut.com\/index.php\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/dbtut.com\/index.php\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/dbtut.com\/index.php\/wp-json\/wp\/v2\/users\/1414"}],"replies":[{"embeddable":true,"href":"https:\/\/dbtut.com\/index.php\/wp-json\/wp\/v2\/comments?post=52516"}],"version-history":[{"count":4,"href":"https:\/\/dbtut.com\/index.php\/wp-json\/wp\/v2\/posts\/52516\/revisions"}],"predecessor-version":[{"id":52549,"href":"https:\/\/dbtut.com\/index.php\/wp-json\/wp\/v2\/posts\/52516\/revisions\/52549"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/dbtut.com\/index.php\/wp-json\/wp\/v2\/media\/52546"}],"wp:attachment":[{"href":"https:\/\/dbtut.com\/index.php\/wp-json\/wp\/v2\/media?parent=52516"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/dbtut.com\/index.php\/wp-json\/wp\/v2\/categories?post=52516"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/dbtut.com\/index.php\/wp-json\/wp\/v2\/tags?post=52516"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}