tag:blogger.com,1999:blog-26359790416803111672024-03-18T19:03:06.891-04:00It's all about DataWhat I am passionate about!Unknownnoreply@blogger.comBlogger105125tag:blogger.com,1999:blog-2635979041680311167.post-12316065742333153662023-11-29T08:14:00.005-05:002023-12-05T08:16:38.256-05:00How to solve available workspace capacity exceeded error (Livy session) in Azure Synapse Analytics?<h3 style="text-align: left;">Livy session errors in Azure Synapse with Notebook</h3><p>If you are working with Notebook in Azure Synapse Analytics and multiple people are using the same Spark cluster at the same time then chances are high that you have seen any of these below errors: </p><div style="text-align: left;">"Livy Session has failed.
Session state: error code: AVAILBLE_WORKSPACE_CAPACITY_EXCEEDED.You job
requested 24 vcores . However, the workspace only has 2 vcores availble out of
quota of 50 vcores for node size family [MemoryOptmized]."</div><p class="MsoNormal" style="text-align: justify;"><o:p></o:p></p><p>"Failed to create Livy session for executing notebook. Error: Your pool's capacity (3200 vcores) exceeds your workspace's total vcore quota (50 vcores). Try reducing the pool capacity or increasing your workspace's vcore quota. HTTP status code: 400."</p><p>"InvalidHttpRequestToLivy: Your Spark job requested 56 vcores. However, the workspace has a 50 core limit. Try reducing the numbers of vcores requested or increasing your vcore quota. Quota can be increased using Azure Support request"</p><p>The below figure:1 shows one of the error while running the Synapse notebook.</p><div class="separator" style="clear: both; text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEi3VR-Eob0gKQsIriPEdVlx_2Bhlu6kfQnhR1XCt0urMiI1gWJ19D1CAGtSLh3h7WG_szWfnMln7ur2RJzN4SUYHO9PfYorDKg79IyYwMz61uYHvpNtB7a3o7221euYH6a1oD2dtwrxRnm00gyGZgkRG_6G9EF67p8y1G0qsGcZRxy1J0VtMd7fnT9ppl9Y/s1878/Workspace_Capacity_Exceed.png" style="margin-left: 1em; margin-right: 1em;"><img border="0" data-original-height="423" data-original-width="1878" height="90" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEi3VR-Eob0gKQsIriPEdVlx_2Bhlu6kfQnhR1XCt0urMiI1gWJ19D1CAGtSLh3h7WG_szWfnMln7ur2RJzN4SUYHO9PfYorDKg79IyYwMz61uYHvpNtB7a3o7221euYH6a1oD2dtwrxRnm00gyGZgkRG_6G9EF67p8y1G0qsGcZRxy1J0VtMd7fnT9ppl9Y/w400-h90/Workspace_Capacity_Exceed.png" width="400" /></a></div><div class="separator" style="clear: both; text-align: center;">Fig 1: Error Available workspace exceed</div><h3 style="text-align: left;">What is Livy?</h3><div style="text-align: justify;">Your initial thought will likely be, "What exactly is this Livy session?"! <span style="background-color: white; text-align: justify;"><span face="Helvetica, sans-serif" style="color: #333333;"><span style="font-size: 10.5pt;">Apache Livy is a service that enables easy interaction with a Spark
cluster over a REST interface </span></span><a href="https://livy.apache.org/" style="color: #333333; font-family: Helvetica, sans-serif; font-size: 10.5pt;" target="_blank">[1]</a>.<span face="Helvetica, sans-serif" style="color: #333333;"><span style="font-size: 10.5pt;"> Whenever we execute a notebook from Azure Synapse Analytics Livy helps </span><span style="font-size: 14px;">interact</span><span style="font-size: 10.5pt;"> with Spark engine.</span></span></span></div><p class="MsoNormal" style="text-align: justify;"><o:p></o:p></p><div><br /></div><h3 style="text-align: left;">Will it fix if you increase vCores?</h3><p>By looking at the error message you may attempt to increase the vCores; however, increasing the vCores will not solve your problem. </p><p>Let's look into how Spark pool works, by definition of Spark pool; when Spark pool instantiated it's create a Spark instance that process the data. Spark instances are created when you connect to a Spark pool, create a session and run the job. And multiple users may have access to a single Spark pool. When multiple users are running a job at the same time then it may happen that the first job already used most vCores so the another job executed by other user will find the Livy session error . </p><h3 style="text-align: left;">How to solve it?</h3><div>The problem occurs when multiple people work on same Spark pool or same user running more than one Synapse notebook in parallel. When multiple Data Engineer works on same Spark pool in Synapse they can configure the session and save into their DevOps branch. Let's look into it step by step:</div><div><br /></div><div>1. You will find configuration button (a gear icon) as below fig 1 shown:</div><div class="separator" style="clear: both; text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhQAEphi5r9wnOC268z0Qi6Y2Bj-thvCG8xk4a6UCtz3CLH-TCqeMtOKNu2AfrFQPFnbfKveBQ69_8xZmZ3W4_YIWPA1SmsQG7YdSYBtikoRu-ZKwaQVn4S0e0TOHQ-5J9wt4wXLDNlC3uHYy5ChHsBAOJWru5ef1Y_pSswy55EwqRJE1lfCtgDj5b753pi/s1783/Find%20configuration.png" style="margin-left: 1em; margin-right: 1em;"><img border="0" data-original-height="525" data-original-width="1783" height="94" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhQAEphi5r9wnOC268z0Qi6Y2Bj-thvCG8xk4a6UCtz3CLH-TCqeMtOKNu2AfrFQPFnbfKveBQ69_8xZmZ3W4_YIWPA1SmsQG7YdSYBtikoRu-ZKwaQVn4S0e0TOHQ-5J9wt4wXLDNlC3uHYy5ChHsBAOJWru5ef1Y_pSswy55EwqRJE1lfCtgDj5b753pi/s320/Find%20configuration.png" width="320" /></a></div><div class="separator" style="clear: both; text-align: center;">Fig 2: configuration button</div><div class="separator" style="clear: both; text-align: center;"><br /></div>By clicking on the configuration button you will find details about Spark pool configuration details as shown in the below diagram:<div class="separator" style="clear: both; text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhdDNJwMOBR6FOZLM53FLJt080VXNZavbkKbbqgjuXNxT80p2RqCtUl3ca-glU_5peWAM_t7JOluddJr4HL_z_25qBDjGc22JUTXl7N-ZcDgAY60pbkC7j3dYeKT2dnzPkHqmKP7UyXIS7zKwxs36gQH9RLaKdjlzYgEga6LJf-oZOt9zdJl7c1MxPFiIOM/s1353/No%20active%20session.png" style="margin-left: 1em; margin-right: 1em;"><img border="0" data-original-height="937" data-original-width="1353" height="222" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhdDNJwMOBR6FOZLM53FLJt080VXNZavbkKbbqgjuXNxT80p2RqCtUl3ca-glU_5peWAM_t7JOluddJr4HL_z_25qBDjGc22JUTXl7N-ZcDgAY60pbkC7j3dYeKT2dnzPkHqmKP7UyXIS7zKwxs36gQH9RLaKdjlzYgEga6LJf-oZOt9zdJl7c1MxPFiIOM/s320/No%20active%20session.png" width="320" /></a></div><div class="separator" style="clear: both; text-align: center;">Fig 3: Session details</div>And this is your session, as you find in the above Fig 3. And there is no active session. <div><br /></div><div>2. Activate your session by attaching the Spark pool and assigning right resources. Please find the below fig 4 and details steps to avoid the Livy session error.</div><div><br /></div><div class="separator" style="clear: both; text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjMhIyHG1M7GW1XAA_GApPoWrcY2gq8XZSCGDaVBV4RqVOio-VDtG8Y-AJWHhLPFfaWc6dyRpE5yiZ292g1fq2TtDtyGHwg6VP6i1HXatl9vdHgO-WNW3Uf6q8db27T4rjIuATHYFQGZ7aoJma8yLWUPIAHsX03e6mvOsKWGTdhkXwRWnNJGX8X58Oga5FD/s923/Configure%20properly.png" style="margin-left: 1em; margin-right: 1em;"><img border="0" data-original-height="923" data-original-width="606" height="320" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjMhIyHG1M7GW1XAA_GApPoWrcY2gq8XZSCGDaVBV4RqVOio-VDtG8Y-AJWHhLPFfaWc6dyRpE5yiZ292g1fq2TtDtyGHwg6VP6i1HXatl9vdHgO-WNW3Uf6q8db27T4rjIuATHYFQGZ7aoJma8yLWUPIAHsX03e6mvOsKWGTdhkXwRWnNJGX8X58Oga5FD/s320/Configure%20properly.png" width="210" /></a></div><div class="separator" style="clear: both; text-align: center;">Fig 4: Fine-tune the Settings</div><br /><div>a) Attach the Spark pool from available pools (if you have more than one). I got only one Spark pool.</div><div>b) Select session size, I have chosen small by clicking the 'Use' button</div><div>c) Enable Dynamically allocate executor, this will help Spark engine to allocate the executor dynamically.</div><div>d) You can change Executors, e.g. by default small session size have 47 executors however; I have chosen 3 to 18 executors to free up the rest for other users. Sometimes you may need to get down to 1 to 3 executors to avoid the errors.</div><div>e) And finally apply the changes to your session.</div><div><br /></div><div>And you can commit this changes to your DevOps branch for the particular notebook you are working on so that you don't need to apply the same settings again.</div><div><br /></div><div><div>In addition, since the maximum Synapse workspace's total vCores is 50, you can create <a href="https://learn.microsoft.com/en-us/answers/questions/1306112/increasing-your-vcore-quota" target="_blank">request</a> to Microsoft to increase the vCores for your organization.</div><div><br /></div></div><div>In summary, the Livy session error is a common error when multiple data engineers working in the same Spark pool. So it's important to understand how you can have your session setup correctly so that more than one session can be executed at the same time.</div><div><br /></div><div><br /></div><div><div><div><br /></div></div></div>Unknownnoreply@blogger.com0Toronto, ON, Canada43.653226 -79.383184315.342992163821151 -114.5394343 71.963459836178842 -44.226934299999996tag:blogger.com,1999:blog-2635979041680311167.post-2093469584110613392023-07-01T11:22:00.004-04:002023-07-03T10:49:37.964-04:00Provisioning Microsoft Fabric: A Step-by-Step Guide for Your Organization<div style="text-align: left;"><span style="color: #222222; font-family: Times New Roman, serif; font-size: medium;">Learn how to provision Microsoft Fabric with ease! Unveiled at Microsoft Build 2023, Microsoft Fabric is a cutting-edge Data Intelligence platform that has taken the industry by storm. This blog post will cover a simple step-by-step guide to provisioning Microsoft Fabric using the efficient Microsoft Fabric Capacity.</span></div><div style="text-align: left;"><span style="color: #222222; font-family: Times New Roman, serif; font-size: medium;"><br /></span><span style="background-color: white; color: #222222; font-family: "Times New Roman", serif; font-size: medium;">Before we look into the Fabric Capacity, let's get into different licensing models. There are three different types of licenses you can choose from to start working with Microsoft Fabric:<br /></span><span style="color: #222222; font-family: Times New Roman, serif;"><span style="background-color: white; font-size: medium;">1. Microsoft Fabric trial<br /></span></span><span style="font-size: medium;"><span style="color: #222222; font-family: Times New Roman, serif;"><span style="background-color: white;">It's free for two months and you need to use your organization email. Personal email doesn't work. </span></span><span style="background-color: white; color: #222222; font-family: "Times New Roman", serif;">Please find how you can <a href="https://learn.microsoft.com/en-us/fabric/get-started/fabric-trial?WT.mc_id=DP-MVP-5004718" target="_blank">activate your free trial<br /></a></span></span><span style="color: #222222; font-family: Times New Roman, serif;"><span style="background-color: white; font-size: medium;">2. Power BI Premium Per Capacity (P SKUs)<br /></span></span><span style="color: #222222; font-family: Times New Roman, serif;"><span style="background-color: white; font-size: medium;">If you already have a Power BI premium license you can work with Microsoft Fabric. <br /></span></span><span style="color: #222222; font-family: Times New Roman, serif;"><span style="background-color: white; font-size: medium;">3. Microsoft Fabric Capacity (F SKUs)<br /></span></span><span style="font-size: medium;"><span style="background-color: white; color: #222222; font-family: "Times New Roman", serif;">With your organizational Azure subscription, you can create Microsoft Fabric capacity. </span><span style="background-color: white; color: #222222; font-family: "Times New Roman", serif;">You will find more details about </span><a href="https://learn.microsoft.com/en-us/fabric/enterprise/licenses#capacity-and-skus?WT.mc_id=DP-MVP-5004718" style="font-family: "Times New Roman", serif;" target="_blank">Microsoft Fabric licenses<br /></a></span><span style="color: #222222; font-family: Times New Roman, serif;"><span style="background-color: white; font-size: medium;">We will go through the steps in detail on how Microsoft Fabric capacity can be provisioned from Azure Portal.</span></span></div><div style="text-align: left;"><span style="color: #222222; font-family: Times New Roman, serif;"><span style="background-color: white; font-size: medium;"><br /></span></span><span style="background-color: white; color: #222222; font-family: "Times New Roman", serif; font-size: medium;">Step 1: Please login to the <a href="https://portal.azure.com?WT.mc_id=DP-MVP-5004718" target="_blank">Azure Portal</a> and find Microsoft Fabric from your organization's Azure Portal as shown below in Fig 1.</span></div><div class="separator" style="clear: both; text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEiYWPfJSfZg9To-WFHq0uBPPR2lCdmFJ23Gl-3rnlA1poQFOObvtPYOJzsIA0LFxJoj8X9En884aYz46H-B5z5AVI-W_ndPdgICG6hxSyMKMftH40w-5ju-x5oGuu_Cu22B6oOJUC77gyHAeF_boVj65XHvwy0nCDLYeZ0n0fk8RkDNvdySoWw1WQ6zknQF/s1357/Finding%20Microsoft%20Fabric.jpg" style="margin-left: 1em; margin-right: 1em;"><img border="0" data-original-height="662" data-original-width="1357" height="195" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEiYWPfJSfZg9To-WFHq0uBPPR2lCdmFJ23Gl-3rnlA1poQFOObvtPYOJzsIA0LFxJoj8X9En884aYz46H-B5z5AVI-W_ndPdgICG6hxSyMKMftH40w-5ju-x5oGuu_Cu22B6oOJUC77gyHAeF_boVj65XHvwy0nCDLYeZ0n0fk8RkDNvdySoWw1WQ6zknQF/w400-h195/Finding%20Microsoft%20Fabric.jpg" width="400" /></a></div><div class="separator" style="clear: both; text-align: center;">Fig 1: Finding Microsoft Fabric in Azure Portal</div><br /><p><span style="background-color: white; color: #222222; font-family: "Times New Roman", serif; font-size: 18px;">Step 2: To create the Fabric capacity, the very first step is to create a resource group as shown in Fig 2</span></p><p><span style="background-color: white; color: #222222; font-family: "Times New Roman", serif; font-size: 18px;"><br /></span></p><div class="separator" style="clear: both; text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhqbWGhqyBzVO-IwCGlwJqahe4wsiM9rtGFStCpzNJypKJGYAylI2RxiKc9H8uRVhk1yKBIZyOL_Q_OhfXBKGPeL_RPQCLR8uLX4XLtoT-iu3kgjBabOADsfJiMovKc6oCBRqqKWT0DEmDFjk-uqO94ADzOlUSbXxQVAstJUHd409HD6I0_a57TVn3172wt/s1328/Creating%20Fabric%20Capacity.jpg" style="margin-left: 1em; margin-right: 1em;"><img border="0" data-original-height="817" data-original-width="1328" height="246" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhqbWGhqyBzVO-IwCGlwJqahe4wsiM9rtGFStCpzNJypKJGYAylI2RxiKc9H8uRVhk1yKBIZyOL_Q_OhfXBKGPeL_RPQCLR8uLX4XLtoT-iu3kgjBabOADsfJiMovKc6oCBRqqKWT0DEmDFjk-uqO94ADzOlUSbXxQVAstJUHd409HD6I0_a57TVn3172wt/w400-h246/Creating%20Fabric%20Capacity.jpg" width="400" /></a></div><div class="separator" style="clear: both; text-align: center;">Fig 2: Creating resource group</div><p></p><p><span style="color: #222222; font-family: Times New Roman, serif;"><span style="background-color: white; font-size: 18px;">Step 3: To create the right capacity you need to choose the resource size that you require. e.g. I have chosen F8 as shown in the below figure 3</span></span></p><p><span style="color: #222222; font-family: Times New Roman, serif;"></span></p><div class="separator" style="clear: both; text-align: center;"><span style="color: #222222; font-family: Times New Roman, serif;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEggjogrL5w9e3epYLYkYDp7K7vREK1crV2o6DO_ZRZ4JwO6XxBw6ka1i3cN0Y1FRDdL7pP42RCDIM22YLFXTeEMeadr_fS8qdacm2uamnn1OerqIffSAc8XkvIsbUKWbPTT9mseIbZUv8RBJnOEZESwjSIR-_bQQqj0sTfhS-VcHn8fS4ZZZroCGhmfvBdg/s1817/Select%20the%20right%20resource%20size.jpg" style="margin-left: 1em; margin-right: 1em;"><img border="0" data-original-height="801" data-original-width="1817" height="176" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEggjogrL5w9e3epYLYkYDp7K7vREK1crV2o6DO_ZRZ4JwO6XxBw6ka1i3cN0Y1FRDdL7pP42RCDIM22YLFXTeEMeadr_fS8qdacm2uamnn1OerqIffSAc8XkvIsbUKWbPTT9mseIbZUv8RBJnOEZESwjSIR-_bQQqj0sTfhS-VcHn8fS4ZZZroCGhmfvBdg/w400-h176/Select%20the%20right%20resource%20size.jpg" width="400" /></a></span></div><div class="separator" style="clear: both; text-align: center;"><span style="color: #222222; font-family: Times New Roman, serif;">Fig 3: Choose the right resource size</span></div><span style="color: #222222; font-family: Times New Roman, serif;"><br /><span style="background-color: white; font-size: 18px;"><br /></span></span><p></p><p><span style="font-size: medium;">You will find more capacity details in this <a href="https://blog.fabric.microsoft.com/en-CA/blog/announcing-microsoft-fabric-capacities-are-available-for-purchase?WT.mc_id=DP-MVP-5004718" target="_blank">Microsft blog post</a>.</span></p><p><span style="font-size: medium;">Step 4: As shown below in Fig 4, before creating the capacity please review all the information you have provided including resource group, region, capacity size, etc., and then hit the create button. </span></p><p></p><div class="separator" style="clear: both; text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgyL4KP1Y5qmSva3lgtqkiaOA488NaPEu6pMsAgavCTgOw7Ty9qQSOgmzARAbqjzNtPnsoDHERKUjIlgI-XdFQ5NfwnyFUB8hj1w-AGd08IS0KhzOV5Z26i4K2HDpL_npXn44j7p_2fdC-aOhEoMnFc1PsVNcmsAgvPUMOJ2Ud8jXh_V7ks8ioUZRV-KaJU/s1113/review%20fabric%20capacity.jpg" style="margin-left: 1em; margin-right: 1em;"><img border="0" data-original-height="821" data-original-width="1113" height="295" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgyL4KP1Y5qmSva3lgtqkiaOA488NaPEu6pMsAgavCTgOw7Ty9qQSOgmzARAbqjzNtPnsoDHERKUjIlgI-XdFQ5NfwnyFUB8hj1w-AGd08IS0KhzOV5Z26i4K2HDpL_npXn44j7p_2fdC-aOhEoMnFc1PsVNcmsAgvPUMOJ2Ud8jXh_V7ks8ioUZRV-KaJU/w400-h295/review%20fabric%20capacity.jpg" width="400" /></a></div><div class="separator" style="clear: both; text-align: center;">Fig 4: Review and create</div><br /><span style="font-size: medium;"><br /></span><p></p><p><span style="font-size: medium;">When it's done you will able to see Microsoft Fabric Capacity is created (see below fig 5)</span></p><div class="separator" style="clear: both; text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEh84hqMNHHDST2CtS7-lsQm0hcVDl1aIx-BH4QLWXVyZ89FL5nI5djTkXjIYEIqggFkKrMJXgXdIzgMxc0fvVztthUxAvIn_5EOrRCvPuVhEcgqWKgqipMif0KwoxOZbKn4ZRkFCkz2RVXqRuvADYnr6zsOeZX-VgeI67-9wBTNk8RV-Kdndi0KBUHfnGGE/s1397/Fabric-go%20to%20resource.jpg" style="margin-left: 1em; margin-right: 1em;"><img border="0" data-original-height="780" data-original-width="1397" height="224" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEh84hqMNHHDST2CtS7-lsQm0hcVDl1aIx-BH4QLWXVyZ89FL5nI5djTkXjIYEIqggFkKrMJXgXdIzgMxc0fvVztthUxAvIn_5EOrRCvPuVhEcgqWKgqipMif0KwoxOZbKn4ZRkFCkz2RVXqRuvADYnr6zsOeZX-VgeI67-9wBTNk8RV-Kdndi0KBUHfnGGE/w400-h224/Fabric-go%20to%20resource.jpg" width="400" /></a></div><div class="separator" style="clear: both; text-align: center;">Fig 5: Microsoft Fabric capacity created</div><br /><div style="text-align: left;"><span style="font-size: medium;"><span>You can go to the </span><a href="https://app.fabric.microsoft.com/admin-portal/" target="_blank">admin portal </a><span>to validate your recently created Fabric capacity too. Fig 6, shows the Fabric capacity under the admin panel.</span></span></div><p></p><div class="separator" style="clear: both; text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjUS2KPChoOTYRgsc7Hblb2h4QuG13wbfdawpI0j7gBSfb7WvJrI2k01HwzU6RPHTN-UHrIATUOxCRDx0U7TSm4vJVAGpwGCMpOGxFSQR1HpNDetLezDx6hkcaUdCh7GqZAdyXJUBwSv8Tuv-uXi3U7ds5gll5IrHMxPZkNDriVmgVJzqz2m-5mzTwriY6B/s1817/Capacity%20under%20Admin.jpg" style="margin-left: 1em; margin-right: 1em;"><img border="0" data-original-height="687" data-original-width="1817" height="151" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjUS2KPChoOTYRgsc7Hblb2h4QuG13wbfdawpI0j7gBSfb7WvJrI2k01HwzU6RPHTN-UHrIATUOxCRDx0U7TSm4vJVAGpwGCMpOGxFSQR1HpNDetLezDx6hkcaUdCh7GqZAdyXJUBwSv8Tuv-uXi3U7ds5gll5IrHMxPZkNDriVmgVJzqz2m-5mzTwriY6B/w400-h151/Capacity%20under%20Admin.jpg" width="400" /></a></div><div class="separator" style="clear: both; text-align: center;">Fig 6: Fabric capacity under admin portal</div><span><p style="font-size: x-large;"><span style="font-size: large;"><br /></span></p><span style="font-size: medium;">To explore the Microsoft Fabric please browse through <a href="https://app.fabric.microsoft.com/home" target="_blank">the site</a> and you will find the capacity you just created finally and the home page will look like below in Fig 7</span></span><p></p><p></p><div class="separator" style="clear: both; text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEioTfigyaQF-MJGQOuFY3OLtYDuWsydn8pJtJ-LGgwkDWoWuwVDedIw4mAMNdDAfe15G663cpcsm046zArCtEqxDrai3P95XB77FT_BFwBZxXQgos7qopIFJd9ZOZ1DK4NNtPC9JnzaYzJbuLl2MIUEnHrzifSIq0PC3vUPTIO6xh3mIkBNuHz1JzxydS0U/s1838/Home%20page%20fabric.jpg" style="margin-left: 1em; margin-right: 1em;"><img border="0" data-original-height="968" data-original-width="1838" height="211" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEioTfigyaQF-MJGQOuFY3OLtYDuWsydn8pJtJ-LGgwkDWoWuwVDedIw4mAMNdDAfe15G663cpcsm046zArCtEqxDrai3P95XB77FT_BFwBZxXQgos7qopIFJd9ZOZ1DK4NNtPC9JnzaYzJbuLl2MIUEnHrzifSIq0PC3vUPTIO6xh3mIkBNuHz1JzxydS0U/w400-h211/Home%20page%20fabric.jpg" width="400" /></a></div><div class="separator" style="clear: both; text-align: center;">Fig 7: Fabric Home page</div><br /><span style="font-size: large;"><br /></span><p></p><div style="text-align: left;"><span style="font-size: medium;"><span>In summary, you have a few ways to use Microsoft Fabric and learned provisioning Fabric by using Fabric Capacity. However, it's important to remember you must need to </span><a href="https://learn.microsoft.com/en-us/fabric/admin/fabric-switch?WT.mc_id=DP-MVP-5004718" target="_blank">Enable Microsoft Fabric for your organization.</a></span></div><p><br /></p>Unknownnoreply@blogger.com0Toronto, ON, Canada43.653226 -79.383184315.342992163821151 -114.5394343 71.963459836178842 -44.226934299999996tag:blogger.com,1999:blog-2635979041680311167.post-14132683534362392082023-05-24T08:10:00.002-04:002023-05-24T08:10:18.954-04:00What is OneLake in Microsoft Fabric?<p class="MsoNormal" style="line-height: normal; mso-margin-bottom-alt: auto; mso-margin-top-alt: auto;"><span style="font-family: "Times New Roman", serif; font-size: 13.5pt;">Get ready to be blown
away! The highly anticipated Microsoft Build 2023 has finally unveiled its
latest and greatest creation: the incredible Microsoft Fabric - an unparalleled
Data Intelligence platform that is guaranteed to revolutionize the tech world!<o:p></o:p></span></p><p><br /></p><div class="separator" style="clear: both; text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgatjnLHq03gmo8lS0woGuUjWWJs_RnjtMOc7ix2zzsEpT182AJhoq5BahDTXki-HBgxGBoh3OQBeVJ1RaCOu6fBOY-KsfAHmNHwTxIxh7DzHmF2iYwS74_9piOX7W6vRO76xmVYcmJDJMP1D1lCgQ8ZPdOCyMuiJ13LSNIxZkT-S_13U0AEH6t1seZTQ/s807/OneLake.PNG" style="margin-left: 1em; margin-right: 1em;"><img border="0" data-original-height="529" data-original-width="807" height="210" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgatjnLHq03gmo8lS0woGuUjWWJs_RnjtMOc7ix2zzsEpT182AJhoq5BahDTXki-HBgxGBoh3OQBeVJ1RaCOu6fBOY-KsfAHmNHwTxIxh7DzHmF2iYwS74_9piOX7W6vRO76xmVYcmJDJMP1D1lCgQ8ZPdOCyMuiJ13LSNIxZkT-S_13U0AEH6t1seZTQ/s320/OneLake.PNG" width="320" /></a></div><div class="separator" style="clear: both; text-align: center;"><br /></div><div class="separator" style="clear: both; text-align: center;">fig 1: OneLake for all Data</div><div><br /></div><p class="MsoNormal" style="line-height: normal; margin-bottom: 0cm;"><span style="font-family: "Times New Roman", serif; font-size: 13.5pt;">One of the most exciting things in Fabric I found is OneLake. I was amazed to discover how OneLake is simplified just like
OneDrive! It's a single unified logical SaaS data lake for the whole
organization (no data silos). Over the past couple of months, I've had the
incredible opportunity to engage with the product team and dive into the
private preview of Microsoft Fabric. <o:p></o:p></span><span style="font-family: Times New Roman, serif;"><span style="font-size: 18px;">I'm sharing my learning through the Private Preview via this blog post, emphasizing that it is not an exhaustive list of what OneLake encompasses.</span></span></p>
<p class="MsoNormal" style="line-height: normal; margin-bottom: 0cm;"><span style="font-family: "Times New Roman", serif; font-size: 13.5pt;"> </span></p>
<p class="MsoNormal" style="line-height: normal; margin-bottom: 0cm;"><span style="font-family: "Times New Roman", serif; font-size: 13.5pt;">I got OneLake installed on my PC and can easily
access the data in the OneLake like OneDrive as shown in Fig 2:<o:p></o:p></span></p><p class="MsoNormal" style="line-height: normal; margin-bottom: 0cm;"><span style="font-family: "Times New Roman", serif; font-size: 13.5pt;"><br /></span></p>
<div class="separator" style="clear: both; text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhtt4Offj5Az1jaGC674Apr_arNeO1CO0zQ4UcHFEucKsnUpq-jyIweUC62VelpiroP5iYMNzrabkYX3nCLJHk3ODC04Jwi5K_ORHDpMEQOBwPfDrTzkYHfkib5C0dsQEYhkrMEI3odDbUzqanb2LneJtcPu9zxZX8VBNNfvJ5LFR4EUZCTiDstahG8Sw/s1544/OneLake%20folder%20in%20Public%20Preview.PNG" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" data-original-height="444" data-original-width="1544" height="115" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhtt4Offj5Az1jaGC674Apr_arNeO1CO0zQ4UcHFEucKsnUpq-jyIweUC62VelpiroP5iYMNzrabkYX3nCLJHk3ODC04Jwi5K_ORHDpMEQOBwPfDrTzkYHfkib5C0dsQEYhkrMEI3odDbUzqanb2LneJtcPu9zxZX8VBNNfvJ5LFR4EUZCTiDstahG8Sw/w400-h115/OneLake%20folder%20in%20Public%20Preview.PNG" width="400" /></a></div><div class="separator" style="clear: both; text-align: center;"><span style="font-family: "Times New Roman", serif; font-size: 18px; text-align: left;">Fig 2: OneLake is like OneDrive on a PC</span></div><p class="MsoNormal" style="line-height: normal; mso-margin-bottom-alt: auto; mso-margin-top-alt: auto;"><br /></p><p class="MsoNormal" style="line-height: normal; mso-margin-bottom-alt: auto; mso-margin-top-alt: auto;"><b><span style="font-family: "Times New Roman", serif; font-size: 13.5pt;">Single unified Managed
and Governed SaaS Data Lake </span></b><span style="font-family: "Times New Roman", serif; font-size: 13.5pt;"><o:p></o:p></span></p><p class="MsoNormal" style="line-height: normal; mso-margin-bottom-alt: auto; mso-margin-top-alt: auto;"><span style="font-family: "Times New Roman", serif; font-size: 13.5pt;">All
Fabric items keep their data in OneLake so no data silos. OneLake is fully
compatible with Azure Data Lake Storage Gen 2 at the API layer which means it
can be accessible as ADLS Gen 2.<o:p></o:p></span></p><p>
</p><p class="MsoNormal" style="line-height: normal; mso-margin-bottom-alt: auto; mso-margin-top-alt: auto;"><span style="font-family: "Times New Roman", serif; font-size: 13.5pt;">Let's
investigate some of the benefits of OneLake:</span><span style="font-family: "Times New Roman", serif; font-size: 13.5pt;"><o:p></o:p></span></p><div class="separator" style="clear: both; text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEiWN2t6vqIHM03miDuF_I9noMzdldMUocZ_ZT_Vf1F0ROYsBndfjy3srjRiBUp6eO34abIrA7EphMKScLtrqnZAOWR8uNDeaI5TF9afNiFoqw04Ve-d3um-ByDQnbTMiGDY98Off6XDkyRodTEW-tj7hxd8oE5V1B6PgXHQup7DPGXApt44R-tqQCgg9w/s946/Unified%20governance.png" style="margin-left: 1em; margin-right: 1em;"><img border="0" data-original-height="598" data-original-width="946" height="202" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEiWN2t6vqIHM03miDuF_I9noMzdldMUocZ_ZT_Vf1F0ROYsBndfjy3srjRiBUp6eO34abIrA7EphMKScLtrqnZAOWR8uNDeaI5TF9afNiFoqw04Ve-d3um-ByDQnbTMiGDY98Off6XDkyRodTEW-tj7hxd8oE5V1B6PgXHQup7DPGXApt44R-tqQCgg9w/s320/Unified%20governance.png" width="320" /></a></div><div class="separator" style="clear: both; text-align: center;">Fig 3: Unified management and governance</div><div class="separator" style="clear: both; text-align: center;"><ul type="disc">
<li class="MsoNormal" style="line-height: normal; text-align: left;"><span style="font-family: "Times New Roman",serif; font-size: 13.5pt; mso-fareast-font-family: "Times New Roman";">OneLake comes automatically
provisioned with every Microsoft Fabric tenant with no infrastructure to
manage.<o:p></o:p></span></li>
</ul><p></p><ul type="disc"><li class="MsoNormal" style="line-height: normal; text-align: left;"><span style="font-family: "Times New Roman",serif; font-size: 13.5pt; mso-fareast-font-family: "Times New Roman";">Any data in OneLake works with out-of-the-box
governance such as data linage, data protection, certification, catalog
integration, etc. </span><span style="font-size: medium;">Please note that this feature is not part of the public preview.</span></li></ul><p></p><ul type="disc">
<li class="MsoNormal" style="line-height: normal; text-align: left;"><span style="font-family: "Times New Roman",serif; font-size: 13.5pt; mso-fareast-font-family: "Times New Roman";">OneLake enables distributed ownership. Different
workspaces allow different parts of the organization to work independently
while still contributing to the same data lake<o:p></o:p></span></li>
<li class="MsoNormal" style="line-height: normal; text-align: left;"><span style="font-family: "Times New Roman",serif; font-size: 13.5pt; mso-fareast-font-family: "Times New Roman";">Each workspace can have its own administrator, access
control, region, and capacity for billing<o:p></o:p></span></li>
</ul></div><div><br /></div><div><b><span style="font-family: "Times New Roman", serif; font-size: 13.5pt; line-height: 107%;">Do you have requirements that data must reside in those countries?</span></b></div><div><b><br /></b></div><div><div class="separator" style="clear: both; text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEi2cKpxo7yzwLu_AjDTwkLwVMsqTXzKNvf-sYKjHMR5S52FmM3d2wbk1ny1zOfMymlxP8U987n2Sl8qVuPvpW0JDahCTr7jKt3-EzPuarPTInkkNTsugFGZnU2PJ1HZM7U8vVLZ4E2yqWVVJ_UDamnfkIVVwZ5Yw9Y9931crpRhm7y-BLxMJ2-r6Mrjag/s1463/Logically%20span%20the%20world.png" style="margin-left: 1em; margin-right: 1em;"><img border="0" data-original-height="871" data-original-width="1463" height="191" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEi2cKpxo7yzwLu_AjDTwkLwVMsqTXzKNvf-sYKjHMR5S52FmM3d2wbk1ny1zOfMymlxP8U987n2Sl8qVuPvpW0JDahCTr7jKt3-EzPuarPTInkkNTsugFGZnU2PJ1HZM7U8vVLZ4E2yqWVVJ_UDamnfkIVVwZ5Yw9Y9931crpRhm7y-BLxMJ2-r6Mrjag/s320/Logically%20span%20the%20world.png" width="320" /></a></div><div class="separator" style="clear: both; text-align: center;">Fig 4: OneLake covers data residency</div><br /></div><div><p class="MsoNormal" style="line-height: normal; margin-bottom: 0cm;"><span style="font-family: "Times New Roman", serif; font-size: 13.5pt;">Yes, your requirement is covered through OneLake. If
you're concerned about how to effectively manage data across multiple countries
while meeting local data residency requirements, fear not - OneLake has got you
covered! With its global span, OneLake enables you to create different
workspaces in different regions, ensuring that any data stored in those
workspaces also reside in their respective countries. Built on top of the
mighty Azure Data Lake Store gen2, OneLake is a powerhouse solution that can
leverage multiple storage accounts across different regions, while virtualizing
them into one seamless, logical lake. So go ahead and take the plunge - OneLake
is ready to help you navigate the global data landscape with ease and
confidence!<o:p></o:p></span></p></div><div><b><br /></b></div><div><p class="MsoNormal" style="line-height: normal; margin-bottom: 0cm;"><b><span style="font-family: "Times New Roman", serif; font-size: 13.5pt;">Data Mesh as a Service:</span></b><span style="font-family: "Times New Roman", serif; font-size: 13.5pt;"><o:p></o:p></span></p></div><div><div><b><br /></b></div><div><p class="MsoNormal" style="line-height: normal; margin-bottom: 0cm;"><span style="font-family: "Times New Roman", serif; font-size: 13.5pt;">OneLake gives a true data mesh as a service. Business
groups can now operate autonomously within a shared data lake, eliminating the
need to manage separate storage resources. The implementation of the data mesh
pattern has become more streamlined. OneLake enhances this further by
introducing domains as a core concept. A business domain can have multiple
workspaces, which typically align with specific projects or teams.<o:p></o:p></span></p></div><div><br /></div><div><b><br /></b></div><div><p class="MsoNormal" style="line-height: normal; margin-bottom: 0cm;"><b><span style="font-family: "Times New Roman", serif; font-size: 13.5pt;">Open Data Format</span></b><span style="font-family: "Times New Roman", serif; font-size: 13.5pt;"><o:p></o:p></span></p></div><div><p class="MsoNormal" style="line-height: normal; margin-bottom: 0cm;"><span style="font-family: "Times New Roman", serif; font-size: 13.5pt;">Simply, no matter which item you start
with, they will all store their data in OneLake similar to how Word, Excel, and
PowerPoint save documents in OneDrive.<o:p></o:p></span></p>
<p class="MsoNormal" style="line-height: normal; margin-bottom: 0cm;"><span style="font-family: "Times New Roman", serif; font-size: 13.5pt;"> </span></p>
<p class="MsoNormal" style="line-height: normal; margin-bottom: 0cm;"><span style="font-family: "Times New Roman", serif; font-size: 13.5pt;">You will see files and folders just like you
would in a data lake today. All workspaces are going to be folders, each data
item will be a folder. Any tabular data will be stored in delta lake format.
There are no new proprietary file formats for Microsoft Fabric. Proprietary
formats create data silos. Even the data warehouse will natively store its data in Delta
Lake parquet format. While Microsoft Fabric data items will standardize on
delta parquet for tabular data, OneLake is still a Data Lake built on top of
ADLS gen2. It will support any file type, structured or unstructured.<o:p></o:p></span></p></div><div><b><br /></b></div><div><p class="MsoNormal" style="line-height: normal; margin-bottom: 0cm;"><b><span style="font-family: "Times New Roman", serif; font-size: 13.5pt;">Shortcuts/Data Virtualization</span></b><span style="font-family: "Times New Roman", serif; font-size: 13.5pt;"><o:p></o:p></span></p></div><div><b><br /></b></div><div><p class="MsoNormal" style="line-height: normal; margin-bottom: 0cm;"><span style="font-family: "Times New Roman", serif; font-size: 13.5pt;">Shortcuts virtualize data across domains and
clouds. A shortcut is nothing more than a symbolic link that points from one
data location to another. Just like you can create shortcuts in Windows or
Linux, the data will appear in the shortcut location as if it were physically
there.<o:p></o:p></span></p></div><div><br /></div><div class="separator" style="clear: both; text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEinihBUTNauTcJAEDHMBvkC9j52zBtYnHwo3m8As3JN7LMgvxt3Dc3LbmU5QT3U7LbmGYFr3pDUUpsp_02X46RZuXyZPzSvAa7bwYGo5pFHBbpThmqimQW94-6udvczs1pVIxRBcn032zq6MMag2EtAAffn8_JB1tHrTwLv_XUcqYD-mqWQ8L1ldX5o5Q/s928/shortcut.png" style="margin-left: 1em; margin-right: 1em;"><img border="0" data-original-height="598" data-original-width="928" height="206" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEinihBUTNauTcJAEDHMBvkC9j52zBtYnHwo3m8As3JN7LMgvxt3Dc3LbmU5QT3U7LbmGYFr3pDUUpsp_02X46RZuXyZPzSvAa7bwYGo5pFHBbpThmqimQW94-6udvczs1pVIxRBcn032zq6MMag2EtAAffn8_JB1tHrTwLv_XUcqYD-mqWQ8L1ldX5o5Q/s320/shortcut.png" width="320" /></a></div><div class="separator" style="clear: both; text-align: center;">fig 5: Shortcuts</div><div><p class="MsoNormal" style="line-height: normal; margin-bottom: 0cm;"><span style="font-family: "Times New Roman", serif; font-size: 13.5pt;">As shown in above fig 4, if you have existing
data lakes stored in ADLS gen2 or in Amazon S3 buckets. These Lakes can
continue to exist and be managed externally by OneLake in Microsoft Fabric.</span></p><p class="MsoNormal" style="line-height: normal; margin-bottom: 0cm;"><span style="font-family: "Times New Roman", serif; font-size: 13.5pt;">Shortcuts will help to avoid data movements or
duplication. It’s easy to create Shortcuts from Microsoft Fabric as shown in Figure
6:<o:p></o:p></span></p><div class="separator" style="clear: both; text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgawZ_CU82IKJHRZ0O0IAxUhYzd_EW5l0EEeMscH9zdyxW45YCCTFjb8xaPlPl9sszMvYjKK8vjEAPaIVydU4O1lRSmtYN-T4rO1Tb3Fa5iF0OdOyPzNoA55P5ad3wnTQCEaSe90Uk9pwVQCuR5podBSY9FIXRfDODreFF12nAk8V-RsLFCzaCGrK6vUw/s1243/Create%20Shortcut.PNG" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" data-original-height="760" data-original-width="1243" height="245" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgawZ_CU82IKJHRZ0O0IAxUhYzd_EW5l0EEeMscH9zdyxW45YCCTFjb8xaPlPl9sszMvYjKK8vjEAPaIVydU4O1lRSmtYN-T4rO1Tb3Fa5iF0OdOyPzNoA55P5ad3wnTQCEaSe90Uk9pwVQCuR5podBSY9FIXRfDODreFF12nAk8V-RsLFCzaCGrK6vUw/w400-h245/Create%20Shortcut.PNG" width="400" /></a></div><div class="separator" style="clear: both; text-align: center;"><span style="font-family: "Times New Roman", serif; font-size: 18px; text-align: left;"> Fig 6: Shortcuts from Microsoft Fabric</span></div><p class="MsoNormal" style="line-height: normal; margin-bottom: 0cm;"><br /></p></div></div><div><div><p class="MsoNormal" style="line-height: normal; margin-bottom: 0cm;"><b><span style="font-family: "Times New Roman", serif; font-size: 13.5pt;">OneLake Security</span></b><span style="font-family: "Times New Roman", serif; font-size: 13.5pt;"><o:p></o:p></span></p></div><div><span style="font-family: "Times New Roman", serif; font-size: 13.5pt;"><br /></span></div><div><span style="font-family: "Times New Roman", serif; font-size: 13.5pt;">In the current preview, data in OneLake is
secured at the item or workspace level. A user will either have access or
not. Additional engine-specific security can be defined in the T-SQL
engine. These security definitions will not apply to other engines. Direct
access to the item in the lake can be restricted to only users who are allowed
to see all the data for that warehouse. </span></div><div>
<p class="MsoNormal" style="line-height: normal; margin-bottom: 0cm;"><span style="font-family: "Times New Roman", serif; font-size: 13.5pt;">In addition, Power BI
reports will continue to work against data in OneLake as the analysis services
can still leverage the security defined in the T-SQL engine through DirectQuery
mode and can sometimes still optimize to DirectLake mode depending on the
security defined.<o:p></o:p></span></p></div><div style="font-weight: bold;"><br /></div></div><div><p class="MsoNormal" style="line-height: normal; margin-bottom: 0cm;"><span style="font-family: Times New Roman, serif;"><span style="font-size: 13.5pt;">In summary, OneLake is a revolutionary
advancement in the data and analytics industry, surpassing my initial
expectations. It transforms Data Lakes into user-friendly OneDrive-like
folders, providing unprecedented convenience. The delta file format is the
optimal choice for Data Engineering workloads. With OneLake, Data Engineers,
Data Scientists, BI professionals, and business stakeholders can collaborate
more effectively than ever before. To find out more about Microsoft Fabric and
OneLake, please visit </span></span><span style="color: #0000ee; font-family: Times New Roman, serif;"><span style="font-size: 18px;"><u><a href="https://azure.microsoft.com/en-us/blog/introducing-microsoft-fabric-data-analytics-for-the-era-of-ai/" target="_blank">Microsoft Fabric Document</a></u></span></span><span style="font-family: Times New Roman, serif;"><span style="font-size: 13.5pt;"><a href="https://azure.microsoft.com/en-us/blog/introducing-microsoft-fabric-data-analytics-for-the-era-of-ai/" target="_blank">.</a><o:p></o:p></span></span></p></div><div><b><br /></b></div><p></p>Unknownnoreply@blogger.com0tag:blogger.com,1999:blog-2635979041680311167.post-86651324875495653902023-04-23T17:08:00.009-04:002023-04-23T21:32:28.483-04:00How to solve Azure hosted Integration Runtime (IR) validation error in Synapse Analytics?<p>This blog post shares recent learning while working with Data flows in Azure Synapse Analytics.</p><p>The ETL pipeline was developed using Azure Synapse Data Flows. However, when attempting to merge the code changes made in the feature branch into the main branch of the DevOps code repository, a validation error occurred, as shown below in Figure 1:</p><div class="separator" style="clear: both; text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhYzrmtUnBkZ30DXDnVBLkEOMP117LPQmPEqLreIbMpEktYY4ULluxIQD_u-rh1EaqcMefw6tvs5Q6onjjioPqrLZK1pydxzIlxushftjOKzOzNm92cDeGa_WchnIjKxfs8yf_XY23xf2qEzQGitV4rmSgi2x2LpSzogpN0NoBmiiUuRF9EklU_9e7EJw/s886/validation%20error%20Azure%20IR.PNG" style="margin-left: 1em; margin-right: 1em;"><img border="0" data-original-height="438" data-original-width="886" height="198" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhYzrmtUnBkZ30DXDnVBLkEOMP117LPQmPEqLreIbMpEktYY4ULluxIQD_u-rh1EaqcMefw6tvs5Q6onjjioPqrLZK1pydxzIlxushftjOKzOzNm92cDeGa_WchnIjKxfs8yf_XY23xf2qEzQGitV4rmSgi2x2LpSzogpN0NoBmiiUuRF9EklU_9e7EJw/w400-h198/validation%20error%20Azure%20IR.PNG" width="400" /></a></div><div class="separator" style="clear: both; text-align: center;">Fig 1: Validation error</div><div class="separator" style="clear: both; text-align: center;"><br /></div><p>It is worth noting that the same pipeline was executed in Debug mode, and it ran successfully without encountering any errors, as depicted in Figure 2.:</p><p></p><div class="separator" style="clear: both; text-align: center;"><a href="https://blogger.googleusercontent.com/img/a/AVvXsEhxcRnBWyu3HJK6k0vOiDAPdpSCT_thmM6pohaJExSYR2BwzOSiXs8E95g0eTBNSVhFu-U68vnwCukuijYpzABaMXU-lee2t9Vz8aMlVIqp4__9AqLSCWV5cMLbRImfsEfdRg7YzWg4x2l09TIlx7Xv6U55VS0mufh6n8s2oGkO1XUYCHk9S0KTrRT8ag" style="margin-left: 1em; margin-right: 1em;"><img alt="" data-original-height="490" data-original-width="1008" height="195" src="https://blogger.googleusercontent.com/img/a/AVvXsEhxcRnBWyu3HJK6k0vOiDAPdpSCT_thmM6pohaJExSYR2BwzOSiXs8E95g0eTBNSVhFu-U68vnwCukuijYpzABaMXU-lee2t9Vz8aMlVIqp4__9AqLSCWV5cMLbRImfsEfdRg7YzWg4x2l09TIlx7Xv6U55VS0mufh6n8s2oGkO1XUYCHk9S0KTrRT8ag=w400-h195" width="400" /></a></div><div class="separator" style="clear: both; text-align: center;">Fig 2: Successfully run on debug mode</div><br /><br /><p></p><p>On the one hand, when trying to merge the code into the main branch from the feature branch it throws a validation error, on the other hand, the pipeline executed successfully in the debug mode. It seems a bug and reported it to the Microsoft Synapse team.</p><p>The validation error needed to be fixed so that the code can be released to the Production environment. The Azure Integration run time (IR) was used in the Data flows created by using the Canada Central region. However, IR must need to use 'Auto Resolve' as shown in Fig 3. </p><p></p><div class="separator" style="clear: both; text-align: center;"><a href="https://blogger.googleusercontent.com/img/a/AVvXsEipF_3q-Ueb7hLgqMZsUPPujQHoKv7RtbU60bfRCPJEsN3PgH54Cyw-F_oAeQn54BMwJ5G0VC8MYqYVVCu88cX4walSLyOOH3VJOd5ELEP6-L1mmWXCpMgDUxjV8Lyujtjryz2d9KH4KIr3vzN8Ro14zclsaRy99_8mglrbZyBMnCFBm69MudEIoa5tFw" style="margin-left: 1em; margin-right: 1em;"><img alt="" data-original-height="673" data-original-width="719" height="240" src="https://blogger.googleusercontent.com/img/a/AVvXsEipF_3q-Ueb7hLgqMZsUPPujQHoKv7RtbU60bfRCPJEsN3PgH54Cyw-F_oAeQn54BMwJ5G0VC8MYqYVVCu88cX4walSLyOOH3VJOd5ELEP6-L1mmWXCpMgDUxjV8Lyujtjryz2d9KH4KIr3vzN8Ro14zclsaRy99_8mglrbZyBMnCFBm69MudEIoa5tFw" width="256" /></a></div><div class="separator" style="clear: both; text-align: center;">Fig 3: Region as 'Auto Resolve'</div><div class="separator" style="clear: both; text-align: center;"><br /></div><div class="separator" style="clear: both; text-align: left;">And used the newly created IR in the pipeline which resolved the issue.</div><div class="separator" style="clear: both; text-align: left;"><br /></div>In summary, though your Azure resources can be created under one particular region e.g. all our resources are created under the region Canada Central, but; for Synapse Data Flows activity you need to create Azure IR by choosing Auto Resolve as the region.<p></p><p><br /><br /></p>Unknownnoreply@blogger.com0Toronto, ON, Canada43.653226 -79.383184315.342992163821151 -114.5394343 71.963459836178842 -44.226934299999996tag:blogger.com,1999:blog-2635979041680311167.post-4902306467814810382023-02-05T11:13:00.364-05:002023-02-19T11:40:26.513-05:00How to implement Continuous integration and delivery (CI/CD) in Azure Data Factory<p>Continuous Integration and Continuous Deployment (CI/CD) are integral parts of the software development lifecycle. As a Data Engineer/ Data Professional when we build the ETL pipeline using Azure Data Factory (ADF), we need to move our code from a Development environment to Pre-Prod and Production environment. One of the ways to do this is by using Azure DevOps.</p><h2 style="font-weight: 400;">Target Audience:</h2><p>This article will be helpful for below mentioned two types of audiences.</p><p>Audience Type 1: Using DevOps code repository in ADF but CI/CD is missing</p><p>Audience Typ 2: Using ADF for ETL development but never used DevOps repository and CI/CD</p><p>For Audience Type 1 you can follow the rest of the blog to implement CI/CD. And for audience type 2, you need to first connect ADF with the DevOps repository, and to do so please follow this <a href="https://www.allaboutdata.ca/2022/01/how-to-integrate-azure-devops-with.html" target="_blank">blog post</a> and then follow the rest of this blog post for the CI/CD.</p><p>This blog post will describe how to set up CI/CD for ADF. There are two parts to this process one called 1) Continuous Integration (CI) and 2) Continuous Deployment (CD).</p><h2 style="font-weight: 400;">1) Continuous Integration (CI)</h2><p>First, you need to log in to the Azure DevOps site from your organization and click the pipelines as shown in fig 1.</p><p><img alt="Pipeline creation" class="alignnone size-medium wp-image-4151608" height="144" src="https://www.sqlservercentral.com/wp-content/uploads/2023/02/1_1_Pipeline-creation-step1-300x144.png" width="300" /></p><p>Fig 1: DevOps pipelines</p><p>And then click “New pipeline” as shown in fig 2</p><p><img alt="New Pipeline" class="alignnone size-medium wp-image-4151609" height="41" src="https://www.sqlservercentral.com/wp-content/uploads/2023/02/1_2_New-Pipeline-300x41.png" width="300" /></p><p>Fig 2: New pipeline creation</p><p>And then follow the few steps to create the pipeline:</p><p>Step 1: Depending on where your code is located, choose the option. In this blog post, we are using the classic editor as shown below in fig 3.</p><p id="KfFzMAq"><a href="https://www.sqlservercentral.com/wp-content/uploads/2023/02/img_63f237d325bcd.png"><img alt="" class="alignnone wp-image-4151607 size-medium" height="151" src="https://www.sqlservercentral.com/wp-content/uploads/2023/02/img_63f237d325bcd-300x151.png" width="300" /></a></p><p>Fig 3: Choose the right source or use the classic editor</p><p>Step 2: In this step, you need to choose the repository that you are using in ADF and make sure to select the default branch. We have chosen ADF_publish as a default branch for the builds.</p><p><img alt="" class="alignnone size-medium wp-image-4151615" height="183" src="https://www.sqlservercentral.com/wp-content/uploads/2023/02/ADF-Publish-branch-300x183.png" width="300" /> </p><p> Fig 4: selecting the branch for the pipeline<br /><br /></p><p>Step 3: Create an Empty job by clicking the Empty job as shown in below figure 5</p><p><img alt="" class="alignnone size-medium wp-image-4151628" height="68" src="https://www.sqlservercentral.com/wp-content/uploads/2023/02/select-empty-job-300x68.png" width="300" /></p><p>Fig 5: Selected Empty job</p><p>Step 4: You need to provide the agent pool and agent specification at this step. Please choose Azure Pipelines for the agent pool and windows latest for the Agent specification as shown in figure 6.</p><p><img alt="" class="alignnone size-medium wp-image-4151617" height="146" src="https://www.sqlservercentral.com/wp-content/uploads/2023/02/Agent-pool-information-300x146.png" width="300" /></p><p>Fig 6: Agent pool and Agent specification</p><p> </p><p>Now click "Save and Queue" to save the build pipeline with the name “Test_Build Pipeline” as shown below figure:</p><p><img alt="" class="alignnone size-medium wp-image-4151619" height="109" src="https://www.sqlservercentral.com/wp-content/uploads/2023/02/Build-pipeline-name-300x109.png" width="300" /></p><p>Fig 7: Saved the build pipeline</p><p>The build pipeline is completed, and the next part is creating the release pipeline which will do the continuous deployment (CD) means the movement of the code/artifacts from Development to the pre-prod and Production environment.</p><h2 style="font-weight: 400;">2) Continous Deployment (CD)</h2><p>The very first step of Continuous Deployment is to create a new release pipeline. And this release pipeline will have a connection with the previously created build pipeline named "Test_Build Pipeline".</p><p><img alt="" class="alignnone size-medium wp-image-4151622" height="217" src="https://www.sqlservercentral.com/wp-content/uploads/2023/02/Create-New-Release-pipeline-300x217.png" width="300" /><br /> Fig 8: Release Pipeline</p><p>As soon as you hit the new release pipeline following step will appear. Please close the right popup screen and then click +Add on the artifact as shown below figure:</p><p> </p><p><img alt="" class="alignnone size-medium wp-image-4151640" height="133" src="https://www.sqlservercentral.com/wp-content/uploads/2023/02/Artifacts-300x133.png" width="300" /><br /> Fig 9: Create the artifact</p><p>Next step to connect the build pipeline and the release pipeline, you will find out the build pipeline that you created at the CI process earlier and choose the item “Test_Build Pipeline”</p><p><img alt="" class="alignnone size-medium wp-image-4151621" height="123" src="https://www.sqlservercentral.com/wp-content/uploads/2023/02/Connecting-build-and-release-pipeline-300x123.png" width="300" /> </p><p>Fig 10: connecting the build pipeline from the release pipeline<br /><br /></p><p>Click on Stage 2 and select an empty job as shown in fig 10</p><p><img alt="empty job" class="alignnone size-medium wp-image-4151627" height="145" src="https://www.sqlservercentral.com/wp-content/uploads/2023/02/Release-PL-choose-empty-job-300x145.png" width="300" /><br /> Fig 11: Empty job under release pipeline</p><p>After the last steps the pipeline will look like the below, now please click on 1 job, 0 tasks. We need to create an ARM template deployment task</p><p><img alt="job and tasks" class="alignnone size-medium wp-image-4151625" height="186" src="https://www.sqlservercentral.com/wp-content/uploads/2023/02/Release-pipeline-1-job-0-task-300x186.png" width="300" /></p><p>Fig 12: Release pipeline job and task</p><p>Search for ARM template deployment, as shown in fig 13. ARM template Deployment task will create or update the resources and the artifacts e.g. Linked services, datasets, pipelines, and so on.</p><p><img alt="search for ARM template" class="alignnone size-medium wp-image-4151613" height="124" src="https://www.sqlservercentral.com/wp-content/uploads/2023/02/Add-ARM-deployment-task-300x124.png" width="300" /><br /> Fig 13: search for the ARM template</p><p>After adding the ARM template, you need to fill in the information for 1 to 12 as shown in the below diagram:</p><p><img alt="ARM template configuration" class="alignnone size-medium wp-image-4151618" height="278" src="https://www.sqlservercentral.com/wp-content/uploads/2023/02/ARM-template-configuration-300x278.png" width="300" /></p><p>Fig 14: ARM Template information</p><ol><li>Display Name: Simply put any name to display for the deployment template</li></ol><ol start="2"><li>Deployment Scope: You have options to choose the deployment scope from Resource Group, Subscription, or Management Group. Please choose Resource Group for ADF CI/CD.</li></ol><ol start="3"><li>Azure Resouce Manager Connection: This is a service connection; if you already have a service principal you can set up a service connection or create a completely new one. The Cloud Infrastructure team in your organization can set it up for you. You will also find details about Service Connection in Microsoft<a href="https://learn.microsoft.com/en-us/azure/devops/pipelines/library/service-endpoints?view=azure-devops&tabs=yaml" target="_blank"> Learn</a>.</li></ol><ol start="4"><li>Subscription: Please choose the right subscription where the resource group for the ADF instance resided for the pre-Prod or Production environment.</li></ol><ol start="5"><li>Action: Please Choose "Create or Update resource group" from the list.</li></ol><ol start="6"><li>Resource Group: The resource group name where the ADF instance is lying.</li></ol><ol start="7"><li>Location: Resource location e.g. I have used Canada Central</li></ol><ol start="8"><li>Template Location: Please choose either "Linked Artifact" or "URL of the file". I have chosen "Linked Artifact" which will connect the ARM template which is already built via Continuous Integration (CI) process.</li></ol><ol start="9"><li>Template: Please choose the file "ARMTemplateForFactory.json"</li></ol><ol start="10"><li>Template parameters: Please choose the template parameter file: "ARMTemplateParametersForFactory.json"</li></ol><ol start="11"><li>OverrideTemplate Parameters: Make sure to overwrite all the parameters including pre-prod or production environment's database server details and so on. Go through all the parameters that exist in the Development environment you need to update those for the upper environments.</li></ol><ol start="12"><li>Deployment mode: Please choose "Incremental" for the Deployment mode.</li></ol><p>After putting all the above information please save it. Hence your release pipeline is completed.</p><p>However, to make this pipeline runs automated, means when you push the code from the master to publish branch you need to set up a Continuous deployment trigger as shown in fig 15</p><p><img alt="release trigger" class="alignnone size-medium wp-image-4151629" height="120" src="https://www.sqlservercentral.com/wp-content/uploads/2023/02/Trigger-1-300x120.png" width="300" /></p><p>fig 15: Continuous deployment Trigger</p><p> </p><p>The last step of the CD part is creating a Release from the Release pipeline you just created. To do so please select the Release pipeline and create a release as shown in fig 16. </p><p><img alt="Create release" class="alignnone size-medium wp-image-4151623" height="100" src="https://www.sqlservercentral.com/wp-content/uploads/2023/02/Create-release-300x100.png" width="300" /></p><p>Fig 16: Create a Release from the release pipeline</p><p> You are done with CI/CD. Now to test it please go to your ADF instance choose the master branch and then click the 'Publish' button as shown below in figure 17. The C/CD process starts and the codes for pipelines, Linked Services, and Datasets move from the Development environment to Pre-Prod and then to Production.</p><p><img alt="publish from ADF" class="alignnone size-medium wp-image-4151614" height="112" src="https://www.sqlservercentral.com/wp-content/uploads/2023/02/ADF-instance-300x112.png" width="300" /></p><p>Fig 17: Publish from Azure Data Factory (ADF)</p><p>The blog post demonstrates step by step process of implementing CI/CD for ADF by using ARM templates. However, in addition to ARM template deployment; there is another ARM template named "Azure Data Factory Deployment (ARM)" which can also be used to implement CI/CD for ADF.</p><p> </p><div><div><div><p></p></div></div></div>Unknownnoreply@blogger.com0Toronto, ON, Canada43.653226 -79.383184315.342992163821151 -114.5394343 71.963459836178842 -44.226934299999996tag:blogger.com,1999:blog-2635979041680311167.post-6673998977772451662022-12-02T13:06:00.000-05:002023-01-17T09:58:44.557-05:00Step-by-step guidelines to provision Azure Synapse Analytics for your organization<p>This blog post will cover end-to-end deployment guidelines including the right roles and permission required to provision Azure Synapse Analytics for your organization.</p><p>Prerequisites: You need to have an Azure Portal login.</p><p>After logging into the Azure Portal search with the keyword "Synapse", you will find Azure Synapse Analytics as it is shown in the below diagram 1:</p><p><br /></p><p></p><div class="separator" style="clear: both; text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEiIHYPEbImfaW-fYW9B2L9gjos5bs8WlrM0cDsgL6zyShu3YTtCv5rZYWtKqYKXQi48InLlrgpv77HBglY1qh0mSrcpWOK7t8gJpe7cZ6wxqI3vKW0ojR6noS4-afRD_pCc_1e-AO_262Bz50dEVKiq_u3a7Jql5qDGlG7Ayo5x85jL64rPVpzfPp_Z2g/s1450/Finding%20Azure%20Synapse%20Analyrics.jpg" style="margin-left: 1em; margin-right: 1em;"><img border="0" data-original-height="722" data-original-width="1450" height="199" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEiIHYPEbImfaW-fYW9B2L9gjos5bs8WlrM0cDsgL6zyShu3YTtCv5rZYWtKqYKXQi48InLlrgpv77HBglY1qh0mSrcpWOK7t8gJpe7cZ6wxqI3vKW0ojR6noS4-afRD_pCc_1e-AO_262Bz50dEVKiq_u3a7Jql5qDGlG7Ayo5x85jL64rPVpzfPp_Z2g/w400-h199/Finding%20Azure%20Synapse%20Analyrics.jpg" width="400" /></a></div><div class="separator" style="clear: both; text-align: center;">Fig 1: Finding out Azure Synapse Analytics</div><br /><div class="separator" style="clear: both; text-align: center;"><br /></div>After you find it click the item and hit the "Create" button which will take you to the next screen which will look like below (Fig 2).<br /><br /><div>There are five different sections (tabs) you need to fill up to provision Azure Synapse Analytics. You will find each section elaborated on details below. <br /><p></p><p><u style="font-weight: bold;"> Basics:</u> You need to fill up the basic information about the Synapse workspace.</p><div class="separator" style="clear: both; text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhnJed_ohnKp5Bh0TdI9kYPQoBVr_VbYgnw2oDPaldQvjX6qj8etn2cnFG3Qma5Lw0jMYfEFowTiXJO9ifOs28nnEkLNf1w-LMcNF0T79ZlXQgKbhLYJVfjnkUVRrSOxfpYANfiKpIMkbvqMRVR9TSYuPnAkBuwhchJIiUPOncTdwEC-va4YHq_8zB-ag/s1061/Synapse%20information.PNG" style="margin-left: 1em; margin-right: 1em;"><img border="0" data-original-height="936" data-original-width="1061" height="353" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhnJed_ohnKp5Bh0TdI9kYPQoBVr_VbYgnw2oDPaldQvjX6qj8etn2cnFG3Qma5Lw0jMYfEFowTiXJO9ifOs28nnEkLNf1w-LMcNF0T79ZlXQgKbhLYJVfjnkUVRrSOxfpYANfiKpIMkbvqMRVR9TSYuPnAkBuwhchJIiUPOncTdwEC-va4YHq_8zB-ag/w400-h353/Synapse%20information.PNG" width="400" /></a></div><br /><div class="separator" style="clear: both; text-align: center;"><br /></div><div class="separator" style="clear: both; text-align: center;">Fig 2: Basic steps of Synapse configuration</div><br /><p></p><p>1. <u>Choose your subscription</u>: Please choose the subscription where you want to provision the Azure Synapse Analytics resource</p><p>2. <u>Choose or create a Resource Group</u>: If you already have a resource group then you can choose the resource group or create a new one.</p><p>3. <u>Managed resource group</u>: Managed resource group will be automatically created, if you want to give a name please fill up this field otherwise it will be chosen automatically.</p><p>4. <u>Workspace name</u>: You need to pick up a globally unique name, it will suggest if the name is not unique.</p><p>5. <u>Region</u>: Please pick the region where you want to provision the Azure Synapse Analytics resource. Since I am in the Canadian region so put the region "Canada Central"</p><p>6.<u> Data Lake Storage</u>: Please choose Data Lake from the subscription or put it manually.</p><p>7. <u>Data Lake Storage Account</u>: This is the Data Lake Storage you have already created or if you need to create one please do so by choosing "Create new"</p><p>8. <u>Data Lake Storage File System</u>: The file System name is a container name in the Data Lake storage.</p><p><br /></p><p>This is how it looks after filling up the Basics information (shown in Fig 3)</p><p><br /></p><div class="separator" style="clear: both; text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhW1xhrZtGYkMFj0dhCuwSPeFSmrJSOcLMg856vhh8_p_ZRtCgVTRSBb-hGNXu8pucJ_uUGmFYXjipyftUZB9VIex9NXrJiQA55gWp-OVIJSUhEtowUakIN5y79bn5OIq0fIbj2e8228wbV9pFHSyyLmgVf0pco90YKl4anYAe4jiGoEVV29rYvF5uGKQ/s968/after%20filling%20up.PNG" style="margin-left: 1em; margin-right: 1em;"><img border="0" data-original-height="825" data-original-width="968" height="341" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhW1xhrZtGYkMFj0dhCuwSPeFSmrJSOcLMg856vhh8_p_ZRtCgVTRSBb-hGNXu8pucJ_uUGmFYXjipyftUZB9VIex9NXrJiQA55gWp-OVIJSUhEtowUakIN5y79bn5OIq0fIbj2e8228wbV9pFHSyyLmgVf0pco90YKl4anYAe4jiGoEVV29rYvF5uGKQ/w400-h341/after%20filling%20up.PNG" width="400" /></a></div><div class="separator" style="clear: both; text-align: center;">Fig 3: After filling up the basic information</div><br /><p><br /></p><p><b><u>Security:</u></b></p><p>Let's move to the Security tab as shown below in figure 4:</p><div class="separator" style="clear: both; text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEggTqca3z9WRuPPrISn4B7MjMYVUHkrv5lE_ntq7VG7rSM2xpuvfLkyKd6i_cYzmLwDq_B6QLojRaXCFR-3TLWlTznMHRVUPuODQLfKzMqrMSglDGBiYpmgMI1wakVrmgWUtXzQTaCpbMPDmDkYWUNebNIGXCNuym81L17Gu4zEEO5ufVbu4g9RJbMzIw/s852/Security%20Tab.PNG" style="margin-left: 1em; margin-right: 1em;"><img border="0" data-original-height="810" data-original-width="852" height="380" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEggTqca3z9WRuPPrISn4B7MjMYVUHkrv5lE_ntq7VG7rSM2xpuvfLkyKd6i_cYzmLwDq_B6QLojRaXCFR-3TLWlTznMHRVUPuODQLfKzMqrMSglDGBiYpmgMI1wakVrmgWUtXzQTaCpbMPDmDkYWUNebNIGXCNuym81L17Gu4zEEO5ufVbu4g9RJbMzIw/w400-h380/Security%20Tab.PNG" width="400" /></a></div><div class="separator" style="clear: both; text-align: center;">Fig 4: Security </div><div class="separator" style="clear: both; text-align: center;"><br /></div><div class="separator" style="clear: both; text-align: center;"><br /></div>The Security part is to connect with both serverless and dedicated SQL Pools. You can choose either local user and AAD login or only ADD login. I have chosen SQL and AAD login like in the old days when you provision SQL database instances. So you have both options available whenever or if required. <div><br /></div><div>And the check box "Allow network to Data Lake Storage Gen2 account" will be automatically chosen if you put the Data Lake Storage information the under the "Basics" tab. Synapse Serverless SQL pool required communication with Data Lake Storage and in this step Synapse workspace network allows to access a Data Lake Storage account.<br /><p><br /></p><p><u style="font-weight: bold;">Networking:</u> You need to choose the right networking options for your organization. If you are provisioning for demo purposes then you can allow the public network or allow outbound traffic. However, if data security is top of your mind I would suggest following the below setup (fig 5) for the networking.</p><div class="separator" style="clear: both; text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEg1L8LYe5as_-nuqYX3NXXuMtilBAxntD4v4dKBslgSs_-aJeRLv_0B1ibu0yVuOtgE5HcY5UhKy3VvOwk9aVHILnynCDO75GHbPvO7O2a2MDo-rzQBqqeI6oiEhb4nkyXRKdaGPVgxQ__nVxfCq004552JfqD_KTZW9z3AxdeR34BV548Qx-cgnwkVqw/s1156/Networking.PNG" style="margin-left: 1em; margin-right: 1em;"><img border="0" data-original-height="883" data-original-width="1156" height="305" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEg1L8LYe5as_-nuqYX3NXXuMtilBAxntD4v4dKBslgSs_-aJeRLv_0B1ibu0yVuOtgE5HcY5UhKy3VvOwk9aVHILnynCDO75GHbPvO7O2a2MDo-rzQBqqeI6oiEhb4nkyXRKdaGPVgxQ__nVxfCq004552JfqD_KTZW9z3AxdeR34BV548Qx-cgnwkVqw/w400-h305/Networking.PNG" width="400" /></a></div><div class="separator" style="clear: both; text-align: center;">Fig 5: Networking</div><p><b><u>1. Managed virtual network:</u></b> Enable the Managed Virtual network so that communication between resources inside Synapse happens through the private network.</p><p><b style="text-decoration-line: underline;">2. Managed Private End Point: </b>Create a managed private endpoint for the primary storage account (we did a storage account under the Basic tab and step #6)</p><p><b><u>3. </u></b><b style="text-decoration-line: underline;">Allow outbound Traffic:</b> I have set this "No" for not limiting only the approved Azure AD tenants. However, the data security is tightened through the next point #4</p><p><b style="text-decoration-line: underline;">4. Public Network Access:</b> Public network access has been disabled, which means there is no risk of exposing the data to the public network, and communication between resources will take place via private endpoints.</p><p><u style="font-weight: bold;">Tags:</u> It's the best practice to have Tags. The tagging helps identify resources and deployment environments and so on.</p><div class="separator" style="clear: both; text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEi3bQx6pOpvoZ9HB0T5pOx7EkuyYFGp5jxTW97ZrGWymqmfX-znCgucynn2mysS3k--UuDaK3ul-6Nywt8FScNRnleZP7iHtuuMXQwpa46uYN9cd18wS4RvRffdbS-UNSEePZuw4Nh74uHt7vhU2DeyI0f4iPY77KGvQ3hV516kaQnm2_lwDuFBD9iybg/s1173/Tags.PNG" style="margin-left: 1em; margin-right: 1em;"><img border="0" data-original-height="503" data-original-width="1173" height="171" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEi3bQx6pOpvoZ9HB0T5pOx7EkuyYFGp5jxTW97ZrGWymqmfX-znCgucynn2mysS3k--UuDaK3ul-6Nywt8FScNRnleZP7iHtuuMXQwpa46uYN9cd18wS4RvRffdbS-UNSEePZuw4Nh74uHt7vhU2DeyI0f4iPY77KGvQ3hV516kaQnm2_lwDuFBD9iybg/w400-h171/Tags.PNG" width="400" /></a></div><div class="separator" style="clear: both; text-align: center;">Fig 6: Tags</div><br /><p><u style="font-weight: bold;">Review and Create:</u> It's the final steps that show you a summary for you to review. Please verify your storage account, database details, security, and networking settings before you hit the create button (shown below fig 7)</p><p><br /></p><div class="separator" style="clear: both; text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEi99pS2hOSxZ4sb75CLB9OWuzJpDJ_JgAOJ77lQo0eIgZJ_zrokpcy6284p_WtQ3EZTf4Zy7FOUl407LQ5s1GsBXLldVJH-iJ72IMQOo6V7L0DSAEHwlPTPTvk4NRCMU7sz82uMsrIa945Uwga3EncspJjtQSB0j4U_f90BQFO_2TUsV06Vgbb5LwFj6w/s1308/Review%20and%20Create.PNG" style="margin-left: 1em; margin-right: 1em;"><img border="0" data-original-height="1308" data-original-width="1290" height="400" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEi99pS2hOSxZ4sb75CLB9OWuzJpDJ_JgAOJ77lQo0eIgZJ_zrokpcy6284p_WtQ3EZTf4Zy7FOUl407LQ5s1GsBXLldVJH-iJ72IMQOo6V7L0DSAEHwlPTPTvk4NRCMU7sz82uMsrIa945Uwga3EncspJjtQSB0j4U_f90BQFO_2TUsV06Vgbb5LwFj6w/w395-h400/Review%20and%20Create.PNG" width="395" /></a></div><div class="separator" style="clear: both; text-align: center;">Fig 7: Review and Create</div><br /><p><br /></p><p><br /></p><p>You have done with provisioning the Azure Synapse Analytics, as an Admin, you can work with the Azure Synapse Analytics. </p><p>However, if any additional team members want to work with the Azure Synapse Analytics tool you need to do a few more steps.</p><p>You need to add a user to the Synapse workspace, as shown in fig 8</p><div class="separator" style="clear: both; text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEid5RZ4m0-zq0FBr7qKU9u3N1bKcijUULjNck38Cd0TZ26wnsbGXIXLz6uS4m8zMjYscALR56j9SDgnQqWLaPxXmUwEylc_a6oKWu_oMqDhaPoacjjoQ77Wm-79-hqHXOLB4GS97UCPN6klalqspqthUBihYznER4_M9c3LIchgq5rusyyIPrPqmSH0HQ/s1576/Add%20user%20to%20the%20Synapse%20Workspace.PNG" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" data-original-height="650" data-original-width="1576" height="165" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEid5RZ4m0-zq0FBr7qKU9u3N1bKcijUULjNck38Cd0TZ26wnsbGXIXLz6uS4m8zMjYscALR56j9SDgnQqWLaPxXmUwEylc_a6oKWu_oMqDhaPoacjjoQ77Wm-79-hqHXOLB4GS97UCPN6klalqspqthUBihYznER4_M9c3LIchgq5rusyyIPrPqmSH0HQ/w400-h165/Add%20user%20to%20the%20Synapse%20Workspace.PNG" width="400" /></a></div><br /><div class="separator" style="clear: both; text-align: center;"><br /></div><div class="separator" style="clear: both; text-align: center;"><br /></div><div class="separator" style="clear: both; text-align: center;">Fig 8: Synapse workspace</div><div class="separator" style="clear: both; text-align: left;">After clicking "Access control" you will find "+Add" button to add user with the right role. At first you need to choose role as shown in below figure 9.</div><div class="separator" style="clear: both; text-align: center;"><br /></div><div class="separator" style="clear: both; text-align: center;"><br /></div><div class="separator" style="clear: both; text-align: center;"><div class="separator" style="clear: both; text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEip6zbCd-k6i9blq3UQ6hCru23zvFgFZ5OclsZcKDwo8mdlqDtv5SMjhIJIttbz5CzkWsUqHvupieAKp5XdmKpKLHVZ9wmXgkOQYyxgxGsqV_h1OOtPWAVgQsyTXbY-lbJJ1zO5jNVOXBNa6NKuUw6ch28fHVxEjs9paVRB4QpqFQfuMwWcwpR_tw2Ocw/s1324/Add%20role%20to%20synapse%20workspace.PNG" style="margin-left: 1em; margin-right: 1em;"><img border="0" data-original-height="1324" data-original-width="1153" height="400" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEip6zbCd-k6i9blq3UQ6hCru23zvFgFZ5OclsZcKDwo8mdlqDtv5SMjhIJIttbz5CzkWsUqHvupieAKp5XdmKpKLHVZ9wmXgkOQYyxgxGsqV_h1OOtPWAVgQsyTXbY-lbJJ1zO5jNVOXBNa6NKuUw6ch28fHVxEjs9paVRB4QpqFQfuMwWcwpR_tw2Ocw/w349-h400/Add%20role%20to%20synapse%20workspace.PNG" width="349" /></a></div><br /></div> Fig 9: Choosing the right role</div><div>If users are data engineers or developers you may want to choose "contributor" role which I have chosen as shown in fig 9. After choosing the role you need to choose members, it can be individual members or AD group members.</div><div><br /></div><div class="separator" style="clear: both; text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhrnZ7SMPruXAyeCXeVbjjZ_VKaZC58RXryRDuAnsSiquhFke5P1cWOxj9qinQn9IBdH8fBzi80g1cCWV6nScSrhHVL9STvTQAJKjFH2ROgUuCTZJMAwoi6ntMxodcOiaEa-tPVnt2Iih_k9LzMa27FUdZkojB4piQwbNdzAIzIoKMHXhYjsFdYak9Sxw/s2474/Add%20member%20to%20synapse%20workspace.PNG" style="margin-left: 1em; margin-right: 1em;"><img border="0" data-original-height="1303" data-original-width="2474" height="211" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhrnZ7SMPruXAyeCXeVbjjZ_VKaZC58RXryRDuAnsSiquhFke5P1cWOxj9qinQn9IBdH8fBzi80g1cCWV6nScSrhHVL9STvTQAJKjFH2ROgUuCTZJMAwoi6ntMxodcOiaEa-tPVnt2Iih_k9LzMa27FUdZkojB4piQwbNdzAIzIoKMHXhYjsFdYak9Sxw/w400-h211/Add%20member%20to%20synapse%20workspace.PNG" width="400" /></a></div><div class="separator" style="clear: both; text-align: center;">Fig 10: Choosing the right member</div><br /><div><br /><div class="separator" style="clear: both; text-align: center;"><br /></div><div><br /></div>The above fig 10 shown I have chosen a member and then click "Next' button to review and assign the role to the members. You have completed the steps for adding right role and members to the Synapse workspace.<br /><p>After adding the right role and member to the Synapse workspace, you also need to add the user to the Azure Synapse Portal as shown in below fig 11. At first click "Access Control" and then by clicking "+Add" button you can assign members or AD group to the right role. If you are giving access to Data Engineers or Developers they will require Contributor role. In below fig 9, I have given Contributor role to the member.</p><div class="separator" style="clear: both; text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEiSvn5V2Q-bbRaT6IrdTDKB2_MQr9NMzm35JaoB334nRBvdeN1myTlkO8T54tCNyd-8Wem7FedEeVn-6x_lSvkNfkZd_qncPZ1NaJSQRVfLEGynOWHTBmqcIq2GKOmLsvBTrHYY_9p-cec6mE5nrjP3KcKgAI4efs2qxILTPlFQTfm63suU7sOG8tS-iQ/s1770/Synapse%20Portal%20Access.PNG" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" data-original-height="831" data-original-width="1770" height="188" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEiSvn5V2Q-bbRaT6IrdTDKB2_MQr9NMzm35JaoB334nRBvdeN1myTlkO8T54tCNyd-8Wem7FedEeVn-6x_lSvkNfkZd_qncPZ1NaJSQRVfLEGynOWHTBmqcIq2GKOmLsvBTrHYY_9p-cec6mE5nrjP3KcKgAI4efs2qxILTPlFQTfm63suU7sOG8tS-iQ/w400-h188/Synapse%20Portal%20Access.PNG" width="400" /></a></div><br /><div class="separator" style="clear: both; text-align: center;"><br /></div><div class="separator" style="clear: both; text-align: center;">Fig 11: Synapse administrator from the Synapse Portal</div><br />Hpwever, to have access to Serverless SQL Pool and Linked Service creation the members will require more permission. To know more about Synapse roles please go through this <a href="https://learn.microsoft.com/en-us/azure/synapse-analytics/security/synapse-workspace-synapse-rbac-roles?WT.mc_id=DP-MVP-5004718" target="_blank">Microsoft documentation</a>.</div><div><br /></div><div>In summary, by following up the above step by step guidelines you can provision Azure Synapse Analytics for your organization. And please make sure through this process work closely with your organization's cloud infrastructure team who can guide you through all networking and security questions you may have.<br /><p><br /></p><p><br /></p><span face=""Segoe UI", SegoeUI, "Helvetica Neue", Helvetica, Arial, sans-serif" style="background-color: #d7eaf8; color: #171717; font-size: 16px;"></span><p></p></div><br /><div class="separator" style="clear: both; text-align: center;"><br /></div><br /></div>Unknownnoreply@blogger.com0tag:blogger.com,1999:blog-2635979041680311167.post-81165706254809796812022-02-12T21:38:00.008-05:002022-03-27T11:39:30.424-04:00How do you secure sensitive data in a modern Data Warehouse?<p>In 2019 Canadian Broadcasting Corporation (CBC) news reported a massive data breach at the Desjardins Group, which is a Canadian financial service cooperative and the largest federation of credit unions in North America. The report indicated, a "malicious" employee copied sensitive personal information collected by Desjardins from their data warehouse. The data breach compromised the data of nearly 9.7 million Canadians.</p><p>Now the question is, how do we secure data warehouses so that employees of the same organization can't breach the data? We need to make sure sensitive data is protected inside the organization.</p><p>When any IT solution is built in an organization, there are two types of environment that exist, one is called non-production and the other is a production environment. Production and non-production environments are physically separated. A non-Production Environment means an environment for development, testing, or quality assurance, and the solution is not consumed by end-users daily basis from a non-production environment. However, the Production environment is a place where the latest version of the software or IT solution is available and ready to be used by the intended users.</p><p>As stated at the beginning, a rogue employee was involved in the massive data breach at the Desjardins Group. Hence; an organization building a data-driven IT solution needs to work on setting up both Production and Non-Production environments secure way. This article will describe how sensitive data can be protected in both the Production and Non-Production environments.</p><p><b><u>A. Protecting Sensitive Data in Non-Production Environment in a Data Warehouse:</u></b></p><p>In general, a Non-Production environment is not well guarded with security. Different personas can have access to a Non-Production environment in a data warehouse e.g. Developers, Testers, business stakeholders. </p><p>So it's important to protect sensitive data inside the organization. The very first thing we need to do is whenever copying data from any application to a data warehouse (non-production environment) sensitive data need to be scrambled. </p><div class="separator" style="clear: both; text-align: center;"><a href="https://blogger.googleusercontent.com/img/a/AVvXsEhnyN5t-uGuVF7mVTo4sRUgeF0r7wjI3Xmn4xzB52fjCOkjLY_MHGtkx36ITZrNeyhUbOyXuYDn2fK6CVt0TBEfpQ2YNp_hYKjcsgbHGZuJR2Ehugj1d5uwTfneaVU7Kw4zz1_bLsfhn11mfgERKxO1N5y_mpZSLx2XuWMaB4fVr5t0dlHlEdHZyPgKFA=s845" style="margin-left: 1em; margin-right: 1em;"><img border="0" data-original-height="390" data-original-width="845" height="185" src="https://blogger.googleusercontent.com/img/a/AVvXsEhnyN5t-uGuVF7mVTo4sRUgeF0r7wjI3Xmn4xzB52fjCOkjLY_MHGtkx36ITZrNeyhUbOyXuYDn2fK6CVt0TBEfpQ2YNp_hYKjcsgbHGZuJR2Ehugj1d5uwTfneaVU7Kw4zz1_bLsfhn11mfgERKxO1N5y_mpZSLx2XuWMaB4fVr5t0dlHlEdHZyPgKFA=w400-h185" width="400" /></a></div><div class="separator" style="clear: both; text-align: center;">Fig 1: Moving data from IT application to non-prod data warehouse</div><br /><p>There are a few steps that can help us to scramble the data in Non-Production Environment in a Data Warehouse:</p><p>Step 1: Business or data steward find the list of sensitive or Personal Identifiable Information (PII) data</p><p>Step 2: Data Engineer or ETL Developer will use any standard tool like Azure Data Factory (ADF) to mask the data and store it in the data warehouse.</p><p>Step 3: Either Test Engineer or Business Stakeholder will verify all the sensitive columns in the database before it releases to the rest of the team.</p><p><br /></p><p><b><u>B. Protecting Sensitive Data in Production Environment in a Data Warehouse:</u></b></p><p>In a Production environment, we can't scramble the data in such a way that is irreversible. We need to keep the original data intact but make sure only the intended users have access to the data. So if we can mask the columns that hold sensitive or PII data in such a way so that only privileged users get access to the data. Below figure shows what is expected from the solution:</p><p><br /></p><div class="separator" style="clear: both; text-align: center;"><a href="https://blogger.googleusercontent.com/img/a/AVvXsEjazD_XfKu3z7ZabLW5l1RY9ZZSvzHIQW6LYkvkmGYoEgks-jpC7TIcAMfO4cyWKjZPF84sQ8ho7VQIFKU7EkKYAxAQOrzVPZFh9A9Mc-5xrCpNH1lwhghfPKhmna9XBjN3JCOhWwQ6JVs4sa_R7bRlpi1t-bBeSELGODPd6C_QeRUSSpPMZN_KGLHcYw=s1527" style="margin-left: 1em; margin-right: 1em;"><img border="0" data-original-height="624" data-original-width="1527" height="164" src="https://blogger.googleusercontent.com/img/a/AVvXsEjazD_XfKu3z7ZabLW5l1RY9ZZSvzHIQW6LYkvkmGYoEgks-jpC7TIcAMfO4cyWKjZPF84sQ8ho7VQIFKU7EkKYAxAQOrzVPZFh9A9Mc-5xrCpNH1lwhghfPKhmna9XBjN3JCOhWwQ6JVs4sa_R7bRlpi1t-bBeSELGODPd6C_QeRUSSpPMZN_KGLHcYw=w400-h164" width="400" /></a></div><br /><div class="separator" style="clear: both; text-align: center;"><br /></div><div class="separator" style="clear: both; text-align: center;">Fig 2: Protect sensitive data from Data Warehouse (PROD)</div><br /><p><br /></p><p>As shown in Fig 2, when users try to access the data via an application such as Power BI, only intended users will be able to see the intact data. Non-intended users will find the data obfuscated. The above-explained process can be done by using dynamic data masking provided by Microsoft Databases. The process only masks the data on the fly at the query time. If you would like to learn about dynamic data masking, please follow the <a href="https://docs.microsoft.com/en-us/azure/azure-sql/database/dynamic-data-masking-overview?WT.mc_id=DP-MVP-5004718">Microsoft document</a>.</p><p>In Summary, whenever PII data is taken from the operational system to the Non-Production environment to build any analytics solution data need to be scrambled. And in a Production environment, though dynamic data masking can prevent viewing the data by unintended users, however; it's important to properly manage database permission on the Data Warehouse. As well as, make sure to have auditing enabled to track all activities taking place in the Data Warehouse in the Production environment.</p><div class="separator" style="clear: both; text-align: center;"><br /></div><br /><p></p>Unknownnoreply@blogger.com1Toronto, ON, Canada43.653226 -79.383184315.342992163821151 -114.5394343 71.963459836178842 -44.226934299999996tag:blogger.com,1999:blog-2635979041680311167.post-54930033131769604682021-12-16T23:23:00.005-05:002022-01-15T22:04:20.962-05:00How to integrate Azure DevOps with Azure Synapse Studio?<p>There are two ways you can develop and execute code in Azure Synapse Studio:</p><ol><li>Synapse live development</li><li>Git enabled development.</li></ol><p>By default, Synapse Studio uses Synapse live, as shown in Fig 1. With Synapse live you can't work in a group for the same codebase whereas by enabling Git collaboration, this becomes easy. This article will demonstrate a step-by-step guide to set up Git-enabled development in Synapse Studio.</p><p><img alt="" class="alignnone wp-image-3969317 size-full" height="250" src="https://www.sqlservercentral.com/wp-content/uploads/2022/01/1_synapse-live.png" width="400" /></p><p><strong>Fig 1: Synapse live</strong></p><p>With the Git enabled development approach either you can use Azure DevOps Git or GitHub. This article will guide you using Azure DevOps Git integration.</p><h2>Prerequisites</h2><p>There are two prerequisites before following along with this article:</p><ol><li>Permissions - You must need to have contributor or higher role in the Synapse workspace to connect, edit or delete the source code repository.</li><li>Git Repository - You also need to create the Git repository. You will find more details about <a href="https://www.blogger.com/blog/post/edit/2635979041680311167/5493003313176960468">creating an Azure DevOps repository in this link.</a></li></ol><h2>Choose from Two Different Options</h2><p>There are two ways you can connect Azure DevOps Git from Synapse Studio, either from the global bar<strong> </strong>or from manage hub. You will find details below how to choose from the two options.</p><h3><strong>Option 1: The global bar</strong></h3><p>If you follow the figure 2, select the "Synapse live" drop down menu then you will find "Set up code repository". Choose this option.</p><p><img alt="" class="alignnone wp-image-3969318 size-full" height="166" src="https://www.sqlservercentral.com/wp-content/uploads/2022/01/2.setupcode-repository.png" width="400" /></p><p><strong>Fig 2: Setup code repo from global bar</strong></p><h3>Option 2: The Manage hub</h3><p>From Synapse studio look at the left bottom menu, as shown in figure 3. Those the last icon that looks like a toolbox. This is the Manage selection. Then choose the Git configuration item int eh menu that is shown to the right of this icon. In the main pane, select configure.</p><p><img alt="" class="alignnone wp-image-3969319 size-full" height="234" src="https://www.sqlservercentral.com/wp-content/uploads/2022/01/3_setupcode-repository.png" width="400" /></p><p><strong>Fig 3: setup code repo from manage hub</strong></p><p>Either of the the above two options will take you to the next step, which look like Fig 4. By selecting Azure DevOps Git you connect Azure DevOps Git with the Synapse Studio.</p><p><img alt="" class="alignnone wp-image-3969320 size-full" height="206" src="https://www.sqlservercentral.com/wp-content/uploads/2022/01/4configrepo_repository.png" width="400" /></p><p><strong>Fig 4: Choose either DevOps Git or Github</strong></p><p>At the next step you will find one more attribute populated, as shown in the below figure 5. Please select the appropriate Azure Active Directory from your organization.</p><p><img alt="" class="alignnone wp-image-3969321 size-full" height="234" src="https://www.sqlservercentral.com/wp-content/uploads/2022/01/5_configrepo_repository.png" width="400" /></p><p><strong>Fig 5: Connect the AD tenants</strong></p><p>After clicking "Next" you will enter all the necessary information to choose your Git repository which is already created in your organization. Each item shown in Fig 6 is explained below:</p><ol><li>Azure DevOps Organization: In the dropdown you may find more than one organization please select the appropriate organization. It's organization name which have been created when Azure DevOps repository is configured.</li><li>ProjectName: There are more than one project in the list, select the relevant one. This is Azure DevOps repos project name which you created earlier.</li><li>RepositoryName: Please select the right repository from the list or you can also create a repository.</li><li>Collaboration branch: By default, it's master. This is the branch where all other branch will be created from. Code will be merged to this branch from other branches as well as you will publish the code from this branch.</li><li>Publish branch: The Publish branch is the branch in your repository where publishing related ARM templates are stored and updated. In general adf_publish is your publish branch but you can also make any other branch as a publish branch.</li><li>Root folder: Your root folder in your Azure Repos collaboration branch.</li></ol><p> </p><p><img alt="" class="alignnone wp-image-3969322 size-full" height="328" src="https://www.sqlservercentral.com/wp-content/uploads/2022/01/6_final_configrepo_repository.png" width="400" /></p><p><strong>Fig 6: Configure repository</strong></p><p>After completing all the above steps, click "Apply". When this process is successfully completed you should able to see the Git repository branches, as shown in Fig 7.</p><p><img alt="" class="alignnone wp-image-3969325 size-full" height="276" src="https://www.sqlservercentral.com/wp-content/uploads/2022/01/After-connecting-Git.PNG.jpg" width="400" /></p><p><strong>Fig 7: Synapse Studio after connecting with Azure DevOps Git</strong></p><h3>How to disconnect from Azure DevOps Git</h3><p>To disconnect from Azure DevOps Git repo, you need to go to Manage-> Git configuration, as shown in Fig 8. There is a Disconnect menu item at the top.</p><p><img alt="" class="alignnone wp-image-3969324 size-full" height="239" src="https://www.sqlservercentral.com/wp-content/uploads/2022/01/8_Disconnect-from.png" width="400" /></p><p><strong>Fig 8: Disconnect from Azure DevOps Git Repo</strong></p><p>Please note that "Disconnect" option will be disabled if you on any other branch than master. So, make sure you choose master branch if you need to disconnect the Azure DevOps Git Repo.</p><p>In summary, there are two ways to Develop and execute code in Azure Synapse and collaboration is only possible with Git enablement. The article depicted how to connect Azure DevOps Git with Azure Synapse Studio as well as how to disconnect them whenever required.</p>Unknownnoreply@blogger.com0Toronto, ON, Canada43.653226 -79.383184315.342992163821151 -114.5394343 71.963459836178842 -44.226934299999996tag:blogger.com,1999:blog-2635979041680311167.post-74796141119763546102021-09-25T23:22:00.004-04:002021-10-06T09:21:37.749-04:00How to recover if Azure Data Factory AutoResolveIntegrationRuntime become corrupted?<p>I would like to share my recent experience with Azure Data Factory (ADF) where AutoResolveIntegrationRuntime become corrupted and how to fix it. I still don't know how the Integration Runtime (IR) is corrupted and don't expect this may happen to you but if it happens then this article will help you to solve the issue.</p><p class="xmsonormal"><b>Problem statement:</b></p><p class="xmsonormal">In general, the ADF AutoResolveIntegrationRuntime should look like below fig 1.</p><p class="xmsonormal"><br /></p><p class="xmsonormal"></p><div class="separator" style="clear: both; text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjqRTt5ftLGNSbW6UNgRQzdfk4iQklYJsd8qM8FyuXO36Gl8O6wlY97pPPgYUsaaHmFsBVoP_pSQMxqDvD72C2fCEuZJS_8e_6q_14JhDwxKrLdmvCn7vqaPYrxX2dESVA3Jba3HO_Qo-Ft/s1485/How+IR+should+look+like.PNG" style="margin-left: 1em; margin-right: 1em;"><img border="0" data-original-height="542" data-original-width="1485" height="146" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjqRTt5ftLGNSbW6UNgRQzdfk4iQklYJsd8qM8FyuXO36Gl8O6wlY97pPPgYUsaaHmFsBVoP_pSQMxqDvD72C2fCEuZJS_8e_6q_14JhDwxKrLdmvCn7vqaPYrxX2dESVA3Jba3HO_Qo-Ft/w400-h146/How+IR+should+look+like.PNG" width="400" /></a></div><div class="separator" style="clear: both; text-align: center;">Fig 1: AutoResolveIntegrationRuntime in Azure</div><br /><b><br /></b><p></p><p class="xmsonormal">As shown in figure 2, I found in ADF AutoResolve IR has been changes from ‘Public’ to ‘Managed Virtual Network” and Status of the IR said "Failed to get status" under the master branch.</p><div class="separator" style="clear: both; text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEh0-CV4CGzjoFIjdQLhiWfrImYVrSKN5qAHcMYreidCsl7YZDyLBh0jyljnB1xVszc9QtUIwDGbdAyQib7W0kRObKTcjYUmZdTinI2B_IErwfU50gCMMhh5Qi31rvki2-eI6xhBJS2mZz_i/s1444/IR+corrupted+2.PNG" style="margin-left: 1em; margin-right: 1em;"><img border="0" data-original-height="569" data-original-width="1444" height="158" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEh0-CV4CGzjoFIjdQLhiWfrImYVrSKN5qAHcMYreidCsl7YZDyLBh0jyljnB1xVszc9QtUIwDGbdAyQib7W0kRObKTcjYUmZdTinI2B_IErwfU50gCMMhh5Qi31rvki2-eI6xhBJS2mZz_i/w400-h158/IR+corrupted+2.PNG" width="400" /></a></div><br /><div class="separator" style="clear: both; text-align: center;"><br /></div><div class="separator" style="clear: both; text-align: center;">Fig 2: Corrupted AutoResolveIntegrationRuntime</div><div class="separator" style="clear: both; text-align: center;"><br /></div><div class="separator" style="clear: both; text-align: left;">I was shocked, was not aware of any code changes that may impact AutoResolve IR. Due to AutoResolve IR corruption release pipeline stopped working, hence we were not able to push new changes to PROD.</div><div class="separator" style="clear: both; text-align: center;"><br /></div><p class="xmsonormal"><b>Identify the Issue:</b></p><p class="xmsonormal">After looking into the DevOps code repo, as found below fig 3 is shown extra code has been added to the original code.</p><div class="separator" style="clear: both; text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgP8IlqwShW2ArH2sF6onQCJE__JmUkNl7p3tEh68cMzxc0RVLaLTbFfMvJBPX7NdJrVTjhvMsx0BCMasPxTAmwf7CRfY46yedVhQ61giIUUn8oirB_G91fnPjxoudbmpBPcyaaQ0ZKLdCQ/s1158/AutoResolveIR.PNG" style="margin-left: 1em; margin-right: 1em;"><img border="0" data-original-height="549" data-original-width="1158" height="190" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgP8IlqwShW2ArH2sF6onQCJE__JmUkNl7p3tEh68cMzxc0RVLaLTbFfMvJBPX7NdJrVTjhvMsx0BCMasPxTAmwf7CRfY46yedVhQ61giIUUn8oirB_G91fnPjxoudbmpBPcyaaQ0ZKLdCQ/w400-h190/AutoResolveIR.PNG" width="400" /></a></div><div class="separator" style="clear: both; text-align: center;">Fig 3: Managed virtual network section has been added</div><br /><p class="xmsonormal"><br /></p><p class="xmsonormal"><b>Resolution:</b></p><p class="xmsonormal">Delete the below code as shown above fig 3 from the DevOps. This part of code changed the AutoResolve IR's Sub-type from 'Public' to 'Managed Virtual Network'.</p><div style="background-color: #fffffe; font-family: Consolas, "Courier New", monospace; font-size: 12px; line-height: 16px; white-space: pre;"><div><i><span style="color: #a31515;">"managedVirtualNetwork"</span>: {</i></div><div><i> <span style="color: #a31515;">"type"</span>: <span style="color: #0451a5;">"ManagedVirtualNetworkReference"</span>,</i></div><div><i> <span style="color: #a31515;">"referenceName"</span>: <span style="color: #0451a5;">"default"</span></i></div><div><i> }</i></div></div><p class="xmsonormal"><i><br /></i></p><p class="xmsonormal">After deleting the part of the code from master branch, the issue seems resolved but not completely. As shown below fig 4, the IR changes back from 'Managed Virtual Network' to 'Public', however; still the status is showing error message.</p><div class="separator" style="clear: both; text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEi_IdyLogfY77zR4N-WmQjGqPDdhs89Z01sng1a96wYulzeZWAPF3HpGncFgH5DRilZGlnwdsRop8VaujMBCXLJLgJRWoooBE40XdLA-d5jQtuXqhj0m712Ud-pn31jQ0toHiS6LiIJvy4-/s1086/updated+from+managed+to+Public.PNG" style="margin-left: 1em; margin-right: 1em;"><img border="0" data-original-height="427" data-original-width="1086" height="158" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEi_IdyLogfY77zR4N-WmQjGqPDdhs89Z01sng1a96wYulzeZWAPF3HpGncFgH5DRilZGlnwdsRop8VaujMBCXLJLgJRWoooBE40XdLA-d5jQtuXqhj0m712Ud-pn31jQ0toHiS6LiIJvy4-/w400-h158/updated+from+managed+to+Public.PNG" width="400" /></a></div><div class="separator" style="clear: both; text-align: center;">Fig 4: Status still showing error </div><br /><p class="xmsonormal">At this stage, release pipeline started working means I was able push the changes to PROD. However; I wanted to see error message disappear. To clean the error message I had to delete the AutoResolve IR code as shown below fig 5. To do so, logged into the Azure DevOps and have chosen the master branch and then under integrationRuntime folder there were two files one is AutoResolve IR and other one is selfhosted IR, I have deleted AutoResolve IR file.</p><div class="separator" style="clear: both; text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhCL9G9KcSFbMPv4dwZBNTh3qBD5vv0ffBZzLtur3em_nsymyvD2zHrZVA7cxjTf8IY6qYibTH02BfIFuUk4DA962ZEXBSvA9quN4HCvDgrn7uq6Xhzw7-7UTGGlNMkG15gnZLPTDMZoh1m/s1368/delete+Autoresolve+IR.PNG" style="margin-left: 1em; margin-right: 1em;"><img border="0" data-original-height="577" data-original-width="1368" height="169" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhCL9G9KcSFbMPv4dwZBNTh3qBD5vv0ffBZzLtur3em_nsymyvD2zHrZVA7cxjTf8IY6qYibTH02BfIFuUk4DA962ZEXBSvA9quN4HCvDgrn7uq6Xhzw7-7UTGGlNMkG15gnZLPTDMZoh1m/w400-h169/delete+Autoresolve+IR.PNG" width="400" /></a></div><div class="separator" style="clear: both; text-align: center;">Fig 5: Remove AutoResolveIntegrationRuntime from DevOps</div><br /><p class="xmsonormal">After the file is deleted, checked ADF portal and then refresh it found the error is completely gone. So anytime you find AutoResolve IR is corrupted from your master branch you know how to fix it.</p><p class="xmsonormal"><br /></p>Unknownnoreply@blogger.com0Toronto, ON, Canada43.653226 -79.383184315.342992163821151 -114.5394343 71.963459836178842 -44.226934299999996tag:blogger.com,1999:blog-2635979041680311167.post-51594600415212235662021-08-28T21:39:00.002-04:002021-09-06T12:07:32.424-04:00How to Flatten JSON in Azure Data Factory?<p class="MsoNormal" style="line-height: normal; mso-margin-bottom-alt: auto; mso-margin-top-alt: auto;"><span style="font-family: "Times New Roman",serif; font-size: 12.0pt; mso-fareast-font-family: "Times New Roman";">When you work with ETL and the
source file is JSON, many documents may get nested attributes in the JSON file.
Your requirements will often dictate that you flatten those nested attributes.
There are many ways you can flatten the JSON hierarchy, however; I am going to
share my experiences with Azure Data Factory (ADF) to flatten JSON.<o:p></o:p></span></p><p class="MsoNormal" style="line-height: normal; mso-margin-bottom-alt: auto; mso-margin-top-alt: auto;"><span style="font-family: "Times New Roman",serif; font-size: 12.0pt; mso-fareast-font-family: "Times New Roman";">The ETL process involved taking a
JSON source file, flattening it, and storing in an Azure SQL database. The
attributes in the JSON files were nested, which required flattening them. The
source JSON look like this:<o:p></o:p></span></p><p class="MsoNormal" style="line-height: normal; margin-bottom: 0cm; tab-stops: 45.8pt 91.6pt 137.4pt 183.2pt 229.0pt 274.8pt 320.6pt 366.4pt 412.2pt 458.0pt 503.8pt 549.6pt 595.4pt 641.2pt 687.0pt 732.8pt;"><span style="font-family: "Courier New"; font-size: 10.0pt; mso-fareast-font-family: "Times New Roman";">{<o:p></o:p></span></p><p class="MsoNormal" style="line-height: normal; margin-bottom: 0cm; tab-stops: 45.8pt 91.6pt 137.4pt 183.2pt 229.0pt 274.8pt 320.6pt 366.4pt 412.2pt 458.0pt 503.8pt 549.6pt 595.4pt 641.2pt 687.0pt 732.8pt;"><span style="font-family: "Courier New"; font-size: 10.0pt; mso-fareast-font-family: "Times New Roman";">"id":
"01",<o:p></o:p></span></p><p class="MsoNormal" style="line-height: normal; margin-bottom: 0cm; tab-stops: 45.8pt 91.6pt 137.4pt 183.2pt 229.0pt 274.8pt 320.6pt 366.4pt 412.2pt 458.0pt 503.8pt 549.6pt 595.4pt 641.2pt 687.0pt 732.8pt;"><span style="font-family: "Courier New"; font-size: 10.0pt; mso-fareast-font-family: "Times New Roman";">"name":
"Tom Hanks",<o:p></o:p></span></p><p class="MsoNormal" style="line-height: normal; margin-bottom: 0cm; tab-stops: 45.8pt 91.6pt 137.4pt 183.2pt 229.0pt 274.8pt 320.6pt 366.4pt 412.2pt 458.0pt 503.8pt 549.6pt 595.4pt 641.2pt 687.0pt 732.8pt;"><span style="font-family: "Courier New"; font-size: 10.0pt; mso-fareast-font-family: "Times New Roman";">"age":
20.0,<o:p></o:p></span></p><p class="MsoNormal" style="line-height: normal; margin-bottom: 0cm; tab-stops: 45.8pt 91.6pt 137.4pt 183.2pt 229.0pt 274.8pt 320.6pt 366.4pt 412.2pt 458.0pt 503.8pt 549.6pt 595.4pt 641.2pt 687.0pt 732.8pt;"><span style="font-family: "Courier New"; font-size: 10.0pt; mso-fareast-font-family: "Times New Roman";">"email":
"th@hollywood.com",<o:p></o:p></span></p><p class="MsoNormal" style="line-height: normal; margin-bottom: 0cm; tab-stops: 45.8pt 91.6pt 137.4pt 183.2pt 229.0pt 274.8pt 320.6pt 366.4pt 412.2pt 458.0pt 503.8pt 549.6pt 595.4pt 641.2pt 687.0pt 732.8pt;"><span style="font-family: "Courier New"; font-size: 10.0pt; mso-fareast-font-family: "Times New Roman";">"Cars":<o:p></o:p></span></p><p class="MsoNormal" style="line-height: normal; margin-bottom: 0cm; tab-stops: 45.8pt 91.6pt 137.4pt 183.2pt 229.0pt 274.8pt 320.6pt 366.4pt 412.2pt 458.0pt 503.8pt 549.6pt 595.4pt 641.2pt 687.0pt 732.8pt;"><span style="font-family: "Courier New"; font-size: 10.0pt; mso-fareast-font-family: "Times New Roman";"> {<o:p></o:p></span></p><p class="MsoNormal" style="line-height: normal; margin-bottom: 0cm; tab-stops: 45.8pt 91.6pt 137.4pt 183.2pt 229.0pt 274.8pt 320.6pt 366.4pt 412.2pt 458.0pt 503.8pt 549.6pt 595.4pt 641.2pt 687.0pt 732.8pt;"><span style="font-family: "Courier New"; font-size: 10.0pt; mso-fareast-font-family: "Times New Roman";"> "make": "Bentley",<o:p></o:p></span></p><p class="MsoNormal" style="line-height: normal; margin-bottom: 0cm; tab-stops: 45.8pt 91.6pt 137.4pt 183.2pt 229.0pt 274.8pt 320.6pt 366.4pt 412.2pt 458.0pt 503.8pt 549.6pt 595.4pt 641.2pt 687.0pt 732.8pt;"><span style="font-family: "Courier New"; font-size: 10.0pt; mso-fareast-font-family: "Times New Roman";"> "year": 1973.0,<o:p></o:p></span></p><p class="MsoNormal" style="line-height: normal; margin-bottom: 0cm; tab-stops: 45.8pt 91.6pt 137.4pt 183.2pt 229.0pt 274.8pt 320.6pt 366.4pt 412.2pt 458.0pt 503.8pt 549.6pt 595.4pt 641.2pt 687.0pt 732.8pt;"><span style="font-family: "Courier New"; font-size: 10.0pt; mso-fareast-font-family: "Times New Roman";"> "color": "White"<o:p></o:p></span></p><p class="MsoNormal" style="line-height: normal; margin-bottom: 0cm; tab-stops: 45.8pt 91.6pt 137.4pt 183.2pt 229.0pt 274.8pt 320.6pt 366.4pt 412.2pt 458.0pt 503.8pt 549.6pt 595.4pt 641.2pt 687.0pt 732.8pt;"><span style="font-family: "Courier New"; font-size: 10.0pt; mso-fareast-font-family: "Times New Roman";"> }<o:p></o:p></span></p><p class="MsoNormal" style="line-height: normal; margin-bottom: 0cm; tab-stops: 45.8pt 91.6pt 137.4pt 183.2pt 229.0pt 274.8pt 320.6pt 366.4pt 412.2pt 458.0pt 503.8pt 549.6pt 595.4pt 641.2pt 687.0pt 732.8pt;"><span style="font-family: "Courier New"; font-size: 10.0pt; mso-fareast-font-family: "Times New Roman";">}<o:p></o:p></span></p><p class="MsoNormal" style="line-height: normal; mso-margin-bottom-alt: auto; mso-margin-top-alt: auto;"><span style="font-family: "Times New Roman",serif; font-size: 12.0pt; mso-fareast-font-family: "Times New Roman";">The above JSON document has a nested
attribute, Cars. We would like to flatten these values that produce a final
outcome look like below:<o:p></o:p></span></p><p class="MsoNormal" style="line-height: normal; margin-bottom: 0cm; tab-stops: 45.8pt 91.6pt 137.4pt 183.2pt 229.0pt 274.8pt 320.6pt 366.4pt 412.2pt 458.0pt 503.8pt 549.6pt 595.4pt 641.2pt 687.0pt 732.8pt;"><span style="font-family: "Courier New"; font-size: 10.0pt; mso-fareast-font-family: "Times New Roman";">{<o:p></o:p></span></p><p class="MsoNormal" style="line-height: normal; margin-bottom: 0cm; tab-stops: 45.8pt 91.6pt 137.4pt 183.2pt 229.0pt 274.8pt 320.6pt 366.4pt 412.2pt 458.0pt 503.8pt 549.6pt 595.4pt 641.2pt 687.0pt 732.8pt;"><span style="font-family: "Courier New"; font-size: 10.0pt; mso-fareast-font-family: "Times New Roman";">"id":
"01",<o:p></o:p></span></p><p class="MsoNormal" style="line-height: normal; margin-bottom: 0cm; tab-stops: 45.8pt 91.6pt 137.4pt 183.2pt 229.0pt 274.8pt 320.6pt 366.4pt 412.2pt 458.0pt 503.8pt 549.6pt 595.4pt 641.2pt 687.0pt 732.8pt;"><span style="font-family: "Courier New"; font-size: 10.0pt; mso-fareast-font-family: "Times New Roman";">"name":
"Tom Hanks",<o:p></o:p></span></p><p class="MsoNormal" style="line-height: normal; margin-bottom: 0cm; tab-stops: 45.8pt 91.6pt 137.4pt 183.2pt 229.0pt 274.8pt 320.6pt 366.4pt 412.2pt 458.0pt 503.8pt 549.6pt 595.4pt 641.2pt 687.0pt 732.8pt;"><span style="font-family: "Courier New"; font-size: 10.0pt; mso-fareast-font-family: "Times New Roman";">"age":
20.0,<o:p></o:p></span></p><p class="MsoNormal" style="line-height: normal; margin-bottom: 0cm; tab-stops: 45.8pt 91.6pt 137.4pt 183.2pt 229.0pt 274.8pt 320.6pt 366.4pt 412.2pt 458.0pt 503.8pt 549.6pt 595.4pt 641.2pt 687.0pt 732.8pt;"><span style="font-family: "Courier New"; font-size: 10.0pt; mso-fareast-font-family: "Times New Roman";">"email":
"th@hollywood.com",<o:p></o:p></span></p><p class="MsoNormal" style="line-height: normal; margin-bottom: 0cm; tab-stops: 45.8pt 91.6pt 137.4pt 183.2pt 229.0pt 274.8pt 320.6pt 366.4pt 412.2pt 458.0pt 503.8pt 549.6pt 595.4pt 641.2pt 687.0pt 732.8pt;"><span style="font-family: "Courier New"; font-size: 10.0pt; mso-fareast-font-family: "Times New Roman";">"Cars_make":
"Bentley",<o:p></o:p></span></p><p class="MsoNormal" style="line-height: normal; margin-bottom: 0cm; tab-stops: 45.8pt 91.6pt 137.4pt 183.2pt 229.0pt 274.8pt 320.6pt 366.4pt 412.2pt 458.0pt 503.8pt 549.6pt 595.4pt 641.2pt 687.0pt 732.8pt;"><span style="font-family: "Courier New"; font-size: 10.0pt; mso-fareast-font-family: "Times New Roman";">"Cars_year":
"1973.0",<o:p></o:p></span></p><p class="MsoNormal" style="line-height: normal; margin-bottom: 0cm; tab-stops: 45.8pt 91.6pt 137.4pt 183.2pt 229.0pt 274.8pt 320.6pt 366.4pt 412.2pt 458.0pt 503.8pt 549.6pt 595.4pt 641.2pt 687.0pt 732.8pt;"><span style="font-family: "Courier New"; font-size: 10.0pt; mso-fareast-font-family: "Times New Roman";">"Cars_color":
"White"<o:p></o:p></span></p><p>
</p><p class="MsoNormal" style="line-height: normal; margin-bottom: 0cm; tab-stops: 45.8pt 91.6pt 137.4pt 183.2pt 229.0pt 274.8pt 320.6pt 366.4pt 412.2pt 458.0pt 503.8pt 549.6pt 595.4pt 641.2pt 687.0pt 732.8pt;"><span style="font-family: "Courier New"; font-size: 10.0pt; mso-fareast-font-family: "Times New Roman";">}<o:p></o:p></span></p><p><b>How do we do it by using ADF?</b></p><p class="MsoNormal" style="line-height: normal; mso-margin-bottom-alt: auto; mso-margin-top-alt: auto;"><span style="font-family: "Times New Roman",serif; font-size: 12.0pt; mso-fareast-font-family: "Times New Roman";">Let's create a pipeline that
includes the Copy activity, which has the capabilities to flatten the JSON
attributes. Let's do that step by step.<o:p></o:p></span></p><p>
</p><p class="MsoNormal" style="line-height: normal; mso-margin-bottom-alt: auto; mso-margin-top-alt: auto;"><span style="font-family: "Times New Roman",serif; font-size: 12.0pt; mso-fareast-font-family: "Times New Roman";">First, create a new ADF Pipeline and
add a copy activity.<o:p></o:p></span></p><div class="separator" style="clear: both; text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhG8xzMbTt20PDFG_QoD-uQlYreHXMud5fUjbAfAUbf6hPC4J4_J1EovE-6FOK2UyFk_JQDZfEqMFo19p8z3ygNZTU9YJ5zBGSlurKIVrDe3yskD-zrVD8SAWcrjOQwtLXd7XQkGu0tAfF_/s1403/Copy+Data+activitiy.PNG" style="margin-left: 1em; margin-right: 1em;"><img border="0" data-original-height="638" data-original-width="1403" height="183" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhG8xzMbTt20PDFG_QoD-uQlYreHXMud5fUjbAfAUbf6hPC4J4_J1EovE-6FOK2UyFk_JQDZfEqMFo19p8z3ygNZTU9YJ5zBGSlurKIVrDe3yskD-zrVD8SAWcrjOQwtLXd7XQkGu0tAfF_/w400-h183/Copy+Data+activitiy.PNG" width="400" /></a></div><div class="separator" style="clear: both; text-align: center;">Fig 1: Copy Activity in ADF</div><br /><p><br /></p><p class="MsoNormal" style="line-height: normal; mso-margin-bottom-alt: auto; mso-margin-top-alt: auto;"><span style="font-family: "Times New Roman",serif; font-size: 12.0pt; mso-fareast-font-family: "Times New Roman";">Next, we need datasets. You need to
have both source and target datasets to move data from one place to another. In
this case source is Azure Data Lake Storage (Gen 2). The target is Azure SQL
database. The below figure shows the source dataset. We are using a JSON file
in Azure Data Lake.<o:p></o:p></span></p><div class="separator" style="clear: both; text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhC1edGrArpwi6p5y-aoujlisUiCMCw7z-esVdBpjEDPC8CQakNftuTHy0wIELOPZHRGK3W92MLKXUJduMrFGciSz4ZOAHgnOOojhEWpdW07NN9uB_b__x7ccRTABUCTuV5J5BKYLYCf1j2/s1161/get+the+source+dataset.PNG" style="margin-left: 1em; margin-right: 1em;"><img border="0" data-original-height="801" data-original-width="1161" height="221" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhC1edGrArpwi6p5y-aoujlisUiCMCw7z-esVdBpjEDPC8CQakNftuTHy0wIELOPZHRGK3W92MLKXUJduMrFGciSz4ZOAHgnOOojhEWpdW07NN9uB_b__x7ccRTABUCTuV5J5BKYLYCf1j2/s320/get+the+source+dataset.PNG" width="320" /></a></div><div class="separator" style="clear: both; text-align: center;">Fig 2: Source dataset</div><br /><p class="MsoNormal" style="line-height: normal; mso-margin-bottom-alt: auto; mso-margin-top-alt: auto;"><span style="font-family: "Times New Roman",serif; font-size: 12.0pt; mso-fareast-font-family: "Times New Roman";">We will insert data into the target
after flattening the JSON. the below figure shows the sink dataset, which is an
Azure SQL Database.<o:p></o:p></span></p><div class="separator" style="clear: both; text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEh9XJ20OlcrYbr9-ZHZPKJUoPHaDlEu0sx9brMGDMx2EkOi6Xbn0ZLMLanhArYbs7XdksM8ysxtlF4-orB7gKmmUMejxDpKCxt20copQk4r_pf3hW_LDh7MQPx9unlvV0nJE8muz12xHPgL/s932/Sink+dataset.PNG" style="margin-left: 1em; margin-right: 1em;"><img border="0" data-original-height="581" data-original-width="932" height="199" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEh9XJ20OlcrYbr9-ZHZPKJUoPHaDlEu0sx9brMGDMx2EkOi6Xbn0ZLMLanhArYbs7XdksM8ysxtlF4-orB7gKmmUMejxDpKCxt20copQk4r_pf3hW_LDh7MQPx9unlvV0nJE8muz12xHPgL/s320/Sink+dataset.PNG" width="320" /></a></div><div class="separator" style="clear: both; text-align: center;">Fig 3: Sink dataset</div><br /><p></p><p>Please note that, you will need Linked services to create both the datasets, this article will not go into details about Linked Services, to know details you can look into the <a href="https://docs.microsoft.com/en-us/azure/data-factory/concepts-linked-services" target="_blank">Microsoft document</a>.</p><p><br /></p><p>3. Flattening JSON</p><p>After you create source and target dataset, you need to click on mapping as shown below figure 4 and follow the steps:</p><div class="separator" style="clear: both; text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgjYb64Y7KDqOpS1EGBuZI9D1rjkfWTVgXN4tnXhkrYz9DHxbDtCM83jg8c5mFJ0R4DbyHk4Ul40mlEHmInwgYJ-s6qskIDyX-V5eqZlQoMNwus1lcuWDr5gjxnaA-RjZp2sfUqjXWZb8qi/s1188/What+the+changes+you+make+for+mapping.PNG" style="margin-left: 1em; margin-right: 1em;"><img border="0" data-original-height="551" data-original-width="1188" height="185" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgjYb64Y7KDqOpS1EGBuZI9D1rjkfWTVgXN4tnXhkrYz9DHxbDtCM83jg8c5mFJ0R4DbyHk4Ul40mlEHmInwgYJ-s6qskIDyX-V5eqZlQoMNwus1lcuWDr5gjxnaA-RjZp2sfUqjXWZb8qi/w400-h185/What+the+changes+you+make+for+mapping.PNG" width="400" /></a></div><div class="separator" style="clear: both; text-align: center;">Fig 4: Flattening JSON</div><div class="separator" style="clear: both; text-align: center;"><br /></div>a) At first import schemas<div>b) Make sure to choose value from Collection Reference</div><div>c) Toggle the Advanced Editor</div><div>d) Update the columns those you want to flatten<br /><p>After you have done above, then save it and execute the pipeline. You will find flatten records are inserted to the database as shown in fig 5.</p><p><br /></p><div class="separator" style="clear: both; text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgku6bs52Op3xumrcDicfn90Z5V_dgDwXUh8RD_dPMhaL3H_oV99xFzWna0qBP9N5GWZobj1csPfnGszLeIpDSfhwSJLU-0oHraF08amAaQVJUagqpB6WspCDy38wjo35dd6WapfrXTV2ck/s901/Output+after+database+save.PNG" style="margin-left: 1em; margin-right: 1em;"><img border="0" data-original-height="171" data-original-width="901" height="76" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgku6bs52Op3xumrcDicfn90Z5V_dgDwXUh8RD_dPMhaL3H_oV99xFzWna0qBP9N5GWZobj1csPfnGszLeIpDSfhwSJLU-0oHraF08amAaQVJUagqpB6WspCDy38wjo35dd6WapfrXTV2ck/w400-h76/Output+after+database+save.PNG" width="400" /></a></div><div class="separator" style="clear: both; text-align: center;">Fig 5: Saved data into the table after flattening</div><br /><p><br /></p><p class="MsoNormal" style="line-height: normal; mso-margin-bottom-alt: auto; mso-margin-top-alt: auto; mso-outline-level: 2;"><b><span style="font-family: "Times New Roman",serif; font-size: 18.0pt; mso-fareast-font-family: "Times New Roman";">Be
cautious<o:p></o:p></span></b></p><p>Make sure to choose "Collection Reference" as mentioned 3.b, if you forget to choose that then the mapping will look like below Fig 6:</p><div class="separator" style="clear: both; text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjleWdtBGbirg0ifpa68a6S5O5VQfnCrua4ladf-HYJ6ZDLDA7fpnTmrsb6YKFw0MiTgOuDw8u9SYlncwyhL7fGjG_msyzeJFktb3YfphIebG9QsWAaU0hL6ZNb97hWzo4jJkJn5fORGwsW/s1521/withoutcollectionreference.PNG" style="margin-left: 1em; margin-right: 1em;"><img border="0" data-original-height="635" data-original-width="1521" height="168" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjleWdtBGbirg0ifpa68a6S5O5VQfnCrua4ladf-HYJ6ZDLDA7fpnTmrsb6YKFw0MiTgOuDw8u9SYlncwyhL7fGjG_msyzeJFktb3YfphIebG9QsWAaU0hL6ZNb97hWzo4jJkJn5fORGwsW/w400-h168/withoutcollectionreference.PNG" width="400" /></a></div><div class="separator" style="clear: both; text-align: center;">Fig 6: Without putting collection reference</div><br /><div class="separator" style="clear: both; text-align: center;"><br /></div>If you look at the mapping closely from the above figure 6, the nested item in the JSON from source side is: 'result'][0]['Cars']['make'] which means it will only take very first record from the JSON. If you execute the pipeline you will find only one record from JSON file is inserted to the database. So it's important to choose Collection Reference.</div><div><br /><p>In summary, I found Copy Activity in Azure Data Factory make easier to flatten the JSON, you don't need to write any custom code which is super cool.</p><p><br /></p><p><br /></p></div>Unknownnoreply@blogger.com0tag:blogger.com,1999:blog-2635979041680311167.post-49144349874772960622021-07-25T12:07:00.006-04:002021-08-07T14:01:16.571-04:00Step by step guideline to install PostgreSQL in Azure cloud and Client tool to administrate the PostgreSQL<h4 style="text-align: left;">What is PostgreSQL?</h4><p>PostgreSQL, also known as Postgres, is a free and open-source relational database management system. The official PostgreSQL site mentioned, "The World's Most Advanced Open Source Relational Database". PostgreSQL as Open Source database gained huge popularity in past few years, this article post will focus how to install PostgreSQL in Azure cloud and tools to interact with the database.</p><p class="MsoNormal"><span face=""Segoe UI",sans-serif" style="font-size: 10.5pt; line-height: 107%; mso-fareast-font-family: "Times New Roman";"></span></p><p class="MsoNormal"><span lang="EN-CA" style="mso-ansi-language: EN-CA;"><br /></span></p><h4 style="text-align: left;"><span lang="EN-CA" style="mso-ansi-language: EN-CA;">Installation of </span>PostgreSQL in the Azure Cloud environment</h4><p class="MsoNormal"><span lang="EN-CA" style="mso-ansi-language: EN-CA;">At first, login to
your Azure Portal and search for PostgreSQL, You will find different services to
choose from, I have chosen “Azure Database for PostgreSQL flexible servers” from the below list as shown in Fig 1. This particular service will allow to add any extension you want to add to your database in future.</span></p><div class="separator" style="clear: both; text-align: center;"><br /><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhQ50fA-g2luva50fDPAwamDSR3JCkbZMiw77gJmBR1up1Y9DZges-bfcWMFPGtCi-C7fwXX5WlBTGNx3j4nnWQRTHUBYLsGBzAY0r1L9rjFG1gX-URXPQZ33obNgZ3hrpRgBrAIm0y4JWW/" style="margin-left: 1em; margin-right: 1em;"><img alt="" data-original-height="367" data-original-width="872" height="169" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhQ50fA-g2luva50fDPAwamDSR3JCkbZMiw77gJmBR1up1Y9DZges-bfcWMFPGtCi-C7fwXX5WlBTGNx3j4nnWQRTHUBYLsGBzAY0r1L9rjFG1gX-URXPQZ33obNgZ3hrpRgBrAIm0y4JWW/w400-h169/Choose+right+services.PNG" width="400" /></a></div><div class="separator" style="clear: both; text-align: center;">Fig 1: PostgresSQL services in Azure Cloud</div><br /><br /><p></p><p class="MsoNormal"><span lang="EN-CA" style="mso-ansi-language: EN-CA;">As soon as
you choose the option you will find below figure 2, which will allow to create the postgreSQL flexible server.<o:p></o:p></span></p><br /><div class="separator" style="clear: both; text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEirblhl_akQvh_YCE-7rwlyqiCdgHaQQeN9ST7JzZNlQkPXfLyLVqAOTUu4AilvkUcTk17IvARCUQFItO_kbbMaASYtqexJ8quctOBc2en0I4zEHuTXHMV3vCFVVrzRg3bxil8BXzcOeNyq/" style="margin-left: 1em; margin-right: 1em;"><img data-original-height="691" data-original-width="1801" height="154" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEirblhl_akQvh_YCE-7rwlyqiCdgHaQQeN9ST7JzZNlQkPXfLyLVqAOTUu4AilvkUcTk17IvARCUQFItO_kbbMaASYtqexJ8quctOBc2en0I4zEHuTXHMV3vCFVVrzRg3bxil8BXzcOeNyq/w400-h154/Choose+PostgreSQL+Flex.PNG" title="Fig 2: PostgreSQL flexible server" width="400" /></a></div><div class="separator" style="clear: both; text-align: center;"><div class="separator" style="clear: both;"> Fig 2: PostgreSQL flexible server</div><div class="separator" style="clear: both;"><br /></div><div class="separator" style="clear: both;"><br /></div><div class="separator" style="clear: both; text-align: left;">After clicking "Create Azure Database for PostgreSQL flexible server" as shown in above figure 2, you will have options to choose from four different plans as shown in figure 3. As per your need you can choose from anyone of them. "Single server" was best fit for my requirements since it's enterprise ready, fully managed and I can add extension to it.</div><div class="separator" style="clear: both; text-align: left;"><br /></div><div class="separator" style="clear: both; text-align: left;"><br /></div><div class="separator" style="clear: both; text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEispRROmxn7YJvGOEx27WRZ69xulyC4Q-k7NCPwSonHxe3H1lx7ug0uuneHhoZYrdkI9AqfdUO6Pqn3JFb5FW4uBOMfilPneGk1YzWyvBCGKU1M0FfonUvVsOmNaLPh38xSIVYPGc2cFNDf/s940/select+plan.PNG" style="margin-left: 1em; margin-right: 1em;"><img border="0" data-original-height="701" data-original-width="940" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEispRROmxn7YJvGOEx27WRZ69xulyC4Q-k7NCPwSonHxe3H1lx7ug0uuneHhoZYrdkI9AqfdUO6Pqn3JFb5FW4uBOMfilPneGk1YzWyvBCGKU1M0FfonUvVsOmNaLPh38xSIVYPGc2cFNDf/s320/select+plan.PNG" width="320" /></a></div>Fig 3: Choose right plan for your database<br /><div class="separator" style="clear: both; text-align: left;"><br /></div><div class="separator" style="clear: both; text-align: left;"> </div><div class="separator" style="clear: both; text-align: left;">As soon as you hit the 'Single server' as shown above figure 3, you will find details information to fill up as shown in figure 4.</div><div class="separator" style="clear: both; text-align: left;"><br /></div><div class="separator" style="clear: both; text-align: left;">Please follow the below steps, figure (4) indicates each step listed.</div><div class="separator" style="clear: both; text-align: left;"><br /></div><div class="separator" style="clear: both; text-align: left;"><div class="separator" style="clear: both;">1) Choose the right subscription for your resource group</div><div class="separator" style="clear: both;">2) Please select resource group where you want to install the database server, if no resource group created then you need to create a resource group. Please find details how to create <a href="https://docs.microsoft.com/en-us/azure/azure-resource-manager/management/manage-resource-groups-portal" target="_blank">azure resource group</a></div><div class="separator" style="clear: both;">3) Put the server name for PostgreSQL</div><div class="separator" style="clear: both;">4) Choose the location where you would like to install the PostgreSQL, I have chosen Canada Central, however; you can choose which best fit for you.</div><div class="separator" style="clear: both;">5) Choose the version of PostgreSQL that you would like deploy in Azure</div><div class="separator" style="clear: both;">6) At this step fill up the administrator account information and save this credential; you will need this when you log into the database server.</div></div><div class="separator" style="clear: both; text-align: left;"><br /></div><div class="separator" style="clear: both; text-align: left;"><div class="separator" style="clear: both; text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEim-_yPLJFeKdTpf8alutWDW8xZXkXLN1owqtUo_fAH2LQiiPfkKj3s9akE4nYxfI9XkfG79OATg7B1bnYvnTSTyG0ZJozXxIskOSZ-DhXEkuQTUSqLK9_3BKSKH7WqXIP9vSt32q9RovZ6/" style="margin-left: 1em; margin-right: 1em;"><img alt="" data-original-height="1079" data-original-width="1011" height="400" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEim-_yPLJFeKdTpf8alutWDW8xZXkXLN1owqtUo_fAH2LQiiPfkKj3s9akE4nYxfI9XkfG79OATg7B1bnYvnTSTyG0ZJozXxIskOSZ-DhXEkuQTUSqLK9_3BKSKH7WqXIP9vSt32q9RovZ6/w375-h400/Config+for+Postgress+SQL.PNG" width="375" /></a></div><div class="separator" style="clear: both; text-align: center;">Fig 4: PostgreSQL deployment config input</div><br /><br /></div><div class="separator" style="clear: both; text-align: left;"><br /></div><div class="separator" style="clear: both; text-align: left;"><br /></div><div class="separator" style="clear: both; text-align: left;"><br /></div><div class="separator" style="clear: both; text-align: left;">After filling up the above information, please click 'Review + Create'. It will take a few minutes to complete the installation and you will find below message when deployment is completed as shown in figure 5.</div><div class="separator" style="clear: both; text-align: left;"><br /></div><div class="separator" style="clear: both; text-align: left;"><br /></div><div class="separator" style="clear: both; text-align: left;"><div class="separator" style="clear: both; text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgSfy_UJsiNireB59CbvHCd3nlRXiYadKHsgQxFQUYOvGic0xPbJIQSMIDaj8Q_D5aI9iOgjXRsQURf3VyieQA4H6IEDsWyzvJvKpaehxBHr9Fvqrg1dvTnWtuAXICl6oypB_Gq_q4wNKHv/" style="margin-left: 1em; margin-right: 1em;"><img alt="" data-original-height="403" data-original-width="1182" height="136" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgSfy_UJsiNireB59CbvHCd3nlRXiYadKHsgQxFQUYOvGic0xPbJIQSMIDaj8Q_D5aI9iOgjXRsQURf3VyieQA4H6IEDsWyzvJvKpaehxBHr9Fvqrg1dvTnWtuAXICl6oypB_Gq_q4wNKHv/w400-h136/Deployment+completed.PNG" width="400" /></a></div><div class="separator" style="clear: both; text-align: center;">Fig 5: Deployment is completed</div><div class="separator" style="clear: both; text-align: center;"><br /></div><div class="separator" style="clear: both; text-align: left;"><br /></div><div class="separator" style="clear: both; text-align: left;">After the deployment if you click Go to Resource (as shown bottom link at Fig 6), you will find out more details about the resource that you just created. We will need these information when database server need to connect from On-Premise IDE.</div><div class="separator" style="clear: both; text-align: left;"><br /></div><div class="separator" style="clear: both; text-align: left;"><div class="separator" style="clear: both; text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEi4p9ub4kP8a4gwh4DBy8lfa3PE2sq9K8JNGvARkdB1J3U29H_dyAKrjaC14HhNzDcShELkpm8nSI9-xenj92_Kty8TFoXYj75QH8hkfMGYdgoPlJAr-Tu1mFhNm2Qa98Dw5b8mpVRnfYRU/" style="margin-left: 1em; margin-right: 1em;"><img alt="" data-original-height="475" data-original-width="1857" height="103" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEi4p9ub4kP8a4gwh4DBy8lfa3PE2sq9K8JNGvARkdB1J3U29H_dyAKrjaC14HhNzDcShELkpm8nSI9-xenj92_Kty8TFoXYj75QH8hkfMGYdgoPlJAr-Tu1mFhNm2Qa98Dw5b8mpVRnfYRU/w400-h103/PostgreSQL+deployment+completed+with+details.PNG" width="400" /></a></div><div class="separator" style="clear: both; text-align: center;">Fig 6: resource details</div><br /><br /></div><div class="separator" style="clear: both; text-align: left;"><br /></div><h4 style="clear: both; text-align: left;">How to connect <span style="text-align: center;">PostgreSQL from On-Premise GUI?</span></h4><div class="separator" style="clear: both; text-align: left;"><br /></div><span style="text-align: center;">PostgreSQL deployment is completed in Azure Cloud, however; </span>Now we need to find out how to connect this PostgreSQL database server with a Graphical User Interface (GUI) and create any new databases. One of the popular GUI for <span style="text-align: center;">PostgreSQL is </span>pgAdmin.</div><div class="separator" style="clear: both; text-align: left;"><br /></div><div class="separator" style="clear: both; text-align: left;">Let's start installing pgAdmin to connect the database server and do rest of the operation. Please follow the link to install <a href="https://www.pgadmin.org/download/pgadmin-4-windows/" target="_blank">pgAdmin </a>for Windows. You can choose latest version to of pgAmin, download it and then use wizard to install it.</div><div class="separator" style="clear: both; text-align: left;"><br /></div><div class="separator" style="clear: both; text-align: left;">When pgAdmin installation is completed, you will find below (Fig 7) if you search for the app from your computer.</div><div class="separator" style="clear: both; text-align: left;"><br /></div><div class="separator" style="clear: both; text-align: left;"><div class="separator" style="clear: both; text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEg-foW_SK3kV4xmk6ciirtjOXBZZrYx72Xd7YOcrsZeLCnRbWjjTHlxcGx5bZ9mKHK84ChN5FzQK8vpR979hAgdl8WRphgODnE6HX6GpBjBfzOOgJ_9CJJw_h34An9IwChCC3FHNZD39a1w/" style="margin-left: 1em; margin-right: 1em;"><img alt="" data-original-height="222" data-original-width="681" height="130" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEg-foW_SK3kV4xmk6ciirtjOXBZZrYx72Xd7YOcrsZeLCnRbWjjTHlxcGx5bZ9mKHK84ChN5FzQK8vpR979hAgdl8WRphgODnE6HX6GpBjBfzOOgJ_9CJJw_h34An9IwChCC3FHNZD39a1w/w400-h130/PgAdming+at+your+PC.PNG" width="400" /></a></div><br /> Fig 7: pgAdmin installed in my PC</div><div class="separator" style="clear: both; text-align: left;"><br /></div><div class="separator" style="clear: both; text-align: left;"><br /></div><div class="separator" style="clear: both; text-align: left;"><br /></div><div class="separator" style="clear: both; text-align: left;">Now, we are going to use pgAdmin 4 to connect the deployed PostgreSQL database server. Open the app pgAdmin 4 and right click under server as below figure 8 is shown.</div><div class="separator" style="clear: both; text-align: left;"><br /></div><div class="separator" style="clear: both; text-align: left;"><br /></div><div class="separator" style="clear: both; text-align: left;"><br /></div><div class="separator" style="clear: both; text-align: left;"><div class="separator" style="clear: both; text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEj-bWjZ2jKSiyyS00RAHaLVnXGKToVXXAQPJ5MDtAjEEVufbmeqcOp0DOhqoEQopS7ewkZzM6XHZHa0ydi9HoJxfZvWmEHxbP5EyCl5sJ1txW_xkYRC_Uk7qPzsU2pNFeIR9xBuaLazaNGw/" style="margin-left: 1em; margin-right: 1em;"><img alt="" data-original-height="319" data-original-width="722" height="176" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEj-bWjZ2jKSiyyS00RAHaLVnXGKToVXXAQPJ5MDtAjEEVufbmeqcOp0DOhqoEQopS7ewkZzM6XHZHa0ydi9HoJxfZvWmEHxbP5EyCl5sJ1txW_xkYRC_Uk7qPzsU2pNFeIR9xBuaLazaNGw/w400-h176/Create+server+connection+from+pgAdmin.PNG" width="400" /></a></div><div class="separator" style="clear: both; text-align: center;">Fig 8: Create connection</div><br />And then you need to fill up the details to connect PostgreSQL database server which we deployed previously (fig 4). Details are shown in below figure 9, and fill up the information as suggested below:</div><div class="separator" style="clear: both; text-align: left;"><br /></div><div class="separator" style="clear: both; text-align: left;"><div class="separator" style="clear: both;">1. Host name/Address: This is server name which can be found under the resource details (as shown in figure 5.)</div><div class="separator" style="clear: both;">2. Port by default should be set 5432, in case it's not then please put 5432.</div><div class="separator" style="clear: both;">3. Maintenance database: It's like master database if you are coming from SQL DB experiences, it should fill up automatically, if not then put: postgres</div><div class="separator" style="clear: both;">4. User Name: It's admin user name (see figure 4 or 6)</div><div class="separator" style="clear: both;">5. Password: The password you entered (fig 4)</div></div><div class="separator" style="clear: both; text-align: left;"><br /></div><div class="separator" style="clear: both; text-align: left;">As well as, under General tab, please give any name you like for the connection then hit Save button.</div><div class="separator" style="clear: both; text-align: left;"><br /></div><div class="separator" style="clear: both; text-align: left;"><div class="separator" style="clear: both; text-align: center;"><div class="separator" style="clear: both; text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhqKpAggpA6dIVgCAyHEOc6xoR0UeDssIwI1Ncjk6qhugpNfAlgYUQgPq9YQejixmglnubF8rm94jDF_HmK9Ew8cjpSkOW7nhovVKbwTpguhTUdaiLvQATlhKdzrkvEzVRkkAT9bTqj3JBC/" style="margin-left: 1em; margin-right: 1em;"><img alt="" data-original-height="726" data-original-width="743" height="240" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhqKpAggpA6dIVgCAyHEOc6xoR0UeDssIwI1Ncjk6qhugpNfAlgYUQgPq9YQejixmglnubF8rm94jDF_HmK9Ew8cjpSkOW7nhovVKbwTpguhTUdaiLvQATlhKdzrkvEzVRkkAT9bTqj3JBC/" width="246" /></a></div><br /><br /></div><div class="separator" style="clear: both; text-align: center;">Fig 9: connection details need to fill up</div><br /><br /></div><div class="separator" style="clear: both; text-align: left;"><br /></div><div class="separator" style="clear: both; text-align: left;">Now you are connected your PostgreSQL database server in the Azure Cloud environment from PgAdmin GUI as shown in below figure 10. Everything is set, you can create new database, add new extension to it and whatever operations you want to make. </div><div class="separator" style="clear: both; text-align: left;"><div class="separator" style="clear: both; text-align: center;"><br /></div></div><div class="separator" style="clear: both; text-align: left;"><br /></div><div class="separator" style="clear: both; text-align: left;"><div class="separator" style="clear: both; text-align: center;"><div class="separator" style="clear: both; text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgDU6FUVCTy86SHDSSX_2v2Ncg0Sbnj6AEsgr0Sjm_I_O5q5f_lPlkmiB6ORteHvdvX7sUs_Dl-2SzN89ogRmBds4vdHf3UDsSCcouTQku-s8ncviZoZPkuh_E2t-Hl1I6-CEz57OXFMHcx/s1217/last+step.PNG" style="margin-left: 1em; margin-right: 1em;"><img border="0" data-original-height="678" data-original-width="1217" height="223" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgDU6FUVCTy86SHDSSX_2v2Ncg0Sbnj6AEsgr0Sjm_I_O5q5f_lPlkmiB6ORteHvdvX7sUs_Dl-2SzN89ogRmBds4vdHf3UDsSCcouTQku-s8ncviZoZPkuh_E2t-Hl1I6-CEz57OXFMHcx/w400-h223/last+step.PNG" width="400" /></a></div><div class="separator" style="clear: both; text-align: center;">Fig 10: PgAdmin GUI connected with <span style="text-align: left;">PostgreSQL in the Azure Cloud</span></div></div><br /><br /></div><div class="separator" style="clear: both; text-align: left;">We learned how to deploy PostgreSQL in the Azure Cloud environment as well as how we can connect the database server from on-premise GUI called PgAdmin.</div></div><p></p>Unknownnoreply@blogger.com1Toronto, ON, Canada43.653226 -79.383184315.342992163821151 -114.5394343 71.963459836178842 -44.226934299999996tag:blogger.com,1999:blog-2635979041680311167.post-27248680070999363612021-06-06T22:48:00.005-04:002021-06-07T22:27:35.830-04:00Why Power query as a transformation activity in Azure Data factory and SSIS?<p>This blog post will describe how power Query activity in ADF and SSIS can be useful. As well as, I will share the differences of Power Query activity between SSIS and ADF.</p><p>Why Power Query and When to use it?</p><p>When data engineer works for transformation pipeline they get different activities like lookup, merge, data conversion etc. in their preferred ETL tool. ETL tools like Azure data factory (ADF) got Dataflow and Databricks to solve complex transformation. In addition, ADF introduced 'Power Query' (previous name data wrangling) as an activity. Please note that, Power query is still in preview for both Azure Data Factory (ADF) and SSIS. </p><p><br /></p><div class="separator" style="clear: both; text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEh9NXevb_Y_G8ylAfVK53bXLEEAx6lLYEZo14DPAeqA0mHM0Fn1ak0-WWyE3psEbpZIK3ILVek5xYJqB0EU4UElO6Fr8vDMXCMZv0gqEojNOcVicereL68k8lKQ7ispitx5u-h4MGxobBRU/s941/Power+query+in+ADF.PNG" style="margin-left: 1em; margin-right: 1em;"><img border="0" data-original-height="419" data-original-width="941" height="178" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEh9NXevb_Y_G8ylAfVK53bXLEEAx6lLYEZo14DPAeqA0mHM0Fn1ak0-WWyE3psEbpZIK3ILVek5xYJqB0EU4UElO6Fr8vDMXCMZv0gqEojNOcVicereL68k8lKQ7ispitx5u-h4MGxobBRU/w400-h178/Power+query+in+ADF.PNG" width="400" /></a></div><div class="separator" style="clear: both; text-align: center;">Fig 1: Power Query in ADF</div><br /><p>Despite having many activities in Azure data factory why we need Power Query? Let's share my experience when Power Query have chosen as an activity in the pipeline. The task was to get data from complex excel files with many calculation and more than 1000 columns which is used by business as an application. Yes! you got it right, it's an excel application, organization still uses excel as an application!! A few transformed and calculated columns need to go to the modern data warehouse from the excel files. </p><p>In this scenario, thought about what would be the best activity to choose from: DataFlow, Databricks or Power Query? well, I would say all of them may work but Power Query was the best choice.</p><p>Let me explain, why? Since the source file is excel and it's got more than 1000 columns with many calculation inside, It's almost impossible for a Data Engineer to find out how to derive the expected outcome where no mapping or transformation logic is provided. By using Power Query visual transformation, business expert and I were able to work closely and produce the output in a very short period of time. </p><p><br /></p><div class="separator" style="clear: both; text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhPMoDFN2eN-ZXDp1dyE4ja5Qdm4iz7sQwUeVHkgUw07_Su992rmtS63MD2ZSe63Pajy-AZilCaU1TsD1Cenx5md6IBye-KHehxVY1ZtScEzaFb8kTwMkmkh44hGGvkKtLkMiAvetMHplbn/s1789/PowerQuery+Transform.PNG" style="margin-left: 1em; margin-right: 1em;"><img border="0" data-original-height="632" data-original-width="1789" height="141" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhPMoDFN2eN-ZXDp1dyE4ja5Qdm4iz7sQwUeVHkgUw07_Su992rmtS63MD2ZSe63Pajy-AZilCaU1TsD1Cenx5md6IBye-KHehxVY1ZtScEzaFb8kTwMkmkh44hGGvkKtLkMiAvetMHplbn/w400-h141/PowerQuery+Transform.PNG" width="400" /></a></div><div class="separator" style="clear: both; text-align: center;">Fig 2: Power Query transformation in ADF</div><br /><p> As a Data Engineer, when you work with dataflow or Databricks or any other transformation activity in ETL tool, you follow the documented mapping logic and build the pipeline. It means transformation rules and mappings are predefined. However, when transformation rules are yet to discover then best to start with Power Query. You can simply start with Power BI desktop to work together with business to produce the expected outcome. And when output is verified and accepted then then copy the M Query to ADF Power Query activity or SSIS Power Query source. In fact, now you have the transformation rules in the M Query so if you like to use other transformation activity like dataflow or Databricks you can use that too.</p><p><br /></p><p>What works in SSIS but not in ADF?</p><p>Power Query activity is in Preview for both SSIS and ADF, however; if you choose ADF then you need to convert the source file from .excel to .csv since Power Query for ADF doesn't support .excel as source dataset.</p><p><br /></p><div class="separator" style="clear: both; text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEg8F-gPElrkH3e-eqidivK40KnUy8HZpEhp5zT0D3qNKZnQziatVxqQLVcUfaWfkXCVxg_YFAHZBey4ig2_a6Niyzk18qZ_w44nkUpvPkgoVzPvD3AtpTSeXpYI3bTlmjeGh43LPknmMOw6/s1608/Power+Query+dataset+CSV.PNG" style="margin-left: 1em; margin-right: 1em;"><img border="0" data-original-height="526" data-original-width="1608" height="131" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEg8F-gPElrkH3e-eqidivK40KnUy8HZpEhp5zT0D3qNKZnQziatVxqQLVcUfaWfkXCVxg_YFAHZBey4ig2_a6Niyzk18qZ_w44nkUpvPkgoVzPvD3AtpTSeXpYI3bTlmjeGh43LPknmMOw6/w400-h131/Power+Query+dataset+CSV.PNG" width="400" /></a></div><div class="separator" style="clear: both; text-align: center;">Fig 3: Source dataset for Power Query</div><br /><p>However, if you work with Power Query in SSIS then it support excel as source. On Contrary, in SSIS; when you are working with Power Query Source, it doesn't have user interface to make the transformation like ADF. The obvious reason is, you can use Power BI desktop to do the transformation and then copy the M query (Power Query generate M syntax which called M Query) from Power BI and paste it to Power Query Source in SSIS.</p><div class="separator" style="clear: both; text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEg8HH_rf0RRwNSgeUpbV8nTNB7ivimFnsw5vlwjxn_it1Jk9qh_RP8yb1KO1VAieXvgR7T8ED0a6wRwMToyK3FDorSWaMr60aOnBPIOiVvyRcBm1ckAWr-1L5NMCHh8nwfdlzdkfP1Gky88/s1505/MQuery+in+SSIS.PNG" style="margin-left: 1em; margin-right: 1em;"><img border="0" data-original-height="822" data-original-width="1505" height="219" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEg8HH_rf0RRwNSgeUpbV8nTNB7ivimFnsw5vlwjxn_it1Jk9qh_RP8yb1KO1VAieXvgR7T8ED0a6wRwMToyK3FDorSWaMr60aOnBPIOiVvyRcBm1ckAWr-1L5NMCHh8nwfdlzdkfP1Gky88/w400-h219/MQuery+in+SSIS.PNG" width="400" /></a></div><div class="separator" style="clear: both; text-align: center;">Fig 4: Power Query in SSIS</div><br /><p><br /></p><p><br /></p><p>In summary, Power Query in both SSIS and ADF is useful Activity and new feature which still in preview, hence there might be many different scenarios where you want to use Power Query activity, however; this article is based on my experiences with Power Query activity in ADF and SSIS. It's also interesting to know that, The user interface you get under ADF Power Query is identical to Power BI, however, not all M query is supported by ADF Power Query yet.</p><p><br /></p><p><br /></p><p><br /></p>Unknownnoreply@blogger.com0Toronto, ON, Canada43.653226 -79.383184315.342992163821151 -114.5394343 71.963459836178842 -44.226934299999996tag:blogger.com,1999:blog-2635979041680311167.post-68090936849245024472021-04-25T13:23:00.007-04:002021-05-02T08:49:41.175-04:00Handling SQL DB row-level errors in ADF (Azure data factory) Data Flows<p>If you are working with ADF (Azure data factory) data flows then you may have noticed there is a new feature released in Nov 2020 which is useful to capture any error while inserting/updating the records to the SQL database.</p><div class="separator" style="clear: both; text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEh1SARoTh4CRImvBP4AGNLoJnhwQdRJhNgxi8PczrxjLFRzYumCMrt90gycDEN5Csarv9zLUpr3DB8spyMGq-BscGm0DnU825ZysvIPvsQSdU7dnAHzYk6kWgZk8bbIWCR8eckLaPivGxWw/s836/Error+raw+handling.PNG" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" data-original-height="635" data-original-width="836" height="304" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEh1SARoTh4CRImvBP4AGNLoJnhwQdRJhNgxi8PczrxjLFRzYumCMrt90gycDEN5Csarv9zLUpr3DB8spyMGq-BscGm0DnU825ZysvIPvsQSdU7dnAHzYk6kWgZk8bbIWCR8eckLaPivGxWw/w400-h304/Error+raw+handling.PNG" width="400" /></a></div><div class="separator" style="clear: both; text-align: center;">Fig 1: Error row handling at sink database</div><br /><p>For error handling there are two options to choose from:</p><p>1) Stop on first error (default)</p><p>2) Continue on error</p><p><br /></p><div class="separator" style="clear: both; text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEj8b_dUwso8WloE0x4jDghVVCnMMOm7FvV_YzK6W2pM9Zxd8awmHDtgaOwxu-nJvbux3xISRbOlSKK_6lfkdJ0vMq3Fv00LoxSmQltTFq6fKG94FRRiRdGWkVyuoHJnGWSQ3S9H2r2qmjEn/s620/two+options.png" imageanchor="1" style="margin-left: 1em; margin-right: 1em; text-align: center;"><img border="0" data-original-height="203" data-original-width="620" height="131" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEj8b_dUwso8WloE0x4jDghVVCnMMOm7FvV_YzK6W2pM9Zxd8awmHDtgaOwxu-nJvbux3xISRbOlSKK_6lfkdJ0vMq3Fv00LoxSmQltTFq6fKG94FRRiRdGWkVyuoHJnGWSQ3S9H2r2qmjEn/w400-h131/two+options.png" width="400" /></a></div><p></p><p> Fig 2: Error row handling options</p><p>By default, ADF pipeline will stop at the error. However, the main purpose of this feature to use option "Continue on error" to catch and log the error so that we can look at later and take action accordingly. </p><p>Let's fill up the settings to catch errors rows, below figures show the settings and will also describe each setup (Please follow the numbering in the figure 3).</p><p>1) Error row handling: Since we wanted to catch the error so we have chosen "Continue of error" at Error row handling.</p><div class="separator" style="clear: both; text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEipQCv3H5qvGZtb3w-tN8nehxViVHBLRmabmhboCeJvFgEPPlloGEXkyrx4vy196XaYypDRpE_F0Pq75Eo-7rMqJm3COgaeWbJEeXQkoLhODPcdBmfXoprFJ1_7gJulAzp1LDmTXp0-CLdA/s983/Learn+the+settings.PNG" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" data-original-height="378" data-original-width="983" height="154" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEipQCv3H5qvGZtb3w-tN8nehxViVHBLRmabmhboCeJvFgEPPlloGEXkyrx4vy196XaYypDRpE_F0Pq75Eo-7rMqJm3COgaeWbJEeXQkoLhODPcdBmfXoprFJ1_7gJulAzp1LDmTXp0-CLdA/w400-h154/Learn+the+settings.PNG" width="400" /></a></div><div class="separator" style="clear: both; text-align: center;">Fig 3: Settings Continue on error</div><br /><p>2) Transaction Commit: Choose whether the data flow will be written in a single transaction or in batches, I have chosen single, it means whenever there is failure it will store the record on the other hand batch will store error records when full batch is completed.</p><p>3)Output rejected data: You need to make this check mark TRUE to store the error rows. The whole point of error row handling is you want to know the error records; if so, please tick check mark. Though you can avoid this, in that case pipeline will run but if there is any error you will not know which records causes the error.</p><p>4) Linked Service: Put the linked service and test the connection</p><p>5)Storage folder path: Storage path need to mention here, it's the path where you would like to store the error records in a file.</p><p>6)Report success on error: I don't put report on success checkbox to TRUE since I wanted to know if there is a failure.</p><p><br /></p><p>After the settings, when you run the pipeline and if there is any error in the dataset, it will be stored in your storage folder as you have provided at point no 5 in the settings.</p><div class="separator" style="clear: both; text-align: center;"><span style="text-align: left;"><br /></span></div><div class="separator" style="clear: both; text-align: justify;"><span style="text-align: left;">In general, when there is a failure at the time of inserting records to the database it takes sometime to find out the reason of failure. You may have to go through large chunk of dataset and look for miss match of data types or NULL value etc. to find out the root cause. Through this feature the error records will be captured and stored in the storage so you will be able to identify the reason for any error very quickly. And if you would like to ingest those error rows then you can fix those records and re-run the pipeline.</span></div>Unknownnoreply@blogger.com0Toronto, ON, Canada43.653226 -79.383184315.342992163821151 -114.5394343 71.963459836178842 -44.226934299999996tag:blogger.com,1999:blog-2635979041680311167.post-86066124493362701812021-03-28T11:01:00.006-04:002021-04-21T23:17:36.929-04:00Step by step guideline to install Jupyter Notebook<p>Whether you work as a Data Engineer or a Data Scientist, a Jupyter Notebook is a helpful tool. One of the projects I was working required a comparison of two parquet files. This is mainly a schema comparison, not a data comparison. Though the two .parquet were created from two different sources, the outcome should be completely alike, schema wise. At the beginning I was manually comparing them then I thought there must be a tool to do that. Well, that's how I found a Jupyter notebook can be useful to compare two .parquet files' schema.</p><p>The Jupyter Notebook can be used for data cleaning and transformation, data visualization, machine learning, statistical modeling and much more. This post will describe the step by step installation process of Jupyter notebook.</p><h2>Step 1: Install python version 3.7.9</h2><p>Python is a prerequisite for running a Jupyter notebook, so we need to install python first. Please follow this URL and choose right version to install: <a href="https://www.python.org/downloads/">https://www.python.org/downloads/</a>.</p><p>I have chosen 'Windows x86-64 executable installer' for my Windows 64 bit OS. Please choose the version as per your computer Operating system.<img alt="" class="wp-image-3862944 size-large" height="195" src="https://www.sqlservercentral.com/wp-content/uploads/2021/03/Python-executible-file-1024x500.png" width="400" /></p><p>Fig 1: Windows Executable</p><p>You can download the executable file and save in any location at your computer.</p><p>Now next step is to create a 'Python' folder under the C: drive, we will use this folder as installation location at later step.<br /><img alt="" class="alignnone wp-image-3862945 size-large" height="116" src="https://www.sqlservercentral.com/wp-content/uploads/2021/03/python-folder-under-C-1024x297.png" width="400" /></p><div><p>Fig 2: Python folder under C</p><p> </p><p>Find out the downloaded executable file, I have saved the executable file under Downloads folder (shown in below figure 3). Now double click the executable file to initiate the installation process.<br /><img alt="" class="alignnone wp-image-3874153 size-full" height="68" src="https://www.sqlservercentral.com/wp-content/uploads/2021/03/Pyhton-downloaded-1.png" width="400" /><br />Fig 3: Python Execution file</p><p>Make sure to choose 'Customize Installation' and check mark 'Add Python 3.9 to PATH' as shown in figure 4. I followed the customization method to avoid setting up environment variable.</p><p><img alt="" class="alignnone wp-image-3862947 size-large" height="247" src="https://www.sqlservercentral.com/wp-content/uploads/2021/03/pyhtons-installation-select-customize-1024x633.png" width="400" /></p><p>Fig 4: Python Installation wizard</p><p>As below figure 5 shown, the Customize installation location, where make sure you put the installation location folder C:\Python\Python39. We have created 'Python' folder in C drive in earlier step (Fig 2)</p><img alt="" class="wp-image-3862948 size-large" height="246" src="https://www.sqlservercentral.com/wp-content/uploads/2021/03/python-installation-setup-step-2-1024x631.png" width="400" /></div><div> Fig 5: choose the location<p>Now hit the Install button. Installation will complete in a minute or two.</p><p>Let's test if python installed successfully, open command prompt and type "python". If python is installed correctly then you should able to see the python version number and some key help, as shown below in Fig 6.</p><img alt="" class="wp-image-3862949 size-large" height="102" src="https://www.sqlservercentral.com/wp-content/uploads/2021/03/test-if-python-works-1024x261.png" width="400" /></div><div> Fig 6: Python installed successfully.</div><div><br /></div><div><h2 style="text-align: left;">Step 2: Install the Jupyter Notebook</h2><p>Let's move to the next step, which is to install the Jupyter notebook software. Open command prompt and type the below code:</p><pre class="prettyprint lang-plain">>pip install jupyter</pre><img alt="" class="wp-image-3862950 size-large" height="204" src="https://www.sqlservercentral.com/wp-content/uploads/2021/03/installation-jupyter-3--1024x522.png" width="400" /> </div><div>Fig 7: Jupyter Notebook installation started<p>When installation is complete, let's run the Jupyter Notebook web application. To do this, you need to go to a cmd prompt and execute this command, as shown in below figure 8:</p><pre class="prettyprint lang-plain">Jupyter notebook</pre><img alt="" class="wp-image-3862951 size-large" height="164" src="https://www.sqlservercentral.com/wp-content/uploads/2021/03/open-jupyter-notebook-command-1024x420.png" width="400" /> </div><div>Fig 8: Opening Jupyter Notebook</div><div> </div><div><i></i>As soon as you hit the above command button, It will open a browser with jupyter notebook as shown in figure 9.</div><div><img alt="" class="wp-image-3862953 size-large" height="160" src="https://www.sqlservercentral.com/wp-content/uploads/2021/03/Jupyter-notebook-on-browser-first-1024x410.png" width="400" /> </div><div>Fig 9: Jupyter Notebook on browser<p>Now you can create a Notebook by choosing 'New' and choose Python 3 as show in fig 10. This will open a new browser tab where you will write the code.</p><p><img alt="" class="wp-image-3862954 size-large" height="102" src="https://www.sqlservercentral.com/wp-content/uploads/2021/03/Jupyter-notebook-on-browser-2nd-1024x261.png" width="400" /></p><p>Fig 10: Open Notebook</p><div><p>Let's write hello world program in the Jupyter notebook. The browser will look like figure 11 if you enter this code:</p><pre class="prettyprint lang-python">print('Hello world')</pre><p>The output is shown after clicking the 'Run' button.</p><div><img alt="" class="wp-image-3862955 size-large" height="73" src="https://www.sqlservercentral.com/wp-content/uploads/2021/03/HelloWorld-python-1024x187.png" width="400" /> </div><div>Fig 11: Hello world in Jupyter Notebook<div><p>Now you can write and run other notebooks.</p><p>In this article, we learned how to install python and Jupyter Notebooks and have also written a simple hello world program. There are different ways you can install Jupyter Notebook, but I followed this approach and found simple.</p></div></div></div></div>Unknownnoreply@blogger.com0tag:blogger.com,1999:blog-2635979041680311167.post-29310415589221415262021-02-23T20:51:00.005-05:002021-04-20T21:08:31.991-04:00How to add local timestamp end of the files in ADF?<p>This article will describe how to add your local timestamp at the end of the each file in Azure Data Factory (ADF). In general, ADF gets a UTC timestamp, so we need to convert the timestamp from UTC to EST, since our local time zone is EST.</p><p>For example, if the input Source file name in SFTP is "_source_Customer.csv", then the expected outcome will be, "_source_Customer_2021-02-12T133751.csv". This means that the pipeline should add '_2021-02-12T133751' to the end of each file. This will work dynamically, which means that any file you pass from the source will have the timestamp added to it by using an ADF regular expression.</p><p>Let's set a simple pipeline and explain the scenario in a few steps. In this example, we receive files from an event based trigger and hold the file name in a parameter. The main part of this article is how to append the current date and time to the end of the file name we received. Please note that event based triggers will not be discussed here. If you like to know more about how to create trigger, please follow this <a href="https://docs.microsoft.com/en-us/azure/data-factory/how-to-create-event-trigger" target="_blank">link</a>.</p><p><strong>Step 1: Add Copy Activity</strong></p><p>Create a simple pipeline with at least one Copy activity that connects a source and a sink, similar to what is shown in Fig 1.</p><div class="separator"><p> </p><p><i> <img alt="" class="alignnone size-medium wp-image-3861406" height="194" src="https://www.sqlservercentral.com/wp-content/uploads/2021/02/All-about-data-ADF-pipeline_1-1-300x194.png" width="300" /></i></p></div><div class="separator"> <em>Fig 1: Pipeline with Copy Activity</em></div><div class="separator"> </div><p><strong>Step 2: Adding a parameter to receive the file name </strong></p><div class="separator">Before adding the expression, we need to have a parameter in the pipeline that will catch the filename from a trigger. For the time being assume that the trigger will give us the file name and the parameter is to hold the filename.</div><div> </div><div class="separator"> <img alt="" class="alignnone size-medium wp-image-3861402" height="221" src="https://www.sqlservercentral.com/wp-content/uploads/2021/02/All-about-data-ADF-parameter_2-300x221.png" width="300" /></div><div class="separator"> <em>Fig 2: Adding parameter</em></div><p><strong>Step 3: Prepare the sink dataset</strong></p><p>In the Copy data activity there is a Sink dataset that needs a parameter. Click on the Sink dataset and when it opens, you will find the view similar to Fig 3.</p><div class="separator"> <img alt="" class="alignnone size-medium wp-image-3861403" height="153" src="https://www.sqlservercentral.com/wp-content/uploads/2021/02/All-about-data-ADF-pipeline_3-300x153.png" width="300" /></div><div class="separator"> <em>Fig 3: Adding parameter to the dataset</em></div><div class="separator"> </div><div class="separator">To add parameter to the dataset, click New and add the parameter name. Select the type, which should be a string. Now you you will see the sink dataset looks like Fig 4. The value edit box is where you need to add the dynamic content.</div><div class="separator"><div class="separator"> <img alt="" class="alignnone size-medium wp-image-3861404" height="205" src="https://www.sqlservercentral.com/wp-content/uploads/2021/02/All-about-data-ADF-pipeline_4-300x205.png" width="300" /></div><div class="separator"> <em>Fig 4: Dynamic content to add the timestamp</em></div><p> </p><p><strong>Step 4: Setting up a dynamic expression</strong></p></div><p>Now, let's create the dynamic expression. As soon you hit the 'Add dynamic content', shown in Figure 5, you will able to write the expression that will convert the UTC timestamp to EST and then pad end of the file.</p><div class="separator"> <img alt="" class="alignnone size-medium wp-image-3861405" height="102" src="https://www.sqlservercentral.com/wp-content/uploads/2021/02/All-about-data-ADF-pipeline_5-300x102.png" width="300" /></div><div class="separator"> <em>Fig 5: expression language</em></div><div class="separator"> </div><div class="separator">We apply a number of functions to the <em>pTriggerFile</em> parameter from Step 1. Let's have closer look at the expression:</div><pre class="prettyprint lang-mssql">@concat(replace(pipeline().parameters.pTriggerFile,'.csv',''), '_', </pre><pre class="prettyprint lang-mssql">formatDateTime(convertTimeZone(utcnow(),'UTC','Eastern Standard Time'),'yyyy-MM-ddTHHmmss'), '.csv')</pre><p>Find out the explanation of the above expression.</p><ol><li>First we need to get the filename from the parameter, pTriggerFile. The value here will be: _source_Customer.csv</li><li>Next we use REPLACE() to replace the .csv with empty string : replace(pipeline().parameters.pTriggerFile,'.csv',''). This case, we get: _source_Customer</li><li>We need to get the timestamp. To do that, we convert utcnow() to EST with this function: convertTimeZone(utcnow(),'UTC','Eastern Standard Time')</li><li>We want to format the date, and use this: formatDateTime(convertTimeZone(utcnow(),'UTC','Eastern Standard Time'),'yyyy-MM-ddTHHmmss'), which will return a value like: 2021-02-12T133751</li><li>We put this all together with <i>@concat<b>(</b></i>Step1,'_', 'Step4','.csv'), which will return _source_Customer_2021-02-12T133751.csv</li></ol><p>We learned how to add local timestamp end of any file, though in this case, the source file was a .csv. However, you can follow the same process for .txt file where you only need to change '.csv' to '.txt' in the expression.</p>Unknownnoreply@blogger.com1Toronto, ON, Canada43.653226 -79.383184315.342992163821151 -114.5394343 71.963459836178842 -44.226934299999996tag:blogger.com,1999:blog-2635979041680311167.post-64456505875995235452021-01-30T21:28:00.004-05:002021-03-28T08:31:45.712-04:00How to work with SQL Store procedure output parameters in Azure data factory<p>To facilitate native SQL code we have stored procedure support in Azure Data Factory (ADF). When we work with stored procedures, mostly we use input parameters, however, a stored procedure can also have output parameters. In this case, we need to deal with return values, similar to how a value is returned by a function. This article will describe how you can work with stored procedure output parameters in ADF.</p><p>ADF has the <a href="https://docs.microsoft.com/en-us/azure/data-factory/transform-data-using-stored-procedure">SQL Server Stored Procedure Activity</a>, which is used for any stored procedure you have in a SQL Server database. ADF also has a <a href="https://docs.microsoft.com/en-us/azure/data-factory/control-flow-lookup-activity">Lookup activity</a> in which you can use a stored procedure. The example will use the Lookup activity to execute a stored procedure in a database.</p><p>Let's start by creating a stored procedure (SP) with an output parameter:</p><pre class="prettyprint lang-mssql">CREATE PROCEDURE [ETL].[sp_testprocOutParm]
(
@input VARCHAR(10),@Iambit BIT OUTPUT)
AS
BEGIN
IF @input >= '1'
BEGIN
SET @iambit = 1;
RETURN;
END
END;</pre><p>Let's see if the SP returns the expected value. Well, how do we execute the SP in SQL Server Management Studio (SSMS)? Please write below syntax to find the outcome.</p><pre class="prettyprint lang-mssql">DECLARE @Iambit bit
EXEC ETL.sp_testprocOutParm '1', @Iambit OUTPUT
SELECT @Iambit Iambit
</pre><p>You should have outcome like below Fig 1.</p><p><img alt="" class="alignnone wp-image-3860873" height="203" src="https://www.sqlservercentral.com/wp-content/uploads/2021/02/storeprocedureoutput-300x137.png" width="445" /></p><div class="separator"><p><em>Fig 1: Execution of output parameter in SSMS</em></p><p>The SP is created in the database and it has returned the expected outcome. Now we need to move to ADF.</p><p>This article assume you know how to add Lookup activity to ADF pipeline. In your pipeline get the Lookup activity as like figure 1.1.Now, Go Settings of the Lookup activity and choose stored procedure from theUse query selections (as shown in Figure 2). Then you should able to see the stored procedure under the dropdown list which you just created in the database. If you are not able to see the stored procedure then it most likely a problem with your access in ADF. Please remember, though you can execute the SP in SSMS, this doesn't mean you will have access to the stored procedure from ADF, You will need to talk with your portal admin and find out if your user has been given enough permissions to execute the stored procedure.</p></div><p><img alt="" class="alignnone wp-image-3860968" height="305" src="https://www.sqlservercentral.com/wp-content/uploads/2021/02/Connect-with-Storeprocedure-300x194.png" width="472" /></p><div class="separator"><p><em>Fig 2: Connect stored procedure via Lookup in ADF</em></p><p>If you find out the stored procedure in the list, you can continue to the next step. The next step is to import parameters by clicking the button, import parameter, as shown in Fig 3.</p><p><img alt="" class="alignnone wp-image-3860969" height="169" src="https://www.sqlservercentral.com/wp-content/uploads/2021/02/import-parameter-300x79.png" width="642" /></p><p><em>Fig 3: import parameter</em></p><p>The import parameter will load all the SP parameters, both input and output parameters. In this case the single output parameter will be shown. Having an output parameter means you want to return some value from the stored procedure and use it in ADF. In this example, the return value will control the next activities in the pipeline. Let's store the return value into a variable.</p><p>Drag and drop the Set Variable activity and connect it with the with the Lookup, as shown below in Fig 4</p><p><img alt="" data-original-height="182" data-original-width="643" height="114" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgBo5VCH0WQX9Bu_4F9sjUDO1DKQww_fyKZC9Kq8wiys8LqjZJqaZ4Qr4M48J2OTrlynbsp49_oP6IMmDEyonVGDqyhjaJbC766cMWN75255YrS-cMO_IB9AnfHWYOopXEPyyNI_siTvOxO/w400-h114/image.png" width="400" /></p><p><em>Fig 4: Add Set variable</em></p><p>We need a variable as shown in Fig 5, put variable name and select type of the variable. Since variable is at pipeline scope, so make sure you have selected pipeline itself not any activities in the pipeline.</p><p><img alt="" class="alignnone wp-image-3860970" height="309" src="https://www.sqlservercentral.com/wp-content/uploads/2021/02/Create-variable-300x157.png" width="590" /></p><p><em>Fig 5: Creating variable</em></p><p>While creating variables, you will find there are three types of variables: string, Boolean and Array. We have chosen Boolean type for the variable since our stored procedure returns Boolean.</p><p><img alt="" class="" data-original-height="298" data-original-width="638" height="252" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhztg3gNd-dw08S92Sh_FE2lyY2wQyCxPtPjXQZMwMcLDGkfLnU96gX3zM98BYW6HOat86_xgImRjiLb7zX92LPHjkhPz4jPrl4ZRxvyKLgS2Z4DNuVA0KpSwsr3HDeoEw6nhLDlyintsxk/" width="541" /></p><p><em>Fig 6: Creating variable</em></p><p>Now, let's get to the pipeline and select 'set variable' activity (as shown in Fig 7). In the Set variable activity, please click the variable tab. Here is where you will find the recently created variable under the name drop down. Select it, as shown in Fig 7.</p></div><p> </p><div class="separator"><div class="separator"><p><img alt="" class="alignnone wp-image-3860977" height="371" src="https://www.sqlservercentral.com/wp-content/uploads/2021/02/ChooseVariable-300x180.png" width="618" /></p><p><em>Fig 7: Choose the variable</em></p><p>To bind the value returned from the stored procedure to the variable, you need write the expression under the 'value' of the variable, as shown in Fig 8</p><p><img alt="" class="" data-original-height="435" data-original-width="1017" height="270" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhMnRSxi-30uAmXZ0t8s-gN-ni-xCFtiq834dSiVvnnakyq9Kvr4sXMDu2w11_2AFl4xMXMGy8QhkDpC9PftfH1O9wFgO9eMTdtDnXd_5HxHNrAQkFOCuA4rlFHkHXAEP-LeevkOJkVnYQt/w400-h171/image.png" width="632" /></p><p><em>Fig 8: expression to hold return value</em></p><p>The expression to bind return value from stored procedure is:</p><pre class="prettyprint lang-mssql">@activity('Lookup1').output.firstRow.Iambit</pre><p>Let's explain what expression means. The above expression will return first row from the lookup activity, named Lookup1, and return the column name from the stored procedure, which is 'Iambit'. Here, 'Iambit' will return the result as 1 (a Boolean TRUE).</p><p><strong>Be Cautious: </strong>Though the variable is set to be Boolean and the stored procedure returned the value as a bit, I found an error in the ADF with the above expression. I had to modify the expression to explicitly convert the return value to Boolean, like the code below:</p><pre class="prettyprint lang-mssql">@bool(activity('Lookup1').output.firstRow.Iambit)</pre><p>In summary, output parameter in stored procedure a great facility when in need and happy to see that works in ADF. Please note that, the example we depicted here in this article by using very simple stored procedure, however; in business scenario your stored procedure could be a bit complex. For example, you may have logging tables in the data warehouse and you want any activity in ADF will only execute when you pass particular pipeline name as input parameter as a return if the stored procedure returns TRUE.</p></div></div><div><div><p class="MsoNormal"><o:p></o:p></p></div></div>Unknownnoreply@blogger.com0tag:blogger.com,1999:blog-2635979041680311167.post-56448007314988551082020-12-30T11:22:00.006-05:002020-12-30T11:23:27.640-05:00ADF data flow: can particular date format generate NULL value?<p>I am going to share recent finding in ADF data flow where my source data in .csv got correct date. However,
as long as when I did some transformation and saved in .parquet found those
date all got empty values. This blog post will describe the issue and the
resolution.</p><p class="MsoNormal"><br /></p><p class="MsoNormal">Source data in .csv have startDate and CloseDate like below figure 1.0 where format of the date is MM/dd/yyyy</p><p class="MsoNormal"><o:p></o:p></p>
<p class="MsoNormal"></p><div class="separator" style="clear: both; text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjCscloJoLLV5s4Gds535wuoiuuGomJVeNw47_A8_ZyKtbclK8LjAYircsC76ZGYXIZ7zL6dMBXm62rZ_a1z2FFja0mgcI4Lq70_3ME_Pd2_3lyclTa4wbv0vVvQTcMACimKReFXQvBfSzu/" style="margin-left: 1em; margin-right: 1em;"><img alt="" data-original-height="618" data-original-width="195" height="320" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjCscloJoLLV5s4Gds535wuoiuuGomJVeNw47_A8_ZyKtbclK8LjAYircsC76ZGYXIZ7zL6dMBXm62rZ_a1z2FFja0mgcI4Lq70_3ME_Pd2_3lyclTa4wbv0vVvQTcMACimKReFXQvBfSzu/w101-h320/date+format.PNG" width="101" /></a></div><div class="separator" style="clear: both; text-align: center;">Fig 1.0: Date in the source .csv</div><br /><br /><p></p><p class="MsoNormal"><br /></p>I have used ADF data flow to cleanup/transform the files and saved into .parquet format. However, in the .parquet file these two date columns 'StartDate' and 'CloseDate' become empty.<br />
<p class="MsoNormal"><span style="mso-no-proof: yes;"><v:shapetype coordsize="21600,21600" filled="f" id="_x0000_t75" o:preferrelative="t" o:spt="75" path="m@4@5l@4@11@9@11@9@5xe" stroked="f">
<v:stroke joinstyle="miter">
<v:formulas>
<v:f eqn="if lineDrawn pixelLineWidth 0">
<v:f eqn="sum @0 1 0">
<v:f eqn="sum 0 0 @1">
<v:f eqn="prod @2 1 2">
<v:f eqn="prod @3 21600 pixelWidth">
<v:f eqn="prod @3 21600 pixelHeight">
<v:f eqn="sum @0 0 1">
<v:f eqn="prod @6 1 2">
<v:f eqn="prod @7 21600 pixelWidth">
<v:f eqn="sum @8 21600 0">
<v:f eqn="prod @7 21600 pixelHeight">
<v:f eqn="sum @10 21600 0">
</v:f></v:f></v:f></v:f></v:f></v:f></v:f></v:f></v:f></v:f></v:f></v:f></v:formulas>
<v:path gradientshapeok="t" o:connecttype="rect" o:extrusionok="f">
<o:lock aspectratio="t" v:ext="edit">
</o:lock></v:path></v:stroke></v:shapetype><v:shape id="Picture_x0020_1" o:spid="_x0000_i1030" style="height: 463.5pt; mso-wrap-style: square; visibility: visible; width: 146.5pt;" type="#_x0000_t75">
<v:imagedata o:title="" src="file:///C:/Users/dpaul/AppData/Local/Temp/msohtmlclip1/01/clip_image001.png">
</v:imagedata></v:shape></span><o:p></o:p></p>
<p class="MsoNormal"><o:p> </o:p></p><div class="separator" style="clear: both; text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEi50-4SITONyMTJUJ7zcO5-TNvM6qPQY1R9oCzUjhvzFXD_xggM9T3OKGdYtyhQr_4gdi3HnySsV8omCBxcC9zEU5p5i8OAELvi_-exUqQpXQcdZW_D_zulBQr7-3Ec0lgavzFMGIXly2rD/" style="margin-left: 1em; margin-right: 1em;"><img alt="" data-original-height="668" data-original-width="936" height="228" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEi50-4SITONyMTJUJ7zcO5-TNvM6qPQY1R9oCzUjhvzFXD_xggM9T3OKGdYtyhQr_4gdi3HnySsV8omCBxcC9zEU5p5i8OAELvi_-exUqQpXQcdZW_D_zulBQr7-3Ec0lgavzFMGIXly2rD/" width="320" /></a></div><div class="separator" style="clear: both; text-align: center;">Fig 1.1: Dates become empty in .parquet</div><br /><p></p>
<p class="MsoNormal"><br /></p><p class="MsoNormal">After looking into the dataflow and specific looking at the projection of the source found auto
detect date format ‘MM/dd/YYYY’ which is original source date format.</p><p class="MsoNormal"><br /></p><div class="separator" style="clear: both; text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEi8dP-edzIYLQOHJeXqarxHPpkuGxzt-l9D37pBFl3Jrl6cK1j1FCt_guj4uOobrLMVPbm0UmUGcjotzakcnedl1iEplYItYwPU-HY2ES0LpGbEQCJRxQ7OgayyhTcrfADwChocQoE39nHV/" style="margin-left: 1em; margin-right: 1em;"><img alt="" data-original-height="125" data-original-width="838" height="60" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEi8dP-edzIYLQOHJeXqarxHPpkuGxzt-l9D37pBFl3Jrl6cK1j1FCt_guj4uOobrLMVPbm0UmUGcjotzakcnedl1iEplYItYwPU-HY2ES0LpGbEQCJRxQ7OgayyhTcrfADwChocQoE39nHV/w400-h60/Projection+format.PNG" width="400" /></a></div><div class="separator" style="clear: both; text-align: center;">Fig 1.3: Date format auto detected in the data flow</div><br /><p></p><p class="MsoNormal"><o:p></o:p></p>
<p class="MsoNormal"><o:p> And when previewed the data those date shown as NULL which was kind of weird.</o:p></p><p class="MsoNormal"><o:p> </o:p></p><div class="separator" style="clear: both; text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgAIPx-JZbyKsl4mtn5g8sqzATWbpdZM3GIYJ86qSOfptflHr2Xb9w-Bq3kd2fun-tVxcfaO3Qx0ncHO_forTKf5AAS6kFGGgYSM1R6q1OqpIYp6fVnlVlSLqReDAJj2_EBlsWJ5xk_r9i-/" style="margin-left: 1em; margin-right: 1em;"><img alt="" data-original-height="581" data-original-width="496" height="313" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgAIPx-JZbyKsl4mtn5g8sqzATWbpdZM3GIYJ86qSOfptflHr2Xb9w-Bq3kd2fun-tVxcfaO3Qx0ncHO_forTKf5AAS6kFGGgYSM1R6q1OqpIYp6fVnlVlSLqReDAJj2_EBlsWJ5xk_r9i-/w346-h313/Data+flow+as+it+is+fomat.PNG" width="346" /></a></div><div class="separator" style="clear: both; text-align: center;">Fig 1.4: Date shown as NULL in the data preview</div><br /><p></p>
<p class="MsoNormal"><span style="mso-no-proof: yes;"><v:shape id="Picture_x0020_3" o:spid="_x0000_i1028" style="height: 70pt; mso-wrap-style: square; visibility: visible; width: 468pt;" type="#_x0000_t75">
<v:imagedata o:title="" src="file:///C:/Users/dpaul/AppData/Local/Temp/msohtmlclip1/01/clip_image003.png">
</v:imagedata></v:shape></span><o:p></o:p></p>
<p class="MsoNormal"><o:p><b><u>How to solve it? </u></b></o:p></p>
<p class="MsoNormal">To fix this issue, what you need to do is, go under projection and change the date format to ‘yyyy-MM-dd’ as like below figure 2.0<o:p></o:p></p>
<p class="MsoNormal"><o:p><br /></o:p></p><p class="MsoNormal"><o:p></o:p></p><div class="separator" style="clear: both; text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEiV1TRmUtoowTD3QTWtj_iujvxcRWCF18pM97KPa6ktP67bcSFRPigVa3uplIFzeGPS75TsVnxfwgmMBSQzjgOm9EWTcc7PEyUslCi9KPstT81DJhK6eNJBv-7L5bGTCfcqstDYSunQRMu6/" style="margin-left: 1em; margin-right: 1em;"><img alt="" data-original-height="92" data-original-width="624" height="59" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEiV1TRmUtoowTD3QTWtj_iujvxcRWCF18pM97KPa6ktP67bcSFRPigVa3uplIFzeGPS75TsVnxfwgmMBSQzjgOm9EWTcc7PEyUslCi9KPstT81DJhK6eNJBv-7L5bGTCfcqstDYSunQRMu6/w400-h59/image.png" width="400" /></a></div><div class="separator" style="clear: both; text-align: center;">Fig 2.0: Change date format</div><br /> <p></p>
<p class="MsoNormal"><span style="mso-no-proof: yes;"><v:shape id="Picture_x0020_5" o:spid="_x0000_i1027" style="height: 69pt; mso-wrap-style: square; visibility: visible; width: 468.5pt;" type="#_x0000_t75">
<v:imagedata o:title="" src="file:///C:/Users/dpaul/AppData/Local/Temp/msohtmlclip1/01/clip_image004.png">
</v:imagedata></v:shape></span><o:p></o:p></p>
<p class="MsoNormal"><o:p> And you can go and see the preview, it looks good now.</o:p></p>
<p class="MsoNormal"><o:p> </o:p></p><div class="separator" style="clear: both; text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjq5Hv7HJHkitTqq1F4wFXGnm-l9BywzeWVxB751S5tbDe4DSxXb2rZ0W3VWtAvBSMPsg-JghS0-GHpbZIsIFIvdHUn8AUQYp7JHvFIxuZFKAJY2OcL5hncnpfxaudLdfXp8IZfDSqw3khy/" style="margin-left: 1em; margin-right: 1em;"><img alt="" data-original-height="432" data-original-width="348" height="240" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjq5Hv7HJHkitTqq1F4wFXGnm-l9BywzeWVxB751S5tbDe4DSxXb2rZ0W3VWtAvBSMPsg-JghS0-GHpbZIsIFIvdHUn8AUQYp7JHvFIxuZFKAJY2OcL5hncnpfxaudLdfXp8IZfDSqw3khy/" width="193" /></a></div><div class="separator" style="clear: both; text-align: center;">Fig 2.1: After changing the date format preview looks perfect</div><p></p><br />
<p class="MsoNormal"><o:p> </o:p></p>
<p class="MsoNormal"><span style="mso-no-proof: yes;"><v:shape id="Picture_x0020_4" o:spid="_x0000_i1026" style="height: 324pt; mso-wrap-style: square; visibility: visible; width: 261pt;" type="#_x0000_t75">
<v:imagedata o:title="" src="file:///C:/Users/dpaul/AppData/Local/Temp/msohtmlclip1/01/clip_image005.png">
</v:imagedata></v:shape></span><o:p></o:p></p>
<p class="MsoNormal"><o:p> </o:p>Note that, I have tried with other format from projection such as yyyy/MM/dd
and so on but those did not resolve the issue.</p><p class="MsoNormal"></p><div class="separator" style="clear: both; text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjN862w0UUUSQkxwiBiBUZZcF4PbX-lpqq4fYlhL42srnRz5xvO_1ciPRKM8kpkSjVYm8P4ljEFiIXD00vLdlY8O5A0uN4pWRnlh_P_MVoQ5zfpeNzUuEPBlAeh4Mi72amsG5-N-wGERTXp/" style="margin-left: 1em; margin-right: 1em;"><img alt="" data-original-height="228" data-original-width="591" height="123" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjN862w0UUUSQkxwiBiBUZZcF4PbX-lpqq4fYlhL42srnRz5xvO_1ciPRKM8kpkSjVYm8P4ljEFiIXD00vLdlY8O5A0uN4pWRnlh_P_MVoQ5zfpeNzUuEPBlAeh4Mi72amsG5-N-wGERTXp/" width="320" /></a></div><div class="separator" style="clear: both; text-align: center;">Fig 2.2: </div><br /><br /><p></p><p class="MsoNormal"><b><u>Other Solution?</u></b></p><p class="MsoNormal"><o:p></o:p></p>
<p class="MsoNormal"><o:p> You can also take other approach, change the format from 'date' to 'string' under the projection</o:p></p><p class="MsoNormal"><o:p></o:p></p><div class="separator" style="clear: both; text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgCgEgBehx4KoYrJO5W0J_pGe6XSv66PB4hpLQI5idVzh3Tqbrgy5EZy1bx2i3001Rgb2Nu38C7GrJFug7RiN9VKEUNQU11myNRv-YX4RNjnANrZ7Ri2O9qj2gj-fFAI5bKfP2rZwkwtXw9/" style="margin-left: 1em; margin-right: 1em;"><img alt="" data-original-height="486" data-original-width="775" height="201" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgCgEgBehx4KoYrJO5W0J_pGe6XSv66PB4hpLQI5idVzh3Tqbrgy5EZy1bx2i3001Rgb2Nu38C7GrJFug7RiN9VKEUNQU11myNRv-YX4RNjnANrZ7Ri2O9qj2gj-fFAI5bKfP2rZwkwtXw9/" width="320" /></a></div><div class="separator" style="clear: both; text-align: center;">Fig 3.0: change data type from date to string</div><br /><br /><p></p><p class="MsoNormal"><o:p>And then use derive column activity </o:p></p><p class="MsoNormal"><o:p></o:p></p><div class="separator" style="clear: both; text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhs6dlb3KUWWIgOFIUx6apxDirrKjD1mWgCUo5CDIrPjZQXYn0X5spsgdaaF5eG7_escNUp7ylyoOvfzN_L7JQWUfTVR7kYxB3y6AhHbyvMoguur7pcZWXB71ynsqLNz4Di9cLvLbbq9Adr/" style="margin-left: 1em; margin-right: 1em;"><img alt="" data-original-height="463" data-original-width="646" height="229" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhs6dlb3KUWWIgOFIUx6apxDirrKjD1mWgCUo5CDIrPjZQXYn0X5spsgdaaF5eG7_escNUp7ylyoOvfzN_L7JQWUfTVR7kYxB3y6AhHbyvMoguur7pcZWXB71ynsqLNz4Di9cLvLbbq9Adr/" width="320" /></a></div><div class="separator" style="clear: both; text-align: center;">Fig 3.1: get 'derived column' in the data flow</div><div class="separator" style="clear: both; text-align: center;"><br /></div><br />Now, use expression to convert into the correct date format :<i> toDate(OpenDate,'yyyy-MM-dd') </i><p></p><p class="MsoNormal"><o:p></o:p></p><div class="separator" style="clear: both; text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgDx90pOPOYDEIecqzXkTtZJE3YNgAFolXt0mSFHYChf0K_XZa3wOZ-gcUJdj2FlAC7VCTYyL2hjufl1pKbQ_-0MxnYPPelhyphenhyphenabEu9G2bwnHUPP5wKmcZjWmtjQf6ZHh49b0pYAiWE0H4Lm/" style="margin-left: 1em; margin-right: 1em;"><img alt="" data-original-height="624" data-original-width="1404" height="142" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgDx90pOPOYDEIecqzXkTtZJE3YNgAFolXt0mSFHYChf0K_XZa3wOZ-gcUJdj2FlAC7VCTYyL2hjufl1pKbQ_-0MxnYPPelhyphenhyphenabEu9G2bwnHUPP5wKmcZjWmtjQf6ZHh49b0pYAiWE0H4Lm/" width="320" /></a></div><div class="separator" style="clear: both; text-align: center;">Fig 3.2: expression to convert from string to date</div><br /><br /><p></p><p class="MsoNormal">In summary, if you find any discrepancy at output file's date columns then look closely the date format, preview the data in the ADF data flow then either change the format from source projections or use derived column activities to fix it.</p>
<p class="MsoNormal"><span style="mso-no-proof: yes;"><v:shape id="Picture_x0020_6" o:spid="_x0000_i1025" style="height: 171pt; mso-wrap-style: square; visibility: visible; width: 443.5pt;" type="#_x0000_t75">
<v:imagedata o:title="" src="file:///C:/Users/dpaul/AppData/Local/Temp/msohtmlclip1/01/clip_image006.png">
</v:imagedata></v:shape></span><o:p></o:p></p>Unknownnoreply@blogger.com2Toronto, ON, Canada43.653226 -79.383184315.342992163821151 -114.5394343 71.963459836178842 -44.226934299999996tag:blogger.com,1999:blog-2635979041680311167.post-90285835713731211602020-11-14T09:26:00.004-05:002020-12-16T22:58:22.556-05:00How to deal with NULL in ADF dataflow compared with SSIS?<p class="MsoNormal" style="line-height: normal; margin-bottom: 0in; tab-stops: 45.8pt 91.6pt 137.4pt 183.2pt 229.0pt 274.8pt 320.6pt 366.4pt 412.2pt 458.0pt 503.8pt 549.6pt 595.4pt 641.2pt 687.0pt 732.8pt; vertical-align: baseline;"><span style="font-size: 12pt;">When you
are working with ETL/ELT, sometimes you may need to transform NULL into
something meaningful value. If you worked with SSIS, you know how you
handle that. This blog post will describe how do we do in SSIS and how the very
same task can be done in ADF Dataflow.</span><span style="font-size: 12pt;"><o:p></o:p></span></p><p class="MsoNormal" style="line-height: normal; margin-bottom: 0in; tab-stops: 45.8pt 91.6pt 137.4pt 183.2pt 229.0pt 274.8pt 320.6pt 366.4pt 412.2pt 458.0pt 503.8pt 549.6pt 595.4pt 641.2pt 687.0pt 732.8pt; vertical-align: baseline;"><span style="border: 1pt none windowtext; font-size: 12pt; padding: 0in;"> Consider, we have a .csv file where Name columns have NULL value for 2nd record (figure: 1.0)</span></p><p class="MsoNormal" style="line-height: normal; margin-bottom: 0in; tab-stops: 45.8pt 91.6pt 137.4pt 183.2pt 229.0pt 274.8pt 320.6pt 366.4pt 412.2pt 458.0pt 503.8pt 549.6pt 595.4pt 641.2pt 687.0pt 732.8pt; vertical-align: baseline;"><span style="border: 1pt none windowtext; font-size: 12pt; padding: 0in;"><br /></span></p><p class="MsoNormal" style="line-height: normal; margin-bottom: 0in; tab-stops: 45.8pt 91.6pt 137.4pt 183.2pt 229.0pt 274.8pt 320.6pt 366.4pt 412.2pt 458.0pt 503.8pt 549.6pt 595.4pt 641.2pt 687.0pt 732.8pt; vertical-align: baseline;"><span style="border: 1pt none windowtext; font-size: 12pt; padding: 0in;"></span></p><div class="separator" style="clear: both; text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEh2yYazoEhc3cfYeKU2JqHk_KnSnKmMgpwZn-SzUhOU7mJD_M4mT746uYPHErH1dwaaNvU-46tXgobnfNmqa1h1rdv8ZPLV_MlY1AlNO_XPjWBObcTBiBlZEcWfydGKHqKUVAAP3wgj-szt/" style="margin-left: 1em; margin-right: 1em;"><img alt="" data-original-height="139" data-original-width="220" height="202" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEh2yYazoEhc3cfYeKU2JqHk_KnSnKmMgpwZn-SzUhOU7mJD_M4mT746uYPHErH1dwaaNvU-46tXgobnfNmqa1h1rdv8ZPLV_MlY1AlNO_XPjWBObcTBiBlZEcWfydGKHqKUVAAP3wgj-szt/" width="320" /></a></div><div class="separator" style="clear: both; text-align: center;"><span style="font-size: 12pt;"><br /></span></div><div class="separator" style="clear: both; text-align: center;"><span style="font-size: 12pt;">Fig 1.0: Sample .csv file with NULL record</span></div><div class="separator" style="clear: both; text-align: center;"><br /></div><div class="separator" style="clear: both; text-align: center;"><br /></div><div class="separator" style="clear: both; text-align: center;"><br /></div><div class="separator" style="clear: both; text-align: center;"><br /></div><p></p><p class="MsoNormal" style="line-height: normal; margin-bottom: 0in; tab-stops: 45.8pt 91.6pt 137.4pt 183.2pt 229.0pt 274.8pt 320.6pt 366.4pt 412.2pt 458.0pt 503.8pt 549.6pt 595.4pt 641.2pt 687.0pt 732.8pt; vertical-align: baseline;"><span style="border: 1pt none windowtext; font-size: 12pt; padding: 0in;">After connecting the .csv file through flat file source in SSIS data flow, we can debug and view the record through data viewer which will look like below figure 1.1</span></p><p class="MsoNormal" style="line-height: normal; margin-bottom: 0in; tab-stops: 45.8pt 91.6pt 137.4pt 183.2pt 229.0pt 274.8pt 320.6pt 366.4pt 412.2pt 458.0pt 503.8pt 549.6pt 595.4pt 641.2pt 687.0pt 732.8pt; vertical-align: baseline;"><span style="border: 1pt none windowtext; font-size: 12pt; padding: 0in;"></span></p><div class="separator" style="clear: both; text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEiWtXD3ZdWRiK8K8rUJWWj76aTuPRm8pZjt-gtGeqavhLtEaCsSOANgnfhnwwokeGf9cCEOuiGsoEDWWC2xCS84ygz3ERZNtCNAkyMHlA9_kDi85Yu_wb4_EMUuIQjyMZfi6pJrFQxYlpqW/" style="margin-left: 1em; margin-right: 1em;"><img alt="" data-original-height="411" data-original-width="1112" height="148" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEiWtXD3ZdWRiK8K8rUJWWj76aTuPRm8pZjt-gtGeqavhLtEaCsSOANgnfhnwwokeGf9cCEOuiGsoEDWWC2xCS84ygz3ERZNtCNAkyMHlA9_kDi85Yu_wb4_EMUuIQjyMZfi6pJrFQxYlpqW/w400-h148/SSIS+Null.PNG" width="400" /></a></div><div class="separator" style="clear: both; text-align: center;">Fig 1.1: Result in SSIS data flow - data viewer</div><p class="MsoNormal" style="line-height: normal; margin-bottom: 0in; tab-stops: 45.8pt 91.6pt 137.4pt 183.2pt 229.0pt 274.8pt 320.6pt 366.4pt 412.2pt 458.0pt 503.8pt 549.6pt 595.4pt 641.2pt 687.0pt 732.8pt; vertical-align: baseline;"><span style="border: 1pt none windowtext; font-size: 12pt; padding: 0in;"><br /></span></p><p></p><p>If you would like to replace the NULL value with meaning value, in that case you need to use derive column activity and use expression.</p><p style="tab-stops: 45.8pt 91.6pt 137.4pt 183.2pt 229.0pt 274.8pt 320.6pt 366.4pt 412.2pt 458.0pt 503.8pt 549.6pt 595.4pt 641.2pt 687.0pt 732.8pt;">SSIS data flow expression got REPLACENULL function, which will replace NULL to the expected value that you want.</p><pre class="prettyprint lang-mssql">The expression: REPLACENULL(Name,"Unknown")</pre><p class="MsoNormal" style="line-height: normal; margin-bottom: 0in; tab-stops: 45.8pt 91.6pt 137.4pt 183.2pt 229.0pt 274.8pt 320.6pt 366.4pt 412.2pt 458.0pt 503.8pt 549.6pt 595.4pt 641.2pt 687.0pt 732.8pt; vertical-align: baseline;"><span style="border: 1pt none windowtext; font-size: 12pt; padding: 0in;"></span></p><p style="tab-stops: 45.8pt 91.6pt 137.4pt 183.2pt 229.0pt 274.8pt 320.6pt 366.4pt 412.2pt 458.0pt 503.8pt 549.6pt 595.4pt 641.2pt 687.0pt 732.8pt;">The above expression will return 'Unknown' when Name is NULL otherwise it will return the original value.</p><p class="MsoNormal" style="line-height: normal; margin-bottom: 0in; tab-stops: 45.8pt 91.6pt 137.4pt 183.2pt 229.0pt 274.8pt 320.6pt 366.4pt 412.2pt 458.0pt 503.8pt 549.6pt 595.4pt 641.2pt 687.0pt 732.8pt; vertical-align: baseline;"><span style="border: 1pt none windowtext; font-size: 12pt; padding: 0in;"></span></p><div class="separator" style="clear: both; text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEia7k-r7L2ecYe7eCUIgeLFqT5ee3iZhuQffk6qf7TxclgbbTgo1ASn4AXnxvQF091w88bpR2zxpEI8NFWIxVq90eAtKoAkbfPQ5U1uM39rOY01xje0-WSfUiT5AW9D5hvVo-1hh6grWsm7/" style="margin-left: 1em; margin-right: 1em;"><img alt="" data-original-height="806" data-original-width="805" height="320" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEia7k-r7L2ecYe7eCUIgeLFqT5ee3iZhuQffk6qf7TxclgbbTgo1ASn4AXnxvQF091w88bpR2zxpEI8NFWIxVq90eAtKoAkbfPQ5U1uM39rOY01xje0-WSfUiT5AW9D5hvVo-1hh6grWsm7/w320-h320/SSIS+Null+handles+with+expression.PNG" width="320" /></a></div><div class="separator" style="clear: both; text-align: center;">Fig 1.2: Expression in SSIS Data flow to replace NULL with 'Unknown'</div><p class="MsoNormal" style="line-height: normal; margin-bottom: 0in; tab-stops: 45.8pt 91.6pt 137.4pt 183.2pt 229.0pt 274.8pt 320.6pt 366.4pt 412.2pt 458.0pt 503.8pt 549.6pt 595.4pt 641.2pt 687.0pt 732.8pt; vertical-align: baseline;"><br /></p><p class="MsoNormal" style="line-height: normal; margin-bottom: 0in; tab-stops: 45.8pt 91.6pt 137.4pt 183.2pt 229.0pt 274.8pt 320.6pt 366.4pt 412.2pt 458.0pt 503.8pt 549.6pt 595.4pt 641.2pt 687.0pt 732.8pt; vertical-align: baseline;"><span style="font-size: 12pt;">When
it comes to ADF data flow regular expression similar like SSIS expression; </span><span style="font-size: 16px;">isNull </span><span style="font-size: 12pt;">only give you true or false. And isNull function take only one argument, e.g. below fig 2.1 took the argument Name and return True (</span><span face=""Open Sans", Arial, sans-serif" style="background-color: white; color: #333333; font-size: small;">✓) if the value is NULL.</span></p><h2 id="method-1-copy-and-paste-xa0-x2713-xa0-x2714-xa0-xa0-x2611-xa0-xa0-x2705-xa0-x2715-xa0-x2716-xa0-x2717-xa0-x2718" style="background-color: white; border: 0px; color: #333333; font-family: "Open Sans", Arial, sans-serif; font-stretch: inherit; font-variant-east-asian: inherit; font-variant-numeric: inherit; height: auto; line-height: 30px; margin: 0px 0px 0.7em; padding: 10px 0px 0px; vertical-align: baseline; width: 602px;"><br /></h2><p></p><div class="separator" style="clear: both; text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhX9dHtqy7VgJ1DPCw0B96WjQaNyHjJXw1VXGaKhoN1pByeJQ8MDnJnDwbVMCf4b1KF4VyClJdwSMNXUhdMYbjxRWZz0d0DIkiax4e-SmDHMuHffW0qgtVHc0PF2ndF2eMKH5p8aBV9262c/" style="margin-left: 1em; margin-right: 1em;"><img alt="" data-original-height="519" data-original-width="495" height="400" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhX9dHtqy7VgJ1DPCw0B96WjQaNyHjJXw1VXGaKhoN1pByeJQ8MDnJnDwbVMCf4b1KF4VyClJdwSMNXUhdMYbjxRWZz0d0DIkiax4e-SmDHMuHffW0qgtVHc0PF2ndF2eMKH5p8aBV9262c/w381-h400/image.png" width="381" /></a></div><div class="separator" style="clear: both; text-align: center;">Fig 2.0: ADF dataflow isNull function</div><p><br /></p>Now, let's find out how to transform NULL value into something meaningful in ADF data flow. <span face="Calibri, sans-serif" style="font-size: 12pt;">ADF doesn’t have the same function </span><span face="Calibri, sans-serif" style="border: 1pt none windowtext; font-size: 12pt; padding: 0in;">REPLACENULL which used in SSIS, rather there are two ways you can replace the NULL values in ADF dataflow.</span><p></p><p><span face="Calibri, sans-serif" style="border: 1pt none windowtext; font-size: 12pt; padding: 0in;"><br /></span></p><p><span face="Calibri, sans-serif" style="border: 1pt none windowtext; font-size: 12pt; padding: 0in;"><b>Approach 1: Combination of iif and isNULL function</b></span></p><p><span face="Calibri, sans-serif" style="border: 1pt none windowtext; font-size: 12pt; padding: 0in;"><b><br /></b></span></p><p><span face="Calibri, sans-serif">Expression: </span><span style="font-size: 16px;">iif(isNull(Name), 'Unknown', Name)</span></p><p><span face="Calibri, sans-serif" style="border: 1pt none windowtext; font-size: 12pt; padding: 0in;"></span></p><p class="MsoNormal" style="line-height: normal; mso-margin-bottom-alt: auto; mso-margin-top-alt: auto;">The function iif will check the condition isNull(Name), if <i>Name</i> have Null value it will return 'Unknown' otherwise original value will be returned.</p><p class="MsoNormal" style="line-height: normal; mso-margin-bottom-alt: auto; mso-margin-top-alt: auto;"><br /></p><p></p><div class="separator" style="clear: both; text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhcyu5vqIeqvXV_bKhqi0aFBQ_TIUMLfglkYGa1Y4ISfM0M1th_2iIIBcmHbXtCjZQDLVtWHHhHqD3c9dWIZfvjipCKSwtBcFUqSCAe8JACfGGuiynJwAoEVpONJwsftif_w5Ko2plN_ZHB/" style="margin-left: 1em; margin-right: 1em;"><img alt="" data-original-height="500" data-original-width="623" height="321" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhcyu5vqIeqvXV_bKhqi0aFBQ_TIUMLfglkYGa1Y4ISfM0M1th_2iIIBcmHbXtCjZQDLVtWHHhHqD3c9dWIZfvjipCKSwtBcFUqSCAe8JACfGGuiynJwAoEVpONJwsftif_w5Ko2plN_ZHB/w400-h321/image.png" width="400" /></a></div><div class="separator" style="clear: both; text-align: center;">Fig 2.1: using iif and isNull in ADF dataflow</div><br /><p></p><p><span face="Calibri, sans-serif" style="font-size: 16px;"><b>Approach 2: By using iifNull function</b></span></p><p class="MsoNormal" style="line-height: normal; mso-margin-bottom-alt: auto; mso-margin-top-alt: auto;">The smartest solution is to use <span style="font-size: 16px;">iifNull which will return exactly the same result we found via approach 1.</span></p><p class="MsoNormal" style="line-height: normal; mso-margin-bottom-alt: auto; mso-margin-top-alt: auto;"><span style="font-size: 12pt;">expression: iifNull(</span><span style="font-size: 16px;">Name</span><span style="font-size: 12pt;">,
'Unknown') will return 'Unknown' if Name have NULL values otherwise it will return original value.<o:p></o:p></span></p><p></p><div class="separator" style="clear: both; text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgUPzuudMTqYFw9W0TGLtex9mrkknMLLARsPlSZT8QzP8XSnm8uGVfm_sUQIB3uliJ3o8wAN8kJnLiBs5IATLPyfO1YLnKf5cM5J-FD1yg_gxEbikn4LC7iKLQyXiNv0_vvbZU9pRrrhKpf/" style="margin-left: 1em; margin-right: 1em;"><img alt="" data-original-height="511" data-original-width="557" height="366" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgUPzuudMTqYFw9W0TGLtex9mrkknMLLARsPlSZT8QzP8XSnm8uGVfm_sUQIB3uliJ3o8wAN8kJnLiBs5IATLPyfO1YLnKf5cM5J-FD1yg_gxEbikn4LC7iKLQyXiNv0_vvbZU9pRrrhKpf/w400-h366/image.png" width="400" /></a></div><div class="separator" style="clear: both; text-align: center;">Fig 2.2:iifNULL function to replace NULL value</div><br />In summary, expression is similar to replace NULL values for both SSIS and ADF data flow, however, the function you need to use is different for two different tools.<p></p><p></p>Unknownnoreply@blogger.com1Toronto, ON, Canada43.653226 -79.383184315.342992163821151 -114.5394343 71.963459836178842 -44.226934299999996tag:blogger.com,1999:blog-2635979041680311167.post-341491221762941832020-10-18T10:13:00.005-04:002020-10-29T20:55:57.126-04:00How to handle Case statement in Azure Data Factory (ADF) compare to SSIS?<p>This post will describe how do you use CASE WHEN statement in Azure data factory(ADF). If you are coming from SSIS background, you know a piece of SQL statement will do the task. However let's see how you do it in SSIS and the very same thing can be achieved in ADF.</p><p><strong>Problem statement:</strong></p><p>For my simple scenario, In case PortfolioTypeCode is either 'Mutual Fund' or 'Pooled Fund' it should return 1 Else it should return 0.</p><p> </p><p><b>How do you do in SSIS?</b></p><p>In SSIS, under data flow you will have OLEDB source like below fig 1:</p><div class="separator"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgj5O8vCXmCpO4DraPgKyaxmcPXoQuSmuOjwIb-dkFdPAosVJhPWTqZ6ahF3-vbORVLRNcr5OfLUdvpgu_d55N0p-L8_ZV9uXsfD0NNEtKjKrYLAyUCrrdOaRLLt_17ECq5JKTbI9gdyx9d/"><img alt="" data-original-height="517" data-original-width="885" height="187" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgj5O8vCXmCpO4DraPgKyaxmcPXoQuSmuOjwIb-dkFdPAosVJhPWTqZ6ahF3-vbORVLRNcr5OfLUdvpgu_d55N0p-L8_ZV9uXsfD0NNEtKjKrYLAyUCrrdOaRLLt_17ECq5JKTbI9gdyx9d/" width="320" /></a></div><div class="separator">Fig 1: SSIS OLEDB source</div><p> </p><p>And open the OLEDB source and then Write SQL command like below and you are done:</p><blockquote><p><i>SELECT Col1,</i></p><p><i> CASE WHEN PortfolioCode IN('Mutual fund','Pooled fund')</i></p><p><i> THEN 1</i></p><p><i> ELSE 0</i></p><p><i>END IsFund,</i></p><p><i>Col2</i></p><p><i>From Table1</i></p></blockquote><div class="separator"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEg2oJgRlw-hMLEE_PL6_DU1UVeiNsqF1GgQ2EcjmTiC-wTjGgKepzfJw90UshrA6JSbT_N3aHE_GFAX83b_fODN6cwcgsV-e79b08WFC8ErTgWUAYYqFEHIgyDK7_DSZ3lGfbxD49mpAjNO/"><img alt="" data-original-height="498" data-original-width="679" height="235" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEg2oJgRlw-hMLEE_PL6_DU1UVeiNsqF1GgQ2EcjmTiC-wTjGgKepzfJw90UshrA6JSbT_N3aHE_GFAX83b_fODN6cwcgsV-e79b08WFC8ErTgWUAYYqFEHIgyDK7_DSZ3lGfbxD49mpAjNO/" width="320" /></a></div><div class="separator">Fig 2: CASE WHEN under SQL command in SSIS</div><p> </p><p><b>How do you implement in ADF?</b></p><p>However in ADF, to achieve the same you need to use Expressions. ADF have very same concept of data flow like SSIS. In the data flow, after the source dataset is established you can add 'Derived Column' activity like below Fig 3:</p><div class="separator"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhkzDzl6r6dQPVPlUOfIDIRZKexO8QLXr4aPf5oTbuxb_xKg1ImvBgN5eb-BDKYtI0a4eFTkBSlrtCDGXceJamrdDe4ZCYYaSqMy_9k69TYkc-cjwZ3o6BBWC-TWzDXWHBfwOmJbpGQ8PQZ/"><img alt="" data-original-height="648" data-original-width="959" height="216" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhkzDzl6r6dQPVPlUOfIDIRZKexO8QLXr4aPf5oTbuxb_xKg1ImvBgN5eb-BDKYtI0a4eFTkBSlrtCDGXceJamrdDe4ZCYYaSqMy_9k69TYkc-cjwZ3o6BBWC-TWzDXWHBfwOmJbpGQ8PQZ/" width="320" /></a></div><div class="separator">Fig 3: Adding derive column under data flow</div><p> </p><p>Now you can give a new column name and then add the expression (Fig 4):</p><div class="separator"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhKslHStKY5VUD2wGhZMawhnaO4ykOPh9MSpnc27QN7KVrbj7pj7K49yiFvXVAXlli-gl-g3kqMl7WpA3yRqnd6OtoR6-44H1eSsyGNSwqADsPzerrvYRaMHYwyLU8wqKHuPhVtZfucVoW5/"><img alt="" data-original-height="486" data-original-width="1010" height="193" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhKslHStKY5VUD2wGhZMawhnaO4ykOPh9MSpnc27QN7KVrbj7pj7K49yiFvXVAXlli-gl-g3kqMl7WpA3yRqnd6OtoR6-44H1eSsyGNSwqADsPzerrvYRaMHYwyLU8wqKHuPhVtZfucVoW5/w400-h193/image.png" width="400" /></a></div><div class="separator">Fig 4: Derived column expression</div><p> </p><p>Let's see how Case expression works: it takes 3 arguments, those are condition, true and false. However, it can have alternating condition which describe in the figure 5:</p><div class="separator"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEg_xcvRr8rjEZi56GktWHMYR_i_um9dcqCAsO2OHnmezXkfVYJQJF2CLc8InintY86DAXghQ52VNR582poxARSg2c2zwscIJ2yZZK7B0HBmyGt7c6-EetLuL_-PidjJ988h405IhVsMYlUb/"><img alt="" data-original-height="476" data-original-width="1364" height="140" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEg_xcvRr8rjEZi56GktWHMYR_i_um9dcqCAsO2OHnmezXkfVYJQJF2CLc8InintY86DAXghQ52VNR582poxARSg2c2zwscIJ2yZZK7B0HBmyGt7c6-EetLuL_-PidjJ988h405IhVsMYlUb/w400-h140/image.png" width="400" /></a></div><div class="separator">Fig 5: Case in Expression</div><p> </p><p>For my simple scenario, If PortfolioTypeCode is either 'Mutual Fund' or 'Pooled Fund' it should return 1 Else it should return 0.</p><p>Since you can't have CASE WHEN rather using case as Expression, the code will look like below:</p><blockquote><p> <i>case( PortfolioTypeCode=='Mutual Fund',1,</i></p><div><i> PortfolioTypeCode=='Pooled Fund',1,0)</i></div></blockquote><div class="separator"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEiWRugryhE17XpBPVVa83oocH5rcweyX9kidn5qnCfgQhC5VOCc3q10w-Kg1ytIdGSNTprj97kpghmrZTeX2okup7A5PssNSpr6-b2kztoomeqFnNILhLrV7nFmC_9O-KGd9rsD73KSU-n0/"><img alt="" data-original-height="491" data-original-width="1031" height="190" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEiWRugryhE17XpBPVVa83oocH5rcweyX9kidn5qnCfgQhC5VOCc3q10w-Kg1ytIdGSNTprj97kpghmrZTeX2okup7A5PssNSpr6-b2kztoomeqFnNILhLrV7nFmC_9O-KGd9rsD73KSU-n0/w400-h190/CASE+in+ADF.PNG" width="400" /></a></div><p> </p>Unknownnoreply@blogger.com2Toronto, ON, Canada43.653226 -79.383184315.342992163821151 -114.5394343 71.963459836178842 -44.226934299999996tag:blogger.com,1999:blog-2635979041680311167.post-18415303487556555502020-09-19T16:33:00.048-04:002020-10-04T19:31:04.786-04:00Azure data Factory: What is not allowed in the file name while using Data Flow (spark engine)?<p>I was working with a pipeline where data needs to move from source and load into RAW zone in ADLS. As soon as loaded the file it got date time stamp appended end of the file. e.g. source file name is: customer.csv and when it's landed to <span style="font-family: inherit;">RAW </span>zone, then file name will be : customer_2020-09-28T12:15:32.csv</p><p>How do you add date time stamp end of the file?</p><p>Adding dynamic content at the sink pipeline like below (Fig 1) will do the task.</p><p><br /></p><p></p><div class="separator" style="clear: both; text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjg72J59COpFKpXEm2lf2yX3x_oMJB3fNS01CfdGcovzyafMQZwKKB0jUBdV26nKbHHdL4gtF0SManMdKN_ETAw4zeNTnIllKNxve4SCpd0BW1o3c4_MnRLhWWyXX3DpmIlttw0X9jUWzpV/" style="margin-left: 1em; margin-right: 1em;"><img alt="" data-original-height="194" data-original-width="636" height="98" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjg72J59COpFKpXEm2lf2yX3x_oMJB3fNS01CfdGcovzyafMQZwKKB0jUBdV26nKbHHdL4gtF0SManMdKN_ETAw4zeNTnIllKNxve4SCpd0BW1o3c4_MnRLhWWyXX3DpmIlttw0X9jUWzpV/" width="320" /></a></div><div class="separator" style="clear: both; text-align: center;">Fig 1: dynamic content to</div><br /><br /><p></p><p><br /></p>When I run the pipeline it was appending timestamp end of the file and saving in the blob storage And I was able to view the data from the file: customer_2020-09-28T12:15:32.csv<div><br /></div><div><br /><div><div class="separator" style="clear: both; text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjWYyVjluZ8WBWUgNb7OtiX1YfUvUws_8MK_nKlQMUqQ3qUIhsyIx0jyZ4Mbn_wcPpOWb-3Y4rel2ThNMkZZC5iYrzkc_iGZB54GeEIhIUbD-233T-mmzLnF0UEIV3qza1Lurt3CQmBQk8m/" style="margin-left: 1em; margin-right: 1em;"><img alt="" data-original-height="377" data-original-width="773" height="156" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjWYyVjluZ8WBWUgNb7OtiX1YfUvUws_8MK_nKlQMUqQ3qUIhsyIx0jyZ4Mbn_wcPpOWb-3Y4rel2ThNMkZZC5iYrzkc_iGZB54GeEIhIUbD-233T-mmzLnF0UEIV3qza1Lurt3CQmBQk8m/" width="320" /></a></div><div class="separator" style="clear: both; text-align: center;">Fig 2: Preview of the data</div><br /><br /></div><div><br /><p><span color="rgba(0, 0, 0, 0.9)" style="background-color: white;"><span style="font-family: inherit;">In the azure data lake storage file name with ':' did not give any issue while creating the file name as well viewing it. </span></span></p><p style="text-align: left;"><span color="rgba(0, 0, 0, 0.9)" style="background-color: white;"><span style="font-family: inherit;">However, as soon as I use that file in the Data Flow activity and debug the code (when apache spark engine fire) then below error : </span></span><span color="rgba(0, 0, 0, 0.9)" style="background-color: white; text-align: center;">java.lang.illegalArgumentException:java.net.URISyntaxException:....</span><span style="background-color: white; font-family: inherit;"> </span></p><p><span color="rgba(0, 0, 0, 0.9)" style="background-color: white;"><span style="font-family: inherit;"><br /></span></span></p><p><span color="rgba(0, 0, 0, 0.9)" style="background-color: white;"><span style="font-family: inherit;"></span></span></p><div class="separator" style="clear: both; text-align: center;"><span style="font-family: inherit;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjdymy_EY43lOpQBWD5bsczfAh2LI9myrPktKyCk5we9ilnblLh8JnbrLapZAIqWNxcJW4gqcA_yLZQsMJV8fEVBmwNJ9HWmNxVT8BC2BEK3QiqAAJ3ErjyCfe7xY3XBDhUc5bMnsCYeXt7/" style="margin-left: 1em; margin-right: 1em;"><img alt="" data-original-height="206" data-original-width="1218" height="54" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjdymy_EY43lOpQBWD5bsczfAh2LI9myrPktKyCk5we9ilnblLh8JnbrLapZAIqWNxcJW4gqcA_yLZQsMJV8fEVBmwNJ9HWmNxVT8BC2BEK3QiqAAJ3ErjyCfe7xY3XBDhUc5bMnsCYeXt7/" width="320" /></a></span></div><div class="separator" style="clear: both; text-align: center;"><span style="font-family: inherit;">Fig 3: java.lang.illegalArgumentException</span></div><span style="font-family: inherit;"><br />How to resolve the error?</span><p></p><p><span color="rgba(0, 0, 0, 0.9)" style="background-color: white;"><span style="font-family: inherit;">You can't have file with ':', which also true if you try to create a file in Windows OS with ':' in it. It will throw error, however, interestingly that's not the case when you create a file with same name in Azure data lake storage (HDFS is working behind).</span></span></p><p><span color="rgba(0, 0, 0, 0.9)" style="background-color: white;"><span style="font-family: inherit;"></span></span></p><div class="separator" style="clear: both; text-align: center;"><span style="font-family: inherit;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgTFFUnrRpmwcc6ZxO6HhyG6wtnYN_fVr3_EBZabuTlaIExB3kVOPQUM8a5WSsnk_UDQ2cUJ7-8Y7DqCT-0D0us0VXlh0V3XiWIJhibiLtyLVpOMz3KRKaFhade8DLpcyMSYx_tcGGYR9iJ/" style="margin-left: 1em; margin-right: 1em;"><img alt="" data-original-height="213" data-original-width="828" height="82" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgTFFUnrRpmwcc6ZxO6HhyG6wtnYN_fVr3_EBZabuTlaIExB3kVOPQUM8a5WSsnk_UDQ2cUJ7-8Y7DqCT-0D0us0VXlh0V3XiWIJhibiLtyLVpOMz3KRKaFhade8DLpcyMSYx_tcGGYR9iJ/" width="320" /></a></span></div><div class="separator" style="clear: both; text-align: center;"><span style="font-family: inherit;">Fig 4: Error in Windows machine</span></div><span style="font-family: inherit;"><br />To resolve this, I updated the format of timestamp part while adding end of the each file, instead of using </span>yyyy-MM-ddTHH:mm:ss , I have used yyyy-MM-ddTHHmmss. so I get the file name as: customer_2020-09-28T121532.csv<p></p></div></div>Unknownnoreply@blogger.com1tag:blogger.com,1999:blog-2635979041680311167.post-71095139137244863802020-08-22T20:31:00.010-04:002020-09-09T21:35:49.742-04:00How to solve Azure data factory Error: Sink dataset filepaths cannot contain a file name ?<p>I was building a pipeline which was taking data from SFTP folder and sink to Cleansed layer where file format was .parquet. As soon as validated the pipeline got the below error, where the error message is: 'Sink dataset filepath can not contain a file name, please remove the file name from {DatasetName}.</p><p>You will find error like this:</p><div class="separator" style="clear: both; text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEiP9SrRSe6S0xva-so05GYtWjtkbfhuNkwuQKFupy9T4DKrvc1a0142Ymq8VqS2o2zn4_I23WeznUUJ4DIDsSQ3rdN2Ao31CUPm0Q5_c9PKWR92mTZ0gEOvsXf1GLB3VSc1tsV_A-zzLU5u/s692/Azure+error+with+sink+dataset+filepath+name.PNG" style="margin-left: 1em; margin-right: 1em;"><img border="0" data-original-height="488" data-original-width="692" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEiP9SrRSe6S0xva-so05GYtWjtkbfhuNkwuQKFupy9T4DKrvc1a0142Ymq8VqS2o2zn4_I23WeznUUJ4DIDsSQ3rdN2Ao31CUPm0Q5_c9PKWR92mTZ0gEOvsXf1GLB3VSc1tsV_A-zzLU5u/s320/Azure+error+with+sink+dataset+filepath+name.PNG" width="320" /></a></div><div class="separator" style="clear: both; text-align: center;">Fig 1: Sink dataset error</div><br /><p><br /></p><p>If you find this error, may get confuse at first since you have file name at two different places. One is at Sink activity settings and other one is at the data set.</p><p>Let's guide you to how to find those settings and fix this error. If you look at the Settings for Sink activity, you will find like below fig: 2.</p><div class="separator" style="clear: both; text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEga1MK72DHzVpu5UD3GX5Yq0C0WZwA6K7oQM6GpGyN7xEZJeXgscBTslAtzBtDMnZfMx8nWFw5jMVergUFcIJYU4b4uP_lI0vuLBzb1bDCS02MtJIPEnYWgFdnvgy2WTRF2tXayMoRfPG-M/s1344/sink+activity+setting.PNG" style="margin-left: 1em; margin-right: 1em;"><img border="0" data-original-height="442" data-original-width="1344" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEga1MK72DHzVpu5UD3GX5Yq0C0WZwA6K7oQM6GpGyN7xEZJeXgscBTslAtzBtDMnZfMx8nWFw5jMVergUFcIJYU4b4uP_lI0vuLBzb1bDCS02MtJIPEnYWgFdnvgy2WTRF2tXayMoRfPG-M/s320/sink+activity+setting.PNG" width="320" /></a></div><br /><div class="separator" style="clear: both; text-align: center;"><br /></div><div class="separator" style="clear: both; text-align: center;">Fig 2: Settings of Sink activity</div><div class="separator" style="clear: both; text-align: center;"><br /></div><div class="separator" style="clear: both; text-align: center;"><br /></div><div class="separator" style="clear: both; text-align: center;"><br /></div>The error message said filepath can't contain file name but remember this is not data set, it's setting of the Sink activity so you don't need to remove the file name. And if you want to keep it in a single file then choose 'Output to single file'.<div><br /><div>So fix the issue you need click Sink and open data set as like below fig 3:<div><br /></div><div class="separator" style="clear: both; text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEigKuEyv2Gb3cFQtuI8MlOqV6M922Flg1VPHXt-KCREYg63qg7oL4x66veg-xudmIDML5-x02EnLhe-Us_BPzWOGEo0Rrk6oIIqAE0COA9Wqyg2KILw3ltA3Yh3aAyPG7EhvGbTWbSH4jlj/s772/Open+dataset+from+sink.PNG" style="margin-left: 1em; margin-right: 1em;"><img border="0" data-original-height="398" data-original-width="772" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEigKuEyv2Gb3cFQtuI8MlOqV6M922Flg1VPHXt-KCREYg63qg7oL4x66veg-xudmIDML5-x02EnLhe-Us_BPzWOGEo0Rrk6oIIqAE0COA9Wqyg2KILw3ltA3Yh3aAyPG7EhvGbTWbSH4jlj/s320/Open+dataset+from+sink.PNG" width="320" /></a></div><div class="separator" style="clear: both; text-align: center;">Fig 3: open data set</div><div class="separator" style="clear: both; text-align: center;"><br /></div><br /><div>Now you can find the file name under data set as like below fig 4, Please remove file name, save and validate.</div><div><br /></div><div class="separator" style="clear: both; text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhU8JpgI4gqiR6c04wwtJEehukfWLZHHJQmCMmMyDhQuijiTi-qDedFywpk9Ja9QMliPObsGYhDB-oFiFKjRYg5SvbLocWFG1mOqs4sK-U7Yn0rrM9IZJ5Yo5Cexp8cDiPVs4wRNCeucthJ/s855/remove+file+name+from.PNG" style="margin-left: 1em; margin-right: 1em;"><img border="0" data-original-height="406" data-original-width="855" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhU8JpgI4gqiR6c04wwtJEehukfWLZHHJQmCMmMyDhQuijiTi-qDedFywpk9Ja9QMliPObsGYhDB-oFiFKjRYg5SvbLocWFG1mOqs4sK-U7Yn0rrM9IZJ5Yo5Cexp8cDiPVs4wRNCeucthJ/s320/remove+file+name+from.PNG" width="320" /></a></div></div></div><div class="separator" style="clear: both; text-align: center;">Fig 4: remove file name from Data set</div><div class="separator" style="clear: both; text-align: center;"><br /></div><div><div><div class="separator" style="clear: both; text-align: left;">Now the error will disappear. Lesson learned, look at the error message more than once and locate the correct place to fix the issue.</div></div></div><div><div><div><div class="separator" style="clear: both; text-align: center;"><br /></div><div class="separator" style="clear: both; text-align: center;"><br /></div><div class="separator" style="clear: both; text-align: center;"><br /></div><div class="separator" style="clear: both; text-align: center;"><br /></div><div class="separator" style="clear: both; text-align: center;"><br /></div><div class="separator" style="clear: both; text-align: center;"><br /></div><div class="separator" style="clear: both; text-align: center;"><br /></div></div></div></div>Unknownnoreply@blogger.com0tag:blogger.com,1999:blog-2635979041680311167.post-64168851224140196642020-07-26T13:54:00.000-04:002020-08-30T17:16:14.093-04:00How to deploy SQL Server big data cluster by using Azure data studio?There are different ways to deploy SQL server 2019 big data cluster. However, this blog post will use Azure data studio to deploy big data cluster.<div><br /></div><div>The very first thing you need to have is Azure data studio, if you don't have installed Azure data studio then please find it: <a href="https://docs.microsoft.com/en-us/sql/azure-data-studio/download-azure-data-studio?view=sql-server-ver15">https://docs.microsoft.com/en-us/sql/azure-data-studio/download-azure-data-studio?view=sql-server-ver15</a></div><div><br /></div><div><br /></div><div>Step 1: After you open Azure data studio you will find below UI:</div><div><br /></div><div><br /></div><div class="separator" style="clear: both; text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhRmmpJR4k-X3rleq1SH29i97hRB_4MauZ0g1RaJ8C04m8OP65zn0IwEiYhLWjTSLsEwUQTEYZ1b2e8_l08khBTg3bbNgi4P4Df0k00s2e65y6do-HWtoAQ47coU89mqi6wMXhPqFs47gSU/s1696/Azure+big+data+cluster_0.PNG" style="margin-left: 1em; margin-right: 1em;"><img border="0" data-original-height="628" data-original-width="1696" height="148" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhRmmpJR4k-X3rleq1SH29i97hRB_4MauZ0g1RaJ8C04m8OP65zn0IwEiYhLWjTSLsEwUQTEYZ1b2e8_l08khBTg3bbNgi4P4Df0k00s2e65y6do-HWtoAQ47coU89mqi6wMXhPqFs47gSU/w400-h148/Azure+big+data+cluster_0.PNG" width="400" /></a></div><div class="separator" style="clear: both; text-align: center;"><br /></div><div class="separator" style="clear: both; text-align: center;">Fig 1: Azure data studio</div><div class="separator" style="clear: both; text-align: center;"><br /></div><div>Step 2: Choose SQL server big data cluster</div><div><br /></div><div><br /></div><div class="separator" style="clear: both; text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgwTqkcHj6J1vckaz-V-ro4jiaQCeVrUdL1xMCakEtE1HQ3BSLGcjPxeY6WtThQx_x4IKHnSQrWrzZDjWns7F7fohLGYVwB93yD-lbwOr3GJIEyxaMEld_ML4kr6ej33ejR2zTuYRiXXzCI/s1516/Azure+big+data+cluster_1.PNG" style="margin-left: 1em; margin-right: 1em;"><img border="0" data-original-height="964" data-original-width="1516" height="254" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgwTqkcHj6J1vckaz-V-ro4jiaQCeVrUdL1xMCakEtE1HQ3BSLGcjPxeY6WtThQx_x4IKHnSQrWrzZDjWns7F7fohLGYVwB93yD-lbwOr3GJIEyxaMEld_ML4kr6ej33ejR2zTuYRiXXzCI/w400-h254/Azure+big+data+cluster_1.PNG" width="400" /></a></div><div class="separator" style="clear: both; text-align: center;">Fig 2: Choose SQL Server Big Data Cluster<span> </span></div><div class="separator" style="clear: both; text-align: center;"><br /></div><div>Step 3: Missing prerequisite</div><div><br /></div><div><ul style="text-align: left;"><li>kubectl</li><li>Azure CLI</li><li>azdata</li></ul></div><div>Please install those.</div><div><br /></div><div class="separator" style="clear: both; text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEj5ETuCAJSTjTRpAp4hxAn5Ms1IAKG85u7XhsGP_c7cMxEGsn43hp44OoGZtv1KTT1kgiSzOs5VTaj88VbuD7tz8TNFnbAJDljZQAvIAqLU5lvti3HLPA9Lrkk6eIaWqn-YwMRaJEHXDV0h/s841/Azure+big+data+cluster+prerequisite+tools_2.PNG" style="margin-left: 1em; margin-right: 1em;"><img border="0" data-original-height="196" data-original-width="841" height="94" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEj5ETuCAJSTjTRpAp4hxAn5Ms1IAKG85u7XhsGP_c7cMxEGsn43hp44OoGZtv1KTT1kgiSzOs5VTaj88VbuD7tz8TNFnbAJDljZQAvIAqLU5lvti3HLPA9Lrkk6eIaWqn-YwMRaJEHXDV0h/w400-h94/Azure+big+data+cluster+prerequisite+tools_2.PNG" width="400" /></a></div><div class="separator" style="clear: both; text-align: center;">Fig 3: Missing prerequisite</div><div><br /></div><div><br /></div><div><br /></div><div>After prerequisite have been installed you will find below UI</div><div><br /></div><div class="separator" style="clear: both; text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhs4ScBAiU1ndoc4qk-p3ktCqTnm1ikpEsQJ4BxPh4lH_HUd23OQcCzrT5MNyQ1G8tm6GI9cEW_FUUpLDqmwdw0HTIewDH3w-wFjq3F4tjcWi1YU1Uq4PSd-ywpGNoNh3vCrk8mH-_Odl3A/s1344/After+prerequisite+installed.PNG" style="margin-left: 1em; margin-right: 1em;"><img border="0" data-original-height="743" data-original-width="1344" height="221" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhs4ScBAiU1ndoc4qk-p3ktCqTnm1ikpEsQJ4BxPh4lH_HUd23OQcCzrT5MNyQ1G8tm6GI9cEW_FUUpLDqmwdw0HTIewDH3w-wFjq3F4tjcWi1YU1Uq4PSd-ywpGNoNh3vCrk8mH-_Odl3A/w400-h221/After+prerequisite+installed.PNG" width="400" /></a></div><div class="separator" style="clear: both; text-align: center;">Fig 4: After prequisite is installed</div><div><br /></div><div><br /></div><div><br /></div><div>Step 4: Deployment Configuration Profile</div><div><br /></div><div><br /></div><div><br /></div><div class="separator" style="clear: both; text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhRFx8lGR41Ky32Eooe483CiMiSVSkwhqRQq62DqGlXd1BkguZI7tIBFinb8tqtdMwekqhuBkQ4ua3u7LTA_J9FgbVOqo0sdw_eL5CYz7leqdk1KTlU3n1No1mZtZtbMA1fLisa2IJdFwhp/s1720/Deployment+Configuration+profile+4.PNG" style="margin-left: 1em; margin-right: 1em;"><img border="0" data-original-height="940" data-original-width="1720" height="274" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhRFx8lGR41Ky32Eooe483CiMiSVSkwhqRQq62DqGlXd1BkguZI7tIBFinb8tqtdMwekqhuBkQ4ua3u7LTA_J9FgbVOqo0sdw_eL5CYz7leqdk1KTlU3n1No1mZtZtbMA1fLisa2IJdFwhp/w500-h274/Deployment+Configuration+profile+4.PNG" width="500" /></a></div><div class="separator" style="clear: both; text-align: center;">Fig 5: Configuration profile</div><div><br /></div><div><br /></div><div>Step 5.A: Azure Settings:</div><div> Azure settings where resource group will be created, make sure you have enough permission to create resource group, if you don't have permission then it will fail.</div><div><br /></div><div class="separator" style="clear: both; text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhJAFZ9yNOlihs3IxYELAlnKGQgDOWgJkm4rhwyj0kZuCd20qqHC5QGWBi4573ret9OQbs193_brXJahNcAlq8rb70MZXOmbNV_sEdNUJB3lSVnkQ3mnQH3GhU9QTM9CClljYwNldfU4F6Y/s1369/Azure+Settings+6.PNG" style="margin-left: 1em; margin-right: 1em;"><img border="0" data-original-height="762" data-original-width="1369" height="223" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhJAFZ9yNOlihs3IxYELAlnKGQgDOWgJkm4rhwyj0kZuCd20qqHC5QGWBi4573ret9OQbs193_brXJahNcAlq8rb70MZXOmbNV_sEdNUJB3lSVnkQ3mnQH3GhU9QTM9CClljYwNldfU4F6Y/w400-h223/Azure+Settings+6.PNG" width="400" /></a></div><div class="separator" style="clear: both; text-align: center;">Fig 5: Azure settings</div><div><br /></div><div>If failed then the error message will show like below:</div><div><br /></div><div class="separator" style="clear: both; text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjSpFJ0MXWkCbQ3yGtofw11oWiJXwqMdr0u-_HpLeJ0_8yBKmENIs_ycGhX0royIKELSFJzLaMyjMD9HMbHwktNcUW7tvsab5xfNLceO3nWZGo7BWy1SXPipYNNYitWnu3YkAmZsS4wBGr_/s1441/Azure+resource+group+creation+failed+7.PNG" style="margin-left: 1em; margin-right: 1em;"><img border="0" data-original-height="439" data-original-width="1441" height="121" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjSpFJ0MXWkCbQ3yGtofw11oWiJXwqMdr0u-_HpLeJ0_8yBKmENIs_ycGhX0royIKELSFJzLaMyjMD9HMbHwktNcUW7tvsab5xfNLceO3nWZGo7BWy1SXPipYNNYitWnu3YkAmZsS4wBGr_/w400-h121/Azure+resource+group+creation+failed+7.PNG" width="400" /></a></div><div class="separator" style="clear: both; text-align: center;">Fig 5.1: Error while creating resource group</div><div><br /></div><div>Step 5.B: Cluster Settings: Your setup cluster name and credential which you will require later to connect the cluster, so please keep it safe.</div><div><br /></div><div class="separator" style="clear: both; text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjGDTyiCcPkxZF0qEqNoN7OGKcOkCN53tfRbLoL3r0F9nlrC1HreJyaaKAkusJGI9s05-Za3_rKpQcdvQbffcB3lFtfUa39AT00fOSBKXlUHl3IuhnxYiUCpbWUrtmh5pcZI39qyvDJ_7_o/s1197/Cluster+Settings+7.PNG" style="margin-left: 1em; margin-right: 1em;"><img border="0" data-original-height="852" data-original-width="1197" height="285" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjGDTyiCcPkxZF0qEqNoN7OGKcOkCN53tfRbLoL3r0F9nlrC1HreJyaaKAkusJGI9s05-Za3_rKpQcdvQbffcB3lFtfUa39AT00fOSBKXlUHl3IuhnxYiUCpbWUrtmh5pcZI39qyvDJ_7_o/w400-h285/Cluster+Settings+7.PNG" width="400" /></a></div><div class="separator" style="clear: both; text-align: center;">Fig 7: SQL Server cluster settings</div><div class="separator" style="clear: both; text-align: center;"><br /></div><div class="separator" style="clear: both; text-align: center;"><br /></div><div><br /></div><div>Step 5.C: Service and storage settings: It will fill automatic but please adjust as per your need.</div><div><br /></div><div class="separator" style="clear: both; text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEitfOFcbL6Ws8Tn2FCTy3cStfj8sxP3GQJTHKzNGAYiIFM6bIkJNd59AigA1ANcU7YGTTAZV8C6G1jN0pVBaMjkjTxOCjDIvCAdieuW3jzRRK8CHpB6Sv9aLlTji10wyrdif5_GSjcSbbCj/s1756/service+and+storage+settings.PNG" style="margin-left: 1em; margin-right: 1em;"><img border="0" data-original-height="938" data-original-width="1756" height="214" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEitfOFcbL6Ws8Tn2FCTy3cStfj8sxP3GQJTHKzNGAYiIFM6bIkJNd59AigA1ANcU7YGTTAZV8C6G1jN0pVBaMjkjTxOCjDIvCAdieuW3jzRRK8CHpB6Sv9aLlTji10wyrdif5_GSjcSbbCj/w400-h214/service+and+storage+settings.PNG" width="400" /></a></div><div class="separator" style="clear: both; text-align: center;">Fig 8: Service settings.</div><div class="separator" style="clear: both; text-align: center;"><br /></div><div><br /></div><div><br /></div><div>Step 6: Script to Notebook</div><div><br /></div><div>Last step of the wizard, where as soon as click the 'Script to Notebook' it will open Notebook in Azure studio.</div><div class="separator" style="clear: both; text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEiyEjR0Pd6Rzp93zirWFBvQQDluEkhduOAeBLbOfxdhjv7vzu-6im_SMChduB8_LnFCc2aOFZSMygIGtyZMG-pcY-bwElmu8RrG5LP0Nb2V6pWiSMIbR0WJIcUMAQqsaWP4rmWgNKbP2zGP/s1714/Script+to+notebook.PNG" style="margin-left: 1em; margin-right: 1em;"><img border="0" data-original-height="1026" data-original-width="1714" height="240" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEiyEjR0Pd6Rzp93zirWFBvQQDluEkhduOAeBLbOfxdhjv7vzu-6im_SMChduB8_LnFCc2aOFZSMygIGtyZMG-pcY-bwElmu8RrG5LP0Nb2V6pWiSMIbR0WJIcUMAQqsaWP4rmWgNKbP2zGP/w400-h240/Script+to+notebook.PNG" width="400" /></a></div><div class="separator" style="clear: both; text-align: center;">Fig 9: Script to Notebook</div><div><br /></div><div><br /></div><div>However, if this is the first time you are deploying big data cluster then python need to be installed, so you will find below UI:</div><div class="separator" style="clear: both; text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEiAT39n7kmrk30BtZUXsZc1HtA0yh2C1zbXVO_azRg1Q0E7opC5MMJaSGUXm0Ra7FvqtLw7nUPEj9tm36cH0N6XpQuOTX1QlliayhXq6h2HN0Ww-Hc5fvP80pJ1aA5os6aCBGVHQ8i6KbIQ/s605/Python+installation.PNG" style="margin-left: 1em; margin-right: 1em;"><img border="0" data-original-height="280" data-original-width="605" height="185" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEiAT39n7kmrk30BtZUXsZc1HtA0yh2C1zbXVO_azRg1Q0E7opC5MMJaSGUXm0Ra7FvqtLw7nUPEj9tm36cH0N6XpQuOTX1QlliayhXq6h2HN0Ww-Hc5fvP80pJ1aA5os6aCBGVHQ8i6KbIQ/w400-h185/Python+installation.PNG" width="400" /></a></div><div class="separator" style="clear: both; text-align: center;">Fig 10: install python</div><div><br /></div><div><br /></div><div>When you are done then go and hit 'Run All'</div><div><br /></div><div><br /></div><div class="separator" style="clear: both; text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgSP_I4f-hBOUCoyKPMF7W10iUUiReJppEZ25sOQ0zDRtj73GjlpwW57Sn-Z8TKE4IbrxmKcP_qenHsWQXOOZAqn4bkarI6LJY9eE8JGzi3qMQNNDhiR5LGQy9P3bthZZooH6B_xm68lcMz/s1020/Notebook+with+python.PNG" style="margin-left: 1em; margin-right: 1em;"><img border="0" data-original-height="664" data-original-width="1020" height="260" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgSP_I4f-hBOUCoyKPMF7W10iUUiReJppEZ25sOQ0zDRtj73GjlpwW57Sn-Z8TKE4IbrxmKcP_qenHsWQXOOZAqn4bkarI6LJY9eE8JGzi3qMQNNDhiR5LGQy9P3bthZZooH6B_xm68lcMz/w400-h260/Notebook+with+python.PNG" width="400" /></a></div><div class="separator" style="clear: both; text-align: center;">Fig 11: Execute the script</div><div><br /></div><div><br /></div><div>You may find error where pandas is still missing</div><div><br /></div><div><br /></div><div class="separator" style="clear: both; text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgUdL354urUlBB8a06y_pSrtXzNBOsM3LWSmQ3zbQrmDOdb9KLth2NJ8Kg_OwxlbaRWG1Jwi87LmWdQIgP79WfKZlA6GCkHWa_2gTAYORdQ_0reRJPQbWra7BFt80sDyj-Evfhxd1DwwMHK/s726/Pandas+not+foung.PNG" style="margin-left: 1em; margin-right: 1em;"><img border="0" data-original-height="360" data-original-width="726" height="199" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgUdL354urUlBB8a06y_pSrtXzNBOsM3LWSmQ3zbQrmDOdb9KLth2NJ8Kg_OwxlbaRWG1Jwi87LmWdQIgP79WfKZlA6GCkHWa_2gTAYORdQ_0reRJPQbWra7BFt80sDyj-Evfhxd1DwwMHK/w400-h199/Pandas+not+foung.PNG" width="400" /></a></div><div class="separator" style="clear: both; text-align: center;">Fig 12: Pandas is missing</div><div><br /></div><div>So you need to install Pandas package from Azure data studio, Go to Package manager</div><div class="separator" style="clear: both; text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEggjDl7mezwQJCeB-MXih2i6IiZ4PIs4A6b2ZalxIQpjmZSDwaazUenuwA2PC2mHggxlLRKzX8sh-hjMDj_eDoGPA7sRdhv3CM3_e3hUVGUKjCnE-WO-GqSs7zwaEPDRzLAEX01jcRSpAKj/s1131/Manage+package+and+adding+new+package.png" style="margin-left: 1em; margin-right: 1em;"><img border="0" data-original-height="438" data-original-width="1131" height="155" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEggjDl7mezwQJCeB-MXih2i6IiZ4PIs4A6b2ZalxIQpjmZSDwaazUenuwA2PC2mHggxlLRKzX8sh-hjMDj_eDoGPA7sRdhv3CM3_e3hUVGUKjCnE-WO-GqSs7zwaEPDRzLAEX01jcRSpAKj/w400-h155/Manage+package+and+adding+new+package.png" width="400" /></a></div><div class="separator" style="clear: both; text-align: center;">Fig 13: Finding package manager in Azure data studio</div><div><br /></div><div>And then find pandas under package manager, as soon as you find them , hit the install button.</div><div><br /></div><div class="separator" style="clear: both; text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjyTQTn7StueuPfwnAI51mT3vqQuAyWYT3RFv5KIZ5KFkvIfE2uQbzwe2urtloJX-8yBwyfzcYi6Hi_7LNpxiSwf9Osl5qZv9p38EqzXM6ge7_v7QY_4ZL3O0eDKl2Xw1vkDeaBx0XaQARM/s1346/Install+pandas.PNG" style="margin-left: 1em; margin-right: 1em;"><img border="0" data-original-height="774" data-original-width="1346" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjyTQTn7StueuPfwnAI51mT3vqQuAyWYT3RFv5KIZ5KFkvIfE2uQbzwe2urtloJX-8yBwyfzcYi6Hi_7LNpxiSwf9Osl5qZv9p38EqzXM6ge7_v7QY_4ZL3O0eDKl2Xw1vkDeaBx0XaQARM/s320/Install+pandas.PNG" width="320" /></a></div><div class="separator" style="clear: both; text-align: center;">Fig 14: Install pandas</div><div>When installation is done, then hit the 'Run All' from Notebook again, this time it will successfully start running and you will find , one of the step will login to Azure portal where automatically redirect to the portal and you will find like below UI where you don't need to do anything.</div><div><br /></div><div class="separator" style="clear: both; text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgOk5Q1YcvAQ9VvohSty6pEhdIrKgna38nzstu3Ffktlyj_HHxHbOcxkushyphenhyphenlgp5wrua7tkI06ChHU8HnIy9dWzORUNZ05Ah-PhvajTSBX7jPGQ7fVPIgGsT5DwOHRKi-HDYQryijju4YO3/s1050/verify+login+for+Azure+08.PNG" style="margin-left: 1em; margin-right: 1em;"><img border="0" data-original-height="152" data-original-width="1050" height="58" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgOk5Q1YcvAQ9VvohSty6pEhdIrKgna38nzstu3Ffktlyj_HHxHbOcxkushyphenhyphenlgp5wrua7tkI06ChHU8HnIy9dWzORUNZ05Ah-PhvajTSBX7jPGQ7fVPIgGsT5DwOHRKi-HDYQryijju4YO3/w400-h58/verify+login+for+Azure+08.PNG" width="400" /></a></div><div class="separator" style="clear: both; text-align: center;">Fig 15: Azure login</div><div>When deployment is over , your big data cluster should be ready to use, so you must need to connect that. You will find all the end points after deployment is completed, please take the end point for 'SQL Server Master Instance Front-End'</div><div><br /></div><div class="separator" style="clear: both; text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgatIH2aollfqNrPIerGSSG2hYLKbsZcitlqwQkC0o8YPsoupo_Zn4aAXxlFMXni3HvHNRy1e-E4niVeDuEj2M6NaBUWp9bRWczovo-Zxwgm8GsuaoISSisKCFKrX895MsX8MAgMKlYl76k/s1043/Click+to+connect+Master+instance.PNG" style="margin-left: 1em; margin-right: 1em;"><img border="0" data-original-height="416" data-original-width="1043" height="160" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgatIH2aollfqNrPIerGSSG2hYLKbsZcitlqwQkC0o8YPsoupo_Zn4aAXxlFMXni3HvHNRy1e-E4niVeDuEj2M6NaBUWp9bRWczovo-Zxwgm8GsuaoISSisKCFKrX895MsX8MAgMKlYl76k/w400-h160/Click+to+connect+Master+instance.PNG" width="400" /></a></div><div class="separator" style="clear: both; text-align: center;">Fig 16: Click to connect the Cluster</div><div><br /></div><div>After connecting the big data cluster, it will look like below:</div><div><br /></div><div class="separator" style="clear: both; text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjWAPlPJzfvvKx0M5HAymQCd-rpJY9YWQ-wAfSva-hP6qDvnNRldxB1va5f0nwB2_q1z6QdcJKmo3dW-k2NFmUoZA_WcpK_7TsXRedeFLuoaeY5j10Bs8jfsJx-FeEhAYRyyFWxU_LtjgDC/s1478/Connect+with+big+data+cluster.PNG" style="margin-left: 1em; margin-right: 1em;"><img border="0" data-original-height="645" data-original-width="1478" height="175" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjWAPlPJzfvvKx0M5HAymQCd-rpJY9YWQ-wAfSva-hP6qDvnNRldxB1va5f0nwB2_q1z6QdcJKmo3dW-k2NFmUoZA_WcpK_7TsXRedeFLuoaeY5j10Bs8jfsJx-FeEhAYRyyFWxU_LtjgDC/w400-h175/Connect+with+big+data+cluster.PNG" width="400" /></a></div><div class="separator" style="clear: both; text-align: center;">Fig 17: Connect with big data cluster</div><div><br /></div><div>If you would like to remove the cluster then either you delete it by using azdata command in the command shell.</div><div><br /></div><div><br /></div><div><div><i>Delete cluster:</i></div><div><i><br /></i></div><div><i>azdata bdc delete -n mssql-cluster</i></div></div><div><i><br /></i></div><div><div class="separator" style="clear: both; text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEha62nUApBkzd1Rd-Vjzu7MYp7FiXwCURS5hF9q61cOSJOpXpOwMjEPwqrdPNhcgwlKPnagw5JtjjMxzTyalutMYi5sPK9Y5INW4yhCW4G2idayfVPUQ44w4i7vUKsDEfBQADhVXPUcxGeB/s921/Delete+cluster.PNG" style="margin-left: 1em; margin-right: 1em;"><img border="0" data-original-height="458" data-original-width="921" height="199" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEha62nUApBkzd1Rd-Vjzu7MYp7FiXwCURS5hF9q61cOSJOpXpOwMjEPwqrdPNhcgwlKPnagw5JtjjMxzTyalutMYi5sPK9Y5INW4yhCW4G2idayfVPUQ44w4i7vUKsDEfBQADhVXPUcxGeB/w400-h199/Delete+cluster.PNG" width="400" /></a></div><div class="separator" style="clear: both; text-align: center;">Fig 18: delete cluster</div><i><br /></i></div><div><i><br /></i></div><div><i><br /></i></div><div>Or you can completely remove the the resource group from Azure portal.</div><div><br /></div>Unknownnoreply@blogger.com1tag:blogger.com,1999:blog-2635979041680311167.post-54877370067660649352020-06-20T15:54:00.000-04:002020-07-27T15:49:53.935-04:00Adding Active Directory users to Azure SQL databasesIf you are working with Azure SQL Database you will find adding user a bit different than general SQL DB, especially; adding Active Directory users. Though it's easy and a few steps, off course if you know the steps. While I was working on it and went through different routes so thought about sharing the easy one which worked like Gem.<div><br /></div><div>There are two authentication approach for Azure SQL DB and SQL Managed Instance.</div><div><br /></div><div>1) SQL Authentication</div><div>2) Azure Active directory Authentication</div><div><br /></div><div><br /></div><div class="separator" style="clear: both; text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEh7r58s2jNgZ4zvvbHQpUtZoKtjbnyBSGdVqQ_Vpof60zoV4BMgfTjNvyV07kpHUE_BTWp9Enytk6NtV-WdkQ-EjYpbnArmJDZiTRHeUOdJnHeZWIQWw_vgKYOdhQkS8Rm6OHjKvypUyDW9/s767/Authentical+type.PNG" style="margin-left: 1em; margin-right: 1em;"><img border="0" data-original-height="495" data-original-width="767" height="259" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEh7r58s2jNgZ4zvvbHQpUtZoKtjbnyBSGdVqQ_Vpof60zoV4BMgfTjNvyV07kpHUE_BTWp9Enytk6NtV-WdkQ-EjYpbnArmJDZiTRHeUOdJnHeZWIQWw_vgKYOdhQkS8Rm6OHjKvypUyDW9/w400-h259/Authentical+type.PNG" width="400" /></a></div><div class="separator" style="clear: both; text-align: center;">Fig 1.0: Authentication type in Azure</div><div><br /></div><div><br /></div><div>SQL authentication is straight forward, while you are creating database in Azure, you must have to put SQL server credential which will become your Azure database credential if you don't create database credential separately. As long as you have administrative credential to login to the database then you create all necessary users. However, you <b>can't</b> create Active directory users in Azure by logging as SQL authenticate user.</div><div><br /></div><div>For example, I have logged in with the SQL admin user and then tried to execute below query to add Active directory user:</div><div><br /></div><div><div><i>CREATE USER [user1@domain.com] </i></div><div><i>FROM EXTERNAL PROVIDER </i></div><div><i>WITH DEFAULT_SCHEMA = dbo; </i></div><div><i> </i></div><div><i>--add user to role(s) for the particular database</i></div><div><i>ALTER ROLE dbmanager ADD MEMBER [user1@domain.com]; </i></div><div><i>ALTER ROLE loginmanager ADD MEMBER [user1@domain.com]; </i></div></div><div><br /></div><div>However, found below error:</div><div><br /></div><div class="separator" style="clear: both; text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEg_IO5eB5MnVCBZeRSXsTcIrGYVS4C_b92OpewXBmXUnfp8JlOZpGRxUdvqVIS0w48y1u_E81HTracH2KbqciD0jPpjOFUBg-iP4EJe9IlkXW833Dv1qTfYVjjlGxiC0krmuAO10gNAvlXc/s1586/only+established+connection+create+error.PNG" style="margin-left: 1em; margin-right: 1em;"><img border="0" data-original-height="481" data-original-width="1586" height="121" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEg_IO5eB5MnVCBZeRSXsTcIrGYVS4C_b92OpewXBmXUnfp8JlOZpGRxUdvqVIS0w48y1u_E81HTracH2KbqciD0jPpjOFUBg-iP4EJe9IlkXW833Dv1qTfYVjjlGxiC0krmuAO10gNAvlXc/w400-h121/only+established+connection+create+error.PNG" width="400" /></a></div><div class="separator" style="clear: both; text-align: center;">Fig 2.0: Error creating AD user</div><div class="separator" style="clear: both; text-align: center;"><br /></div><div>The error message is self explanatory: "Principal 'user1@domain.com' could not be created. <b>Only connections established with Active Directory accounts can create other Active Directory users</b>."</div><div><br /></div><div>So, how to solve it? Or How to add the AD user to Azure SQL DB ?</div><div><br /></div><div><b>Step 1:</b> Please login to you azure portal and find out your SQL server resource and you will find 'Active directory Admin' left side under settings. Please click 'Active Directory Admin' to find out UI like below where click 'Set admin' , now your Active Directory user account become Admin to the SQL server. </div><div><br /></div><div class="separator" style="clear: both; text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjRi4Kah7hy8-TPlkUa6aQlcSm_AVH4390aLTbLcjbyqnbDt1g_o8or6HViUb6-LCuvl71dO6bVxJKbvhOtCUibbULAYfSeEytHnlX87Ae97rogeVnGRp_w-39zGEsWNKvQc5fTzqMYW-PG/s1328/AD+user+as+ADMIN.PNG" style="margin-left: 1em; margin-right: 1em;"><img border="0" data-original-height="825" data-original-width="1328" height="249" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjRi4Kah7hy8-TPlkUa6aQlcSm_AVH4390aLTbLcjbyqnbDt1g_o8or6HViUb6-LCuvl71dO6bVxJKbvhOtCUibbULAYfSeEytHnlX87Ae97rogeVnGRp_w-39zGEsWNKvQc5fTzqMYW-PG/w400-h249/AD+user+as+ADMIN.PNG" width="400" /></a></div><div class="separator" style="clear: both; text-align: center;">Fig 3.0: Active Directory Admin</div><div><br /></div><div><b>Step 2: </b>Now either use SSMS or Azure data studio and login with Active directory authentication. Since my organization have Multi factor Authentication (MFA) enables so I have chosen Azure Active Directoty- Universal with MFA, but you can choose either 'Active directory - Integrated' or 'Active Directory - Password.'</div><div><br /></div><div class="separator" style="clear: both; text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgD2IfhpUZ6Bsx40sNChbum1RFHwZOPt-cJMTsQFjWEr5FqjnhHgMOsxrrpnsBGDN4swPR9oD53KXMKC0sEbOdMgCfQuqIF-9t3nq8KDYcC5GsLGEOsLzxFqZvq-_MZPsPS77NZwR6kOoQ_/s731/Login+with+MFA.PNG" style="margin-left: 1em; margin-right: 1em;"><img border="0" data-original-height="495" data-original-width="731" height="271" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgD2IfhpUZ6Bsx40sNChbum1RFHwZOPt-cJMTsQFjWEr5FqjnhHgMOsxrrpnsBGDN4swPR9oD53KXMKC0sEbOdMgCfQuqIF-9t3nq8KDYcC5GsLGEOsLzxFqZvq-_MZPsPS77NZwR6kOoQ_/w400-h271/Login+with+MFA.PNG" width="400" /></a></div><div class="separator" style="clear: both; text-align: center;">Fig 4: Login with AD user</div><div><br /></div><div><br /></div><div><b>Step 3:</b> And now you can run the query to add Active directory user and give permission as you want, though traditional database level roles like db_datareader, db_datawriter, db_ddladmin, etc. are the same in Azure database, but roles like sysadmin, serveradmin, etc. don’t exist in Azure SQL database. There are two admin roles, <b><i>dbmanager </i></b>(similar to dbcreator) that can create and drop databases, and <b><i>loginmanager </i></b>(similar to securityadmin) that can create new logins which you can see in the below code:</div><div><br /></div><div><i><br /></i></div><div><div><i>CREATE USER [user1@domain.com] </i></div><div><i>FROM EXTERNAL PROVIDER </i></div><div><i>WITH DEFAULT_SCHEMA = dbo; </i></div><div><i> </i></div><div><i>-- add user to role(s) for the particular database</i></div><div><i>ALTER ROLE dbmanager ADD MEMBER [user1@domain.com]; </i></div><div><i>ALTER ROLE loginmanager ADD MEMBER [user1@domain.com]; </i></div></div><div><br /></div><div><br /></div><div>As soon as you execute the above code the AD user got all the necessary access to perform activities. </div><div><br /></div><div><br /></div><div><br /></div><div><span> </span><br /></div>Unknownnoreply@blogger.com0Toronto, ON, Canada43.653226 -79.383184315.342992163821151 -114.5394343 71.963459836178842 -44.226934299999996tag:blogger.com,1999:blog-2635979041680311167.post-86225317259772581872020-05-23T15:18:00.003-04:002020-07-03T10:32:36.549-04:00Parameterize data source in Power BI and Tricks for source as Salesforce<div dir="ltr" style="text-align: left;" trbidi="on">
Parameters in Power BI can be used for different purposes, one of the useful case to hold environment variable. For example, while you are building Power BI report source can be connected to DEV environment and as soon as report is ready to release you may want to connect the source from PRE-PROD/PROD environment. If you use parameter then it's become easy to handle.</div><div dir="ltr" style="text-align: left;" trbidi="on"><br /></div><div dir="ltr" style="text-align: left;" trbidi="on">How to create Parameter?</div><div dir="ltr" style="text-align: left;" trbidi="on"><br /></div><div dir="ltr" style="text-align: left;" trbidi="on">Open power query editor and then find Manage parameter and create New parameter as like below fig 1.0</div><div dir="ltr" style="text-align: left;" trbidi="on"><br /></div><div class="separator" style="clear: both; text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjnR9JCnnSNMeJp2YwypkaVNhxd24ZY_EL72a_2DoEopLz-RCkhIjgZAjnhr7AiTG_QP24aTeWsJkpcz7piJRo7OFm0B_y8Z-z5VXnZCyfSaEpyPQ9vO8G1mcvgsmgXqb4Dgi977o9aatud/s1133/parameter+creation.png" style="margin-left: 1em; margin-right: 1em;"><img border="0" data-original-height="442" data-original-width="1133" height="156" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjnR9JCnnSNMeJp2YwypkaVNhxd24ZY_EL72a_2DoEopLz-RCkhIjgZAjnhr7AiTG_QP24aTeWsJkpcz7piJRo7OFm0B_y8Z-z5VXnZCyfSaEpyPQ9vO8G1mcvgsmgXqb4Dgi977o9aatud/w400-h156/parameter+creation.png" width="400" /></a></div><div class="separator" style="clear: both; text-align: center;">Fig 1.0: Find New Parameter</div><div dir="ltr" style="text-align: left;" trbidi="on"><br /></div><div dir="ltr" style="text-align: left;" trbidi="on">My source is Salesforce and have three different environments, QA, UAT and PROD. At first provide the name of the parameter, e.g. my parameter name is: Environment, then choose Type as 'Text' and for Suggested values please choose 'List of values' as like below figure 2.0.</div><div dir="ltr" style="text-align: left;" trbidi="on"><br /></div><div class="separator" style="clear: both; text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhfjjj8uockg5iwZPv-ZMDfUJ0JSo3TYJWqOYlHoN6YOQHiI82Ukj1xhZvGMYkVc7PROfDHtUOeSUKnaCtc3oWp61GPDSkpAcswIGcoP7f6RIcHkU4nF0C_TW5MI64brtsR51w_P3sHsOrB/s938/Environment+switch.PNG" style="margin-left: 1em; margin-right: 1em;"><img border="0" data-original-height="938" data-original-width="839" height="400" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhfjjj8uockg5iwZPv-ZMDfUJ0JSo3TYJWqOYlHoN6YOQHiI82Ukj1xhZvGMYkVc7PROfDHtUOeSUKnaCtc3oWp61GPDSkpAcswIGcoP7f6RIcHkU4nF0C_TW5MI64brtsR51w_P3sHsOrB/w358-h400/Environment+switch.PNG" width="358" /></a></div><div class="separator" style="clear: both; text-align: center;">Fig 2.0: Creating parameter for different environment</div><div class="separator" style="clear: both; text-align: center;"><br /></div><div class="separator" style="clear: both; text-align: center;"><br /></div><div dir="ltr" style="text-align: left;" trbidi="on">Under List of values you will list down all your source environment and Default and Current value you need to set from those list of values. e.g. if would like to use QA environment then you need to choose the URL for QA for both default and current value.</div><div dir="ltr" style="text-align: left;" trbidi="on"><br /></div><div class="separator" style="clear: both; text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjWwBSsgIXu7A7G4SMPYiF0Afc8ZJkflmYH9XJrZnWLGEe6hIUOml4zSD1f-DeIpErumuRHOrZynOaY3FiMC9LrifU3EIn_6v1a3yxzS0l1p22Mfb2G8fXw8VM7IGl34YzR6FO9LLCo6N67/s635/Default+and+current+value.PNG" style="margin-left: 1em; margin-right: 1em;"><img border="0" data-original-height="247" data-original-width="635" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjWwBSsgIXu7A7G4SMPYiF0Afc8ZJkflmYH9XJrZnWLGEe6hIUOml4zSD1f-DeIpErumuRHOrZynOaY3FiMC9LrifU3EIn_6v1a3yxzS0l1p22Mfb2G8fXw8VM7IGl34YzR6FO9LLCo6N67/s320/Default+and+current+value.PNG" width="320" /></a></div><div class="separator" style="clear: both; text-align: center;">Fig 3.0: Default and current value</div><div dir="ltr" style="text-align: left;" trbidi="on"><br /></div><div dir="ltr" style="text-align: left;" trbidi="on"><br /></div><div dir="ltr" style="text-align: left;" trbidi="on">As soon as you click 'OK' button then you are done. Yes, you are done with current value you set up; if you are connected for QA that's work fine. However, as soon as you change your environment from current value QA to UAT that works also fine if your source is database, but not for Salesforce source as expected.</div><div dir="ltr" style="text-align: left;" trbidi="on"><br /></div><div dir="ltr" style="text-align: left;" trbidi="on"><u>What special about source as Salesforce?</u></div><div dir="ltr" style="text-align: left;" trbidi="on"><br /></div><div dir="ltr" style="text-align: left;" trbidi="on">After changing the source from QA to UAT and refresh the model, I got the below error:</div><div dir="ltr" style="text-align: left;" trbidi="on"><br /></div><div class="separator" style="clear: both; text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEi8uFyL949fLgpQA9uLdAK46ADKhngM0cqNSoCWTuyIIfsFcO-TGMU8H7YZIc4ajs0Z-PH3DfqTQXH1RAZB8hZeHGBZPCTP9wqNhb-Z-ZClfT7Iy0e29TK0_aT3Tpt0Sr75ZrVLt4h6DrvZ/s771/Error+after+manually+change+data+source+to+environment.PNG" style="margin-left: 1em; margin-right: 1em;"><img border="0" data-original-height="371" data-original-width="771" height="193" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEi8uFyL949fLgpQA9uLdAK46ADKhngM0cqNSoCWTuyIIfsFcO-TGMU8H7YZIc4ajs0Z-PH3DfqTQXH1RAZB8hZeHGBZPCTP9wqNhb-Z-ZClfT7Iy0e29TK0_aT3Tpt0Sr75ZrVLt4h6DrvZ/w400-h193/Error+after+manually+change+data+source+to+environment.PNG" width="400" /></a></div><div class="separator" style="clear: both; text-align: center;">Fig 3.0: Error after changing environment</div><div dir="ltr" style="text-align: left;" trbidi="on"><br /></div><div dir="ltr" style="text-align: left;" trbidi="on"><br /></div><div dir="ltr" style="text-align: left;" trbidi="on"><u>How to fix it or What do you do after environment update for source as Salesforce ?</u></div><div dir="ltr" style="text-align: left;" trbidi="on"><br /></div><div dir="ltr" style="text-align: left;" trbidi="on">If you run into the same situation then please go to the power query editor then go to the source query as like below figure 4.0 for each table and then validate it by enter or the validation button (tick mark as like in the below figure). You will find error disappear. It's now can identity correct environment to connect.</div><div class="separator" style="clear: both; text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjv8Fidq3ZGJ1QARqTkjz7dNNacagwwBDUsrhlQ9MpX-u5j6VvsKSZyGEo7yzm3es0vS2b42VPUrqwoYOfbAXJUdQLnEQGHfBBRkwSjLi5sHaPJYhEYbUZqv4YwBbvcYCfkvDqoInkKhZN9/s1384/How+to+fix+error.PNG" style="margin-left: 1em; margin-right: 1em;"><img border="0" data-original-height="630" data-original-width="1384" height="183" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjv8Fidq3ZGJ1QARqTkjz7dNNacagwwBDUsrhlQ9MpX-u5j6VvsKSZyGEo7yzm3es0vS2b42VPUrqwoYOfbAXJUdQLnEQGHfBBRkwSjLi5sHaPJYhEYbUZqv4YwBbvcYCfkvDqoInkKhZN9/w400-h183/How+to+fix+error.PNG" width="400" /></a></div><div class="separator" style="clear: both; text-align: center;"><br /></div><div class="separator" style="clear: both; text-align: center;">Fig 4.0: Refresh data source </div><div dir="ltr" style="text-align: left;" trbidi="on"><br /></div><div dir="ltr" style="text-align: left;" trbidi="on"><u>Update Environment from Power BI service:</u></div><div dir="ltr" style="text-align: left;" trbidi="on">Since you have created the parameter, so you can update the parameter value from Power BI service as well. Firstly login to the power BI portal and then find your published datasets as like below figure 5.0.</div><div dir="ltr" style="text-align: left;" trbidi="on"><br /></div><div class="separator" style="clear: both; text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEit490XG_yIoSqd9ASdj-p7DznvG6h_jLEHLK8K3nHnYWYR4iZlFyBjfmlii-r0kYlE3UMQon8RZZx9_IiFpzq4IBd2hprHMk1hZxgr3rwypNMjoz3tzsTeEPRqmve_gbyZv6-U0znIXNcu/s1318/PowerBI+service+settingsPNG.PNG" style="margin-left: 1em; margin-right: 1em;"><img border="0" data-original-height="391" data-original-width="1318" height="119" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEit490XG_yIoSqd9ASdj-p7DznvG6h_jLEHLK8K3nHnYWYR4iZlFyBjfmlii-r0kYlE3UMQon8RZZx9_IiFpzq4IBd2hprHMk1hZxgr3rwypNMjoz3tzsTeEPRqmve_gbyZv6-U0znIXNcu/w400-h119/PowerBI+service+settingsPNG.PNG" width="400" /></a></div><div class="separator" style="clear: both; text-align: center;">Fig 5.0: Settings under Datasets</div><div dir="ltr" style="text-align: left;" trbidi="on"><br /></div><div dir="ltr" style="text-align: left;" trbidi="on">And clicking ellipsis (...) you will find settings </div><div dir="ltr" style="text-align: left;" trbidi="on"><br /></div><div dir="ltr" style="text-align: left;" trbidi="on"> <img alt="" height="415" src="data:image/png;base64,iVBORw0KGgoAAAANSUhEUgAAAXoAAAIHCAYAAACR9JcSAAAgAElEQVR4Ae29abAVVZq2TXf1UP2no9/u9+3oHx3xdnX0j/71fd1BR9CEQURTEfIhASGE0IIFIhQUCCLKUAWCqIilgKIiKFAgDgUq1UcBQUA4KDKJigjIKMg8g0wyWcD64lrwbNdJcu+zzzn77CHzXhF5cp/MlWu488lrrXzWyswmTZo0cfVZ/uRP/sSFy5/+6Z86lp/85Cd++bM/+zPH8ud//ufuL/7iL/zyl3/5l47lpz/9aWb5q7/6K1fXZePGjY7lo48+0iINUm8Ddj3U9TpKe/yQQ8YmYxXcMoYZ04xxIff4XR9+luCYwoPeBEKsb7/9Vos0kA3IBirOBqKwTz3oTQBr+cKevECvhk42IBuoRBugd2+wN7YZ68JefQl65/W5i6h7jz6sJL+t8ogR9uYRyk6wc859/vnnbs6cOW727NlapIFsQDZQVjYAm2CUMSsEPVyDb8a6KAMrAPYNA71V3Fq8sDeP38tEQ0ABXg2cbEA2UO42UF1d7bkFv0LYG+OMeSHsUwX6aG8+BL315E+dOkXnXkEKSAEpUFYKwCYaoXfeeScW9Ll69akDPT4tWkEgz6i29eitFS+rM6vCSAEpIAUCBYxTcAt+xfXq1aMPplMK9IH16KcUkAIVoYBAH8y3D31TtG7mu4r659WjrwjbViGlgBS4qUA+oLdB2ZCDiXPdhJXjdy7Q80CGXDe6hqSAFKgUBULQw684100c6GFhmcO+brNu8gU9vXmBvlLMW+WUAlIABaKgh2PZZt5EWZg60Jt/XqDXxSMFpEAlKRAHengW9+CUQH9zxo1AX0kmrrJKASkg0N8cjA1bMZtmhM8qnEOvHr0uGCkgBSpRgdpAX8Fz6evvo48DfTiHXj36SjR1lVkKpFeBbKCHa9aZtcFY+Bd2fBPro289ep77as9Jd/LkCXds0yz3S3tY6l/udWOrPnGfrF6rWTfpvWZUcylQcQrkAn3op7dObt6gv32623rJOXdkoRscTFXPu3F4davX8vTywfWd3VO/Hv0/DZzv9p/+zn2350s399XX3AcfzLgJ+tvdhA+3ua82fOo+eO15gb7iTF0FlgLpVUCgj/jox6w87c6c3u7e+cWNB6YyD0t1mulWb9/mvnpnhPuZ5tGn94pRzaVABSpQO+j7u2mrt7mDuxa6h4rpumlQj34SdwH169FP+hzQf+peuvlkbAb0v5rtvti+za2a3lPz6CvQ0FVkKZBmBWoH/fNu7alT7vzBSgL9dIE+zUatuksBKVBTgWSC3nfmrUf//7hfTVvpdh045s6fO3tjOb7ffb3kWferf7Y4TdzABfvdubNn3Nkz9Oi/c9+dujEY+03VIrfi4AG359vdbgeumw1fuk/XrpGPvqYd6T8pIAXKWIFcoB9XvccdPXrU8Trj8+fPu0uXLrkffvjBnVw2yM++GVz9nbt+/TtX/fDPXP83trrTf6SiW9107zWZ7vxw6unqmoOx/9Hfzfr8iDvNQO3NcOnEVjfr/p/VHHS9xXXzczd9Mwddclun/bxm3HgvDRC/18346ibg93zmFrw+3c14/X33xTe27X334E3Y/9dDz7oZ06a7ZTtu+Og/nDLNTXtlqhv7wGNu+EuT3cTXl7pN27e5z+ZOdmNGPynQ29nTWgpIgbJXIBfo7xk20U2ZvNBtA/RHv3BzX33dvTHzDffSkJ/XBP2cNe5HbucG/fRtSHLJHdlc7aremOWqln/rTvtNW93023/sYDcpBOjvfWen78EfXDDA/d8arcH/db+qurHv+OIhvtWw6UTy0Ze9zaqAUkAK1FGBXKC/Mb0yu4/eevSnT19y37472P38XwJQN4nv0U//cI17tkvN3nubOd/6Uh+Zf8+PPfVbQB+mndfvIa76yFl3/shSN7QG5O3gG/vPHf7QjWrSJPOAgEBfRwtSdCkgBcpegYaD/rpzGya5n93C0njQx06Gebja9+przJlvMOg7/8Htwie/asyPrUekkJM/PevOnd3h/tBZoC97S1UBpYAUqLcChQD91letkxyus4H+Z+7nQye5quVfuq3bjrjT5390+rhtfrbMDS43GPT933cHz511uG1iW5cmTdyDfgB2n3u/v0BfbwvSgVJACpS9Ag0HPYOxIeDtdwzo/2WwW3jgJtj/eMmd3rPVbf282lV9sPWGj7+goO9zA/S5evRPraJHv9H9vqVAXyhLXb9+vWvatKmbN29eoZIseDrMKnjyySfdgAED/EyDgmdQ5gmePn3ade/e3Y0YMcLPsCjz4qp4BVCgmKD/5QdHfIm/feOemq6em66bwvbom4xxq77Lz0ePDz8pg7EbN250LVq0cEOHDnUXLlwogInULQmB3rmJEyf6xo4GL7qUA1wF+rrZdBJiFxP0N2bc2Kwc6/k3cU0mfHlDyoL16P37cW64Zpg7n2vWzTdv3+tdO0kA/dWrV92ECRNc3759Xbdu3dzOnTuLbqOVAPrGFgXQt27d2r3wwgtuypQpNZb58+f7OcqNXYZc6Qv0udRJ5r7aQd/fzf2W6ZXV7onIKxBs1k2+rpsboD/iFvYIIP8vg131iZvaFgr0E76E3U1ck38e4BZ8c/MhqXAe/bYb8+hPfPGS+/9uDtAmAfTHjx939957r1uxYoXv0c+YMaPoVivQ3+jR4xoBqOUYBPpyPCuNW6baQf/PbvyKGw9MHV0/173x1ifui/+JPjAVgDszseVWH/3Pp930xV864r78YJab9cEa9+1p5y4dOXJjLn1O0Of/wNTg5advgt7DvpN7quozd/D4TeAzE+fATrdqWm/3/2YKmwzXzZIlS1yPHj3cd9995+bMmeN79mfPnm1cC4qkLtAL9BGT0L9loEDtoP8z95M2T7qFW05knozd9fY9NR+YyncwtsnNJ2htos0fT7tvlz/r2uTlo68v6AOYZ5t9w3br0bO2dzLbF6Z4oKDcPzxy5coV98QTT7hJkya569eve7dNmzZt3OrVq28xMwYj8RWznDlzxs2aNcsRF39y165d3Zo1a3wa4YGkv3jxYt+QNGvWzLH079/fbd16433SFjcKevInbrbBWRonxhQYW8jl2ybO5s2bLRtn5aG8lLtly5buueeey2tw1eof7XVTxrZt27rdu3e7devWuV69evm0W7Vq5V577TV3+fLlTP65flCPaNrR+Js2bfJlfvnllx0uNwvU66mnnnL33HOPfyw93B7Vn0Z9165dFsWvDx486MaMGePTtvPJObh27Vomnnr0GSlS8yMv0P/kJxn2hTzMxc0y2Bd3m5F9W1ixSgQ9/nhgDWgJDMQyIIvPPgQJ+wx07AeOI0eOdMuXL3dVVVWuffv2HhKAKAxAsHnz5m78+PFu5cqVbtGiRa5Lly6uQ4cObs+ePZmoUdBzR8GYQdwgpDVONnAMtABsuHBnctttt/kGzOrBGt83cAeqNEw0VpS9d+/e7sQJcwZmilXjh9U/CmMD/ZtvvunTee+997wuAwcO9MCfOXPmLQ1gjYRv/pMP6KkDjTINy7ff3nhikMOpO/VaunRpJmneQTJo0CCv/+jRo73+nK+xY8f6BtIi0kBxPmigOD/E4dyiH3WjA0AQ6E2x9KwF+ps9/koHPUA0t42ZL9u48A8cOGCb/NpAR48PYBpA2blt2zY/kBhtIIDpvn37aqRjccOxgCjogQv7o0AjIQDHdiAUFyjn448/fgu8geCdd97pduzYUeMwGgrSC8tTI8LNf6z+caBHk2hjwYueBg8e7Ae4GQepLeQDetLgRVL03J999ll/DiwfevQ0ggTODfuBP41AtkCDyh3WSy+9lDnWjqdB6dixozt06JA/XKDPpmJytwv0CQC99ZoBnPXaMFkDKe6RMBjo6JEfO3Ys3OXvBOg9Dh8+3F28eLHGvug/BgzAZiEKerZzt8EslCjQszVEHEM9iH/77bfXAJzdqZiLyvJlzRv3cFvUVnarfzbQc8cSDVwo3DHRa64toAcNRtxid1yWBo0WdeQOasGCBe6uu+6qcYdk5zB6bu14W+OeofEL7w5sHy4vXF/WUNh5i7vLsmO0TpYCAn0CQI9/Gz8y6zAY0Mw1YvtsexwQbV8Ughx78uRJt2zZMu8u6dOnj88TmIXAiAO9wTmMZ9uidw5WRnNDRO846JXSO42DqG2LK7ulyzpbHWlYssGcfaQfBXWYrv0G9NmmV+7du9ei+TU6cB64i+Dua+7cuTUaaxodxjg2bNhQ47joPzQEVv9sa2toBfqoesn/X6CvcNBzaw8ss13cbKfHuH379ow1G+hC8NpO2xfCkjzeeOMNDxwalFGjRnn3CIODwClMJw70pM1dBfCzuf2Uh//jBosNfvj28U+HwXq4zFEPffnhb6Boro/wWPsdV0f2AcI4F5PtQ8t8QR/qZ/lmW3/99dfu5z//uYc9g+NhyFWmMB6NC+eiuro6qy5y3YSKpeu3QF/hoMdnzMNRDz/8cI0Hc+xBnXHjxnn/bnjrb6ALAW1mb/tCUNnTtgzW1jZ7IxvorSeOu4aA4QHy6PRPXDZvv/22L3N0QJjjGCfARcFslfqGuDqSVi6osq8xQG+zbGhgaJDDQVjKtHDhQu92CWccxdUbPdAlOo4SF1c9+jhVkr1NoK9w0IfTE+NM1aAWDtTatnxBnw2ABu8wnWygtzsP3EgMQvbr18/DPlpmG+DNNsPl3Llz/ljGEej51ydY/cPGjHSy1dP2NQbozUfPXQiwjk6rNP+6NZDZ6otLjfLFjS9EjxHoo4ok/3+BvoJBb9MT43rGoenSGODnNTeJgS4EtMW3fSEEAaDNdbd4gJu7BuASppMN9BxH/vjAgRa9T3PjWJo2c4QZLsxAiQv0+GkEbG5+OPjMb+4+mEueK8TVkfjFBj2zoTp16uQBj572v83CoUw2loFbJqpXWEc7NjpjiDjff/+9++ijjzLRBfqMFKn5IdBXMOhtNkvcDJTQgoEAoMCNA1AMdCGgLb7tC0HPPHmOZ5467hubn83bH9keppML9DY7qF27dv7hrtCPTrmmTZvm53xjlKHP3X7zxC+BefJMJQT29gwAjRnz3bPNPLH6sY6rI9sLCfpsg7Gvvvqqw91mLpuwB09DxWAsjSrTWS0wME2DYA+F0WuPzqPnWGbt8KwD54nnCkhj6tSpvnENZ0YJ9KZsetYCfQWDnp4xsAOuuYJNOwTKQN9AFwLajrd9IejZxxOwBld7UhQXDPHCdHKBnnRsdgiuhjBs2bLFg4w7hGxLWE+mftKzp9EgPhAcNmyYfw4g7OWHedjvbHUsJOiz1cEGe3HZ8CATcA6D3dVEB6IZlH7llVcyM52svjbAShrUGx1p8AA+ZejcubOHfjhVVqAPFU/Hb4G+gkFfaSYKiAA9g8f5PHhUafVTeaVAuSog0Av0RbNNmyFUm6upaAVSRlIgJQoI9AJ9UUyd3jz+Z3zXzKxRkAJSoHgKCPQCfaNa2/79+z3gcdngV2bAlYFXBSkgBYqngEAv0DeqtdkDTgzgMhMknGnTqBkrcSkgBTIKCPQCfcYY9EMKSIFkKiDQC/TJtGzVSgpIgYwCAr1AnzEG/ZACUiCZCgj0An0yLVu1kgJSIKOAQC/QZ4xBP6SAFEimAgK9QJ9My1atpIAUyCgg0Av0GWPQDykgBZKpgEAv0CfTslUrKSAFMgoI9AJ9xhj0QwpIgWQqINAL9Mm0bNVKCkiBjAICvUCfMQb9kAJSIJkKJBb0+Z4u3qrIQmDNx6956RYf67h8+bL/DNuZM2f8+9P5aAdfdeJzdd9++61fTMB881M8KSAFpECxFTBOwS0++rNq1Sr/wSLeJMsX5PiIEB+34ROefJgH/sFB46Mxstjlri2/JrVFsP1WEf4X6E0VraWAFEiSAgK9evRJsmfVRQpIgRgFBPoGgp7vtiIitz0KUkAKSIFyUwA2wah33nnHu5vluqmHj/7zzz/3IlqLqfVs6TFbGug6KD8bqK6uFuhphevjo+c4YG89exl4+Rm4zonOSZptADbBKJtAoh59PXr05XabpvJIASkgBeIUEOgbML0yTlBtkwJSQAqUmwICvUBfbjap8kgBKVBgBQR6gb7AJqXkpIAUKDcFBHqBvtxsUuWRAlKgwAoI9AJ9gU1KyUkBKVBuCgj0An252aTKIwWkQIEVEOgF+gKblJKTAlKg3BQQ6AX6crNJlUcKSIECKyDQC/QFNiklJwWkQLkpINAL9OVmkyqPFJACBVZAoBfoC2xSSk4KSIFyU0CgF+jLzSZVHikgBQqsQAj6pUuXupdeesmNGTPGfzXPvjC1Y8cON27cOLdu3Tr/hT37MBPrcgxF+8JUOVZeZZICUkAKRBUw0A8YMMB16tTJde3a1d17772uqqrKf0rw2LFj7vnnn3cPPPCAu//++x3xeIf9uXPn/Jt9o+mVw/8CfTmcBZVBCkiBslHAQH/fffe53//+9/7VxVOnTvVA37Jli/vwww/dgw8+6PgN3HmdMbCfP3++QF82Z1EFkQJSQArkUMBA//DDD7tPPvnEfxx8/fr17te//rX71a9+5Xr16uVmzZrlLl686D8OfvjwYTds2DC3adMmgT6HrtolBaSAFCgbBQz03bt3d0uWLPGg37Ztm/vyyy/d4sWL3ccff+xOnDjhLl265EHPdhoB/Pfy0ZfNaVRBpIAUkALZFTDQ33333e6tt97KgH7Pnj0e5nxb9vz58xnQv/322+6RRx5xhw4dcvv373cM4L7yyiu+McieS3H3yEdfXL2VmxSQAmWugIF+1KhRbsWKFbWC/vjx4+7JJ5/0bh1cO7h8+vXr5335+VaVO4Fdu3a5F154wT300EOZtEhv7NixvlHJN624eAJ9nCraJgWkQGoVMNDjtlm1apUH/datW/2agVi2AXdz3Vy9etWdPXvW4d7Bb09ghs5TTz3lLly4UKuOQP6DDz5wffv29Y3Eyy+/7F599VW/jB8/3k2cOLHeoD948KB77rnnnEBf62lQBCkgBdKkgIHePg5Or57efbdu3Rx++x49engoz50714Md0ANrW9AKF86QIUPcxo0ba5WOuNwF4O6h8ShkYBCZuwKBvpCqKi0pIAUqXoEQ9MuXL/fAZs786tWrvY+egdhFixa5/v37u/fff99FQX/t2jU/iMsx06dPd/yfKxiMWRc6WNoCfaGVVXpSQApUtAIh6N944w3fg//oo49cdDB24cKFbsSIEY7BWevN0zvnKVogv2DBAnflypVatTAYc+eQbzh58qR37ZCPjQtQHsvv9OnTfson+9Sjz1fVOsSbN2+ea9q0qffn1eEwRZUCUqBMFAhB/+yzz7rBgwd7F0wU9LwG4Te/+Y0fRDXQ0zDgmwfE+Qbi0mDgvsHVQ1q5AncUI0eO9HcauI8YM5g8ebJ3J+Hb/+Mf/+iB/9VXX/kndlMFelo4/GtAOFxatWrln3Kj9WXKVEODQN9QBXW8FCitAsUGPbX9+uuvfYPSp08fR+Oyb9++WODTY3/xxRfdM888477//vuMUDQOPJnLKxk2b96c2W53C6lx3RjoGUiZMmVKZnn66addu3btPPyBPrdPtbWoGRVjfjQU9AzGvP766/6FSYUemIkprjZJASkQUSAEfTFcN5Y9r1N47bXXfM8c4PMyteidwc6dO33HNM6fzwwb7j5gkIXUgp6pStEA2HlvBS8uatmypX8jXTROvv83FPTWIHErJ9Dnq7riSYHCKRCCPjoYe+TIEcdLzWobjGXgduDAgXkNxkZLDtx/97vfZaZbAncL1dXVNebYmw8+XE+bNs2iexdyKl03caA3VXiE+Z577vGj6cyLrU8Q6Oujmo6RAuWjQAh6/N+NPb0yW81xweC3f/TRR/2AL/GY5WOzfXhFctzCg1cW1KM3JSLrOXPmuGbNmt0ymIpvjPdc8MpSfPz0/HkQgRH3MGQDfT7H0wiF4wf2O7wVY6rW2rVrHbd2lLN58+Zu+PDhjts2BSkgBRqugIG+WA9M5SoxjQw9cnPV8GAV1z4DrfkEgT6LSrt373Zt2rRx7777biYG82Tx6wN3YLxmzRr/9rr27du73r1713inRRzo8z2elphbsw4dOviTuXLlSt9i8w4NAi4m0m/RooUbPXq0Y/97773nB5k5hrIrSAEp0DAFDPTFfAVCthIbqA309PIZcKVDmk+gQaBhSN1gbC7XDcKZjzyMx0uK7rzzTsd0qjAA5rZt27oZM2ZkNseBvi7HW/5xPnpeg3rHHXf4BzfCAWMex+7Zs6efv/vDDz9kyqIfUkAK1F0BA33nzp2L8lIz3ob52Wef3fJgFV4AfPW4b5ifT8Cl/Nhjj/ltoYuGfcTnDuDy5cuZSlMXxgoE+owkN34YaA30vKti6NChbtKkSbfMxgGqPByB68TecREFfV2Pt/yjoOeuYMKECX6+bdwgLY0Nj2hHXUmR6ulfKSAFalHAQF+s1xTjd8c9w8vM4A5z4RlQ5RUK9MaZ+h127GgUeFCKhbi4cvnCFfGjL0A7c+aM9/EL9JGTzoh6ly5dHC8WIuA26dixY6zv3HzoGASAJkRBX9fjs4GeqVe8Ec/yjFtzd4GRKkgBKVB/BQz09KSL8eEROon43umpc40DfSBOx44XpYWQt1odOHDAQ93i00gA/eh0TOLzQjaB3pS7ucYHhg+cFxoROOkAlNeHxo1ws23Dhg2ZR4+joK/r8dlAb9u5e8hWji+++KLGQxSRqulfKSAF8lDAQM+nBJnX/vnnn/sxOj4XGPcpQXrc+pTgt9+WRS/TQGkumbjzjXuEnjwDm7SYBJ5Qwz9vPfy448JtUdDX9XgrZ9R1g28OPzwDRPjiFKSAFGgcBQz09JLz/Tg4ryLQx8HLwJ1gAM0FenrKzKzBPwb0CeYyGTRoUF7vlo6Cvq7HG9CjoLfxANxKuJcUpIAUaBwFDPTc1TORgidUGYvjwSWetWEcjIkZ48aN83fXDH7iXrGlcUrVsFTlurk5Wg2geQUCAxo0ChY4eTNnzvRz1okT+sv4zUuIwjnsUdDX9XgGWoF8XMOC0TF3nqme1hBZOTHC8Ak62661FJACdVMgBD0PTDG1EV959KVmXKt0wLgWuc5tqVtuxYmdOtBH33WDKwTAA1BGrBmljgbeFsfTaMThrXE8Fs3DFExbwq2DYViIgp7tdTkeY2EGDXnxRjrm1TNfnsAMHubPMxBLeSgDZWEQhwenbK6tlUVrKSAF6q6AQH/z9ZnAiCc0aclo0bh14U1qQJI53fi36V3S2zXR6i53YY8w1010tspdd93ln3BlTir1yhYYGadnby9Aw8UzbNiwW0bF40BPmvkeT1x05A12wJt8mBtrAf88b6ljji91IQ7QZ4pVbR84sDS0lgJSILsCxiz7wpR69BUE+uynVXukgBSQAj8qINBXcI/+x9OoX1JACkiB7AoI9AJ9duvQHikgBRKhgEAv0CfCkFUJKSAFsisg0Av02a1De6SAFEiEAgK9QJ8IQ1YlpIAUyK6AQC/QZ7cO7ZECUiARCgj0An0iDFmVkAJSILsCAr1An906tEcKSIFEKCDQC/SJMGRVQgpIgewKCPQCfXbr0B4pIAUSoYBAL9AnwpBVCSkgBbIrINAL9NmtQ3ukgBRIhAICvUCfCENWJaSAFMiugEAv0Ge3Du2RAlIgEQokEvS8Pz6fhc/XsRCXNZ/UOnLkiDt8+LD/wtLevXvd7t27/fvZeQ89n+XjPeomWiIsQJWQAlIg8QoYs6qqqtzChQv9x3/WrFnjP+zDx8H51gZfm+K7G4cOHfIcND4aI/NhajHjNMk3M6uIQJ94O1cFpUCqFRDo1aNP9QWgykuBNCgg0Av0abBz1VEKpFoBgV6gT/UFoMpLgTQoINAL9Gmwc9VRCqRaAYFeoE/1BaDKS4E0KCDQC/RpsHPVUQqkWgGBXqBP9QWgykuBNCgg0Av0abBz1VEKpFoBgV6gT/UFoMpLgTQoINAL9Gmwc9VRCqRaAYFeoE/1BaDKS4E0KCDQC/RpsHPVUQqkWgGBXqBP9QWgykuBNCgg0Av0abBz1VEKpFoBgV6gT/UFoMpLgTQoINAL9Gmwc9VRCqRaAYFeoE/1BaDKS4E0KCDQC/RpsHPVUQqkWgGBXqBP9QWgykuBNCgg0Av0abBz1VEKpFoBgV6gT/UFoMpLgTQoINAL9Gmwc9VRCqRaAYFeoE/1BaDKS4E0KCDQC/RpsHPVUQqkWgGBXqBP9QWgykuBNCgg0Av0abBz1VEKpFoBgV6gT/UFoMpLgTQoINAL9Gmwc9VRCqRaAYFeoK+4C+DSpUvuySefdAMGDHCnTp2qU/lPnz7tunfv7iZOnFin4yxyQ/K2NPJZUy/qRz3Js75h3rx5rm3bto4LXSG9Cgj0CQH9lStX3KpVq9yQIUNcy5YtXdOmTV2zZs1c165d3dSpU92hQ4cSY+UNga1AnxgzUEXqoIBAnwDQ79271/Xq1cvDHbDTW50yZYp77rnnXOfOnf32+vZgsSUaiSeeeMItXLiwDqZVnlEbCvryrFX2UtW1R08j+vrrr7tx48Y16E4ie4m0pxQKCPQVDvrdu3e7Dh06uHvvvddt2bLFXb9+vYYd8T/b//CHP9TYXpd/1q9f7xsLoFHpQaDPfQZNnxEjRgj0uaWqqL0CfQWD/vz5827w4MGud+/e7sSJE41meAJ9o0nb6AnXtUcv0Df6KSlJBgJ9BYN+9erV7rbbbnMrV66sl/Ew4PfKK6+4Vq1a+R47dwbz5893+PsJdtHj7w+XfAf3Dh486B555BE/ZsB4Qf/+/d3WrVu9a4kBUdIP84lzL1kZwn24F+hxhmn4hJzzZV+8eLHr0aOHH6MgX37v2rUrZ16bN2/2d0aDBg3KOcAbl7dto0xnzpxxs2bNcm3atPGa4Upbs2bNLXda586dczNmzHDt2rXz8TgHuNouXLhQo5xxPWvOG3FtLAa33YYNG1wc1G0bd4kJrwoAACAASURBVH7r1q1zffr08bqQ32uvveYuX75s0vnzEp5n+213clevXnUff/xxRtvmzZv7cwpEFMpbAYG+QkHPRYcftW/fvu7s2bN1tjLuALgTuPvuu11VVZVvLMaPH++4ePHvkz7AByCAi4v+hRde8LD44osv3Pfff58zT6ACTNq3b18j/U6dOnlQhJCOg7klHrfPwBqmQXwACKipw+jRo32dli9f7saOHes2btzok4xLz9xf+dwZxeVt24YOHeoBPHLkSEe+6Er9AfKmTZusSr6cnDf2oS0NAZr369fvlsYvCnorK9pyDI08a9J64IEHbplhY6B/8803/XgNZaJsAwcO9Od02rRp/lxTOBrD6upq3+DRIJA255ExGuyBuGiLnbDvvffe840td3wK5a2AQF+hoKdHCBjGjBnjfvjhhzpZGQB/6qmn3PDhw2sAG3/+3LlzXYsWLRw9XAt1dd3Q8NB7jwMn4AB8IaTj4Gt5x+0zsIZpAKJnn33Wp00e2UI0PWvwGONgULu2EJe3baMxtEbS0tm2bZtr3bq1mzBhQgaoQJK40XLSm6ceBCtnCHo7b9x5AfwwWAMQvdsC9OQVPRfm9uvSpYs7duxYJqm4fNlJHOLS2IfjQJTp4sWLmeP1ozwVEOgrFPR2QYYuDTMx28cFbksIxZ07d3rXQlxPjN5bx44d3bvvvmvJubqC3lxKrKPBoBiWx8qbqy7hvrg0MGQghzskBFE0/zAvwDpq1Cjfg42CM3qc/R+Xt22LQpNjyIO7DBpVA6KBHjdItmDlDEHPeaPRmDNnzi2HUedJkybF9uixgSVLltxyDOcYF1NY97h8OdBA/8wzz2Qao1sS1IayVUCgr3DQ476xXqBZGXDBJUDvkosfd0kI1mXLlmUaAGsIousQrHUFPbClscg2d5+0w/IYXMI8rS5x+wysYRrAE388rqZcwdLjjsYgH9695DqWfXF527YQ5paO7QvLancRuF+Yynjy5EmLnllbOUPQ11ZHc9NwUVtgWxTm4T7Oe9jgx+VLfGwMe0Ljxx9/3O3YscNdu3bNktK6zBUQ6CsU9NwyAyp6i4A9W4gDDRf/7bff7gfvcB/ELTZ4Sbp1BX0U5NGyRfcbXBoC+jjIRfPlf8uLQWyg9eKLL97SUMYdZ9vi9LRtIZRzxWcf4wm4mvB5GzwPHz5sh2XKGaZZWx3j9sdts0zYly/oOQabY7DeBpppvLCdXHdQlpfWpVVAoK9Q0GM2s2fP9v50G2iMMyWDUNij5MGnfHq/ll59QB/nxrD0GgP01Ck6tmD5hWsDPYAHdAD/jTfeyBv2cXrathDKlqftC/W3fawpD716evfceR04cMDvtnKGaVJeGujt27eHSWR+x0E9bpsdwL66gN6OA/grVqzwd2XY0dKlS22X1mWqgEBfwaAHCsCBnn22Xn0caHBVAEUainwC7hAuaMCQTzDoxjVADCI/9NBDNVw3NrBMPYBIGDBQfO9hbz9XneL812F6BlDSC90R1C2fnmlc3rYthLLlafuygd7iMfOGc2JPH1s5wzTtPMT529GNp5fjBmOj2yzPONAzkN6zZ08/fZWy5wrHjx/3cePOW67jtK/4Cgj0FQx6wMTFCoS52HAHRAMDgPiOQ9DYrJi42RsAg0HCELgG23wbBhog0o42QGF5w/IAXMYaAFLoMmI7Ywz0OmsDPQ0d0xvJl0HLbMEAaulxHOXEhbJgwYJaYR8HbtsWQtnyt31hfZnxEvVv00unt874CcHKGabJeWNaZnQGDfFxoTCbKQp17CO6zcoWB3orb9QlyHz70CZIA+2IV5+ZX1YGrYujgEBfwaDHRAAGFyygYuHCY9CMhYdqAB+gBA6AwgJgwF1g87HpUTKAy7z6EC7EN8AADGDIFMz9+/dbUreso0BftGiRn7vN/HLeyAhYQ/CRAO4hQGXz7m2uN71UymRgJq7BKJoGs0e4wyEd6s7gZT7z6HnIicaQ49AlV4jL27ZFdctWVs4XdUJv8mM+Oq6uEOBxoCc9A7rpRB2Z196tWzf3/PPP3wL1uoKec8dgOp2HyZMn+3n15AEo0DbUNV/NcumpfcVRQKCvcNCbmTDDhbdU2kvMgDtPXQLKTz/9tMYTkHZM+OSqxedJ2bg7A55opTdJPJ40DQcOLb1wTQPE9EqeDOUYGhTSBqpAOwppAENjE8bnyc2jR4/6uPmAnvwpe/i0L/AeNmxYZgaQATRMj+NsJkxtsDeoh+W3bfmCnrsWykRe2XS3ckbTRCfeXcRzCsCYNOhRcz7ioB63zc4T+8g/nHXDPs6RDRSTPv54Gns6D/YkL9upA88JUCaF8lZAoE8I6MvbzGqWLg70NWPov/ooUNu01vqkqWOSoYBAL9AX3ZIF+sJLbmMUUd964XNSipWogEAv0BfdbgX6wkqOm4xxk7rMjCpsCZRauSsg0Av0RbdRgb5+kjMW8PTTT/sF/zoDs6zx1+Nr50Vu9OwVpEBUAYFeoI/aRKP/L9DXT2Kmm/J2Sd5SaQO5zLQC9AyYRqc/1i8XHZVEBQR6gT6Jdq06SQEpECgg0Av0gTnopxSQAklUQKAX6JNo16qTFJACgQICvUAfmIN+SgEpkEQFBHqBPol2rTpJASkQKCDQC/SBOeinFJACSVRAoBfok2jXqpMUkAKBAgK9QB+Yg35KASmQRAUEeoE+iXatOkkBKRAoINAL9IE56KcUkAJJVECgF+iTaNeqkxSQAoECAr1AH5iDfkoBKZBEBQR6gT6Jdq06SQEpECgg0Av0gTnopxSQAklUQKAX6JNo16qTFJACgQICvUAfmIN+SgEpkEQFBHqBPol2rTpJASkQKCDQC/SBOeinFJACSVQgkaB/5plnXD6LfX/T4tr/v/3tbx3LmDFj/Hc4H3/8cTdy5Eg3bNgwN2TIEGeiJdEgVCcpIAWSp4Axq6qqyi1cuNB/knLNmjVu/fr1bsuWLW7nzp1uz5497sCBA+7QoUPuyJEj7tixY5nl+PHjrtyWJgbu2tYGdotn/wv0yTN01UgKpFmBRII+35bHWizi8/vo0aO+JTt8+LA7ePCg27t3r9u9e7fbtm2b27hxo1u3bp3/CLOJlmbDUd2lgBSoHAWMWYnq0Qv0lWOAKqkUkAKNr4BAr8HYxrcy5SAFpEBJFRDoBfqSGqAylwJSoPEVEOgF+sa3MuUgBaRASRUQ6AX6khqgMpcCUqDxFRDoBfrGtzLlIAWkQEkVEOgF+pIaoDKXAlKg8RUQ6AX6xrcy5SAFpEBJFRDoBfqSGqAylwJSoPEVEOgF+sa3MuUgBaRASRUQ6AX6khqgMpcCUqDxFRDoBfrGtzLlIAWkQEkVEOgF+pIaoDKXAlKg8RUQ6AX6xrcy5SAFpEBJFRDoBfqSGqAylwJSoPEVEOgF+sa3MuUgBaRASRUQ6AX6khqgMpcCUqDxFRDoBfrGtzLlIAWkQEkVEOgF+pIaoDKXAlKg8RUQ6AX6xreyPHLga/Rt2rTxX6jPI7qiSAEpUAcFBPoKB/2VK1f8B8sfeOAB17JlS9e0adPMMm/evDqYQmmjCvSl1V+5J1sBgb6CQX/hwgU3evRo16xZM9enTx83efJkN2XKFL8G/AsXLqxo67169aqbP3++Gz58uDt9+nRF10WFlwKlVECgr2DQL1myxLVo0cItX77cXb9+vZR21Ch5X7p0yY0YMcJ1795doG8UhZVoWhQQ6CsY9BMnTkw0BAX6tGBI9WxsBQT6Cgf93Xff7Y4ePZq3nRw8eNCNGTMm48/v2rWrW716tbt27dotaVy8eNHNmjXLde7c2fv9mzdv7oYMGeKOHTvm4+ZqaOL2MWbQtm1bt3v3brdmzRrXvn1717NnT3f27FmHj57xBRtXYB2ON9hv0qW8uKssbrTgdqezcePG6C79LwVSqYBAX8GgX7dunXfd4MM+depUrQYMYDt06OB69erlFi1a5F0+I0eOdLfddpuHZuj+2bt3r7v33nt9gwBcATMAffrpp92+fft8XnEwt0LE7TPQv//++65169Ye5OaWiYL+0KFDbuXKlX7sgTJXV1c76rtr1y7fMPTt29e7dej1h4HB6SeeeMINHTrUMYahIAWkgHMCfQWDnsHKqqoqR0+7VatW7s0333Tnz5+PtWt6zf3793cvvfSSA4YWSGPSpEmuY8eODrgSSGPw4MG+UaBxyBbiYG5x4/YBeqZQAmnATd4WoqBnezbXDQ3SjBkz/N0BBhwG/ueuIVtvP4yr31IgLQoI9BUMejNSYAyYcW8AfCAXwpx4uDvuvPNO37LbcbbevHmzvzOgx0wAurhG6MHnCnEwt/hx+ygXZQTS4d2D5cm+ENDZQE/8nTt3+ruCMD7b58yZ4xuoAwcOWFG0lgKpV0CgTwDosWLAuW3bNjdw4EAP00cffdR9//33GQMHrubnzrY2aM6ePds3CuaiySQS+REHc4sSt89AT0MSDXXp0XMsbhncM8zKMfeNbZswYUKNu4VoXvpfCqRNAYE+IaA3w8UdQq+WHvm0adMywAO8oa+b3nt0MddNHKQt/XCdK17cPkCPWwWji4a6gp7juePA10/vnrB9+3b/P3cvClJACvyogECfMNBzaoH9M888U2Pq5csvv5xXL53jiRv67H80l5q/4mBuMeL2FRr0NEyUk4aNwJ0I/n/GIxSkgBT4UQGBPoGg5/RGQbts2TLvumEmS22BuNwR1NYzxh2E3z/q4snmW68r6BlnGDVqVI0GKyw7DRpuGlw4TDHt16+fh30YR7+lgBTQrBs/J/z48eN+DSyOHDniDh8+7JhvzhRDBjrxfTMnG1fHihUrMlOVSmlAwPStt97KzJQJy0I9mJ/ONEMblGVwslOnTq53797uxIkTYXTvy//oo48y29CjW7duPm6uOfo0Gvj7586dW2NwlamYPLFrUyct4bqCnuNosLp06ZKZu29p2ZrGiJk89OppdMyNY/u1lgJSQKCvaNAzEEnPu0ePHh6IvOfmt7/9rZ/7jj8+nBrJYO2CBQv8VEweVOJBKIA8depUD0qAGgYaNWbwAFHixM2jZ+4+rhKmd7744ot+3jvr+++/3/eyCwF6/PA0JjzkRSPLu2/CgJuGMrRr165GwxbG0W8pkHYF5LqpYNcNPmogzNOtAB8g8hTr9OnTYx+gAvZbtmzxM3OAs8UH+jwFGw3Rp2gB/9ixY2u8d4a7Hx7YIj0WfrMt6joi7fr06C9fvuxmzpzpGy/Sxw8fDTajCJeTghSQArcqINBXMOhvPZ3p20LjBehxNeFyUpACUuBWBQR6gf5Wq6igLTaewNO9QF9BCkiBWxUQ6AX6W62iQrYAdgaCmUvPgLmCFJAC8QoI9AJ9vGWU8db9+/d7wOOy4ata4YNhZVxsFU0KlEwBgV6gL5nx1Tdj5u0zlZLBYQaSbQppfdPTcVIg6QoI9AJ90m1c9ZMCqVdAoBfoU38RSAApkHQFBHqBPuk2rvpJgdQrINAL9Km/CCSAFEi6AgK9QJ90G1f9pEDqFRDoBfrUXwQSQAokXQGBXqBPuo2rflIg9QoI9AJ96i8CCSAFkq6AQC/QJ93GVT8pkHoFBHqBPvUXgQSQAklXQKAX6JNu46qfFEi9AgK9QJ/6i0ACSIGkKyDQC/RJt3HVTwqkXgGBXqBP/UUgAaRA0hUQ6AX6pNu46icFUq+AQC/Qp/4ikABSIOkKCPQCfdJtXPWTAqlXQKAX6FN/EUgAKZB0BQR6gT7pNq76SYHUKyDQC/SpvwgkgBRIugICvUCfdBtX/aRA6hUQ6AX61F8EEkAKJF0BgV6gT7qNq35SIPUKpAb0O3fudFOnTnWDBg1yv/rVr1yfPn3cr3/9a/fOO++4I0eOuGPHjrmjR4/634cPH3YbNmxwzz//vOvbt6/r0aOH69mzpxs+fLiPb6Kl3nokgBSQAhWhgDGrqqrKLVy40FVXV7s1a9a49evXuy1btjj4uGfPHnfgwAF36NChDBPhIsvx48fLbmmyY8eOGoVatmyZ69+/vwf8Qw895IYOHeqBD+yffPJJXzkDPZBftGiR69evn+vVq5d74IEH3MCBAz3wu3bt6n7xi184E60izrAKKQWkQOoVMGYlCvS/+93vMqDfu3evGzlypLv//vvdhx9+WKN1ogVbunSpO3jwYKZHv3r1at8oDB482Ld6u3btctu2bXNfffWVe/vtt919990n0Kf+spEAUqCyFEgk6MeMGeN76dxufPnll+7BBx/0bhqgHd6ChLcl/Oa25ZlnnvGg/+STTxyNxO7duz3oN27c6NatW+fGjh0r0FeWjau0UiD1CiQS9PjeDerffPONh3xcjz4KevxVNAqjR4/2gI8D/ZtvvinQp/6ykQBSoLIUSDzogTkDrvjjWR5//HH30UcfOXzxUdAvX77c+/F79+7tWPDR//KXv/SDsd27d3f46Lt06SLQV5aNq7RSIPUKJB70uGoAOr73Rx55xIOcWTcMyjIwgX/e4hjoGYjFR88MnYcfftgPxjIoywwcGgATLfXWIwGkgBSoCAWMWYkajA1dN1Gf/KZNm9xLL73koQ3wJ06c6KcT0RgY6JlWib8+znWzYsUKgb4iTFuFlAJSwBRIFehD6H/22We+p47vftWqVb7Xv3btWj+tkp4/UzQFejMTraWAFKhkBVILenrwL7zwgnfl4LPnf2bYjBgxwm97//33BfpKtmyVXQpIgYwCiQf99u3bM4OvYY+ep8F+85vf+Aei6N0Dep6Mfe+99zzoBwwY4ObNm+eYtcM8eqZXfvrpp+7dd9+V6yZjPvohBaRAJSiQeNDbPHoGUocMGeKfimXN/8zCmTVrVuZxX0CPb/7ll1/2sGfWDfF4MpbBWB6W0qybSjBrlVEKSIFQgcSDft++fe7VV1/1kAfsDMDyOgQefMInD9xt1o2964Z3PaxcudKNGjXKx+ddNyzMxuGBKhMtFFK/pYAUkALlqoAxK1GzbkIXTa7fuGtYoqBnjj3TLjUYW65mq3JJASlQFwUEeoG+LvaiuFJAClSgAgK9QF+BZqsiSwEpUBcFBHqBvi72orhSQApUoAICvUBfgWarIksBKVAXBQT6Cgb96dOnHS9ba9q0aY2lQ4cOftroxYsX62ILiisFpEBCFRDoEwB6pn5OmTLFL5MnT/Zv3QT+vLjt7NmzCTVdVUsKSIF8FRDoEwB6XswWhmvXrrm5c+e6Zs2auSVLloS79FsKSIEUKiDQJxD02PGpU6dct27d/Ns5U2jXqrIUkAKBAgJ9QkFv/vu43j5PBPOUMD3+5s2bu+HDh/sHxAK78F+Hx/3DV7d4kyfv5yd+y5Yt3XPPPefOnDkTRnfXr1/37wQaNmyYj8OxjBUsWLDAXb16NRP30qVL/sVxvDzu5MmTburUqT4+5ZgwYYL7/vvvfXzuRO6++24/9sCaL9aTRzTQoL3yyiuuVatWPi55zp8/3125ciUaVf9LgdQqINAnFPS8swfozZgxI2PcgJIXtbVo0cJ/LpHXPPASNwZ0icvbOy0AeGDNu4D42Apr4vOFLoD/1FNP1YAphtS2bVv/egkgTdxHH33UxwX2Fgz0fLJx6NCh7sUXX3S849/SHTdunP8gDF/zWrRokV94vxANDN8SCMOJEyd82WgIeLSbPMePH+8bL8YswgYmPE6/pUDaFBDoEwh6erO8mA1479mzJ2PTgPKOO+7wX9sKe8e8AqJnz56Oj6r/8MMPPr6BPtoAAM9nn33WtW7d2u3cuTOT9v79+/27gxgfsHD+/Hl/J8A7gs6dO+c3G+hpRBhHsHJYumy/5557/DuILB3eHkp+kyZNysSnjjQ23I1wF2CB9EiXxmzz5s22WWspkGoFBPoEgD6cdYOrhh5u+/bt3YYNGzLGDUhxjeAyAbbRQM8fnz6uEIKBfvbs2dGovucMkNetW3fLvugGysMdA64kgoGeXjrvGQoDPXLSjeZ54cIF/1nHsOw0Mm3atPHlDNPgNy+l69ixo3+ldHSf/pcCaVRAoE8A6IFjuOAyiU6rpEdNzzqMF/2N6wWDIBjo42Bu+3ADheHy5cv+OBoN3DJ33XWXzy9M10BPTzw6z9/SjeZpx4SgX7ZsWc66ULfo+ERYVv2WAmlSQKBPAOgNaPTa+e4tA5O8YpmesAUbnAWwgDRu+eKLLzJuEIMu62iwfSHoee8/QGdQlddAM58fXz35xYE+hLalb+lG84wDPXnffvvtfswhri5s27VrlyWttRRItQICfYJAb5bM4Odtt93mIWg+cHr4+OFpAPKZkZINuuRh+wz0pM0HWqL+cuLSCDUG6BcuXOgHekP3lNVfaykgBWoqINAnEPT0gJnFEg6kMsjKYGucb7ymSdz4z2Ae7V2z1/YZ6DEiYG7/W3qWZ2OAnoFWBlyj/nzLW2spIAV+VECgTyDoOb24LQAswAf8hKVLl/pecNzUQwY3w1k0BvO6gB7fvN1BkB/uE6ZFNgbouYvARRQ2Zr6Szvk7lo8//jivOxc7RmspkGQFBPqEgh7gvv322zXmseOzHz16tB/EBJL40PHpP/bYY963HkK9LqC3Owj888yLZ/YMjQmzgXDpNAbouShpSBiPYCE/Hqpivj+zjuLGAJJ8IatuUiCXAgJ9QkHPSbdeb6dOnfxHz9mGf54nRzt37uyBb4OnPC0bzoGvC+hJlydleWKWHjxp4q/nk4yN5aM3o+aTj4888ojPl5k27dq180/K2jRRi6e1FEizAgJ9BYM+zYaruksBKZC/AgK9QJ+/tSimFJACFalAIkF/7tP/5fJZzq79X47l3Kd/686u/Vt3Zu3futNr/s6dWv137uTKv3NHP/7f7mD1/3HfLvk/bvuCv3cbqv7erZ399/6hIoRTkAJSQApUggICvUBfCXaqMkoBKdAABQR6gb4B5qNDpYAUqAQFEgl63saYz8JLtViIy/ro0aPuyJEjfrYIszn27t3rX93L2xM3btzop/PxSl0TrRJOsMooBaSAFDBm8Tpvniqvrq7205GZXbdlyxb/DA1vuuX15rwUEA4aH42R+TC1mHGa5JuZVUSg14UgBaRAkhUQ6NWjT7J9q25SQAo4l/FCqEcv140uCCkgBRKqgHr06tEn1LRVLSkgBUwBgV6gN1vQWgpIgYQqINAL9Ak1bVVLCkgBU0CgF+jNFrSWAlIgoQoI9AJ9Qk1b1ZICUsAUEOgFerMFraWAFEioAgK9QJ9Q01a1pIAUMAUEeoHebEFrKSAFEqqAQC/QJ9S0VS0pIAVMAYFeoDdb0FoKSIGEKiDQC/QJNW1VSwpIAVNAoBfozRa0lgJSIKEKCPQCfUJNW9WSAlLAFBDoEwr6DRs2uI8++uiW5a233nKPPfZY7NK5c2f3X//1X7HLT3/6U9ekSZN6LX/zN38Tm2br1q1rlGPRokWZ8vLhAwUpIAUKo4BAX2LQA+T33nuvBvByATcKYiBaXwBXwnH/+q//6hsJGqeZM2f6huDSpUuFsX6lIgVSooBAX2TQ81lCQA7AKgG0YRnp1Ucbmlz//9M//VOj1dHuEvr27esbSe5UuIM5ffp0Si5dVVMK5K+AQF9k0N93330Fh59BLwrd9u3b17hTCF02BsY4906xesxh3s8//3ymrGhEXf7zP/+z3lpxPA0qdSYfvoepIAXSqoBAX2TQAzTrJdPjBUjA6JVXXvFAwk9t/nD2KzjHXRC6oBPwbmgD8NBDD/k0pa0USIsCAn2RQZ/LsHA7DBs2LNMQ0AgoZFeABoDeOr57awTQzBpKa1CzrWkwOLZYdzDZa6I9UqBxFRDoywD0DMj+27/9WwbwBiYGaRXqpwCNJo0Ad1D04GkAsg1csx1fP/EVpEASFRDoSwx6fMdxAGLqoULhFeAuAPD/wz/8wy0NKw0s24E+8RSkQFIUEOhLDHoGTK0Hj0+egUj15ItzeTEgHepv58HWNLYCfnHOhXJpXAUE+hKCngFGgwo9ST0k1LjGni11dGcwHPeOnY9w/fTTT2c7VNulQEUoINCXCPRRlw2+ZIXSKwD0GdiNutO401KQApWqQCJB/+GHH7pwqa6udt988407dhPqx48fdyz8b9sOHz7sPvvsM7d48WI/lW/t2rUOcXbv3u1v3zdu3OjWrVvn3n77bb+dffUNzPIIH5hi9odCeSkA8KPPPODK0QNZ5XWeVJr8FEgk6Js2berilgceeMB9/fXXHvIh6L/44gvXvXt3f8x//Md/OFu6devmVq5cWQP0Q4YMaTDoQ4Dgl5fLJj9jLUUsXDqhG4fZUfLbl+JMKM+GKJBY0Ldp08bdc8897r//+7/9AzYG/l/+8pdu586dmR494L/77rs93Js1a+buuusuv9DLBvjAnt699egbCvrwgSnme5O2QnkrED7EBvQ5b5xHzb8v7/Om0v2oQGJB//vf/z7Tcz969KibO3euH2wD+K+++moG9DNmzPBAv+OOOzx06V3jxvnggw98fGD/0ksvFQT0UWDIL/+jIZb7L551iE7J5H8GauXOKfezp/KlAvS4aQD4E0884d0zI0eOdIcOHXL79+939NCB+W9/+1sfx0C/b98+9+ijj/p9zLtev36999HXt0cf9uTpFTJXW6GyFADo+OlDVw6/GbhlAFfAr6zzmabSpgb0wP6ZZ57xoAfWBw4ccLt27fKuGUDPHQA9fwP9wYMH3eTJkz3oH3zwQff555/XC/SkF4UD/+u2v3IvM3r3cfPvBfzKPadJL3mqQc9MHPz42UCPi4d99QU9rproND3uDhSSoUAu4PPOIg3aJuM8J6EWAn0jgZ4ee+jTBfh64jUJl8ytdQD42R62oufPfgUpUEoFBPpGAn04LY8pebhwFJKtAC9FiwM+s3R4S6aCFCiVAqkG/d69e13//v29e+b111+/xUePT7++rpvwnem4cBTSowAuG1w3UbedYJ8eGyi3mqYG9LxygAemmF7J7Bt62Ay4/PLpxwAAIABJREFUMgMHmA8fPtwP0NpgLP574tu+L7/8Mu/BWPKymRm4bxTSqQDuOz6UYrbAWo1+Om2h1LVOLOinT5/uZ9Uws4ZXFwB0IM9DUcyRtydj/+d//sc/UEUPnCmQW7du9U/PMt2SbSzvvPNOnebRM7faLm5Noyy1iZc+//BJaOxJQQoUW4HEgt6ehI2uJ0yY4HvzBnrcN7/5zW98z5249ODD5de//rWHfL5PxjKXOhyETdOTr6dOnXIDBgxwTz75pKaPRq5kc+Pgr1eQAsVWIBWgpxffs2dP9/7772cgb6DnpWa4cKZOnernuxvkmeuOT3X79u11etdN2HtjYK6xw5UrV9yKFSu8m6lly5b+rsUat3nz5jV29jXSF+hryFHjH95pZHd5NXboHylQBAUSCXogns8Svr2S3zwwxYNUPBUL/OntR99eCVRNtOj5oWGwi5meG776xgwXLlxwo0eP9u6oPn36+Ae8pkyZ4teMLyxcuLDe2TOgOGjQoNipgbn21TvDhB8o0Cf8BJd59YxZVVVVngu80XfNmjX+if8tW7b493/BK/jHWwMYqzQ+ss6Hp8WO0yTfDK0ixDfQ22BsXUGPiwa4G+iZXtnYYcmSJa5FixZu+fLl7vr16wXNjrsB7gx49UM05NoXjav/bygg0MsSSqmAQF+AD4/QOIQXMg/JFCNMnDjRv165Md6xkgvmufYVo96VmEdoH5VYfpW5shUQ6AsA+vAhGT4oUqz32AB6XrGMyynfwJ3KmDFjnPnzu3bt6lavXu2uXbvmk8Ag2rZtW8PXT8+e9/UzIJ1tH40NC/FGjBiR0QAt+J/lzJkzbtasWY5XSJMmeXP7GHc3QjmZ8tq8eXPvmho8eLB3hcU1blevXnUff/yx69Gjh4/LMTwfQV3KJQj05XIm0lkOgb6BoOetheauYWZFY/vlQzNl2iiuG4DIQGhtgfGGDh06uF69evn53Lh8mHZ62223OXrpAPf77793fIjlhRde8DAGzOTDY/zfffdd1n0MCucC/dChQ91zzz3n8yNffIXc+dDgbNq0qUbRyY/t7Cd/Pv4yfvx416lTJw9zGhO7iwHy06ZN8w0CcYjLqyaIE+d2qpFREf/RrJsiiq2sblFAoG8A6IGfQZ51sR+GAXIAkx5sq1at3JtvvunOnz9/y0lmw9mzZ30vl3frA2ULpDFp0iTXsWNHPwhj23O5Z7LtywV6evAMFJOfBQZ1md3ElFfbbuXs3bu3O3HihEX1a2sAQtAzttKlSxffMIV3BtTx4sWLNY4v1T+49sxOeB2GghQotgICfQNAH75+mJ59qQI9dVwbwBTgA+IQ5pQL98ydd94Z687YvHmzvzMApBaywZz92fblAj0wBsphYNYQM3u4IzEoU07uMFhHg7mB4kDP6yqssYgeV+r/w28R8GoEBSlQbAUE+nqCnt6p9dJ4QKpYfvlsBkJvlh7ywIEDPfD5aApuGAt8SYuGINcCwC1kgzn7s+3LBfoQ5pZHHLhnz57tGySmuMaFqI8euHMueFbi8ccfdzt27MiMN8QdX+xt1DF8gE5vsiz2GVB+KCDQ1xP0//7v/54BfTl9EhDwzZkzx4MP37X1cgEk/nnmz9Jzj1uYP2shG8zZn21fLtCHA7SWRxzooyC3uLaO28/dy/z58zODvPT4qV/oyrHji70Oe/PFmo1V7Doqv/JXQKCvB+iBiPXm8bmWujcfNTPgjisjdHG8/PLLOXvK0TSywZx42fYVCvRxbh4rXxzobR/A54E26k0Pf+nSpbarZGtmYZmtqDdfstOQ+owF+nqAfty4cZmLt1y/GBUF4rJly7zbhlkp+QSeqsXNEzdzJdu+QoCetJlJxFTOaGCglpfEhQ1YNA7/8wAcr7wYNWrULWMVcfEba9tbb72VsRMNwjaWyko3HwUE+nqAHohYL61UX43iLgKQhO4WO+HMq6eMvI7ZBmV5tJnpiXGzWfDl89GMMAB4QB/XMGTbVwjQU05cTECawVoLuGG4k6CnHoL+8uXLmTpaXBvk5XmBH374wTYXdc00W5tSia2Uk3uvqEIos7JQQKCvB+h51ayBvpjz5kOLMf824ONBIXrwDEryemXmoANLZuNYAJQLFizwUzFtfjoPK/EyNx5g4vgwGHDvvfde79fH70+PmpBtXyFAHwU6U1Ztvj8zdGgAQtBjwDRgzNGnUSIuA79ogIutFIFzE9qIXk1cirOgPEMFBPp6gP6v//qvPejpsZUy0JsH1DxhCvDpgfOhC97FH/cAFRDlBUbMzGHuvcXnoSSb3mj1IS4NAU/eEo9XOdsc/Wz7CgF68ucpXaZXUi/yZsoo7w7iydqoS4rGhwauXbt2Pi6Atw9zU85SBNx51hHgiVh0UZACpVRAoK8H6O0ilt+1uKaLGwZ3TL9+/dy5c+eKm3meueHKM/tgnabvEeQpkaKVQAGBvo6gDy/kYrxvvgQ2UbZZMsjarVs3x2C4TRstp8IyqyZ8g6kejiqns5Pusgj0dQQ9rxmwHhtuEoXiKMCgMlNEmZFTKt97rprysFr44jKemlaQAuWigEBfR9Dz/VgDvS7mwpsx/mx7ARrv2wfqPC1rc+PDh8AKn3v9UuQuL5xhg0tPfvn6aamjGkcBgb6OoOeBHAM9PTiFwipAz/0Pf/iDf8OmDRjbAOuXX35ZVq83wP8evu8Iu8AmSjUTq7BnQqklSQGBvh6g/8d//McM7HkzoUK6FOCZg/AbBNbws009+XTZQqXUVqCvB+h/8YtfZEBfrk/GVooBVlI5gTjPIBjYbc1Ly4rx6chK0kplLS8FBPp6gD581w2+2XJ71015mVgySgPk8b0b3FnjpuGJV53/ZJzjJNdCoK8H6BGNh3jsotc0uiRfIs773EPI07jrlQbJPudJq51AX0/Q8zoBAz1rploy+4JpdgrJUQB/fPg+eSCvt1Am5/ympSYCfT1Bj3Dho+4h9Hk1rXp8lX0JMXMm6o8X5Cv7nKa59AJ9A0CP4QD08GnIEPi8zEo9/Mq6vPC381nI6DllGqVmWFXWuVRpf1RAoG8g6JGSgbqZM2f6Hn74oQmgTy/QXrL1o+z6VW4KAPGnn366hpuG84fbhtdBK0iBSlYgkaDP94TwdkN7wyFr3prIO1R4eRbvOec97bwxkXes8GrenTt3+g9imGjZ8sGvGwW+QZ+51vfdd5/vNdJzpIEgvvy+2dRsnO087MSUSD5kEg60hndkNNCaF984+ivV4ipgzOIDRKtWrfIfFMLbgIuS71fwtlveTssdLfyDg8ZHY2RxS1x7bk1qj3IjhlWE/woJetIDEAA9BEc+v7kDoDHAP0xDgGuIhkAuoHzPanw89EPP8D3x2c4H+kvveB21tTIVEOhvvrO80KA3c6DFBDDAIxtY6roddwLpsdDrZLaP/Mem+I9rGlt67XF3V1HN0ZSGWa8V/lE//UqOAgJ9I4M+aiqAnx46C24bGgFrCBraGAA0wJ9mNxBwx6fOdNcozO1/u2tCdxpJzomCFEiyAgJ9kUGfrzFZYxBtCMJX4Rq44tbWQwV6SQYZdQPW6JTLLUMjSBy5ZPK1QMVLkgICfZmCPh8jo+dus32yDShaI2C92HAgmEbAGpRyc/0AZMrGN2OtseO5BXNZ5YJ6WGe5Y/KxJMVJugICfQWDPmqc9G7xSUdfnWvgy3dNo2FAtTUuIQOura2RyHdNw2LHhmvKa/lE56/nW2aLxx0PM2nIS++giVqI/k+rAgJ9gkAfGrH5qoEeEG0oQA2k5bSmTtSNnn7SXVThudVvKVBXBQT6hII+zhDo8VvvOxwIZuDSetT49ksNcyuL3UXYFFPKLh973JnVNimQWwGBPkWgz20K8Xu5M7DGwdY2+Bm6X+ryG7+7pWXrNM8UildeW6VA4RQQ6AX6wlmTUpICUqAsFRDoBfqyNEwVSgpIgcIpINAL9IWzJqUkBaRAWSog0Av0ZWmYKpQUkAKFU0CgF+gLZ01KSQpIgbJUQKAX6MvSMFUoKSAFCqeAQC/QF86alJIUkAJlqYBAL9CXpWGqUFJAChROAYFeoC+cNSklKSAFylIBgV6gL0vDVKGkgBQonAICvUBfOGtSSlJACpSlAgK9QF+WhqlCSQEpUDgFBHqBvnDWpJSkgBQoSwUEeoG+LA1ThZICUqBwCgj0An3hrEkpSQEpUJYKCPQCfVkapgolBaRA4RQQ6AX6wlmTUpICUqAsFRDoBfqyNEwVSgpIgcIpINAL9IWzJqUkBaRAWSog0Av0ZWmYKpQUkAKFU0CgF+gLZ01KSQpIgbJUQKAX6MvSMFUoKSAFCqeAQC/QF86aCpTS+vXrXdOmTd28efMKlGLDkpk4caLr3r27O336dMMSyuNo8iAv8swnXLp0yT355JNuwIAB7tSpU/kcojgpVECgr2DQGxSAYri0atXKPfjgg27BggXu/PnzFWfWAr1AX3FGW+YFFugTAPoePXq4KVOmZJann37atWvXzsMf6K9YscJdv1nPMrdHXzyBPn/QV8L5VBlLr4BAnwDQx93mA/YtW7a4e++917Vs2dKtW7eu9NaWZwkEeoE+T1NRtDwVEOgTCno7/0ePHnX33HOP69+/vzt79qxtLuu1QC/Ql7WBVmDhBPqEgx6bnDNnjmvWrJkDoGG4du2aW7t2rW8Emjdv7l09nTt3dlVVVe7y5cuZqDNmzHBt27Z1GEsYDh065Dp27OheeOGFGq4hBghHjBjhxowZ43744Qc/iGkDjGfOnHGTJ0/2dxmUiQZo69atYbK+nNkGYxlwfOWVVzKuKe5WHnnkEbd79+4aafDPlStX3OLFix2uLfLKlp8dePDgQZ8WaRJ38ODBbseOHX5gtL6DsQwoox3l4w6L+pI2eaDP4cOHLXu/tnEX7tIuXrzoZs6c6XC/oUfXrl3dkiVL3NWrVzPHmNZWPur81FNPxZ6vTZs2+XzffvvtGucrk5h+JFYBgT4FoAcybdq0ce+++27GkAECMDGgLV++3LGMHDnSbxs1apS7cOGCj08DQbyVK1dmjufHsmXLPID69evnzp07l9lnDYDNmjF4AaAnnnjCjR8/3qdFAwLEOnXq5A4cOJA5PluPngahQ4cOvi4AEHfUrFmzXPv27WPdU+RPA2b5LVq0yHXp0sWnsWfPnkx+/CAtykJaNHTUleMoGw2FgbTGQXn8Y6B/8803faNIudesWePHU8iP+oRlCbV67LHHfGPDeQHwNBIAf9q0aRnYR0FPkXbt2uVB//LLL98Sr5Lu7PKQV1HyVECgTwHoDR6hL3/p0qXutttu81MYw4FafgMVwM6dAOH48eOuW7dubtKkSZmeIL3KcePGuV69enlYbd++PWNyQPP22293ts3yJ03AF+YHUMO8SCQO9LidgFTv3r3diRMnMnnxg14+s4woI2W1AFD37dtn//r1tm3bXOvWrR2NjIVcaVMXet8NAT1w7tmzp8ONFgZ6+JRlwoQJGSCbVhxDg8NdlwUaZzTnmJ07d/rNcaBHX3rtxKO+BHTmnNCrV0ifAgJ9CkFPT33o0KF+sV57aPq2f9CgQb5Xb1APe+7fffedhxeNAj1e672TDhAN4xq8LL0wr2PHjvleduj+iQP96tWrfcPEOi6wPe6uIxrXyhI2ernSBq7chTQU9OgUDaYrDZTNgbfy9e3bN3ZMhcYTYNvdWRzoyccaL+6ijhw54kgvvBOIlkX/J1sBgT4FoDeYcitPMNdK2KuNmjlx8b8Tl4CbBveP+cI3btzo4YePmV4mC+CikQDoYe/f4BXC3PKzffj0gRYhDvSUNSyPHW9reu533nlnjZ46+06ePOnLDtj79OmT8XeH+dWWNsc2BPTh3Y2V19bm2uFCJJgepqfFs7Xtt4YqG+iJzx3NHXfc4R5++GHfKId3O5ae1ulQQKBPAeg3b97sWrRo4YGHWXPSGSAMe+FRc48CyI4B+ATgaDBiG716evkAF79z2POOwinMy/aF4I0DfW2wtXQMgDQ6b7zxhu/l4wtnzIEyMzhL+cL8aku7tv1hfaK/ozrWtj9aj2j86P5coGcgnB49bqAPPvggmpT+T5ECAn3CQQ/w6J0DNxvwzNb7De0eKDJwyd0AwXrqwB23AL12gz69fNKnQcEXHB7HsVE4hfnYvhC8caCP3mGEafDb7lJmz57td3HHQeMW9XPH5QfIo2UO029M0KMzdyI2lmDlI8+4YPW0u7NcoMcfT4+eh+c0CBunZnq2CfQJB70NJob+WWbI4EPHT5/LRx/dD5TouQNiHsTCeAiWHlDFZWPTKu0yygUv21cb6GlU8MGHdwqWPuuonz1bT9pAGea3cOFC3yjQOEQDjRr+7Ya4buhR0wBGQ3QshP2mR9x4BvupZzgWkQ30oY8evz53cJpWGT0D6flfoE8o6BlEBHa4LYYMGeIBEpo1+wAG63AWjM26YUYOM3PCQI8dYIwePbqG64NjADzzzu+///5bXEIGr7hequ0LwRvXo8fXzoybXLNuwl4r9aJHH8KbuxteFQF4w/y40+GOJJxSSr2pl+nUUNAzK8gGXKNpk4cF04NzwwBueG6YbWQaoAchDvQcwzROzhUXuP0fzsKx/LROhwICfQJATy87fNcNwALwwGLs2LGOh5Sigd4kwCYOgLZ59MzdZu456QHGMOCDJy8Aai4S20+PlcYBmNi0Sttn8GoI6EnL5rozKBzOo7/77rs9qG2gmLjMTQfeNi/enhHgLY9sD0EfBfp7772XeaaA+OiZD+hpoHCVcOdj2gFxysv7h9COtNEK7dE52riYVvToiW/PAHC3RD2Z6okOFuJAH/dgFAOxTPEcPnx47F2cpad1MhUQ6BMAenqo4XLXXXe55557zj84E/YIoyZMrz/65CgzU3haNpy/bccBL3z0gJ7efRjMJRJOq7T9Bq+Ggp70eHqVAUYaMurMmidlw96y5csDVvTyacyI99prr/m57EA7BD3xqS9uEZ4+DdOlkaTcDQE9PWvmvfM2URoZ0mfNw17hE8iUI9SKGU2AmQaBBVcaT+qGIQp6c9ngbopqYs9OUA6FdCkg0Fcw6NNlqpVZW3r05kKpzBqo1ElQQKAX6JNgx2VbB4G+bE9Nqgom0Av0qTL4YldWoC+24sovTgGBXqCPswttK5ACAn2BhFQyDVJAoBfoG2RAOji3AgJ9bn20tzgKCPQCfXEsTblIASlQMgUEeoG+ZManjKWAFCiOAgK9QF8cS1MuUkAKlEwBgV6gL5nxKWMpIAWKo4BAL9AXx9KUixSQAiVTQKAX6EtmfMpYCkiB4igg0Av0xbE05SIFpEDJFBDoBfqSGZ8ylgJSoDgKCPQCfXEsTblIASlQMgUEeoG+ZManjKWAFCiOAgK9QF8cS1MuUkAKlEwBgV6gL5nxKWMpIAWKo4BAL9AXx9KUixSQAiVTQKAX6EtmfMpYCkiB4igg0Av0xbE05SIFpEDJFBDoBfqSGZ8ylgJSoDgKCPQCfXEsTblIASlQMgUEeoG+ZManjKWAFCiOAgK9QF8cS1MuUkAKlEwBgV6gL5nxKWMpIAWKo4BAL9AXx9KUixSQAiVTQKAX6EtmfMpYCkiB4iiQSNAfP37c5bMcO3bMsRCX9dGjR92RI0fc4cOH3cGDB93evXvd7t273bZt29zGjRvdunXr3IoVK5yJVpxTpFykgBSQAg1TwJhVVVXlFi5c6Kqrq92aNWvc+vXr3ZYtW9zOnTvdnj173IEDB9yhQ4c8B42Pxsh8mFrMOE3yzcwqItA3zIh0tBSQAuWtgECvHn15W6hKJwWkQIMVEOgF+gYbkRKQAlKgvBUQ6AX68rZQlU4KSIEGKyDQC/QNNiIlIAWkQHkrINAL9OVtoSqdFJACDVZAoBfoG2xESkAKSIHyVkCgF+jL20JVOikgBRqsgEAv0DfYiJSAFJAC5a2AQC/Ql7eFqnRSQAo0WAGBXqBvsBEpASkgBcpbAYFeoC9vC1XppIAUaLACAr1A32AjUgJSQAqUtwICvUBf3haq0kkBKdBgBQR6gb7BRqQEpIAUKG8FBHqBvrwtVKWTAlKgwQoI9AJ9g41ICUgBKVDeCgj0An15W6hKV6sCfCWoTZs2/stBtUauYwS+RkTa5KFQuQoI9BUM+tOnT7vu3bu7pk2buiVLluRlhStXrnTNmjXzx3G8QuUrINBX/jls7BoI9AkB/dChQ92FCxdy2gv7iUfDQAMh0OeUSzulQGIUEOgTAPq+ffu6Vq1a+Q+Y57JMPnDetm1b16FDB4E+l1DaJwUSpoBAnwDQP/roow7YT5gwwV29ejXWRNnOfnr0Tz31lEAfq5I2SoFkKiDQJwD0EydOdHPmzHGtW7d2O3fujLVUtrMfXz7xo66b69evu23btrlhw4a5li1bevcOPf8FCxbUaDwuXbrkRowY4ZczZ864WbNm+cE63EFdu3Z1a9ascaQVhitXrrjFixe7Hj16+PEBxgj69+/vtm7dGkbzv69du+ZWr17t0yJNyjJ16lR34MABX2bKHg2nTp1yr7zyir+r4RjKPX/+fEe+tQUb5yDdixcvupkzZ2bSoT7oFdd45pNnVKsZM2a45s2bu5dfftkXa968ef4Oa/fu3a66utrdfffdXncabbYRduzY4QYPHux1QwvKd/ny5RrVwkdPvUnPAufgyy+/dAMHDvR5ojn6f/755xbFn6fa4pAmaUcHY+PspV27do46njt3LpMHP0KNsZnJkyf785rNDvIpe40M9E+tCgj0CQE9IARwXGhR0PI/27t16+b27dvnIR0FPYaAWwcAAzcGbblT4GIE9hYMXtwZPPfcc27kyJFu+fLlrqqqyrVv395fwJs2bbLofg0sANz48eN9uosWLXJdunTx5d2zZ08mLkCdMmWKB4uVg7Lwu1evXr58UdCfOHHC9e7d20OSMlBu8iE/0oqDdCbDAELc5Tz22GMeqtTH8gVy06ZNq5FOvnmaVsOHD/f6kxaL1cFAP3bsWPfwww97HTlPuOHuueceXxd0YpudD44H9uE5jgM95+y2227LnB+rT9gY5BMnDvRoitbYBueF8xnqzvlAIwsGejR+4oknMnZgde3UqZNvyC1+PuWyuFrnp4BAnxDQc/HhmgHmx48fr3H2+Z/tAIJeKz3yKOj379/v1q5d6+hRWzh//rwHX79+/TK9NIMXwImClDsC7hqiLiR6+TQwYbC4XOwWaCDotUbTDRsAgyTH0GMHHoD0+++/t2Q8BOfOnetatGjhNm/enNke98MgRH2AV1h/0h83blyNO6W65Glaof2gQYPc3r17awDaIEr5w4F0oAlEWfhtwc4HPfPvvvvONvveNuU3iJMW+ZEu59sCdSMNQj5xiGdlDHv02c4T8blDiNqAaUx9SC9spKyu3JHWpVw+sv7krYBAnxDQc8ZD90xoAVxc9Pbp9Rt8oqAP44e/AWsY146np3ns2LEwalZ41Ih08x+7+A3cuRoqDrE7DovPNuqbbY73oUOHXMeOHd27774bl31mm5UDd8nZs2cz2+3H9u3b3e23355Jpy55mlY0OAyER4NBNIQ5cdAVfYF12ACwb/bs2f7OBj0sRHv0BvGHHnoo00BbXFvnE4e4VkYDfW3nyfaH9mEax9XH6vrCCy/4BiDfclk9tM5PAYE+QaCnt8mtcTjVEniFA7UGnxDeZir4frmg6WWTxl133eVdDbh0DCx2fLS3SBq2Ly7tkydPumXLlnm3RZ8+fTJ+cO4uOM4u8FGjRsX61g0WIehJj55sriWMb/UM15YuPXcgFQ2239KpS5659CAfIEpDZf54y9vyNPjZdjsmPB9si4KeHjNp04PGR8/+aN3yiWP5oa+BHv87d3hjxoxxP/zwQ1i0zG8esgrvpnLVx/aZHeRbrkxm+pGXAgJ9gkDPGWcgE/fHunXrvAHY/9ajzAYfbrkBCL5tfOIMmOHXBeghWOx4uzBDK7N9IegBzBtvvOGhg+8ZkNOQMDjLXYalYxe8ATVMl99x+4EZvW3W1Ddu2bVrVzSpGv/HpRtGiO6vS56mh9UxTJffpBVqa/ujedr2bMdEQU88G9S2AV7GTzifIfDziUMZQ9DnKpuV08pjjUOuY2xfqFE+5bK8tM5PAYE+YaCnBw+o6dnjj4328A0+IYyt1w/UQ183JgR4QxjZ8eGFaaZm+8K0aWDo3UX939EL3P7Pt2dNnvQc6bVu2LDBilDnteWbrYExF5DNlKlLnqZHnFYUtDFBb0IATZtZA7CjA7nEyxUnCnpspWfPnjl79Nz10ADj9iLk0tj2xWmUq1xWP63zU0CgTxjoOe303BgQ4yLFNRD6gA0+IYwxAmBO/DBwa84tekNAnw1mBlC7wHE70dvP5iu3BiMEMgOtNCL4resbDDRx/mPS5I4oHBStS56mtdUxWsZs2liZwrrasXHHWA86ev7sGNa4xmjIgTSwjgtxcUgz7NGbTcQN+pMmdwzRSQG56mP7smmUb9nj6qNtPyog0CcQ9DbLBghGwWnwiQM9LpVwRgSuENxADQU95TDXEaYHDGwaZXiB00DFzcwAQDQCACeEn9294AKK+rlpOD7++ONYf/+P5v9jb5N8yT+sv02jZLogYwyEuuRpWod1DPOOgzb7DX5hXe24uGOioEdfNAsD27hbshlU+cTh+Cjo2YZdcE6js6PYZ7NuwjuHXPWxfaZRvuUK66bftSsg0CcQ9MAKaANGm7ZmpmDwCUHPtscff9z751988UV/B8BFzDQ+GoqGgJ558oAYHzHuG+aoM/d+wIABNXz0lM+ADnQpD3ci7733ni/Hs88+66eIRuEHdPD9s1BmpnLyEBe+aYOH1T1ubaBh8Pn+++/PzPGmrKQRjnfY8fnmaVpnK0cctMnDyhStK/vijomCnuPpufNcAHqjI3P1mVfP8dhHPnEsv7BHzzZgzLMFnKdwHj3PVaAXjXLY0OSqj+0zjfItF+WDjfAiAAAIOUlEQVRQyF8BgT6BoOf0Mw0QmDOlMgwGnxD07OeJRbtQGZDlNv/w4cMN9tGTNk/AMm4AGADya6+95o4ePerLZxe4lTHb06n0rilzHPwOHjzoHnnkEQ8ZoMQTmjwpy9OrtQUDDelSX+pN/VmAP0+mxoV88jSto3W09OKgzb6wTBbX1nHHREHP3QzurM6dO/vG3gbYw+ck8olDnuQXBT3b8Z+THueV9InDk8QMspN2GHLVx/aZRvmWK0xfv2tXQKCvYNDXfnqTE8N8+typFDIYaOIakELmo7SkQCkVEOgF+lLaX955RwdF8z6wlogCfS0CaXciFBDoBfqyN2TcPAyIhoOihSq0QF8oJZVOOSsg0Av0ZWOfzIfH5zt9+nQ/gMggIoPDNtjKIGihg0BfaEWVXjkqINAL9GVjl7z3hAFhe/WCDaxOmjTJ4aNvjCDQN4aqSrPcFBDoBfpys0mVRwpIgQIrINAL9AU2KSUnBaRAuSkg0Av05WaTKo8UkAIFVkCgF+gLbFJKTgpIgXJTQKCvBfS85pavIfGuFmZ9rFixwr+bHeEUpIAUkAKVoEA20H/xxRduy5Yt/kl6XlfCk/RMfDhy5Ij/AA4TJFh4f1a5LU3yLVBYCX4zX5sK8ig8j7jz6TdAzyP8An0lmLPKKAWkQJwCcaDnIcTUgh7IG+hp4QT6OLPRNikgBSpJgVyg//rrrzM9er4dnYoefTbQ80FkuW4qybRVVikgBUyB2kD/zTffODq2qQR91HXz2WefuU8++UQ+erMeraWAFKgIBQz07777rv8yG18BM9cNPfrUg54PWoSDsQJ9Rdi1CikFpECgQBT01dXV/tsN+OhTA3oblA0HY/FT0aMH9Hz7EteNevSB5einFJACFaNAFPR8kIaP9PAtA0DPWGR01g08NDbmO8GlmPHqPOvGKsM6OutGoK8YW1ZBpYAUyKKAgZ4vtn3wwQf+y2OAnk8/Mr0S1w0dW6ZXMk4JBxMDelqfEPJR0O/bt8/36PnKEB+F/vzzz/1bFU20LJpqsxSQAlKgrBQwZs2dOzcDer4Qlg/oi9lLr0teeffoc4Ee1w0j0PToBfqyslkVRgpIgToqYKDn05CLFi3K9Oh5NXjYo+f5oWiPvi7wLWbceoHeoG+uG0DPbUwIevxZjFSbaHXUWtGlgBSQAiVRwJj1/vvvuyVLlriPP/7Yf/MX0PNAqLluDPS4bcx1U0x41yWvgoGeHj0DFHxoG9cNoMevZaKV5IwpUykgBaRAHRUwZi1YsMB9+OGHHvQ8F/TVV1950DMYi6s68aA3Xz2tmD00RY8e0NPacXtD64c4JprW30qLb6WBroPKsQHcNkyt5OttzCLk1S7MKsRzAejxZCTadRMFvblu7KEpm0vPgKwMu3IMW+dK50o28KMN0Jv/6KOPvNsGDwXTxhmDBPT2VGwiQW++eUDPb/NNUVlgTyuHodiALOJ8+umn/glZWsbFixc7bodYGOhg+pItPIVWVVWlRRrIBmQDjWoDsMYW+MPsGniETx420ZMH8syfX7VqlfdM4LaxgdhwDr0xkHVdfObFjlsnH30IemBvlQT0+Kto5QA97hsGLWgB6dUzNYlXFiPc0qVL/QAH0GeO6sKFC7VIA9mAbKAkNgCDWIA7A68Ank4pPXkgT0eVaZWMO9KBxT+P5wLe0bm1OfSJA30c7Kkslaby9OoRA1FoAfFr8egwsGcWDq9FYBQbIQE/C8Lamt9apIFsQDbQWDbAu2tI2/hja7jEgk8eXjHGCORhGO5oOrB0ZMP30BvozctR7J56vvnVuUcfBb317AE9iw3KAntm4BjsGZwF+IhHK4mQzMphoQEIF9uu9Q19pIN0kA00zAZCvvA71NNYZHBn4BW3M+4aevIGeRuEtdk2eDLMq5EvcEsVr16gD2FvFQ179TbV0nr2jFTzjghaRhagz0JriaAsNALhYtu1vqGPdJAOsoH62UDIlfA3esIgFmMScIdRuJ3ppOKZoCcP5O21B3RoK2UQ1hqWBoOeHr316kPY48JBGARCKHr3iEbriP8eEYF/dGG7FmkgG5ANFNoGoqzh/zAPuMRCx9QAT2cVdw08i3PZlLtvvsGgj/bqgT2gZ7HBWYQx4CMW0Ec4wM/Cb1usMaBB0CINZAOygcawAWOP8Sf837bBKRZm11gvPhx8Nc4BebhnMC3ndb179FapsEcfunGAvQ3QAnzcOSyA3+CPkFqkgWxANlAqGwDk4WJ8glVwywAPz8xdA+grCfKwusGgJ5Eo7K13b+LYQC2iRRdrAGyNuFqkgWxANlBoGzDG2DpMP+SS8crW5qUwwFca5AsG+jjYmxh2m2NiGfxZm5Ba35ixJB2kg2yguDYQ8ij6G4YZv/htS6W4a+CyLQXp0Vti2YBv0A+FMwG1vjGuIR2kg2ygtDZgILfzYP/buhIBb2wuOOgt4ag7x8SKri2e1jdmL0kH6SAbKI4NRFmU7X/Oh3GtUteNBnoTREZbHKOVztJZNlBYGzCGJWHd6KDPJpKMsrBGKT2lp2ygfjaQjVFJ2l4y0CdJRNXlx0EfaSEtZAPlZwMCfTAyLQMtPwPVOdE5kQ003AYEeoG+4geaBIKGg0AaJltDgV6gF+hlA7KBhNuAQJ/wE6yeWrJ7ajq/Or/52IBAL9CrNycbkA0k3AYE+oSf4Hxae8VRr1A2kGwbEOgFevXmZAOygYTbgECf8BOsnlqye2o6vzq/+diAQC/QqzcnG5ANJNwGBPqEn+B8WnvFUa9QNpBsGxDoBXr15mQDsoGE24BAn/ATrJ5asntqOr86v/nYwP8PTwZDXm/lqfYAAAAASUVORK5CYII=" width="302" /></div><div dir="ltr" style="text-align: left;" trbidi="on"> Fig 6.0: Settings menu</div><div dir="ltr" style="text-align: left;" trbidi="on"><br /></div><div dir="ltr" style="text-align: left;" trbidi="on"><br /></div><div dir="ltr" style="text-align: left;" trbidi="on">Under settings you will see the parameter, where you can update the environment, you can replace UAT by PROD URL and click Apply which will connect to PROD environment and refresh your power BI model.</div><div dir="ltr" style="text-align: left;" trbidi="on"><br /></div><div class="separator" style="clear: both; text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEiufFlAs_iRXTB3w3HGWeUohns1Zun21S9WFvz9i1c0XelNas4kPLLwHgQyxu737WrYy4Thd2sltq8wefEcv6U2SeY5yaI1XgoMQfhEqSwwqV9R_ZRyMciHdnm0Ht7zJHDcpj63vxzb1t5k/s683/Power+BI+service+Environment.PNG" style="margin-left: 1em; margin-right: 1em;"><img border="0" data-original-height="512" data-original-width="683" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEiufFlAs_iRXTB3w3HGWeUohns1Zun21S9WFvz9i1c0XelNas4kPLLwHgQyxu737WrYy4Thd2sltq8wefEcv6U2SeY5yaI1XgoMQfhEqSwwqV9R_ZRyMciHdnm0Ht7zJHDcpj63vxzb1t5k/s320/Power+BI+service+Environment.PNG" width="320" /></a></div><div dir="ltr" style="text-align: left;" trbidi="on"><br /></div>
Unknownnoreply@blogger.com1Toronto, ON, Canada43.653226 -79.383184315.342992163821151 -114.5394343 71.963459836178842 -44.226934299999996tag:blogger.com,1999:blog-2635979041680311167.post-55793992282397174482020-04-26T20:30:00.011-04:002020-05-06T20:49:05.900-04:00Basic differences between SQL and NoSQL.<div dir="ltr" style="text-align: left;" trbidi="on">Around three decades Oracle, IBM, Microsoft relational databases were consistent leaders for Operational Database Management Systems. However, currently all of these vendors offer NoSQL platform parallel to their traditional database platform e.g. Microsoft got Azure Cosmos DB as NoSQL database and so on.</div><div dir="ltr" style="text-align: left;" trbidi="on"><div><br /></div><u><b>Journey of NoSQL:</b></u></div><div dir="ltr" style="text-align: left;" trbidi="on"> </div><div dir="ltr" style="text-align: left;" trbidi="on">The world of NoSQL emerged in 1998 and it's gained popularity in 2010s. In 2015, the popular NoSQL database provider MongoDB appeared in Gartners magic Quardent as a leader for operational database management system.</div><div dir="ltr" style="text-align: left;" trbidi="on"><br /></div><table align="center" cellpadding="0" cellspacing="0" class="tr-caption-container" style="margin-left: auto; margin-right: auto;"><tbody><tr><td style="text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEiCWvIsNXzn18uYCNFF3Z09hiC9fwX4WY0NgJHP-LQ6WrpX2RKRbndDnXTEzaFaO0jYrEUF6ran43i6e6QPZ7QOE3xogx1FSw9yDOp6ZpaGfsNsJxFiOM4-J3tnihuW0gPl2vvGrqPr5bR_/" style="margin-left: auto; margin-right: auto;"><img border="0" data-original-height="359" data-original-width="638" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEiCWvIsNXzn18uYCNFF3Z09hiC9fwX4WY0NgJHP-LQ6WrpX2RKRbndDnXTEzaFaO0jYrEUF6ran43i6e6QPZ7QOE3xogx1FSw9yDOp6ZpaGfsNsJxFiOM4-J3tnihuW0gPl2vvGrqPr5bR_/s320/mongodb-evenings-dc-mongodb-the-new-default-database-for-giant-ideas-18-638.jpg" width="320" /></a></td></tr><tr><td class="tr-caption" style="text-align: center;">Fig 1:Gartners Magic Quadrant in 2015<br /></td></tr></tbody></table><div dir="ltr" style="text-align: left;" trbidi="on"><br /></div><div dir="ltr" style="text-align: left;" trbidi="on"><br /></div><div dir="ltr" style="text-align: left;" trbidi="on">Though recent trends shows that NoSQL is getting more popular in Big data landscape than relational database management system.</div><div dir="ltr" style="text-align: left;" trbidi="on"><br /></div><div dir="ltr" style="text-align: left;" trbidi="on"><br /></div><div dir="ltr" style="text-align: left;" trbidi="on"><br /></div><div dir="ltr" style="text-align: left;" trbidi="on"><b><u>What is NoSql and Why?</u></b></div><div dir="ltr" style="text-align: left;" trbidi="on"><br /></div><div dir="ltr" style="text-align: left;" trbidi="on">NoSQL stands for 'Not only SQL'. NoSQL database is for storing both structure and semi structure data.</div><div dir="ltr" style="text-align: left;" trbidi="on"><br /></div><div dir="ltr" style="text-align: left;" trbidi="on"><div dir="ltr" style="text-align: left;" trbidi="on">NoSQL stores data in one of four categories:</div><div dir="ltr" style="text-align: left;" trbidi="on"><ol style="text-align: left;"><li>Key-Value storage</li><li>Document storage</li><li>Wide Column storage</li><li>Graph database</li></ol></div>
<div dir="ltr" style="text-align: left;" trbidi="on">The most popular NoSQL database MongoDB is document storage, however, it can be used as a key/value store, which is just a subset of features.</div><br />
Today data become more broader in terms of shape and size. Data is spread around documents, social media, complex web application, IoT sources. The traditional SQL database can't handle these data due to the continuous changing behavior. Traditional database need to know shape of the data (schema) beforehand to store those, hence it failed to capture continuous evolving data. And thus, NoSQL emerge and conquer the world!!!</div><div dir="ltr" style="text-align: left;" trbidi="on"><u><br /></u></div><div dir="ltr" style="text-align: left;" trbidi="on"><u>Fixed schema vs. No Schema</u><br />
<br />
For structure database (traditional SQL database) you need to have schema ready before you insert data to a table but for NoSQL you don't need to have schema created at first rather you can directly insert the data. Though, it's better to say 'Schema agnostice' than 'Schemaless' e.g. Microsoft's NoSQL database Azure Cosmos DB known as schema agnostic database, which is not bound by schemas but are aware of the schemas. <br />
<br />
<br /><u>
Relationship vs. No Relationship</u><br /><br />Traditional SQL require to maintain relationship to perform, that's why normalization is there. Whereas, NoSQL doesn't require to maintain the relationship. You can embed a few tables into one table in NoSQL database.</div><div dir="ltr" style="text-align: left;" trbidi="on"><br /></div><div dir="ltr" style="text-align: left;" trbidi="on"><br /></div><div dir="ltr" style="text-align: left;" trbidi="on">Below table (fig:2) shows basic differences between SQL and NoSQL database:</div><table align="center" cellpadding="0" cellspacing="0" class="tr-caption-container" style="margin-left: auto; margin-right: auto;"><tbody><tr><td style="text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEiWPR_v0wUq7ocZtVrQbZ8hlnX6v5l_FtgPxjPWfO-_mpdZATMfGQ0T2yLogFGjAjeZNT_vzsYnhAUQ5P2LucExpdr2GWtP2wtR4vyRfn4SBnqz2_9v1XoKvXJdgUF3KEGARkE9kCtqTsvh/" style="margin-left: auto; margin-right: auto;"><img border="0" data-original-height="751" data-original-width="1776" height="169" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEiWPR_v0wUq7ocZtVrQbZ8hlnX6v5l_FtgPxjPWfO-_mpdZATMfGQ0T2yLogFGjAjeZNT_vzsYnhAUQ5P2LucExpdr2GWtP2wtR4vyRfn4SBnqz2_9v1XoKvXJdgUF3KEGARkE9kCtqTsvh/w400-h169/Compare+SQL+vs+NoSQL.png" width="400" /></a></td></tr><tr><td class="tr-caption" style="text-align: center;">Fig 2: Comparison between SQL and NoSQL<br /></td></tr></tbody></table><br /><br /><br />Conclusion: NoSQL denotes 'Not only SQL', it means NoSQL database can perform what traditional relational database can do as well as do more. But these two types of Database have different expertise, as per business requirements you can choose from them. A way to find out which fit best for you could be asking the question. Do you know the shape of data well advance? If answer is 'Yes' which means you can define schema earlier and can build the relationship before data arrive then it's good to choose traditional SQL. On the other hand, when you don't know the shape of data and behavior changes continuously then go with NoSQL. Nevertheless, some complex enterprise solution can be built by using both SQL and NoSQL database to leverage best of each.<div><div><br /><br /><br /><div class="separator" style="clear: both; text-align: center;"><br /></div>
</div></div>Unknownnoreply@blogger.com1Toronto, ON, Canada43.653226 -79.383184315.342992163821151 -114.5394343 71.963459836178842 -44.226934299999996