{"id":88717,"date":"2020-10-20T00:30:20","date_gmt":"2020-10-20T00:30:20","guid":{"rendered":"https:\/\/www.red-gate.com\/simple-talk\/?p=88717"},"modified":"2020-10-20T00:42:53","modified_gmt":"2020-10-20T00:42:53","slug":"connecting-log-analytics-using-azure-data-studio-kql","status":"publish","type":"post","link":"https:\/\/www.red-gate.com\/simple-talk\/blogs\/connecting-log-analytics-using-azure-data-studio-kql\/","title":{"rendered":"Connecting to Log Analytics using Azure Data Studio and KQL"},"content":{"rendered":"<p><strong>Azure Data Studio<\/strong> is a great tool and supports way more than only SQL. Recently, <strong>Azure Data Studio<\/strong> included the support to Kusto language, or <strong>KQL<\/strong>.<\/p>\n<p>Kusto is a query language used for massive amounts of streamed data, perfect for logs. That&#8217;s why Kusto was initially created for <strong>Azure Monitor<\/strong> and <strong>Azure Log Analytics<\/strong>.<\/p>\n<p>However,\u00a0<strong>Azure Data Studio <\/strong>only supports Kusto connections with\u00a0<strong>Azure Data Explorer<\/strong>, not\u00a0<strong>Log Analytics<\/strong>. All this huge focus on <strong>Azure Data Explorer<\/strong> given by Microsoft still puzzles me. <strong>Log Analytics<\/strong> is a ready-to-use monitoring solution for cloud and on-premises environment.<\/p>\n<p>On the other hand,\u00a0<strong>Azure Data Explorer<\/strong> is a database storage MPP (Massive Parallel Processing) solution for massive stream storage &#8211; logs &#8211; allowing anyone to create their own monitoring solution using it as a storage, but not a ready-to-use solution as <strong>Log Analytics<\/strong>.<\/p>\n<p>Considering these differences, the use cases for <strong>Log Analytics<\/strong> seems related to migrations to the cloud and all the following monitoring work.\u00a0<strong>Log Analytics<\/strong> can be a fundamental tool on the process of cloud adoption, because it gives numbers to the process. These numbers allow the IT Administrators to do in the cloud what only a few really manage in an organized way: Keep baselines and compare results after every environment change.\u00a0<\/p>\n<p>On the other hand, software houses creating custom solutions or, in some situations, solution providers trying to manage a big number of clients, may find\u00a0<strong>Azure Data Explorer<\/strong>\u00a0as a better solution for them. It puzzles me that one receives more attention than the other. What do you think? Let&#8217;s talk about on the comments.<\/p>\n<p>Now it&#8217;s time to go direct to the point. Of course, you already noticed how I got disappointed by discovering <strong>Azure Data Studio<\/strong> couldn&#8217;t connect to <strong>Log Analytics<\/strong>. However, with some help from Microsoft community on twitter I was able to find a work around: We can use <a href=\"https:\/\/mybinder.org\/v2\/gh\/Microsoft\/jupyter-Kqlmagic\/master?filepath=notebooks%2FQuickStartLA.ipynb\"><strong>KqlMagic<\/strong>, a <strong>Python<\/strong> package, to connect to <strong>Log Analytics<\/strong><\/a>.<\/p>\n<p>Yes, <strong>Azure Data Studio<\/strong> is so flexible it supports Python and we can install Python packages.<\/p>\n<h2>The Challenges and What Doesn&#8217;t Work<\/h2>\n<p>If you are thirsty for the solution, you can skip this part and go to the solution a bit below on the blog. However, I like to explain how I reached the solution and what didn&#8217;t work. At the bare minimum, you will find some great links and ideas.<\/p>\n<p>What may appear easy in the beginning, is not at all. Let&#8217;s face the challenges:<\/p>\n<ul>\n<li>The article explaining how to do it only connects to a demo <strong>Log Analytics Workspace<\/strong> kept by Microsoft<\/li>\n<li>The github repo has a <a href=\"https:\/\/mybinder.org\/v2\/gh\/Microsoft\/jupyter-Kqlmagic\/master?filepath=notebooks%2FQuickStartLA.ipynb\">Get Started With KqlMagic and Log Analytics<\/a> which redirects you to an online executable notebook with some explanations about the connection string, but using the demo workspace<\/li>\n<li>If you ask help to KqlMagic itself, the information you get is the same on the notebook, and it doesn&#8217;t help to much when trying to connect to a workspace.<\/li>\n<\/ul>\n<h3>The Existing Documentation<\/h3>\n<p>This is the documentation we have:<\/p>\n<blockquote>\n<pre><code class=\"cm-s-ipython language-python\">%kql loganalytics:\/\/code;workspace='&lt;workspace-id&gt;';alias='&lt;workspace-friendly-name&gt;'<\/code><\/pre>\n<pre><code class=\"cm-s-ipython language-python\">%kql loganalytics:\/\/tenant='&lt;tenant-id&gt;';clientid='&lt;aad-appid&gt;';clientsecret='&lt;aad-appkey&gt;';workspace='&lt;workspace-id&gt;';alias='&lt;workspace-friendly-name&gt;'<\/code><\/pre>\n<pre><code class=\"cm-s-ipython language-python\">%kql loganalytics:\/\/username='&lt;username&gt;';password='&lt;password&gt;';workspace='&lt;workspace-id&gt;';alias='&lt;workspace-friendly-name&gt;'<\/code><\/pre>\n<pre class=\"\"><code class=\"cm-s-ipython language-python\">%kql loganalytics:\/\/anonymous;workspace='&lt;workspace-id&gt;';alias='&lt;workspace-friendly-name&gt;'<\/code><\/pre>\n<p>(1) authentication with appkey works only for the demo.<br \/>\n(2) username\/password works only on corporate network.<br \/>\n(3) alias is optional.<br \/>\n(4) if credentials are missing, and a previous connection was established the credentials will be inherited.<br \/>\n(5) if secret (password \/ clientsecret) is missing, user will be prompted to provide it.<br \/>\n(6) if tenant is missing, and a previous connection was established the tenant will be inherited.<br \/>\n(7) a not quoted value, is a python expression, that is evaluated and its result is used as the value. This is how you can parametrize the connection string<br \/>\n(8) anonymous authentication, is NO authentication, for the case that your cluster is local.<\/p>\n<\/blockquote>\n<p>&nbsp;<\/p>\n<p>The numbered items help to exclude the options that don&#8217;t work:<\/p>\n<ul>\n<li>The connection string on the demo only works for the demo<\/li>\n<li>We can&#8217;t use anonymous, for sure<\/li>\n<li>I tried to leave some parameters missing, but it didn&#8217;t work for me, <strong>KqlMagic<\/strong> always complains about the missing parameters<\/li>\n<\/ul>\n<p>I can ensure you: The solution is above, but it is still difficult to find.<\/p>\n<h3>More Documentation<\/h3>\n<p>Another closer look on the <a href=\"https:\/\/github.com\/Microsoft\/jupyter-Kqlmagic\">KqlMagic git repo<\/a> and we find this documentation about authentication methods on the repo:<\/p>\n<ul>\n<li>\n<blockquote><p>AAD Username\/password &#8211; Provide your AAD username and password.<\/p><\/blockquote>\n<\/li>\n<li>\n<blockquote><p>AAD application &#8211; Provide your AAD tenant ID, AAD app ID and app secret.<\/p><\/blockquote>\n<\/li>\n<li>\n<blockquote><p>code &#8211; Provide only your AAD username, and authenticate yourself using a code, generated by ADAL.<\/p><\/blockquote>\n<\/li>\n<li>\n<blockquote><p>certificate &#8211; Provide your AAD tenant ID, AAD app ID, certificate and certificate-thumbprint (supported only with Azure Data Explorer)<\/p><\/blockquote>\n<\/li>\n<li>\n<blockquote><p>appid\/appkey &#8211; Provide you application insight appid, and appkey (supported only with Application Insights)<\/p><\/blockquote>\n<\/li>\n<\/ul>\n<p>&nbsp;<\/p>\n<h3>We have a match<\/h3>\n<p>Finally, it&#8217;s just a matter of linking the dots: On the sample connection strings, the one which makes more sense mention an <strong>AppId<\/strong> inserted as the <strong>clientId<\/strong> key value, even against the advise that App Ids work only for the demo. On the other hand, the list of authentication methods mentioned above includes an AAD Application. We have a match!<\/p>\n<p>I don&#8217;t remember exactly how I found this great link\u00a0about <a href=\"https:\/\/dev.loganalytics.io\/\">Log Analytics API<\/a>\u00a0, but it includes information about how to <a href=\"https:\/\/dev.loganalytics.io\/documentation\/1-Tutorials\/Direct-API\">register an application with Azure Active Directory.<\/a>\u00a0The final result is not exactly what we need, because the documentation focus is make a direct access to the API. However, we can still follow this link to register the application on Azure Active Directory.<\/p>\n<h2>The Solution<\/h2>\n<p>After many attempts, I finally figured out how to fit the information I have in the connection string. You need to follow the steps below:<\/p>\n<ul>\n<li>Create an <strong>App Registration<\/strong> on <strong>Azure Active Directory<\/strong><\/li>\n<\/ul>\n<p><img loading=\"lazy\" decoding=\"async\" class=\"alignnone size-full wp-image-88761\" src=\"https:\/\/www.red-gate.com\/simple-talk\/wp-content\/uploads\/2020\/10\/Kql1.png\" alt=\"App Registration\" width=\"994\" height=\"501\" \/><\/p>\n<p>&nbsp;<\/p>\n<ul>\n<li>Create a client secret\n<ul>\n<li>The client secret goes on the connection string, but not the client secret name. clientId is not the name of the client secret.<\/li>\n<\/ul>\n<\/li>\n<\/ul>\n<p>\n<img loading=\"lazy\" decoding=\"async\" class=\"alignnone size-full wp-image-88762\" src=\"https:\/\/www.red-gate.com\/simple-talk\/wp-content\/uploads\/2020\/10\/KQL2.png\" alt=\"App Secret\" width=\"1266\" height=\"573\" \/><\/p>\n<p>&nbsp;<\/p>\n<ul>\n<li>Set the <strong>Api Permissions<\/strong> to <strong>Log Analytics Read<\/strong> as application permission (not delegation)<\/li>\n<\/ul>\n<p><img loading=\"lazy\" decoding=\"async\" class=\"alignnone size-full wp-image-88763\" src=\"https:\/\/www.red-gate.com\/simple-talk\/wp-content\/uploads\/2020\/10\/KQL3.png\" alt=\"API Permissions\" width=\"1266\" height=\"441\" \/><\/p>\n<p>&nbsp;<\/p>\n<ul>\n<li>On Manifest of the AAD application, copy the appId. The appId is the value to fill the clientId property on the connection string. This was one of the most challenging discoveries.<\/li>\n<\/ul>\n<p>&nbsp;<\/p>\n<p><img loading=\"lazy\" decoding=\"async\" class=\"alignnone size-full wp-image-88764\" src=\"https:\/\/www.red-gate.com\/simple-talk\/wp-content\/uploads\/2020\/10\/KQL4.png\" alt=\"App Id\" width=\"976\" height=\"663\" \/><\/p>\n<p>&nbsp;<\/p>\n<ul>\n<li>On the <strong>Log Analytics<\/strong> workspace, copy the Id of the workspace. Workspace property on the connection string is the Id, not the name. This was another though discovery.<\/li>\n<\/ul>\n<p><img loading=\"lazy\" decoding=\"async\" class=\"alignnone size-full wp-image-88765\" src=\"https:\/\/www.red-gate.com\/simple-talk\/wp-content\/uploads\/2020\/10\/Kql5.png\" alt=\"Workspace Id\" width=\"608\" height=\"243\" \/><\/p>\n<ul>\n<li>On the <strong>Log Analytics<\/strong> workspace, <em>Access Control (IAM) =&gt; Add =&gt; Add Role Assignment<\/em>\n<ul>\n<li>On the next window, give permission to the app registration to read the logs. You will find the app registration as a user.<\/li>\n<\/ul>\n<\/li>\n<\/ul>\n<p>&nbsp;<\/p>\n<p><img loading=\"lazy\" decoding=\"async\" class=\"alignnone size-full wp-image-88766\" src=\"https:\/\/www.red-gate.com\/simple-talk\/wp-content\/uploads\/2020\/10\/Kql6.png\" alt=\"Log Analytics Permissions\" width=\"1266\" height=\"415\" \/><\/p>\n<ul>\n<li>On the Azure Active Directory, copy the tenantId<\/li>\n<\/ul>\n<p><img loading=\"lazy\" decoding=\"async\" class=\"alignnone size-full wp-image-88767\" src=\"https:\/\/www.red-gate.com\/simple-talk\/wp-content\/uploads\/2020\/10\/Kql7.png\" alt=\"Tenant Id\" width=\"364\" height=\"364\" \/><\/p>\n<p>The connection string you need to use:<\/p>\n<div>\n<div>\n<pre class=\"\"><code class=\"cm-s-ipython language-python\">%kql\u00a0loganalytics:\/\/tenant='&lt;&lt;your\u00a0tenant\u00a0id&gt;&gt;';clientid='&lt;&lt;yourappid&gt;&gt;';clientsecret='&lt;&lt;your\u00a0client\u00a0secret&gt;&gt;';workspace='&lt;&lt;Your\u00a0log\u00a0analytics\u00a0workspace\u00a0id&gt;&gt;';alias='&lt;&lt;workspace\u00a0name&gt;&gt;'<\/code><\/pre>\n<\/div>\n<\/div>\n<div>\u00a0<\/div>\n<div><em>Voila!<\/em> You are connected to <strong>Log Analytics<\/strong> and you can use <strong>KQL<\/strong> to retrieve data and build graphics.<\/div>\n<div>\u00a0<\/div>\n<h2>Further Studies<\/h2>\n<ul>\n<li><a href=\"https:\/\/azure.microsoft.com\/en-us\/services\/cognitive-services\/metrics-advisor\/\">Metrics Advisor in Cognitive Services<\/a> has <strong>Azure Data Explorer<\/strong> as its only <strong>Kusto<\/strong> source. It&#8217;s unable to connect to <strong>Log Analytics<\/strong>.\u00a0However, <a href=\"https:\/\/azure.microsoft.com\/en-us\/updates\/azure-monitor-log-analytics-data-export-is-in-public-preview\/\">Log Analytics Export<\/a> is also a feature in preview, so we can export from <strong>Log Analytics<\/strong> to <strong>Metrics Advisor<\/strong>. There is a lot to discover here.<\/li>\n<li><strong>Azure Data Explorer<\/strong> has a <a href=\"https:\/\/docs.microsoft.com\/en-us\/azure\/data-explorer\/query-monitor-data\">proxy which enables Data Explorer to connect to Log Analytics<\/a>. <strong>Log Analytics<\/strong> becomes an additional node in the cluster. I think this solution may be useful in two situations:\n<ul>\n<li>You are already using <strong>Azure Data Explorer<\/strong> for something (whatever) and you would like to keep everything in a single place.<\/li>\n<li>You are a solution provider managing hundreds of different <strong>Log Analytics<\/strong> services for different clients. The best architecture would be create a single place where you can query everything. This 2nd solution seems very interesting. If I&#8217;m not mistaken, we would need to use <a href=\"https:\/\/docs.microsoft.com\/en-us\/azure\/lighthouse\/concepts\/cross-tenant-management-experience\">Azure Lighthouse<\/a> as well.<\/li>\n<\/ul>\n<\/li>\n<li><strong>Metrics Advisor<\/strong> can connect to <strong>Azure Data Explorer<\/strong>. So, if we create an <strong>Azure Data Explorer<\/strong> and configure a proxy, we can use <strong>Metrics Advisor<\/strong> to analyze <strong>Log Analytics<\/strong> data through the proxy. This seems very interesting.<\/li>\n<li>Configuring the proxy, we can use the regular Azure Data Studio Kusto extension for Data Explorer and we can query <strong>Log Analytics<\/strong> data as well.<\/li>\n<\/ul>\n<h2>References<\/h2>\n<ul>\n<li><a href=\"https:\/\/docs.microsoft.com\/en-us\/sql\/azure-data-studio\/what-is?view=sql-server-ver15?WT.mc_id=DP-MVP-4014132\">Azure Data Studio<\/a><\/li>\n<li><a href=\"https:\/\/docs.microsoft.com\/en-us\/azure\/data-explorer\/?WT.mc_id=DP-MVP-4014132\">Azure Data Explorer<\/a><\/li>\n<li><a href=\"https:\/\/docs.microsoft.com\/en-us\/azure\/data-explorer\/kusto\/concepts\/?WT.mc_id=DP-MVP-4014132\">Kusto<\/a><\/li>\n<li><a href=\"https:\/\/docs.microsoft.com\/en-us\/azure\/azure-monitor\/overview?WT.mc_id=DP-MVP-4014132\">Azure Monitor<\/a><\/li>\n<li><a href=\"https:\/\/docs.microsoft.com\/en-us\/azure\/azure-monitor\/learn\/quick-create-workspace?WT.mc_id=DP-MVP-4014132\">Azure Log Analytics<\/a><\/li>\n<li><a href=\"https:\/\/mybinder.org\/v2\/gh\/Microsoft\/jupyter-Kqlmagic\/master?filepath=notebooks%2FQuickStartLA.ipynb\">KqlMagic<\/a><\/li>\n<li><a href=\"https:\/\/azure.microsoft.com\/en-us\/services\/cognitive-services\/metrics-advisor\/?WT.mc_id=DP-MVP-4014132\">Metrics Advisor<\/a><\/li>\n<li><a href=\"https:\/\/docs.microsoft.com\/en-us\/azure\/data-explorer\/query-monitor-data?WT.mc_id=DP-MVP-4014132\">Azure Data Explorer Proxy<\/a><\/li>\n<li><a href=\"https:\/\/azure.microsoft.com\/en-us\/updates\/azure-monitor-log-analytics-data-export-is-in-public-preview\/?WT.mc_id=DP-MVP-4014132\">Log Analytics Export<\/a><\/li>\n<li><a href=\"https:\/\/docs.microsoft.com\/en-us\/azure\/lighthouse\/overview?WT.mc_id=DP-MVP-4014132\">Azure Lighthouse<\/a><\/li>\n<\/ul>\n","protected":false},"excerpt":{"rendered":"<p>Azure Data Studio is a great tool and supports way more than only SQL. Recently, Azure Data Studio included the support to Kusto language, or KQL. Kusto is a query language used for massive amounts of streamed data, perfect for logs. That&#8217;s why Kusto was initially created for Azure Monitor and Azure Log Analytics. However,\u00a0Azure&#8230;&hellip;<\/p>\n","protected":false},"author":50808,"featured_media":0,"comment_status":"open","ping_status":"closed","sticky":false,"template":"","format":"standard","meta":{"_acf_changed":false,"footnotes":""},"categories":[2],"tags":[126253,124955,126250,124954],"coauthors":[6810],"class_list":["post-88717","post","type-post","status-publish","format-standard","hentry","category-blogs","tag-azuredatastudio","tag-dataplatform","tag-kql","tag-loganalytics"],"acf":[],"_links":{"self":[{"href":"https:\/\/www.red-gate.com\/simple-talk\/wp-json\/wp\/v2\/posts\/88717","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/www.red-gate.com\/simple-talk\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/www.red-gate.com\/simple-talk\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/www.red-gate.com\/simple-talk\/wp-json\/wp\/v2\/users\/50808"}],"replies":[{"embeddable":true,"href":"https:\/\/www.red-gate.com\/simple-talk\/wp-json\/wp\/v2\/comments?post=88717"}],"version-history":[{"count":23,"href":"https:\/\/www.red-gate.com\/simple-talk\/wp-json\/wp\/v2\/posts\/88717\/revisions"}],"predecessor-version":[{"id":88785,"href":"https:\/\/www.red-gate.com\/simple-talk\/wp-json\/wp\/v2\/posts\/88717\/revisions\/88785"}],"wp:attachment":[{"href":"https:\/\/www.red-gate.com\/simple-talk\/wp-json\/wp\/v2\/media?parent=88717"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/www.red-gate.com\/simple-talk\/wp-json\/wp\/v2\/categories?post=88717"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/www.red-gate.com\/simple-talk\/wp-json\/wp\/v2\/tags?post=88717"},{"taxonomy":"author","embeddable":true,"href":"https:\/\/www.red-gate.com\/simple-talk\/wp-json\/wp\/v2\/coauthors?post=88717"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}