Understanding your Azure EA Billing Data and Building a Centralized Data Storage Solution

Many organizations are moving to Azure and other cloud providers. Understanding how resources are being used and what is spent is very important. In this article, Feodor explains his solution for automating the collection of the Enterprise Agreement billing from Azure into an Azure SQL Database where it can be analyzed.

The series so far:

  1. Understanding your Azure EA Billing Data and Building a Centralized Data Storage Solution
  2. Azure EA Financial Reporting and Granular Access to Data Through the Enterprise
  3. Building a Collaborative Platform to Speed up Data Analysis

With the current technology shift from data centers to cloud solutions, many companies are investigating the Enterprise Agreement (EA) model for using the Microsoft Azure platform. Even though one of the advantages of the EA is the preferential pricing, there is still room for resource cost optimization, based on an understanding of the internal business demands. In this article I’ll dive into the data centralization for the EA billing, showing how to set up an Azure SQL Database to store the EA billing data, and a Web App and associated Web Job, in Azure, to import new data, on a schedule. In a Subsequent article explore reporting and data analysis.

The Enterprise Agreement

Very often, it makes sense for bigger customers requiring a cloud solution to enter into an Enterprise Agreement (EA) with Microsoft for using the Azure platform. There are many benefits to the EA such as:

  • The prices are better

  • It can make it easier to procure new cloud-based resources

  • Azure usage can be managed with subscriptions, to allow organizations to ascribe costs to departmental budgets, and do internal cross-charging, via individual ‘cost code’ tags.

  • It can challenge the entire cost model for traditional data centers.

Alongside these benefits, the EA introduces some challenges, including:

  • How do we control resources added to the cloud?

  • How do we get an overall view of all Azure resources?

  • How do we know that we are getting good value for money?

  • How do we know that the resources are being used effectively?

There are many other questions that we as a business customer might want to ask, that you can answer more easily if you have access to the billing data. However, this billing data is available only via an API, or a CSV download, so if we want to mine this data to see how costs add up, we need to get it into a form that is easily imported into a system, and that makes for simple turn-key data mining within the business.

Some vendors have been exploring the option of providing an SaaS application, which would allow us to give them our EA Billing API Key and they would then give us some analysis. This would not appeal to the average risk-averse customer, who would consider this to be a security risk.

The only other viable alternatives would be to use the Azure tools provided, or to “roll your own” solution.

Accessing EA Billing Data Using Built-in Azure Tools

The Azure tools include several options that can be used to review billing information in Azure:

  • Cloudyn – this is a tool which for now is free (soon will be paid, according to some specialists). This tool gathers data about billing and resource costs

  • Azure Cost Reporting – which is embedded in the portal. As long as the user has access to the Billing Reader RBAC role, they can see the cost of the resources in their resource group.
    From the Azure portal click on ‘Browse‘ —> ‘Subscriptions‘ and click on your subscription to see this built in spending report and a very basic forecasting)

  • PowerBI – This has built-in EA reports, so if a user has a PowerBI license and connects their API key for the EA portal, they will see the usage and the resource allocations (for more info, see the documentation Connect to Microsoft Azure Enterprise with Power BI )

We can use these tools to explore the EA costs, but it is a compromise that means that:

  • It is difficult to process, customize and administer the data for larger organizations with hundreds of subscriptions and resource groups;

  • These approaches aren’t sufficient to produce a unified Chargeback report which is to be sent to the Finance department, so the costs can be mapped to cost centers throughout the organizations.

  • It is impossible to calculate a chargeback (the cost distribution of utilized resources per cost center) that can be on either a Department / Subscription level, a Subscription / Resource Group level, or even only on a specific Resource Level.

  • the EA portal retains only 12 months of data, and once it is rotated, it cannot be retrieved

To avoid these compromises, there is no realistic alternative to a home-brew solution for downloading, consolidating and mining the EA data.

The Homebrew Solution: A Roadmap

The solution I offer will automate the chargeback reporting sent to Finance by using a mapping table per Department / Subscription level or Subscription / Resource Group level. It will also:

  • Give access to the EA billing data only to authorized users (each user should see their own data and nothing more)

  • Be entirely accessible to an appropriate manager

  • Produce a report with common shared costs (for example Express Route costs can be accumulated from different entities across the organization, and they need to be split somehow)

  • Create a daily list of newly created resources, with immediate alarms for resources which incur costs over a certain rate

For example, as a preview of the reward for th set up I describe in this article, here is a screenshot of the sort of report you can expect to see. This one shows a report for a subscription level billing, the spending patterns per day and the general trend over a month. The report is delivered only to the authorized staff, according to the security lockdown of the data, as proposed in the solution.


From here the user or manager can draw conclusions about their usage: the system is generating costs on weekdays, while on weekends, it is fairly idle, and there is an upwards trend in spending over time.

The cost of the home-brew solution should be minimal, and preferably the reporting should be free. To meet these requirements and constraints, I opted for a solution built using the following components:

  • A centralized place to store the EA data – in this case, we will use Azure SQL Database, because it supports indexing and T-SQL functionality to do reporting

  • A simple way to import the data from EA portal to the SQL Database – we can get the data either as a CSV file, or via API call. This solution uses the latter; an Azure API app, together with a Web Job to schedule the API calls and to maintain data consistency. When the API call is made, only data that is not already in the database is downloaded and stored.

In Part 2 of the series we will explore the Chargeback Mapping table, the reporting views and stored procedures and most importantly, we will explore the Row-Level access to the data, so we can have a security layer allowing users to see their own data only, and a Manager role to overview all data.

Here is the solution roadmap:

Getting the Project Ready

To set up the Azure SQL Database, the API App and the Web job we will be using a Visual Studio project and C#. This will make it easy to configure, maintain and deploy. Here is a snapshot of the Solution Explorer.

The code for the solution can be downloaded from here as a zip file. After unzipping the files, the main solution file can be opened with Visual Studio. There are three Projects in the solution:

  • EABillingAPI – this is a Azure API web app which creates the environment to call the EA API and to host the scheduling of the BillingWebJob, which calls the EA API to bring new data to the database

  • BillingDB – this is the code for the Azure SQL Database, which will contain the billing data

  • BillingWebJob – this is the code for the scheduled call to the EA API which checks periodically if there are new records in the EA portal which are not imported in the database

Configuring the Projects for Deployment

The first thing to do, to prepare for deployment, is to configure the projects. Each project has an XML file containing configuration settings. The Web job project has an app.config and the API App project has a web.config file.

First, let’s open the web.config file of the EABillingAPI project. It looks like this:

In this file, the only things we need to enter are the EA enrollment number and the EA API key. These two values can be found in the EA portal. When you log in, you will see the Enrollment number on the left, and the API key can be found under the ‘Reports‘ —> ‘Download Usage’ —> ‘API Access Key’ tab.

While the enrollment number most likely won’t change, the API keys will be recycled often, so make sure to refresh the API key in the web.config as well, otherwise the app will stop getting data.

Deployment of the EABillingAPI Web App

To deploy the API app, just right-click on the EABillingAPI project and click ‘Publish…‘ In the following screen select ‘App Service’, then ‘Create New…‘ and click ‘Publish‘.

On the next screen, you’ll see the following (you might have to log in to your Azure account first):

Enter a valid App Name, select a Subscription, Resource Group and a Hosting Plan. and click ‘Create’. This will take a few moments to deploy.

When the deployment is complete, a web browser will open automatically with the Web App URL. In this case it will be http://eabillingapi123.azurewebsites.net/. Take a note of this URL, because it will be used in the app.config of the Web Job later.

Setting up Authentication

We need to set up the Authentication of the API App:

Click on the API App we just created in the Azure portal, and under Authentication / Authorization, enable App Service Authentication. In this case, I have left it as ‘Allow Annonymous’, but in a real scenario this would not be recommended.

To verify that the API app is working, we can use swagger (an open source toolset which helps developers design, build and consume RESTful web services, https://swagger.io/) by adding it to the end of the URL, like this: http://eabillingapi123.azurewebsites.net/swagger

In the following web page, you can enter your EA API key and click ‘Explore’.

Then you can test the API call if it has a proper response.

Preparing for Deployment of the Web Job

There are a few things that you need to create in Azure before deploying the Web Job. Firstly, you will need a Blob storage account, which will be used to store the flat files from the EA API before they are imported into the database. The settings should be like this:

Secondly, you’ll need an AD app for authorizations of the web job call. Here is the documentation on how to create the Azure AD app: https://docs.microsoft.com/en-us/azure/azure-resource-manager/resource-group-create-service-principal-portal.

Thirdly, you’ll need an Azure SQL Database with the following settings:

Make sure to note down the login’s username and password, as well as the connection string for the database. For the initial load, you might want to use 50 or 100 DTUs, and after the initial load is done, you can save costs by using 20 DTUs.

Deploying the Database

To deploy the database, right-click on the database project and click ‘Publish…‘. Then, enter the target database connection and click ‘Publish’. The database will be deployed shortly. In this version of the project, the database is very simple; there are only two tables: dbo.EaBillingData where the EA data is imported, and dbo.AuditData where the error messages are saved in case the Web job fails. In the second part of the series, we will improve the BillingDB project by adding procedures for reporting, views and Row-Level security access functionality.

Configuring the Web Job

Open the app.config and enter the actual settings, according to the Azure storage you just created, in the following config keys: AzureWebJobsDashboard, AzureWebJobsStorage and StorageConnectionString.

Enter the database connection string in the config key named AzureAnalyticsDbModel.

Enter the Web App URL in the WebApiUrl key.

From the creation of the Azure AD app, enter the WebApiAADClientId, WebApiAADClientSecret and WebApiAADDomainName keys.

The last setting to adjust is the start and end date of the data load: EaBillingPeriodStartDate and EaBillingPeriodEndDate. The only mandatory field is the EaBillingPeriodStartDate, which means that if you enter 2018-01 as the value, and you leave EaBillingPeriodEndDate empty, then the web job will run forever.

Deploying the Web Job

To deploy the Web Job, right click on the BillingWebJob and click on ‘Publish as Azure Web Job…’. Publish to Microsoft Azure App Service:

Then, after you log in to your Azure account and select your subscription, you will see the API App that you created in the previous deployment step:

In the next step, validate the connection and Publish.

We need to configure Always-On. Since the Web job must run continuously, we have a need to configure Always-On on the App Service, otherwise, the App Service may enter idle mode if no traffic is seen for a considerable time. To enable always-on, login into Azure portal. Select the App Service we created and go to ‘Application Settings’. Find the ‘Always on‘ setting and flip it to “ON“. Click ‘Save.

Now the job is deployed and running.

Deployed and Ready

At this point, we have set up the Azure API App, the Database and the Web Job. Now it is time to monitor the web job logs and give some time for the web job to write some data from the EA API to the database.

In the next article, we will explore the loaded data in a bit more detail and we will add the Chargeback Mapping Table functionality, as well as the Row Level access functionality. We will also create some Excel report templates for the Manager and for the users. There are endless options for mining and visualizing the data, after we have collected it. For example, the following screenshot shows the spending versus the resource quantity consumed, for a specific department, over a two-month time slice. This provides a valuable insight into the costs, and exposes opportunities for finetuning use of the department’s Azure resources.



In this article, we set up a centralized storage (Azure SQL Database) and used it as a repository for the EA billing data. We set up a Web App in Azure and a Web Job under it, so it can be scheduled to import new data as it comes to the EA portal.

This is an easy way to get access to the EA billing data and gives us the opportunity to query and mine it. Furthermore, the EA portal retains only 12 months of data, and once it is rotated, it cannot be retrieved. This way, we can retain more than 12 months of data for deeper analysis.

In the next article, we will focus on giving granular access – each entity of users will access only their own data, and a Manager can access it all. We will also explore the reporting possibilities and will create the Financial reporting for the Enterprise.