Azure Functions are a great development tool, allowing us to create serverless software. However, one detail was bothering me when I create a function: The HTTP Trigger receives a HTTPRequest object and we need to extract the parameters from the request. This becomes kind of a low-level code mixed with our business code. It’s like going … Read more
Azure SQL has a close relationship with Azure Storage. Features like Polybase, backups, extended events and more make use of Azure Storage. On Azure SQL Database, probably the most common use is Extended Events. When we create a file target, we need to point to the Azure Storage URL where the file will be stored. … Read more
Data Lakes are becoming more usual every day and the need for tools to query them also increases. While writing about querying a data lake using Synapse, I stumbled upon a Power BI feature I didn’t know was there. When reading from a data lake, each folder is like a table. We store in the … Read more
You may have noticed the export feature on Azure resource groups don’t like too much the Data Factory. We can’t completely export a Data Factory as an ARM template, it fails. Probably you know you don’t need to care about this too much. You can link the data factory with a github repo and get … Read more
One great result from PASS Summit, especially when we are close to a new SQL Server release, is to identify important technologies to study on the following year. PASS Summit 2018 was great, with sessions about many new technologies giving us very good guidance on where to focus our study for the new year. Let’s … Read more
Azure SQL Data Warehouse is a fully-managed and scalable cloud service. It is still in preview, but solid. Not only is it compatible with several other Azure offerings, such as Machine Learning and Data Factory, but also with various existing SQL Server tools and Microsoft products. It talks Power BI. Are we now seeing the final piece of the Azure jigsaw fall into place?… Read more
Some of the most intractable problems of application design are concerned with how to store credentials for accessing sensitive application data, keys and, configuration settings in code. With Azure Key Vault you don't have to. You can, instead just authorise an application to access and use a Key Vault and perform operations that require authentication against a KeyVault. Christos Matskas shows how an application can interact with the service, using a node.JS application as an example.… Read more
The whole point of using a cloud service is to be able to use it intensively for a brief period just when it is needed and then clear out all your work when you've finished. This means automation to make the process as quick and easy as possible. It is likely to mean creating a VM, provisioning it from scratch and spinning it up using PowerShell. Relax, grab the popcorn, and let Adam Bertram show you how he does it in Azure.… Read more
No longer do developers need to store sensitive application data, keys and, configuration settings in code - Azure Key Vault can store them for our applications on the cloud. Christos Matskas shows how to provision a new Key Vault in Azure using the Azure PowerShell cmdlets, and how to authorise an application to access and use a Key Vault.… Read more
The name 'Azure Key Vault' hides a valuable Azure service that allows us to easily protect our Cloud data by putting sound cryptography in Cloud applications without having to store or manage the keys or secrets. This makes it far easier to manage cloud data in applications in a way that that complies with industry-standards for sensitive data.… Read more
DocumentDB only uses SQL for querying data. To create procedures and functions, you have to flex your JavaScript skills to create JavaScript functions that are saved to a DocumentDB collection. Robert Sheldon shows how it is done. … Read more
DocumentDB is a late-entrant in the Document-oriented database field. However, it benefits from being designed from the start as a cloud service with a SQL-like language. It is intended for mobile and web applications. Its JSON document-notation is compatible with the integrated JavaScript language that drives its multi-document transaction processing via stored procedures, triggers and UDFs.… Read more
The Data Lake is basically a large repository of data for 'big data' analytic workloads, held in its original format. The Azure Data Lake adds Data Lake Analytics, and Azure HDInsight. Although the tools are there for Big Data Analysis, it will require new skills to use, and a heightened attention to Data Governance if it is to appeal to the average enterprise.… Read more
Once you've built your BizTalk solution, you will need to deploy and monitor it. Matt Milner shows that, while there are several ways to monitor your solution, and manual deployment via Visual Studio, the automated deployment of BizTalk solutions is probably best served by Azure IaaS with BizTalk Server virtual machines.… Read more
Docker technology is Linux-based. Although the concept of a container isn't new, the common toolset, packaging model and deployment mechanism of Docker has made the use of containers far simpler. As a quick introduction to the technology for Windows users, Krishna shows, step by step, how to set up, and configure a Docker Daemon in Azure within a VM, and install a MySQL image in a container from the Docker Repository.… Read more
By moving applications to Azure, you can't always avoid the requirement for accessing data from applications that are based within your intranet. You have a wide range of choices for doing this. Matt Milner explains that there are several technologies that can achieve this for BizTalk, but that will apply more generally.… Read more
What's the best way of providing self-service business intelligence (BI) to data that is held in on-premise SQL Server? Not, it seems, Power BI 2.0 the hosted cloud service, but Power BI 2.0 Desktop. If moving your database to Azure isn't an option, Power BI 2.0 desktop could still bring smiles to the faces of your BI hotshots.… Read more
The Electronic Data Interchange has been around a long time. Azure BizTalk Services supports this important protocol for transmitting business transactions, and it is reasonably easy to set up, as Matt Milner explains… Read more
If you need to receive and process a large volume of packets of data, such as telemetry, or event-log items, it may be worth considering Azure Event hubs. They aren't like traditional messaging but represent more of a stripped down one-way event processing system for large volumes of data. It could represent a good solution to an ever-present problem, but is it ready for production use? Rob Sheldon investigates.… Read more
Azure Stream Analytics aims to extract knowledge structures from continuous ordered streams of data by real-time analysis. These streams might include computer network traffic, social network data, phone conversations, sensor readings, ATM transactions or web searches. It provides a ready-made solution to the business requirement to react very quickly to changes in data and handle large volumes of information. Robert Sheldon explains its significance. … Read more