SQL Server Business Intelligence – Modern Tools and Technologies

This article talks about the modern business intelligence (BI) tools and technologies related to the Microsoft platform. Additionally, the readers are going to learn about the inner work of Microsoft’s cloud-powered service that is available to build cost-effective business intelligence solutions at the corporate level.

CodingSight-SQL Server Business Intelligence - Modern Tools and Technologies

Before we start, it will be useful to refresh your knowledge of business intelligence concepts and some famous traditional tools that are used by business intelligence developers. We explored these topics in our previous articles. So, refer to the following materials, as they contain all the necessary information:

  1. SQL Server Business Intelligence (BI) – Basic Understanding
  2. SQL Server Business Intelligence (BI) – Traditional Tools and Technologies

By understanding the traditional tools, you get a background for comparing them with modern tools. This way, you’ll find the most suitable approach for your business intelligence projects. Also, you can mix and match traditional and modern tools, however, it will require understanding and experience. 

Azure Synapse Analytics – a Cloud-powered Data Warehouse and Much More

One of the most renowned technologies for developing modern Microsoft-based data warehouse BI solutions is Azure Synapse Analytics – one of the services provided by the Azure cloud platform.

The modern version of Data Warehouse (Azure Synapse Analytics) is slightly complicated in the beginning. It does not only bring too many new terms, but also it offers much more than just a cloud-powered data warehouse.

However, it is not mandatory to know every single feature of Azure Synapse Analytics. You need a sufficient understanding of how to meet your business-specific requirements using this massive technology.

According to the Microsoft documentation, Azure Synapse Analytics is a cloud-powered service developed by Microsoft for analytics.

Analytics is a field of Computer Science that deals with discovering new data to understand better understanding, improve, and, in some cases, automate decision-making. We can also say that Analytics is a process of discovering the knowledge to answer business questions and support business decisions for better productivity.

Since Analytics involves (among many others) a process of collating (gathering) data into a single location for analysis and reporting, data warehouse business intelligence solutions are an integral part of this job.

Understanding Business Intelligence in the context of Analytics

What do We Need Azure Synapse Analytics for?

Azure Synapse Analytics serves for building, testing, deploying, and managing analytics solutions, including data warehouse business intelligence solutions. Now, it is itself an enterprise-level Analytics mega-solution that typically offers three big things:

  1. Big Data Analytics
  2. Machine Learning
  3. DWBI solution (Azure SQL DATA WAREHOUSE)
Services offered by Azure Synapse Analytics

The Azure Data Warehouse or DWBI part of Azure Synapse Analytics is known as SQL Pool. It is offered to customers in the following two flavors:

Dedicated SQL Pool – an SQL-based data warehouse service where you have some pre-determined estimate of the expected power your cloud data warehouse needs to run

Serverless SQL Pool – is an option of automatic adjustment of required resources (particularly computational power) that is increased and decreased depending on the consumption.

In other words, a Dedicated SQL Pool gives you more deciding power but can be more costly if you can’t shrink or grow your data warehouse in time to manage it successfully. The Serverless SQL Pool gives you an automatic resource resizing your data warehouse solution, so it may be less costly, but it offers fewer options for data warehouse logic handling.

How Do you Measure the Power of your Azure Data Warehouse?

Generally speaking, and skipping the technical details, you only need the following two factors when estimating your Azure-based data warehouse (Dedicated or Serverless SQL Pool) use:

  1. Compute – the processing power required by the data warehouse queries to perform analysis and reporting.
  2. Storage – the space required by the data warehouse objects, such as tables, views, procedures, etc.

Compute and Storage Independence

One of the benefits of using Azure-powered Data Warehouse is the freedom to independently manage the compute and storage values. It means you can have a very cost-effective solution where you only require the storage and compute power you are happy with rather than trying to perform a pretty sophisticated balancing act (just because if you increase one, the other gets increased too).

Synapse SQL (Azure Data Warehouse) Architecture

Synapse SQL is another name of an Azure-based data warehouse business intelligence solution.

The Synapse SQL Architecture is multi-node – many nodes work side-by-side to process queries and return results. However, the processing is managed differently depending on whether you are using a Dedicate SQL Pool or a Serverless SQL Pool.

Let us go through the following major key players to understand the Synapse SQL (Azure Data Warehouse) architecture:

  1. Control Node
  2. Compute Nodes
  3. Azure Storage
  4. Data Movement Services (DMS)

Note that we are talking about the architecture mainly and only from the processing or computing point of view.

Dedicated SQL Pool (Azure SQL Data Warehouse) Architecture
  • Control Node is the first point of contact of your cloud-powered data warehouse with the outer world, such as an application querying the data warehouse interacts with this node. The Control Node accepts the data request, optimizes it, and sends it to the Compute Nodes for further processing.
  • Compute Nodes receive the data from the Control Node and run the queries in parallel to get faster results.
  • Azure Storage is where the Compute Nodes on the receiving end store the data to be processed.
  • Data Movement Services (DMS) help the Compute Nodes run queries in parallel and move data across different nodes for parallel processing.

As we mentioned earlier, the process management depends on using the SQL Pool, whether it is a Dedicated SQL Pool or a Serverless SQL Pool.

  • With the Dedicated SQL Pool in use, a requested T-SQL (query) is ultimately turned into multiple queries to run in parallel by Compute Nodes.
  • In the Serverless mode, a T-SQL is broken down into small T-SQL queries to be run by different Compute Nodes followed by getting joined in the end to get optimum results.

Preliminary Requirements for Azure Synapse Analytics

If you decide to appeal to the Azure Synapse Analytics in your team, you should get familiar with some concepts to make the most of the Analytics powers:

  1. Understand Azure in general
  2. Understand the basics of data warehouse
  3. Register an Azure account 
  4. Choose and start using the Azure Synapse Analytics service available in Azure Portal

A good thing is, Microsoft offers Azure Synapse Analytics a free trial which you can take benefit from to learn and explore further.

Things to Do

Now that you are familiar with the basics of Synapse SQL (Azure powered data warehouse) please try to check your knowledge:

  1. What is the Azure Synapse Analytics?
  2. What is the difference between Dedicated SQL Pool and Serverless SQL Pool in data warehousing?
  3. What is the role of Compute and Storage regarding Azure-based data warehouse?
  4. Can you differentiate between the Control Node and Compute Node(s)?

Stay in touch to learn more about Azure Synapse Analytics!

Haroon Ashraf

Haroon Ashraf

Haroon's deep interest in logic and reasoning at an early age of his academic career paved his path to become a data professional. He began his professional life as a computer programmer more than 10 years ago working on his first data venture to migrate and rewrite a public sector database driven examination system from IBM AS400 (DB2) to SQL Server 2000 using VB 6.0 and Classic ASP along with developing reports and archiving many years of data. His work and interest revolves around Database-Centric Architectures and his expertise include database and reports design, development, testing, implementation and migration along with Database Life Cycle Management (DLM).

Leave a Reply

Your email address will not be published. Required fields are marked *