Automatic gathering of SQL Server configuration by using PowerShell

At present, the gathering of SQL Server configuration information is no longer a problem— the opportunities to add value through an update and to build a stable database environment exist permanently. Most of the third party tools provide the appropriate functionality in order to capture the software market. Through this guide, you’ll figure out the ways to capture few SQL Server configurations and manipulate the data by using PowerShell.

Moreover, you will get the answers to the following questions:

  • How to gather SQL Server configuration
  • How to automatically discover SQL instances
  • How to manually input the SQL Server instances to speed up the process
  • How to transform data by using PowerShell
  • And more…

Read More

Overview of Data Compression in SQL Server

The database is a critical and vital part of any business or organization.  The growing trends predict that 82% of enterprises expect the number of databases to increase over the next 12 months. A major challenge of every DBA is to discover how to tackle massive data growth, and this is going to be a most important goal. How can you increase database performance, lower costs, and eliminate downtime to give your users the best experience possible? Is data compression is an option? Let’s get started and see how some of the existing features can be useful to handle such situations.

In this article, we are going to learn how the data compression solution can help us optimize the data management solution. In this guide, we’ll cover the following topics:

  • An overview of compression
  • Benefits of compression
  • An outline about data is compression techniques
  • Discussion of various types of data compression
  • Facts about data compression
  • Implementation considerations
  • and more…

Read More

Importance of transaction log in SQL Server

Transaction logs are a vital and important component of database architecture. In this article, we’ll discuss SQL Server transaction logs, importance, and their role in the database migration.


Let’s talk about different options for taking SQL Server backups. SQL Server supports three different types of Backups.
1. Full
2. Differential
3. Transaction-log

Before jumping into transaction-log concepts, let’s discuss other basic backup types in SQL Server. Read More

Overview of DMV Tool using DBFS FUSE with Docker Containers on Linux Machine

This article is intended for database administrators and system administrators who plan to configure the DBFS FUSE file system on a Linux machine using the Docker containers. I am going to present the step-by-step procedure for setting up and configuring the DBFS FUSE file system to access local or remote SQL database instances on Linux CentOS docker containers.

Introduction to the Docker, DBFS, FUSE and DMV Tool

To make the most of this material, you’ll need some knowledge of the Docker Containers and SQL Server. I would recommend you refer to the previous articles that explored the topics:

Introduction to Docker and Docker toolbox

Docker and SQL Server on Linux

Now, let’s start with the basic concepts of Docker, DBFS, FUSE, and DMV Tool.


Docker allows us to package and run an application in a loosely isolated environment called a container. Docker is an open-platform engine that facilitates faster delivery, faster development, and faster deployment in an isolated environment.


DBFS (Database File System) is a process of mounting a virtual file system on the top of that database objects using the FUSE technique which provides an interface between a VFS and Kernel module. This setup lets us access those files using dbfs programs.

A big part of SQL Server monitoring is to ensure that SQL Server is performing optimally and it’s highly available. Traditionally, to get system metrics or performance information, one could rely on GUI admin tools such as SSMS, or command-line tools such as SQLCMD to run queries.

Now, there is another option. We can query the information using a new experimental Linux tool known as DBFS. It enables us to access live DMVs mounted to a virtual file system through the FUSE file-system concept. Thus, we can explore information about the database (Dynamic Management Views) using native bash commands.

There is no need to use any other tools or run SQL queries. The meta-data configuration file is used internally by the DBFS process to query the SQL Server databases.


  1. Access data in JSON or CSV format.
  2. Compatible grep, sed, and awk programs.
  3. Real-time DMV data
  4. Cross-platform support (Windows and Linux)

Note: This tool is currently only available for Ubuntu, Red Hat, and CentOS. SUSE coming soon!

DMV Tool

DMV TOOL is a command-line, cross-platform tool that allows for accessing the live data by mounting SQL query output file or DMVs (Dynamic Management Views) to a virtual directory using FUSE and by querying a file structure as an object.

DBFS Concepts

In Linux, everything is a file. FUSE (Filesystem in Userspace) is a loadable kernel program for all Linux distros. It provides a platform for non-privileged users to create their own file systems without even touching the existing kernel code. This is achieved by mounting a file system in a user space through the interface known as FUSE modules.

Diagram Flow Implementation

In this section, the implementation steps are represented in a diagram.

Getting Started

See the following steps to configure the DBFS:

  1. Log in to the console with the SU account (Super User Mode)
  2. Install the wget utility. The GNU Wget is a non-interactive, file retrieval utility for downloading files from the Web or via FTP.
# yum install wget
  1. To get Docker set up on a CentOS, start to download the .rpm package for the corresponding release and install it manually using YUM command. Browse the URL to download the latest docker-ce file for the Docker version that you want to install.

4. Download the rpm package using the wget utility

# wget
  1. Install Docker CE package using the following yum command:
#yum install docker-ce-18.03.0.ce-1.el7.centos.x86_64.rpm
  1. Check the docker package information using the yum info command.
#yum info docker
  1. Use the systemctl command to start Docker, verify and enable the Docker service
#systemctl start docker
#systemctl status docker
#systemctl enable docker
  • Verify that Docker is installed correctly by running the welcome image.
#docker run hello-world
  •  Let’s download the CentOS latest image from the library. Use the Docker Pull command to extract the CentOS image, the community edition of RHEL Linux distros. We can also define a specific image from the list using tags or just append a colon ‘:‘ and type latest that only gives the most current image of CentOS.
#docker pull centos:latest
  • Once the pull is complete, create a container from the extracted image. We’ll do that with a sudo docker run.
# docker run –it centos:latest bash

Here, -it refers to interactive terminal, centos:latest is the reference to the image, and bash defines the shell that will open when the container starts.

Now, we can see the status of all Docker containers with the command docker ps –a

# docker ps -a
  • Let’s install the wget utility
# yum install wget
  • Download the DBFS (dbfs-0.2.5-0) package using the wget utility
  • Download the EPEL repository using wget utility
  • Setup the EPEL package
#rpm -ivh epel-release-latest-7.noarch.rpm
  • Install the DBFS package
#yum install dbfs-0.2.5-0.x86_64.rpm
  • Verify the DBFS package information
#yum info dbfs
  • Let’s quit the Docker container session by typing EXIT. Now, commit the Docker image. This commits all the previous changes made to the centos:latest image. The committed image will have the wget utilities, repositories of EPEL and DBFS package.
#docker commit 7af55818f007
  • Add the tag to the committed image using the Docker tag command. Let us name the tag centos:dbfs. We can see the new image using Docker images.
#docker tag 88cd8b6c264890547bc0fa24047f5248f2f6c40b5147a580c18a78018f8c664c centos:dbfs
#docker images

Docker image is ready with the DBFS configuration.

Let’s see how the DBFS FUSE program executes DMVs and Custom queries remotely using the configuration file.

Getting Ready

We will use the image centos7:dbfs in this section. Ensure that this image is available and has the required package and configuration to build the FUSE filesystem.

The following command is executed to launch the container centos:dbfs with a bash console:

#docker run –it –device /dev/fuse --cap-add SYS_ADMIN centos:dbfs bash

To mount FUSE file-system based, we need to combine both –cap-add and –device options with the docker run command. We can mount the FUSE file systems inside a container, but FUSE must be enabled on the host system. Docker will not explicitly enable and load kernel modules on the host.

Let us take a look at the other parameters of the docker run command:

SYS_ADMIN performs a range of system administration operations

–device flag limits access to a specific device that we can use for the container. By default, the container can read and write these devices.

DBFS mount command should not be started as the root user. To start dbfs, you need to create a normal user account. Next, go ahead and create the user “docker” with the useradd command. This command is a low-level utility applied for adding/creating user accounts in Linux.

The useradd command in Linux performs the following things:

  • Edits /etc/passwd, /etc/shadow, /etc/group, and /etc/gshadow files.
  • Creates and populates the home directory.
  • Sets permission and ownership to the home directory.
#useradd docker
#passwd docker

Grant full permissions to the /tmp folder. Let the user docker have full read and write access to all files and associated directories and subdirectories in that /tmp directory – use the command:

#mkdir /tmp/dbfs
#mkdir /tmp/dbfs/sql
#chown -R docker /tmp

The first chown command makes the user docker own the /tmp directory. The second chmod command gives them full read and access permissions. The r gives read permission, the X gives execute permission to directories and not files.

Now, without exiting the session, log in to enter the user session. dbfs should always run at the user privilege. I’m using the docker user credentials which I created above to open a user session. We can see that the pwd is set to /home/docker.

Let’s prepare the configuration file – it requires the following parameters:

  • [Server friendly name]
  • hostname=[HOSTNAME]
  • username=[DATBASE_LOGIN]
  • password=[PASSWORD]
  • version=[VERSION]
  • customQueriesPath=[PATH_TO_CUSTOM_QUERY_FOLDER]

In the following example, I’m connecting to a SQL instance, the Windows Server, HQDBT01.

  • Hqdbt01 is the server’s friendly name.
  • The location of the output file or dump directory would be //CustomQueries/. For example, the /tmp/dmv is the mount directory, then the path to the dump directory would be /tmp/dmv/hqdbt01/CustomQueries/.
  • Hqdbt01 is a hostname of the SQL instance.
  • The username and password are the SQL instance access credentials. I would recommend you define the database user and grant the required permission to that user. Then, the VIEW SERVER STATE permission must enable the user to query DMVs.
  • The version is set to 16 or higher (SQL Server 2016 or later), the fetch the data as JSON. Otherwise, the data would be exported as Tab-Separated Values (TSV).
  • CustomQueriesPath is the path of the SQL file to which the custom queries will be written.

Change the directory to /tmp and create a directory,dmv, the mount directory. Now, FUSE the filesystem as flat files using the dbfs client.

$cd /tmp/
$mkdir dmv
$dbfs –c ./dmvconfig.tool –m ./dmv

The DMV data can also be accessed using various data extraction commands.

$cat dm_exec_sessions

List only those databases where the database recovery model is set to SIMPLE:

$cat databases| grep SIMPLE

List the database name, database information, and sizes of columns of the database where the recovery model is set to SIMPLE. We can see that the output of one program is redirected to another program for further processing. The output is formatted as tab-separated values.

$cat databases| grep SIMPLE | awk ‘{print $1,$2,$5}’| column -t

Configure DBFS to Run Custom SQL Queries

Creating a custom SQL query is pretty straightforward. As per the configuration file, go to the path /tmp/dbfs/sql and write the SQL statement in a SQL file, demo.sql.

#vi /tmp/dbfs/sql/demo.sql

Type the following SQL code in the demo.sql:

SELECT name,user_acces_desc,is_read_only, state_desc,recovery_model_desc from

Save the file by pressing escape :wq

The JSON output is then dumped into a file and mounted to a virtual directory. The thing is, the Docker container doesn’t have any tools installed on the source image. Still, we can fetch the SQL data.

Browse the directory /tmp/dmv/hqdbt01/customQueries

$cd /tmp/dmv/hqdbt01/customQueries/
$cat demo.sql|awk ‘{print $1,$2,$3,$4}’|column -t

Let’s update the demo.sql file with the following SQL statements or SQL of your choice:

SELECT c.session_id
, c.auth_scheme
, c.node_affinity
, s.login_name
, db_name(s.database_id) AS database_name
, CASE s.transaction_isolation_level
WHEN 0 THEN 'Unspecified'
WHEN 1 THEN 'Read Uncomitted'
WHEN 2 THEN 'Read Committed'
WHEN 3 THEN 'Repeatable'
WHEN 4 THEN 'Serializable'
WHEN 5 THEN 'Snapshot'
END AS transaction_isolation_level
, s.status
, c.most_recent_sql_handle
FROM sys.dm_exec_connections c
INNER JOIN sys.dm_exec_sessions s
ON c.session_id = s.session_id

Browse the directory /tmp/dmv/hqdbt01/customQueries and view the file

$cat demo.sql

The docker image is configured and tested with the dbfs FUSE technique to mount the programs as virtual directories on the Docker container.

Adding an Existing Project to GitHub Using the Command Line

The first step is to save the Docker image to a compressed file. This is the file that we are uploading to the github repository.

#docker save centos7:dbfs > centos7_dbfs.tar

Install Git on CentOS using yum

#yum install git

Let’s initialize or re-initialize the local directory as a Git repository

git init

Add the .tar files to the local repository. This stages them for the first commit

git add centos7_dbfs.tar

Commit the file that we’ve staged in the local repository

git commit -m "centos7 built with dbfs configuration"

Add the URL to the remote repository where the local repository will be pushed

git remote add origin <remote repository URL>

Push the changes in your local repository to GitHub.

git push –u origin master

Wrapping Up

In a nutshell, Docker can get more applications running on the same hardware than other technologies. It makes it easy for developers to create ready-to-run container applications quickly, and it makes managing and deploying applications much easier.

We have installed the Docker engine, provisioned Docker Linux VM, and installed the DBFS FUSE package. We have seen the complete life cycle of the Docker containers. It is that simple to build, use and test programs using the Agile process. Do you agree?

Understanding Data Sharing in SQL Server Docker Containers

In the previous articles, “Introduction to Docker and Docker Toolbox” and “Docker and SQL Server on Linux“, we discussed a lot about SQL Server Docker containers. As you should know by now, instantiating SQL images is very simple and pretty straightforward thing with the Docker containers.

This article talks about shared volumes and the preferred mechanism for data persistence using the Docker containers. Read More

Introduction to Docker and Docker Toolbox

The article gives some knowledge about getting started and utilizing Docker containers. The focus is all about understanding  Docker and its basics. Once you start grasping the basics and learn to use them, you will easily understand how to use Docker with other products.

In this article, we are going to review the following points:

  1. Introduction to Docker
  2. Pre-requisites to be done for the installation
  3. Installation of Docker toolbox on Windows
  4. Basics on getting started with Docker
  5. Main docker commands to be used
  6. Methods to build Ubuntu images from the docker hub repository
  7. And much more…

Read More

Docker and SQL Server on Linux

SQL Server 2017 is the biggest release and most important releases in the Microsoft product history simply because of the freedom that it gives to install the product anywhere. The advent of the Docker containers and growing emphasis on open source solutions opened the door to greater technological innovation and advancement.

This article discusses the following topics:

  1. Introduction to Docker container
  2. Pre-requisite to build Docker container
  3. A walkthrough of the setup process to build a SQL Server instance running on the Linux CentOS and in a Docker container
  4. How to connect to the database engine
  5. How to connect to the database engine externally
  6. And more…

Read More

Getting Started with Mssql-cli Command-Line Query Tool

A recent announcement on the release of several SQL Server tools has raised expectations across various groups. Product requirements and business are almost always a trade-off, and striking the right balance in a product in terms of the toolset is a sign of a successful product. After testing the SQL Operations Studio, I feel that it’s a promising tool for many developers, administrators, and DevOps specialists. In my opinion, the mssql-cli tool adds another feature to SQL Server in order to make it a leading database product.

Read More

Getting Started with SQL Server 2017 on Linux in the Azure portal

SQL Server 2017 now is considered as a hybrid database enterprise solution as it expands its market and is ported to other operating system platforms. It also includes mainstream support for Linux machines. The Cloud makes the life of administrator much easier, now it’s no longer daunting task to configure the SQL Server instance. The easiest way to explore SQL Server on Linux is to provision a virtual machine through Microsoft Azure portal – The Linux azure virtual machine will come pre-configured with Linux and SQL Server 2017.

Read More

How to Use Default and Custom Widgets in SQL Server Operations Studio

There are already a number of articles and blog posts that reveal the benefits of the SQL Operations Studio tool. The demand created across the software and the related tools are relatively unimaginable than ever before. The trend and growth will continue to increase in the upcoming days.

The graphical representation is in the vogue today. Visualizing data helps us to better understand this and to make decisions. It’s no wonder that data visualization continues to attract a growing number of users. The development of any toolset could provide opportunities to speed up the software development lifecycle process.

This article demonstrates the advantages of using custom SQL queries or complex T-SQL to provide a great insight into the database and explains how one can use this insight to build custom widgets. In this case, the SQL Operations Studio Widget is a customized piece of the code to personalize the SQL Server Dashboard for effective management of SQL instances.

Read More