Auto-Email System to Send Database Summary Report

Database monitoring is the most essential job of any database administrator. Big organizations and companies have multiple database servers that are located either in the same data center or in the geographically different data centers. There are many standard tools for database monitoring. Those tools use the SQL Server dynamic management views and system stored procedure to populate the data. Using these DMVs, we can create our customized automated system to populate the status of the database and email the report.

In this article, I am going to demonstrate how we can use the system stored procedure and linked server to populate the information of databases located on different servers and schedule the job to send the report.

In this demo, I am going to perform the following tasks:

  1. Create required stored procedures on TTI609-VM1, TTI609-VM2, and TTI412-VM servers to populate information of the database, database objects, and SQL Jobs.
  2. Create a stored procedure to populate database summary, database object summary, and SQL Job summary from the TTI609-VM1 and TTI609-VM2 servers and store them in related tables.
  3. Create an SSIS package which performs the following tasks:
      • Executes a stored procedure using Execute SQL Script Task.
      • Export data from SQL tables created on TTI412-VM and store it in the individual tab of an excel file.
  4. Create an SQL Server Job to execute the SSIS package to populate the database information and stored procedure to email the report.

Following image illustrates the demo setup:

Following is the list of stored procedures:

Following is the list of Tables:

Create Stored procedures on both database servers

As I mentioned, we are going to populate data from the TTI609-VM1 and TTI609-VM2 servers. The stored procedures used to populate the database will remain the same in both servers.
So firstly, I have created a database named DBATools on both servers. I created a stored procedure in those databases. To do that, execute the following code on the TTI609-VM1 and TTI609-VM2 servers:

Create a stored procedure named Pull_Database_Information in the DBATools database. This stored procedure populates the following information of all databases existing on both database servers.

  1. Database Name.
  2. Database Compatibility Level.
  3. State of the database (ONLINE/OFFLINE/RESTORING/SUSPEND).
  4. Database recovery model (SIMPLE / FULL / BULK-LOGGED).
  5. Database Size in MB.
  6. Total Data File Size.
  7. Used Data File Size.
  8. Total Log File Size.
  9. Used Log File Size.

Execute the following code in the DBATools database of both database servers to create the stored procedure:

Secondly, create stored procedures named Pull_Database_Objects in the DBATools database. This stored procedure iterates through all the databases within both database servers and populates count of all database objects. It populates the following columns:

  1. Server / Host Name.
  2. Database Name.
  3. Type of database object (Table / Stored procedure / SQL Scaler functions / constraints etc…)
  4. Total number of database objects.

Execute following code in the DBATools database on both database servers to create the stored procedure:

Create stored procedures named Pull_SQLJob_Information in the DBATools database. This stored procedure iterates through all the database server and populates the information of all SQL jobs and their statuses. It populates the following columns:

  1. Server / Host Name.
  2. SQL Job Name.
  3. Owner of SQL Job.
  4. Job Category.
  5. Description of Job.
  6.  Status of Job (Enabled / disabled)
  7. A date of job creation.
  8. Job Modified date.
  9.  Job scheduled status.
  10. Schedule Name.
  11. Last execution date and time
  12. Last execution status.

Execute the following code in the DBATools database on both database servers to create the stored procedure:

Create Stored procedures, Linked Server and tables on the central server

Once the procedures have been created on the TTI609-VM1 and TTI609-VM2 database servers, create required procedures and tables on the central server (TTI412-VM).

I created a separate database named as MonitoringDashboard on the TTI412-VM server. Execute the following code to create a database on the central server.

Once the database has been created, create a stored procedure which uses LINKED Server to execute a procedure on the TTI609-VM1 and TTI609-VM2 database servers. Execute the following code in the “master” database of the TTI412-VM database server to create a linked server:

Script 1: Create Linked server TTI609-VM1

Script 2: Create Linked server TTI609-VM2

Create a Stored procedure and tables

Once the linked server has been created, we need to create three tables named Database_Object_Summery, Database_Summery, and SQL_Job_List in the MonitoringDashboard database. These tables store information populated by the Generate_Database_Information stored procedure and later the data, stored in those tables that will be used to generate a report in excel.

Execute the following code to create the SQL_Job_List table:

Execute the following code to create the lst_dbservers table:

Execute the following code to create the Database_Summery table:

Execute the following code to create the Database_Object_Summery table:

Once the tables are created, create a stored procedure named Generate_Database_Information on the MonitoringDashboard database. Using “RPC”, it executes stored procedures, created on the TTI609-VM1 and TTI609-VM2 database servers to populate the data.

Execute the following code to create a stored procedure:

Once the procedure is created, create an SSIS package to export data into an Excel file.

Create SSIS Package to export data in excel file

In my past articles, I explained the steps to configure the data flow tasks, OLEDB Connections, and Excel connections, hence I skip this part.

To export data into an Excel file, Open SQL Server Data tools and create a new SSIS project named Export_Database_Information.

Once the project is created, drag and drop Execute SQL Task to the Control Flow window and rename it as Populate Data from Servers. See the following image:

Double-click Execute SQL Task (Populate Data from Server). The Execute SQL Task Editor dialog box opens to configure SQL connection. See the following image:

In the Connection filed, select the OLEDB connection string and in the SQL Statement field, provide the following query:

Click OK to close the dialog box.

On the Control Flow screen, drag and drop Data Flow Task from SSIS Toolbox and rename it as Generate Report. See the following image:

Double-click to open the Data Flow window.

As I mentioned earlier, the Generate_Database_Information procedure inserts the output of the database servers in the following tables:

  1. Database_Object_Summery
  2. Database_Summery
  3. SQL_Job_List

I created an Excel file that has three worksheets. The following tables show the mapping of the SQL tables and Excel Worksheet.


In the Data Flow window, drag and drop three ADO.Net Sources and three Excel destinations. See the following image:

Double-click Object Summary (ADO.NET Source) in ADO.NET Source Editor.

  1. Select TTI412-VM\SQL2017MonitoringDashboard  from the ADO.NET connection manager drop-down box.
  2. Select Table or View from the Data access mode drop-down box.
  3. Select Database_Object_Summery from Name of the table or the view drop-down box.

Double-click Database Information (ADO.NET Source) in  ADO.NET Source Editor.

  1. Select “TTI412-VM\SQL2017MonitoringDashboard” from the ADO.NET connection manager drop-down box.
  2. Select Table or View from the Data access mode drop-down box.
  3. Select “Database_Summery” from the Name of the table or the view drop-down box.

Double-click  SQL Jobs (ADO.NET Source) in ADO.NET Source Editor.

  1. Select TTI412-VM\SQL2017MonitoringDashboard in the ADO.NET connection manager.
  2. Select Table or View from the Data access mode drop-down box.
  3. Select SQL_Job_List in the Name of the table or the view drop-down box.

Now, drag and drop three Excel destinations from  SSIS Toolbox. See the following image:

Once destinations have been copied, drag blue arrow beneath the ADO.Net source and drop it on the Excel destination. Do the same for all. See the following image:

Double-click Excel Object Summary (ADO.NET Source) in Excel Destination Editor.

  1. Select Excel Connection Manager from the Excel connection manager drop-down box.
  2. Select Table or View from the Data access mode drop-down box.
  3. Select Object Summery$ from the Name of the excel sheet drop-down box.

As I mentioned, column names of the SQL table and the  Excel columns are same, hence mapping will be done automatically. Click Mapping to map the columns. See the following image:

Double-click Excel Database Information (Excel Destination) in Excel Destination Editor.

  1. Select Excel Connection Manager from the Excel connection manager drop-down box.
  2. Select Table or View from the Data access mode drop-down box.
  3. Select Database information$ from the Name of the excel sheet drop-down box.

As I mentioned, column names of the SQL table and the  Excel columns are same, hence mapping will be done automatically. Click Mapping to map the columns. See the following image:

Double-click  Excel SQL Jobs (Excel Destination) in Excel Destination Editor.

  1. Select Excel Connection Manager from the Excel connection manager drop-down box.
  2. Select Table or View from the Data access mode drop-down box.
  3. Select “SQL Jobs$ from the Name of the Excel sheet drop-down box.

As I mentioned, column names of the SQL table and the  Excel columns are same, hence mapping will be done automatically. Click Mapping to map the columns. See the following image:

Create a SQL Job to email the database report

Once the package is created, create a SQL job to perform the following activities:

  1. Execute the SSIS package to populate data from all servers.
  2. Email the database report to the required team.

In SQL Job, we need to create two steps. The first step will execute the SSIS package and the second step will execute the procedure to send an email.

To create a SQL Job, Open SSMS >> SQL Server Agent >> Right-click New SQL Job.

Om the New Job wizard, select the Step option and click New. In the New Job Step dialog box, in the Step name text box, provide the desired name, select SQL Server Integration services package from the Type drop-down box. Provide a location of the SSIS package in the Package Text box. See the following image:

Click OK to close New Job Step.

Create another job step which will execute a stored procedure to email the report. It uses a system procedure to send an email. The SSIS package copies the database information on a specific location, hence provide the full path of the excel file in @file_attachments parameter of sp_send_dbmail stored procedure.

To create the stored procedure, execute the following code in the DBATools database of Central Server:

Once the procedure is created, add a new SQL Job step. Click New. In the New Job Step dialog box, provide a Job Step name, and select Transact-SQL script (T-SQL) from the Type drop-down box. In the Command Text box, write the following code:

See the following image:

Click OK to close the wizard. Now to configure Job Schedule, select Schedules on the New Job window. Click New to add a schedule.

In the New Job Schedule dialog box, provide the desired name in the Name text box, choose frequency and time. See the following image:

Close OK to close the New Job Schedule and in the New Job window, click on OK to close the dialog box.

Now, to test the SQL Job, right-click the Email Database Report SQL job and click Start Job at Step.

In the result of successful completion of the SQL Job, you will receive an email with the database report. See the following image:

Summary

In this article I have covered as follows:

  1. How to populate information of the databases located on remote DB server.
  2. Create an SSIS package to populate database information and export it to excel file
  3. Create a multi-step SQL job to generate the report by executing an SSIS package and email the report.

Nisarg Upadhyay

Nisarg is a SQL Server Database Administrator and Microsoft certified professional who has more than 5 years of experience with SQL Server administration and 2 years with Oracle 10g database administration. He has expertise in database design, performance tuning, backup and recovery, HA and DR setup, database migrations and upgrades. He has completed the Bachelors in Information Technology from Ganpat University.
Nisarg Upadhyay