Tracking database growth with Power BI

Published on 21 May 2024 at 15:32

The guide below shows how to implement stored procedures that will monitor the disk space used by your databases and their tables.  Giving an understanding of areas utilising storage and the rate of storage consumption, allowing planning and monitoring, helping to limit the likelihood of that moment when the database grows and runs out of space.  

I also cover how the captured data can be analysed easily with a simple Power BI dashboard, to allow quick illustration of database storage utilisation. 

 

Step 1 - Create the database and its associated objects

  • Firstly create a database on your instance called "DatabaseMonitor", you can use a different database name if your naming conventions dictate, however the scripts below will have to be modified accordingly.
  • Create the tables that will store the data using the two create scripts below.

Create table script for database size tracking

USE [DatabaseMonitor]

CREATE TABLE [dbo].[TblDbFileSize](
[ID] [int] IDENTITY(1,1) NOT NULL,
[CreatedDate] [date] NOT NULL,
[DatabaseName] [varchar](104) NULL,
[FileType] [varchar](3) NOT NULL,
[FileSizeMb] [bigint] NULL
) ON [PRIMARY]

Create table script for table size tracking

USE [DatabaseMonitor]

CREATE TABLE [dbo].[TblDbTableFileSize](
[ID] [int] IDENTITY(1,1) NOT NULL,
[CreatedDate] [date] NOT NULL,
[DatabaseName] [varchar](104) NULL,
[Tablename] [varchar](150) NULL,
[NumberRows] [int] NOT NULL,
[FileSizeMB] [numeric](36, 2) NOT NULL,
[NumberIndexes] [int] NULL
) ON [PRIMARY]

  • Create the stored procedures using the script below that will capture and write the disk consumption records to the newly created tables.

Create stored procedure to track database file size

USE [DatabaseMonitor]

CREATE PROCEDURE [dbo].[spCaptureDatabaseFileSize] AS

CREATE TABLE #Databases (ID INT IDENTITY (1,1), DatabaseName VARCHAR(100))

INSERT INTO #Databases (DatabaseName)

SELECT NAME

FROM sys.databases

WHERE state = 0;

 

DECLARE @StartRecord INT = (SELECT MIN(ID) FROM #Databases)

DECLARE @StopRecord INT = (SELECT MAX(ID) FROM #Databases)

DECLARE @CurrentRecord INT = @StartRecord

DECLARE @currentDatabaseName VARCHAR(100)

DECLARE @SQL VARCHAR(MAX)

DECLARE @SQLResult TABLE ([database_name] VARCHAR(104), [Physical_name] VARCHAR (150), filesize BIGINT)

 

WHILE @CurrentRecord <= @StopRecord

BEGIN

 

SELECT

@CurrentDatabaseName = DatabaseName

FROM #Databases

WHERE id = @CurrentRecord

 

SET @SQL = 'SELECT [name], physical_name, size FROM ' + @CurrentDatabaseName + '.sys.database_files ORDER BY [type], [file_id] '

 

 

INSERT @SQLResult ([database_name], [Physical_name] , [filesize])

EXEC (@SQL)

 

SET @CurrentRecord = @CurrentRecord + 1

 

END

 

INSERT INTO [dbo].[TblDbFileSize] (CreatedDate, databasename, FileType, FileSizeMb)

select

getdate(),

Database_Name,

Right(physical_name,3),

round(filesize /128,0) DatabaseSize

from @SQLResult

 

 

DROP TABLE #Databases

Create stored procedure to track table file size

USE [DatabaseMonitor]
CREATE PROCEDURE [dbo].[spCaptureDatabaseTableSize]


@Database VARCHAR(100) = '0', --Which database should be analysed, when set to ALL will run against all DBs (including system databases).

@SizeThreshold VARCHAR(4) = 150, --What threshold of table MB should be recorded, 0 will return all tables regardless of size.

@Debug TINYINT = 0 --Mode "1" displays more debug detail behind calculations.

 

AS

 

DECLARE @ProcProgress VARCHAR(250)

 

CREATE TABLE #Databases (ID INT IDENTITY (1,1), DatabaseName VARCHAR(100))

INSERT INTO #Databases (DatabaseName)

SELECT name

FROM sys.databases

WHERE (@Database = name or @Database = 0) AND state = 0

 

IF @Debug = 1

BEGIN

SET @ProcProgress = ('Identifying databases to analyse, ' + CAST((SELECT COUNT(*) FROM #Databases) AS VARCHAR) + ' databases identfied using parameters supplied.')

RAISERROR (@ProcProgress, 0, 1) WITH NOWAIT

END

 

DECLARE @StartRecord INT = (SELECT MIN(ID) FROM #Databases)

DECLARE @StopRecord INT = (SELECT MAX(ID) FROM #Databases)

DECLARE @CurrentRecord INT = @StartRecord

DECLARE @currentDatabaseName VARCHAR(100)

DECLARE @SQL VARCHAR(MAX)

DECLARE @SQLResult TABLE ([database_name] VARCHAR(100), [Physical_name] VARCHAR (150), filesize BIGINT)

 

WHILE @CurrentRecord <= @StopRecord

BEGIN

 

SELECT

@CurrentDatabaseName = DatabaseName

FROM #Databases

WHERE id = @CurrentRecord

 

IF @Debug = 1

BEGIN

SET @ProcProgress = ('Looping through databases, analysing database ' + @CurrentDatabaseName +'')

RAISERROR (@ProcProgress, 0, 1) WITH NOWAIT

END

 

SET @SQL = ' SELECT

getdate() datetime, '''

+ @CurrentDatabaseName + ''' as DB,

t.NAME AS TBL,

p.rows NumberRows,

COUNT(distinct I.index_id) NumberIndexes,

CAST(ROUND(((SUM(a.total_pages) * 8) / 1024.00), 2) AS NUMERIC(36, 2)) AS SizeMB

 

FROM ' + @CurrentDatabaseName +'.sys.tables t

 

INNER JOIN ' + @CurrentDatabaseName +'.sys.indexes i

ON t.OBJECT_ID = i.object_id

INNER JOIN ' + @CurrentDatabaseName +'.sys.partitions p

ON i.object_id = p.OBJECT_ID AND i.index_id = p.index_id

INNER JOIN ' + @CurrentDatabaseName +'.sys.allocation_units a

ON p.partition_id = a.container_id

LEFT OUTER JOIN ' + @CurrentDatabaseName + '.sys.schemas s

ON t.schema_id = s.schema_id

 

WHERE

i.OBJECT_ID > 255

GROUP BY t.Name, s.Name, p.Rows

HAVING CAST(ROUND(((SUM(a.total_pages) * 8) / 1024.00), 2) AS NUMERIC(36, 2)) >= '

+ @SizeThreshold + '

ORDER BY SizeMB DESC, t.Name'

 

IF @Debug = 1

BEGIN

SET @ProcProgress = ('Preparing to execute SQL ' + @SQL+'')

RAISERROR (@ProcProgress, 0, 1) WITH NOWAIT

END

 

INSERT INTO [DatabaseMonitor].[dbo].[TblDbTableFileSize] (CreatedDate, DatabaseName, TableName, NumberRows, NumberIndexes, FileSizeMB)

EXEC (@SQL)

 

 

SET @CurrentRecord = @CurrentRecord + 1

 

END

 

DROP TABLE #Databases

GO

 

Step 2 - Schedule the capture of database disk consumption

Consider the frequency and timing of the stored procedures to run and as such take their file size snapshots, I run this every other day which gives plenty of data.  Consider taking a look at my article on my views on Best practice for SQL agent configuration and monitoring.

  • Open SQL Server agent on your server.
  • Create a new job with a name that matches your naming conventions (I simple use "Housekeeping - Database File Size capture".
  • Create a step with a clear and brief name that executes the database table file size using the script "EXEC DatabaseMonitor.dbo.SpCaptureDatabaseTableSize".
    • This by default captures any table 150 MB or larger, this can be adjusted by running the stored procedure with the @SizeThreshold parameter set to the minimum level for the tables required.  
  • Create a step with a clear and brief name that executes the database file size stored procedure using the script "EXEC DatabaseMonitor.dbo.SpCaptureDatabaseTableSize"
    • This captures database sizes across all databases, including system databases, temp db and the DatabaseMonitor database itself.
  • Schedule the job, typically I would recommend every other day and at a quieter time such as evening prior to backups being performed.   

 

Step 3 - Allow the capture of data

The job above will need to run for around 1-2 weeks to capture useful information to then be able to analyse disk consumption rates.  The code below creates views across the two tables, that can be plugged into a simple dashboard, with an over partition by in place to easily identify the latest size reading for that database/table.

Create view that shows latest recorded database file sizes

Create View VW_DbFileSize AS

 

SELECT 

 

DBTFS.*,

DBTFS.FileSizeMB / 100 FileSizeGB,

Case

WHEN RANK() OVER(PARTITION BY DBTFS.DatabaseName Order by DBTFS.CreatedDate Desc) = 1 THEN 1

ELSE 0

END IsLatestDatabaseReading

 

FROM [DatabaseMonitor].[dbo].[TblDbFileSize] DBTFS

Create view that shows latest recorded table file sizes

Create View VW_DbTableFileSize AS

 

SELECT 

 

DBTFS.*,

DBTFS.FileSizeMB / 100 FileSizeGB,

Case 

WHEN RANK() OVER(PARTITION BY DBTFS.DatabaseName, DBTFS.TableName Order by DBTFS.CreatedDate Desc) = 1 THEN 1

ELSE 0

END IsLatestTableReading

 

FROM [DatabaseMonitor].[dbo].[TblDbTableFileSize] DBTFS

Step 4 - Build Power BI Dashboard

If you already have experience of building Power BI dashboards you can probably leap this step and simply make use of the new data you are capturing, however if that is not the case the following details the steps to do so.

 

DAX to calculate the most recent database reading

When summarising data in tree maps and other visualisations, its useful to be able to show the disk consumption expressed as the latest taken reading for that database.  The following DAX creates a measure that produces that result.

Latest Database Reading (GB) = CALCULATE(sum(VW_DbFileSize[FileSizeGB]),VW_DbFileSize[IsLatestDatabaseReading]=1)

DAX to calculate the most recent table reading

In a similar theme to above, when summarising data in tree maps and other visualisations, its useful to be able to show the disk consumption expressed as the latest taken reading for that table.  The following DAX creates a measure that produces that result.

Latest Table Reading (GB) = CALCULATE(sum(VW_DbTableFileSize[FileSizeGB]),VW_DbTableFileSize[IsLatestTableReading]=1)

Add comment

Comments

There are no comments yet.