The guide below shows how to implement stored procedures that will monitor the disk space used by your databases and their tables. Giving an understanding of areas utilising storage and the rate of storage consumption, allowing planning and monitoring, helping to limit the likelihood of that moment when the database grows and runs out of space.
I also cover how the captured data can be analysed easily with a simple Power BI dashboard, to allow quick illustration of database storage utilisation.
Step 1 - Create the database and its associated objects
- Firstly create a database on your instance called "DatabaseMonitor", you can use a different database name if your naming conventions dictate, however the scripts below will have to be modified accordingly.
- Create the tables that will store the data using the two create scripts below.
- Create the stored procedures using the script below that will capture and write the disk consumption records to the newly created tables.
Step 2 - Schedule the capture of database disk consumption
Consider the frequency and timing of the stored procedures to run and as such take their file size snapshots, I run this every other day which gives plenty of data. Consider taking a look at my article on my views on Best practice for SQL agent configuration and monitoring.
- Open SQL Server agent on your server.
- Create a new job with a name that matches your naming conventions (I simple use "Housekeeping - Database File Size capture".
- Create a step with a clear and brief name that executes the database table file size using the script "EXEC DatabaseMonitor.dbo.SpCaptureDatabaseTableSize".
- This by default captures any table 150 MB or larger, this can be adjusted by running the stored procedure with the @SizeThreshold parameter set to the minimum level for the tables required.
- Create a step with a clear and brief name that executes the database file size stored procedure using the script "EXEC DatabaseMonitor.dbo.SpCaptureDatabaseTableSize"
- This captures database sizes across all databases, including system databases, temp db and the DatabaseMonitor database itself.
- Schedule the job, typically I would recommend every other day and at a quieter time such as evening prior to backups being performed.
Step 3 - Allow the capture of data
The job above will need to run for around 1-2 weeks to capture useful information to then be able to analyse disk consumption rates. The code below creates views across the two tables, that can be plugged into a simple dashboard, with an over partition by in place to easily identify the latest size reading for that database/table.
Step 4 - Build Power BI Dashboard
If you already have experience of building Power BI dashboards you can probably leap this step and simply make use of the new data you are capturing, however if that is not the case the following details the steps to do so.
Add comment
Comments