SQL Server Smart Database Backup Plan

Preparing a proper database backup plan strategy is very critical for every DBA to make sure the plan meets the business requirements such as Recovery Point Objective (RPO) and Recovery Time Objective (RTO). I have consulted many organizations regarding SQL Server issues and one of the most common mistake that client makes, is performing database backup (either data or log), blindly without checking the amount of data changes. Most of the database backup plans are highly depends on the schedule given by DBA.

The recent client mentioned that their backup policy is as the following: –

  • Log Backup Every One Hour
  • Differential Backup Once a Day
  • Full Backup Once a Week

I do agree that above policy is fixed and it is not applicable on every type of databases, some databases are having transactional peak hours.

Let’s get back to our main topic ‘Smart Backup Plan’, RTO and RPO is not highly depends on the schedule of backup operation. If you take Log backup every one hour does not mean you can recover the database up to one hour before failure time, or taking 1TB database for changes of less than 2GB in a week is not realistic and you will waste off the storage capacity. Have a look at the following figure:-

As you see, the database full backup size is about 111GB, and this client takes a full backup every day for 30-100MB data changes daily. 

The following script is taking the log backup if the log been used more than equal 200MB, by using this technique the number of log backups reduced and it decrease the restoring database period.

DECLARE @LogSpace TABLE ([DATABASE Name] sysname,[LOG SIZE (MB)] FLOAT,[LOG SPACE Used (%)] FLOAT, [Status] INT);
INSERT INTO @LogSpace EXEC ('dbcc sqlperf (LOGSPACE)');
SELECT @LogSize = [LOG SIZE (MB)] * ([LOG SPACE used (%)]/100.0)  FROM @LogSpace WHERE [DATABASE Name] = 'SmartBackup'
IF(@LogSize >= 200)
	BACKUP LOG SmartBackup TO DISK = 'NUL'; --Perform some modifications as per your business requirements.


The following script is to keep track of data pages modification on the server to feed ‘Smart Backup Plan’ solution:-

--Create the following table in a specific database.
CREATE TABLE DB_Modified_Data (Database_id INT , [Modified DATA MB] FLOAT,LastUpdate DATETIME DEFAULT Sysdatetime());
--Run the following script in a Job with Schedule of 10-15 Seconds
WITH BufferData AS (
SELECT database_id,(COUNT(is_modified) * 8.0)/1024.0 AS [Modified DATA MB] FROM sys.dm_os_buffer_descriptors WHERE is_modified = 1
GROUP BY Database_id )
Merge DB_Modified_Data AS Target
USING (SELECT * FROM BufferData) AS Source 
ON (Target.Database_ID = Source.Database_ID)
	UPDATE SET Target.[Modified DATA MB] += Source.[Modified DATA MB], Target.[LastUpdate] = SysdateTime()
WHEN Not Matched BY Target THEN
	INSERT (Database_ID, [Modified DATA MB]) VALUES (Source.Database_ID,Source.[Modified DATA MB]);
EXEC sp_MSforeachdb 'CheckPoint;';


The following script is to take full backup once the data page modification exceeds certain threshold:-

SELECT @SIZE = [Modified DATA MB] FROM DB_Modified_Data WHERE Database_id = DB_ID('SmartBackup');
IF(@SIZE >= 1024.0)
	BACKUP DATABASE SmartBackup TO DISK = 'NUL'; --Needs to perform some modifications as per your business requirements
	UPDATE DB_Modified_Data SET [Modified DATA MB] = 0 , LastUpdate = Sysdatetime() WHERE Database_id = DB_ID('SmartBackup');



As mentioned earlier, RTO and RPO is not depending on the backup scheduling. Perform smart backup to achieve the following objectives: –

  1. Takes Log Backup When Log Data is more than 200MB.
  2. Reduces I/O and CPU Consumption (For Log and Data Backup).
  3. Reduces the Storage Used Space (For Log and Data Backup).
  4. Count of Log Backup Sets will Reduce and Increases Restore Performance.


Share This Story

Share on facebook
Share on twitter
Share on linkedin
Share on whatsapp

Share Your Comments

Inline Feedbacks
View all comments

About The Author

Search Articles


Follow Fard Solutions