I was recently commissioned to complete an archiving project for one of our customers. The customer did not have a SQL Server DBA resource in house to develop the solution, and would need it automated with very little or no administrative intervention. The solution is hosted on a SQL Server 2012 Enterprise instance.
The solution required 1 month of data stored in the live application database, and 1 week of data stored inside the archive database. The aim is to manage the storage requirements for the application database servers. The new application has only been launched for a few months, and has approximately grown in 60GB of data.
The partitioning of the live database focused on the tables that incurred the huge growth in data, since the launch of the application. The datetime column in the target tables were used to define the partitioning function. The partition function was defined to run up to December 2019, which contained 72 partitions; this is sufficiently under the 15,000 partition limitation for SQL Server 2012.
The automated process to archive from the live target tables to the archive tables and the archive table clean-up process ran once a month. Depending on the time of the month, there could be up two months of data in the live database. The automated archiving and clean-up process ran under two SQL Agent jobs. The archive tables used Page level compression, to further manage the storage requirements of the server.
The business benefits from this solution included:
The next blog in this series will discuss the technical workflow of designing and implementing partitioning and compression.