25 Nov 2015 | SQL Server | SQL Server Data Tools
This posts talks about the high level steps I went through in order to get the SQL Server related components ready for automation in a project I worked on recently.
This project uses SQL Server Data Tools (SSDT) project in order to maintain the database schema in source control. Its output - the Data-tier Application Component Packages (DACPAC) gets deployed into the appropriate target environment via a WebDeploy package. And considering that the solution was designed as an Entity Framework (EF) database first approach, code first migrations were not a viable upgrade strategy.
Here are the steps I followed in order to bring the production environment up-to-date:
Next, ensure that every time the SSDT project is built a post event would generate a differential delta script between the baseline and latest DACPAC. I tried to simplify the following command by wrapping it up within a powershell script:
&"C:\Program Files (x86)\Microsoft SQL Server\110\DAC\bin\sqlpackage" /a:Script /sf:$SourceDacpac /tf:$TargetDacpac /op:$OutputDeltaFile /tdn:$DBName /p:IncludeTransactionalScripts=True /p:IncludeCompositeObjects=True /p:ScriptDatabaseOptions=False /p:BlockOnPossibleDataLoss=True /v:TenantSchemaName=dbo
Note that one of the parameters (p:IncludeTransactionalScripts=True
) was to ensure that the script would be generated as a transaction.
(optional) Perform any post processing on the generated delta script - In my specific use-case I had to tinker the script to work within a multi-tenant scenario.
SqlCmd
or a custom tool such as https://github.com/rusanu/DbUtilSqlCmd