Controlling a dacpac deployment

I have been thinking quite a lot recently (ok not that much but my thinking has changed) about how to deploy dacpac’s. In particular I am talking about how to control the deployment, what tool is used to deploy the dacpac, what connection details are passed in, which of the hundreds of parameters are passed in and how. I think that as I configured and deployed databases using dacpac’s in a number of different environments my approaches are been:

tSQLt Visual Studio Test Adapter

tSQLt Visual Studio Test Adapter What is this? This lets you use Visual Studio to run tSQLt tests easily. Visual Studio has a built in framework for finding and executing tests so that if you have tSQLt tests in an SSDT project for example, although this just requires you have the .sql files in source control and does not require ssdt - you can easily see and execute your tests via the Visual Studio Test Explorer window.

DacFxed - Powershell Nugetized DacFx wrapper

Deploying a dacpac from powershell should be pretty easy, there is a .net api which you can use to open a dacpac, compare to a database and either create a script or get the database to look the same as the dacpac but there are a couple of problems with the approach. Problem 1 - Is the DacFx installed? The first problem is whether the DacFx is installed, the chances are if Visual Studio or SQL Server have been installed then it will be there.

SQLCover v 0.2 - Bug fixes and Azure V12 Support

I have released a new version of SQLCover which is a code coverage tool for T-SQL (let’s you identify where you need to focus when writing tests). This includes a few minor fixes but also support for SQL Azure so if you run your test in a v12 database or higher you can now get an idea of code coverage from that. If you are interested in using this but don’t know where to start, there is a powershell script in the download (https://the.

Where do you put your business logic? In stored procedures?

There seems to be two trains of thought and I think this is mainly down to who and where your developers are. The first is that a stored procedure or a function is a great place to put all the business logic that an application needs. The second is that no logic should be in code deployed to SQL Server at all. You can see these two extremes if you compare the difference between what stack overflow does (.

How do you audit automatically generated sql deployment scripts?

I saw recently an email where someone stated that the reason that they manually create deployment scripts is that because of the restirctions put on them by the requirements for auditing and the ability to be able to demonstrate exactly what was deployed at what time. When automating deployments it really makes sense to use a tool to generate the deployment so you can concentrate on writing productive code instead of deployment scripts.

What is code coverage for?

Code coverage gives you an indication of how well tested a particular area is. It is not a measure of code quality and having a statement covered by a test does not mean that the code is accurate. Using code coverage you can see how risky changes are to a particular area of code. If you have a high level of code coverage you can be fairly confident in changes, if you have low code coverage in an area then when you make changes you will need be careful.

SQLCover Code Coverage for SQL Server T-SQL

Open Source code coverage tool for T-SQL, SQL Server 2008+ What is code coverage? Code coverage is a way to see how many statements in your database code have been executed when you ran your tests(s). It is a way to see how well covered with tests a particular area is - the better covered with tests, the less likely you will miss issues when you make changes in those areas.

Post Deploy Scripts In Composite Dacpacs not deploying

Composite projects in SSDT are a really useful way to logically split databases up while still being able to deploy into a database as a whole. Even if you have a fairly simple database in a single SSDT project you should put your unit tests somewhere. What I like to do is have my solution configured like: Main.dbproj - actual production code and reference data Main.UnitTests.dbproj - unit test code + references to tSQLt etc The unit tests have a “Same Database” reference pointing to the main project so it is as if my tests were in the same project.

Where did january go?

I have been pretty busy over january and the first half of feburary and haven’t had time to write any blogs, I did have an article on simple talk published: https://www.simple-talk.com/sql/sql-tools/connected-development-with-sql… A couple of exciting things did happen, firstly I was allowed back into the friends of redgate programme - I like the redgate guys and am pleased to be back! Secondly I have been accepted into the Microsoft ALM Ranger programme which means I can help deliver some tooling and documentation as part of a team instead of beavering away by myself.