I sometimes see this when deploying via sqlpackager.exe: Analyzing deployment plan (Complete) Updating database (Start) Dropping DF_XXX... Creating DF_XXX... Update complete. Updating database (Complete) Successfully published database. This happens everytime I publish, even though there have not been any changes. Typically this is caused by Sql taking the create statement from SSDT and changing it, it does things like changes GETDATE to getdate and (0) to ((0)), to be honest I am not 100% sure what it will and won’t change but my fix for this when I get it happening is to go to SSMS and script out the table and copy and paste the constraint back into SSDT, then deploy and it should stop.
Updated 2015-02-02 I have changed the details about passing multiple filters at the bottom, see the codepex site for full details. tldr; Download the filter from: https://agilesqlclub.codeplex.com/releases/view/610727 put the dll into the same folder as sqlpackage.exe and add these command line parameters to your deployment: /p:AdditionalDeploymentContributors=AgileSqlClub.DeploymentFilterContributor /p:AdditionalDeploymentContributorArguments=“SqlPackageFilter=IgnoreSchema(BLAH)” This will neither deploy, drop or alter anything in the BLAH schema. More Info I recently discovered deployment contributors which let you write .
What? Simply put, it is possible to include checks in the SSDT (Sql Server Data Tools) build process - you come up with a set of rules or start with these (https://tsqlsmellsssdt.codeplex.com/) and when you check-in your code, as part of your normal build or CI build you run the code analysis - the code analysis verifies that the code is good and either continues the build or fails it if you have committed a crime against your rules.
I thought it would be a good idea to gather together a list of links for DacFx, blogs code or articles - if you have anything that you think should be included in this list please send it and I will include it. The first thing to read is the “DacFx Public Model Tutorial” this by Kevin Cunnane from the Microsoft SSDT team and is highly recommended: http://blogs.msdn.com/b/ssdt/archive/2013/12/23/dacfx-public-model-tutor… Then there are the DacFx samples on codeplex:
If you get the Visual Studio 2012 or 2013 web installer then it uses the BITS service to download the setup files. The BITS service is used to download files but throttles the download so it doesn’t overwhelm your bandwidth. If you are also downloading lots of other things like windows updates then the visual studio downloads have to take their time in the queue with everything else. If you want to download to go quicker, you can disable and stop the BITS service, the installer will fall back to traditional HTTP can be much faster (depending on what other things are trying to use BITS).
When you already have a test framework setup such as MSTest or NUnit and possibly a CI build in place and you want to add Sql Server tests it is often wise to call the tests from the framework you already have setup. I do this at the moment using C# and I have a helper method that calls tSQLt.Run/RunClass/RunAll passing in the name of the test(s) to run - this means that from within visual studio, I can see and run all of my tSQLt tests alongside my existing NUnit / Moq tests.
When unit testing code there is a really powerful feature that is either called mocking or faking or sometimes using stubs and what these mean is creating objects that look like real objects but aren’t. Why would you want something that looks like something but isn’t? Unit testing means breaking your code into small chunks and testing that the small chunk does what it is meant to. Testing small chunks of code in isolation is difficult as they often call other chunks of code so we test and pretend that the other thing is correct and to validate those other small chunks, we unit test them.
I recently had the chance honour to spend twenty minutes talking to Boris Hristov about testing with Sql Server for his Google Hangouts series, catch the video here One of the things I mentioned was that having a suite of unit tests helps to document the code and in this post I would like to demonstrate that. If we take the first stored procedure in the AdventureWorks2012 database to see how it is documented, if we take the procedure definition, we get:
I was asked an interesting question about collations in sql server recently about where variables in a script got their collation’s from. I really wasn’t too sure of the answer, I generally try to ensure that databases are in the company default collation although I realise that this is not always possible. Given how important this is I thought I would investigate it a little: The first question I had was what collaction is the actual T-Sql script compiled in, is it the database or server?
I have worked as a DBA and also as a Sql Server developer and in many roles had full access to production, it is OK though I know what I am doing and I normally don’t make mistakes (we’ll come back to this!). To be clear, this isn’t a moan at DBA’s, I am one and can and do make changes myself - I am trying to improve processes that I have input into.