Why is SSDT so slow?

I see the same question again and again “I am using SSDT but it takes a long time to publish changes, how do I make it faster?” or the ever useful “SSDT is too slow”.

SSDT is slower than the alternatives at publishing changes because it does so many cool things

Anyone who has used SSDT knows that doing a publish can take time, how much time typically varies depending on a number of factors:

  • Is the database local or on the other side of the earth (high latency / possibly low bandwidth)?
  • How many objects in your database
  • How many objects have changed in your database

These seem obvious, the slower the link the longer it will take to get the data SSDT needs to generate the list of changes that are required. If you have more objects or lots of work (changes) to make then it will take longer.

What is not so obvious is what SSDT is doing and whether the time it takes to publish is worth it. If you can script out a procedure in SSMS, change a part of it and then deploy it using F5 in a matter of seconds, why wait longer in SSDT? If the publish time is a few seconds then fine, but if it is 30 seconds, 1 minute, 5 minutes, 30 minutes? What is too long for you to stop using SSDT and go back to SSMS?

Personally I think it is worth the time it takes to publish using SSDT because of the benefits you get but I do use shortcuts to speed up my development as even waiting 30 seconds every time you want to test your code is very distracting and really slows down development. That being said, there are issues with SSDT taking 30 minutes in the SSDT forum - 30 minutes is definetly too long, if you are getting that then I would really find out what is wrong and fix it as that is unacceptable.

Why is it worth it?

When you use SSDT you get things like reference checking and the ability to run validation rules like the T-SQL code smells which means that every time you build you are validating the code, when you modify the T-SQL and deploy it directly to SQL Server you are not getting that compile time checking - that is bad, consider the case where you drop a column that stored procedures rely on - you can easily to it in SSMS but SSDT will not deploy if you do that.

When you publish your database using SSDT you are testing SSDT and making sure that the upgrade scripts always work. When you do this you gain confidence that the tooling works correctly so you can do “the right thing” (TM - me 2015) and use continuous integration and move towards continuous deployment with confidence.

Remember I am only talking about publishing changes, there are lots of other benefits of using SSDT aside from these but these are why I continue to use it even though it isn’t as fast as using SSMS.

What can I do if it is too slow?

If it is way too slow, like 30 minutes to publish then raise an issue on connect I would also raise it in the SSDT forum and ask for help (link to the connect item) - the SSDT team are active in the forum and really good at helping people.

If it isn’t way too slow, just too slow (slow, too slow, fast and too fast are entirely dependent on you) then generally you can:

  • Look at composite projects
  • Investigate and fix warnings
  • Limit changes between publish

Look at composite projects

http://blogs.msdn.com/b/ssdt/archive/2012/06/26/composite-projects-and-s…

Investigate and fix warnings

Each warning takes time to find and produce, if you have one it won’t be noticed but if you have hundreds then it will, fix them and be happy you are a) speeding up the build time and b) you are fixing the warnings in your project, go on dude you da man!

Special note on performance for tools writers - if you are testing an app you are writing in visual studio that uses DacFx to parse a project or dacpac then the ScriptDom uses Antlr to parse the T-SQL and it throws a lot (and I mean a lot) or 1st chance exceptions that are caught but running the ScriptDom under a debugger will cause lots of backwards and forwards between the debugger and the app which causes it to be very slow - either detach or have a small dacpac otherwise it is painful. Note, this paragraph is just for tools writers, publishing normally although launched from visual Studio is not run as a debuggee so you don’t get this issue.

Limit changes between publishes

If you have a thousand tables and procedures that have changed it will be take time, that should be fairly obvious. If you do have lots of changes then why did you leave it so long, come on release early, release often.

Bypass publish

What is this madness you speak of? Bypassing the publish process is a great way to get SSDT as fast as using SSMS. There are two general routes you can take, the first is not recommended by me the second is. The first way is that you can do your development in SSMS and when you are finished do a schema compare and deploy back to your project, I don’t like it as I think it is unnessesary and you won’t improve SSDT if you don’t use it and learn how to use it. SSDT is a major change in the way SQL Server developers must approach their development but it leads to greatness (ok better software quality in the end but greatness sounds better). The second way is one of the Visual Studio addins that allow you to right click a procedure in solution explorer and deploy it directly to your dev database without going through the publish, I have one (https://github.com/GoEddie/SSDTExtension) and Dave Ballantyne has one (http://dataidol.com/davebally/2014/03/01/ssdt-fast-deploy/) - I would use Dave’s as I have seen the code in my one ha ha. If you go down this route, you should include in your workflow a full publish/run tests before checking in.

Final note

Think about why you are using SSDT, find ways to make it fast enough that you are happy with it. SSDT really is a great tool and the more people who use it and benefit from it the better it will get!


Comments:

Anonymous

August 26, 2015 - 12:24

Going through all those

Going through all those things to see a 2-3 % increase in performance makes you reconsider if you want to use SSIS to begin with. SSDT 2013 is a disaster; it crashes for any project with over 50 packages upon clicking save.

Ed Elliott

August 26, 2015 - 12:48

Hi,

Hi,

SSDT and SSDT-BI are separate products, this was just about SSDT.

Personally if I had aproject with over 50 ssis packages I would want to be using MIST or at the very least BIML and never actually open the packages!

Anonymous

January 21, 2016 - 21:01

nice idea, poor

nice idea, poor implementation, microsoft as usual