If you write code to SQL Server then you might be interested in this: I have written a tSQLt tdd training course which has helped over 500 people learn both tSQLt and how to apply TDD practices to their SQL Server T-SQL development, you can join the course at https://courses.agilesql.club.

How can we merge multiple dacpacs into one

Reading Time: 4 minutes (not a minute longer)

TLDR;

You probably don’t want to actually merge dacpac’s you probably want to deploy multiple and use /p:IncludeCompositeObjects=true.

If you do really want to merge dacpacs you can start with this sample: https://github.com/GoEddie/DacpacMerge

Why?

I was reading the dacfx issues on the dacextensions sample’s github site: https://github.com/Microsoft/DACExtensions/issues and one of the issues https://github.com/Microsoft/DACExtensions/issues/23 asked about how to merge dacpac’s and I have been asked a few times how to do this and I thought there was a sample but I couldn’t find it so I knocked a quick one up.

What?

If you have more than one dacpac and you want to merge them into one, the process is basically:

  • Create a new empty TSqlModel
  • Iterate all the dacpac’s and pull out the AST of all of the objects
  • Write each object into the new TSqlModel
  • Write the new massive model to a dacpac
  • Read in any supporting items like pre/post deploy scripts, refactorlog, references (that weren’t same database references to one of the merged dacpacs)
  • Create new versions of each of these docs and write to the new merged dacpac

How?

Everything can be found here:

https://github.com/GoEddie/DacpacMerge/blob/master/src/MergeEm/Program.cs

Create a new empty TSqlModel

    _first = new TSqlModel(sources.First<string>());
    var options = _first.CopyModelOptions();

    _target = new TSqlModel(_first.Version, options);

We have a list of dacpacs to merge, we get the model options from the first - you might well want to merge these as well etc. Once you have the options, create a new TSqlModel using the version that we pull from the first dacpac to merge.

Iterate all the dacpac’s and pull out the AST of all of the objects

    foreach(var obj in model.GetObjects(DacQueryScopes.UserDefined))
    {
        TSqlScript ast;
        if(obj.TryGetAst(out ast))
        {
            var name = obj.Name.ToString();
            var info = obj.GetSourceInformation();
            if(info != null)
            {
                name = info.SourceName;
            }

            _target.AddOrUpdateObjects(ast, name, new TSqlObjectOptions());    //WARNING throwing away ansi nulls and quoted identifiers!
        }
    }

We grab the AST if we can, we use TryGetAst because lots of things in a dacpac model can’t be scripted like a primary key that is defined within a table rather than as a separate constraint and to stop having to specify all the objects we do want (procs, tables, tvf’s etc) I just say give me everything and see what we can get. This is probably something that might need to change if anyone struggles with missing objects.

Write each object into the new TSqlModel

Once we have the AST we add it to the target model with the name of the script that was used to populate the original model, if we have it.

Write the new massive model to a dacpac

It might be massive, it might not be - your call busta.

    var metadata = new PackageMetadata();
    
    metadata.Name = "dacpac";

    DacPackageExtensions.BuildPackage(_targetPath, model, metadata);

we set some basic metadata, this would be another good thing to pass in. Then use DacPackageExtensions.BuildPackage which will take our model and write it into a dacpac, at this point we should have a deployable dacpac albeit without the juicy extras like pre/post deploy scripts.

Read in any supporting items like pre/post deploy scripts, refactorlog, references (that weren’t same database references to one of the merged dacpacs)

  using (var package = DacPackage.Load(source))
  {
      pre += new StreamReader(package.PreDeploymentScript).ReadToEnd();
      post += new StreamReader(package.PostDeploymentScript).ReadToEnd();
  }

Even though we have read the model from all the dacpac’s we still need to go back to the dacpac’s to get the pre/post deploy scripts - I haven’t added in the refactorlog.xml but that might be needed as well.

Save the pre/post deploy scripts in the order that they are read.

Create new versions of each of these docs and write to the new merged dacpac


 private void AddScripts(string pre, string post, string dacpacPath)
{            
    using (var package = Package.Open(_targetPath, FileMode.Open, FileAccess.ReadWrite))
    {
        if (!string.IsNullOrEmpty(pre))
        {
            var part = package.CreatePart(new Uri("/predeploy.sql", UriKind.Relative), "text/plain");

            using (var stream = part.GetStream())
            {
                stream.Write(Encoding.UTF8.GetBytes(pre), 0, pre.Length);
            }
        }


        if (!string.IsNullOrEmpty(post))
        {                    
            var part = package.CreatePart(new Uri("/postdeploy.sql", UriKind.Relative), "text/plain");

            using (var stream = part.GetStream())
            {
                stream.Write(Encoding.UTF8.GetBytes(post), 0, post.Length);
            }
        }
        package.Close();
        
    }
}

Writing supporting scripts into a dacpac doesn’t need the DacFx API instead you use the System.IO.Packaging API, give it the details of the part you want (pre/post deploy/refactorlog etc) and all should be cushty

Enjoy - test well and prosper :)