DACPAC generation

classic Classic list List threaded Threaded
4 messages Options
Reply | Threaded
Open this post in threaded view
|

DACPAC generation

enders
Hello,

Microsoft SQL server has an option to deploy databases using a DACPAC.
A dacpac is a zip file which containts some SQL files. It is a declaritive approach of deploying a database like "make it look like this and I don't bother how you do it."
Anyway it is pretty easy. More info can be found here : http://sqlblog.com/blogs/jamie_thomson/archive/2014/01/18/dacpac-braindump.aspx 

The SQL files don't need to have "if exists .. then drop" stuff. It will be done automatically.

Since it is a zip file you can include your own files like the model file.

And if you are generating, you can create a DacMetadata.xml file and add some nice stuff in it like a build number or so.

What do you think ?

Constantijn
Reply | Threaded
Open this post in threaded view
|

Re: DACPAC generation

Chris
Hi Constantijn,

I've read through the article and it sounds like DACPACs would work well after you've generated the model inside SQL Server. I can see a benefit to them when you need to alter and existing object like changing the length or data type of a column. These aren't handled well (at all?) in the AM tool.

However, I can see potential headaches if you don't create the DACPAC properly. And then having to maintain the code when the DACPAC spec changes (Probably not that often I know).

Hopefully someone else with more experience can post?

Cheers!
Chris
Reply | Threaded
Open this post in threaded view
|

Re: DACPAC generation

rogge
In reply to this post by enders
I Have you tried creating a VS SQL Server Database Project and importing the AM generated SQL code? you should be able to import the object; deploy the project; make some changes in the AM, generate the code and re-import.
Reply | Threaded
Open this post in threaded view
|

Re: DACPAC generation

roenbaeck
Administrator
In reply to this post by Chris
Just to clarify, data type changes are not handled by the tool. The new datatype will be in the generated code, but the table will be left untouched since it already exists.

Let me explain the reasoning behind this behavior. Early on we decided that any "destructive" changes to the schema must require manual (human) intervention. An idealist approach would be to create a new attribute with the new data type and leave the old one in place. A pragmatist approach would be to rename the attribute that should get a new data type, run the script that will now create the attribute, then migrate the data from the old attribute. In a production system, data type changes should be few, so the extra manual labor should not be a big issue.

However, in the development phase you may need to change data types frequently. This is why we have included a stored procedure named GenerateDropScript, which drops all tables, views, functions, etc in a database such that everything can be recreated from scratch. Of course, you could also just recreate the database, but not everyone have those privileges.

If you believe that handling data type changes is a big issue, let us know :)