I've read through the article and it sounds like DACPACs would work well after you've generated the model inside SQL Server. I can see a benefit to them when you need to alter and existing object like changing the length or data type of a column. These aren't handled well (at all?) in the AM tool.
However, I can see potential headaches if you don't create the DACPAC properly. And then having to maintain the code when the DACPAC spec changes (Probably not that often I know).
Hopefully someone else with more experience can post?
I Have you tried creating a VS SQL Server Database Project and importing the AM generated SQL code? you should be able to import the object; deploy the project; make some changes in the AM, generate the code and re-import.
Just to clarify, data type changes are not handled by the tool. The new datatype will be in the generated code, but the table will be left untouched since it already exists.
Let me explain the reasoning behind this behavior. Early on we decided that any "destructive" changes to the schema must require manual (human) intervention. An idealist approach would be to create a new attribute with the new data type and leave the old one in place. A pragmatist approach would be to rename the attribute that should get a new data type, run the script that will now create the attribute, then migrate the data from the old attribute. In a production system, data type changes should be few, so the extra manual labor should not be a big issue.
However, in the development phase you may need to change data types frequently. This is why we have included a stored procedure named GenerateDropScript, which drops all tables, views, functions, etc in a database such that everything can be recreated from scratch. Of course, you could also just recreate the database, but not everyone have those privileges.
If you believe that handling data type changes is a big issue, let us know :)