Giter Club home page Giter Club logo

mmckechney / sqlbuildmanager Goto Github PK

View Code? Open in Web Editor NEW
7.0 4.0 0.0 122.34 MB

SQL Build Manager is an all-in-one database management tool to easily update your fleet SQL Server databases - from one to tens of thousands.

License: MIT License

C# 65.18% Dockerfile 0.01% Batchfile 0.01% XSLT 0.25% CSS 0.05% TSQL 32.57% Visual Basic .NET 0.07% PowerShell 1.35% Bicep 0.50%
azure-batch database-updates sql dacpac database-management database-schema databases parallel kubernetes sql-server

sqlbuildmanager's Introduction

SQL Build Manager

SQL Build Manager is a multi-faceted tool to allow you to manage the life-cycle of your databases. It provides a comprehensive set of command line options for the management from one to many thousands of databases.

.NET Core Build

Kubernetes and ACI breaking changes in 15.0, be sure to review the change log

Be sure to review the change log for the latest updates, enhancements and bug fixes


Key feature enhancement with Version 14.4+: Expanded use of Azure User Assigned Managed Identity

With this update, it significantly reduces the the need to save and manage secrets and connection strings. For full details on leveraging Managed Identity to connect to the other Azure resources such as SQL Database, Blob storage, Service Bus, Event Hub, Key Vault and Azure Container registry, see the Managed Identity documentation here.


Contents


Important Concepts

Below are some high level concepts used by SQL Build Manager. You will see these used through out the documents and how-to's so it is important to understand what they mean:

"Build"

The action of updating to your database or fleet of databases with SQL Build Manager. Your build is wrapped in an all-or-nothing transaction meaning if a script fails, your database will be rolledback to the state it was prior to your build. The app also maintains a build history in the database by adding a logging table to the database.

"Package"

The bundling of your scripts for updating your databases. In addition to managing the order of script execution, it also manages meta-data about the scripts such as a desription, the author and unique id. The package also maintains hash values of each script and the package as a whole to ensure the integrity and allow tracking. Details on package contents and creation can be found here

"Override" file

The list of servers/databases you want to update with a build. When a package is created, the scripts are assigned a default database named "client". The override file is a list of the SQL Server Name and database targets that will replace "client" at build time. This file is in a format of: {sql server name}:client,{target database} with one target per line. This is either used directly or used as a source to create Azure Service Bus messages that are leveraged in a build. See here for additional information.

Remote Build Execution

The ability to distribute the load across multiple compute nodes. There are four options with SQL Build Manager: Azure Batch, Azure Kubernetes Service, Azure Container Apps and Azure Container Instance. Each of these has some specific configuration required, but have simliar process steps. The concept of parallel remote build is outlined here with specifics for each options are available.

"Settings" file

A configuration file that can be saved and re-used across multiple builds. It saves configurations for your remote envionment, identities, container names, connection strings, etc. Any sensitive information is encrypted with AES265 encryption with the value you provide with the --settingsfilekey. Sensitive information can instead be stored in Azure Key Vault with a --keyvault parameter if desired.

"jobname"

The name of a build. This is used as the name or name prefix for all of the Azure services used in a remote build including the blob storage container, running docker containers, service bus topic, etc.


Key Features

  • Packaging of all of your update scripts and runtime meta-data into a single .sbm (zip file) or leverage data-tier application (DACPAC) deployment across your entire database fleet.
  • Massively parallel execution across thousands of databases utilizing local threading or an Azure Batch, Kubernetes, Container Apps or Container Instance remote execution
  • Single transaction handling. If any one script fails, the entire package is rolled back, leaving the database unchanged.
  • Handle multiple database updates in one package - seamlessly update all your databases with local threading or massively parallel remote processing.
  • Full script run management. Control the script order, target database, and transaction handling
  • Trial mode - runs scripts to test against database, then rolls back to leave in pristine state.
  • Automatic logging and version tracking of scripts on a per-server/per-database level
  • Full SHA-1 hashing of individual scripts and complete .sbm package files to ensure integrity of the scripts
  • Execution of a build package (see below) is recorded in the database for full tracking of update history, script validation and potential rebuilding of packages

Running Builds (command line)

There are 6 ways to run your database update builds each with their target use case

Local

Leveraging the sbm build command, this runs the build on the current local machine. If you are targeting more than one database, the execution will be serial, only updating one database at a time and any transaction rollback will occur to all databases in the build.

Threaded

Using the sbm threaded run command will allow for updating multiple databases in parallel, but still executed from the local machine. Any transaction rollbacks will occur per-database - meaning if 5 of 6 databases succeed, the build will be committed on the 5 and rolled back only on the 6th

Batch

Using the sbm batch run command leverages Azure Batch to permit massively parallel updates across thousands of databases. To leverage Azure Batch, you will first need to set up your Batch account. The instructions for this can be found here. An excellent tool for viewing and monitoring your Azure batch accounts and jobs can be found here https://azure.github.io/BatchExplorer/

Azure Container Apps

Using the sbm containerapp run commands leverages Azure Container Apps to permit massively parallel updates across thousands of databases. Learn how to use Container Apps here.

Kubernetes

Using the sbm k8s run commands leverages Kubernetes to permit massively parallel updates across thousands of databases. To leverage Kubernetes, you will first need to set up a Kubernetes Cluster. The instructions for this can be found here.

Azure Container Instance (ACI)

Using the sbm aci commands leverages Azure Container Instance to permit massively parallel updates across thousands of databases. Learn how to use ACI here.


Querying across databases (command line)

In addition to using SQL Build Manager to perform database updates, you can also run SELECT queries across all of your databases to collect data. In the case of both threaded and batch a consolidated results file is saved to the location of your choice

Threaded

Using the sbm threaded query command will allow for querying multiple databases in parallel, but still executed from the local machine.

Batch, Kubernetes and ACI

Using the sbm batch query , sbm k8s query or sbm aci query commands leverage their respective compute to permit massively parallel queries across thousands of databases.

sqlbuildmanager's People

Contributors

dependabot-preview[bot] avatar dependabot[bot] avatar jormdav avatar mmckechney avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar

sqlbuildmanager's Issues

New tables with foreign key constraints not generating the CREATE TABLE statements

We're attempting to add 3 tables to our schema:

CREATE TABLE [dbo].[CustomLayout] (
    [CustomLayoutID] INT            IDENTITY (1, 1) NOT NULL,
    [Name]           NVARCHAR (100) CONSTRAINT [DF_CustomLayout_Name] DEFAULT ('') NOT NULL,
    [OwnerUserID]    INT            CONSTRAINT [DF_CustomLayout_UserID] DEFAULT ((0)) NOT NULL,
    [IsShared]       BIT            CONSTRAINT [DF_CustomLayout_IsShared] DEFAULT ((0)) NOT NULL,
    PRIMARY KEY CLUSTERED ([CustomLayoutID] ASC)
);

CREATE TABLE [dbo].[CustomLayoutColumn] (
    [CustomLayoutColumnID] INT IDENTITY (1, 1) NOT NULL,
    [CustomLayoutID]       INT NOT NULL,
    [EntityID]             INT CONSTRAINT [DF_CustomLayoutColumn_EntityID] DEFAULT ((0)) NOT NULL,
    [TypeID]               INT CONSTRAINT [DF_CustomLayoutColumn_TypeID] DEFAULT ((0)) NOT NULL,
    [OrdinalPosition]      INT CONSTRAINT [DF_CustomLayoutColumn_OrdinalPosition] DEFAULT ((0)) NOT NULL,
    PRIMARY KEY CLUSTERED ([CustomLayoutColumnID] ASC),
    CONSTRAINT [FK_CustomLayoutColumn_ToCustomLayout] FOREIGN KEY ([CustomLayoutID]) REFERENCES [dbo].[CustomLayout] ([CustomLayoutID])
);

CREATE TABLE [dbo].[CustomLayoutUser] (
    [UserID]         INT      NOT NULL,
    [CustomLayoutID] INT      NOT NULL,
    [IsDefault]      BIT      CONSTRAINT [DF_CustomLayoutUser_IsDefault] DEFAULT ((0)) NOT NULL,
    [LastUsed]       DATETIME CONSTRAINT [DF_CustomLayoutUser_LastUsed] DEFAULT (((1)/(1))/(1900)) NOT NULL,
    CONSTRAINT [PK_CustomLayoutUser] PRIMARY KEY CLUSTERED ([UserID] ASC, [CustomLayoutID] ASC),
    CONSTRAINT [FK_CustomLayoutUser_ToCustomLayout] FOREIGN KEY ([CustomLayoutID]) REFERENCES [dbo].[CustomLayout] ([CustomLayoutID]),
    CONSTRAINT [FK_CustomLayoutUser_ToUserInfo] FOREIGN KEY ([UserID]) REFERENCES [dbo].[UserInfo] ([UserID])
);

None of the target databases have any of these tables present.
However, when the diff script is generated (at least via sbm batch run), it is only generating a single script with the following ALTER statements -- none of the expected CREATEs precede this.

ALTER TABLE [dbo].[CustomLayoutColumn] WITH CHECK CHECK CONSTRAINT [FK_CustomLayoutColumn_ToCustomLayout];

ALTER TABLE [dbo].[CustomLayoutUser] WITH CHECK CHECK CONSTRAINT [FK_CustomLayoutUser_ToUserInfo];

ALTER TABLE [dbo].[CustomLayoutUser] WITH CHECK CHECK CONSTRAINT [FK_CustomLayoutUser_ToCustomLayout];

This yields the following error, resulting in failed batch jobs.

Cannot find the object "dbo.CustomLayoutColumn" because it does not exist or you do not have permissions.

These are the first foreign keys added to our schema, so we're assuming it's a problem with them.

Running the full dacpacs generated from our SQL project however does create the tables and constraints as expected, so we don't believe there is a syntax issue.

Batch commands appear to not honor settingsfile-defined AuthenticationType

We're testing a managed identity implementation, and have verified that ManagedIdentity is set as the AuthenticationType in our settingsfile, along with the expected values in the IdentityArgs section.

When the nodes attempt to run the batch command though, we see this in the Errors.log:

The --username and --password arguments are required when authentication type is set to Password or AzurePassword.;Returning error code: 1

Upon reviewing the command line on the batch job task, no --authtype arg is present, so it appears to presume SqlAuthentication, and we've purposefully neither of those values in our key vault.

We worked around in our test branch by shoehorning in the following into BatchManager.CompileCommanLines():

                if (cmdLine.AuthenticationArgs.AuthenticationType == AuthenticationType.ManagedIdentity)
                {
                    sb.Append(" --authtype ManagedIdentity ");
                }

Batch runs called with --targetdacpac failing to find file on batch nodes

Experimenting with the --targetdacpac batch run commandline switch, I'm able to confirm that an .sbm file is getting properly generated on our client, and pushed to the batch nodes (alongside the platinum dacpac file and .cfg targets file) .

However, the batch job exits early with exit code -728 - A Target Dacpac file was specified but could not be located at 'C:\scrap2\npr\MyTargetDacPac.dacpac'

The "C:\scrap2..." path was the value supplied to the --targetdacpac file, so I wouldn't expect that exact path to be mirrored on the batch node. However, I'm wondering if this dacpac file is even needed on the batch nodes at all, since the .sbm file was already created from it?

I hacked in this line to BatchManager.CompileCommandLines(), which otherwise allowed it to process successfully (against the same database that the targetdacpac was extracted from, at least):

threadCmdLine.DacPacArgs.TargetDacpac = null;

A logging note -- though the batch job / task failed, the -728 error only appeared in the console log and stdout.txt files -- the Errors.log and stderr.txt were empty.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.