Framework for Monitoring Metadata changes

It is really important for a handshake model to exist between most upstream and downstream systems. Normally there is a mode of communication between the upstream Dev and Downstream Dev teams in order to track these changes. Now I have created a simple framework. Now rather than a pull system where the downstream pulls the meta data changes from the source, I have created a push system that the upstream will send across to the destination systems. Let us see how the framework works:
Step 1: Create a table that will store the metadata changes like addition or updation of tables and views as follows-

Step 2: The next step is to create the table that will track the Column changes made to views or tables-

Step 3: We create a table to store the list of downstream MetaData Tables (one for table/view changes and the other for the column changes) that these changes have to move to from the upstream system-

Step 4: The next step is to write up an SP (Stored Procedure) that will create the linked servers (provided the upstream Dev team has access to a metadata repository on the downstream systems. (The reason for creating a separate SP for doing this is that we cannot wrap the "master.dbo.sp_addlinkedserver" system proc within a transaction.)-
Step 5: The next step would be to create the downstream tables hosted at the destination metadata DB layer (The prior ones were all at the source level)-
Step 6: The final step is to create the Stored proc to populate the metadata changes in the destination tables at the upstream system. Note: The procedure can be modified to be made set based rather than cursor based. Configuration of the DataBase Mail at the source level is required for the "sp_send_dbmail" task. You can also pass a query parameter to the "sp_send_dbmail" task if required. All I am trying to create is some sort of notification for the downstream owners.
Populate the DB_Downstream_Tables with the required information as shown in the figure below-

The last and final step is to create a SQL agent job at the source level to run the Stored Procs in the following Order:
1. EXEC DB_Add_Linked_Server
2. EXEC DB_SP_Verify_Metadata_Changes

Comments

Unknown said…
Looks like a good framework for testing through linked server objects.
Ishwar said…
A couple of quick things to note while using this framework:
1. Add indexes to the Source tables
2. Add NOLOCK to the Source table queries in the SP's.
3. The SP needs to be modified for performance by changing the cursor to a while loop.
4. Instead of an agent job and the SP to transfer the info to the destination tables, an SSIS package can be created to replace the same.

Popular posts from this blog

Branding your SharePoint site in a super fast way - Emgage

Load Data into Azure DW using C# in an SSIS script task

Power BI To Embed Or Not To Embed