Comprehensive Workflow Integration Template¶
This comprehensive template demonstrates advanced MFSQL Connector integration patterns for custom workflow procedures. It serves as a complete reference implementation showcasing all major integration scenarios in a single, well-documented procedure.
Use Cases: - Context menu actions triggered by workflow state changes - Complex object processing with dependent relationships - Batch operations with performance optimization - Integration with external systems via email notifications - Comprehensive error handling and audit logging
Key Features: - Parameter validation with flexible defaults - Temporary table processing for complex operations - Valuelist item management and synchronization - Object relationship handling - Automated email notifications for error conditions - Complete audit trail and debugging support
Business Scenarios Covered: - Specific object actions (Context Menu Action Types 3 & 5) - Bulk processing operations (Context Menu Action Types 1 & 4) - Data validation and correction workflows - Integration with third-party systems - Automated reporting and notifications
Note
This template includes all common MFSQL integration patterns. For simpler implementations, remove unused sections. For production use, customize the business logic section (2) to match your specific requirements.
/*
=====================================================================================
Comprehensive Workflow Integration Template
=====================================================================================
Purpose: Advanced template demonstrating all major MFSQL Connector integration patterns
TEMPLATE DOCUMENTATION:
This procedure serves as a comprehensive template for creating custom context menu actions
in the MFSQL Connector framework. It demonstrates best practices for:
- Parameter handling with defaults and validation
- Structured logging and error handling
- M-Files integration patterns
- Security considerations for dynamic SQL
- Performance optimization techniques
- Complex business logic implementation
Parameters:
@ID INT - Context menu ID (Required for context menu integration)
Links to MFContextMenu table to track process status
@OutPut VARCHAR(1000) - OUTPUT message for user display (Optional)
Used to return feedback to the M-Files client
@ProcessBatch_ID INT - Batch logging identifier (Auto-generated if NULL)
Links all related log entries for this execution
@ObjectID INT - M-Files object ID (Required for object-specific actions)
The specific object this action is performed on
@ObjectType INT - M-Files object type (Optional - passed from context menu)
May be used for additional validation or logic
@ObjectVer INT - M-Files object version (Optional - passed from context menu)
Can be used for version-specific operations
@ClassID INT - M-Files class ID (Required for object-specific actions)
Determines which class table to operate on
Returns: 1 on success, -1 on error
Example Usage:
-- Single object workflow action
DECLARE @Output VARCHAR(1000), @BatchID INT;
EXEC custom.ComprehensiveWorkflowTemplate
@ID = 1, @OutPut = @Output OUTPUT, @ProcessBatch_ID = @BatchID,
@ObjectID = 12345, @ObjectType = 0, @ObjectVer = 1, @ClassID = 78;
-- Bulk processing action
EXEC custom.ComprehensiveWorkflowTemplate
@ID = 2, @OutPut = @Output OUTPUT, @ProcessBatch_ID = @BatchID;
CUSTOMIZATION GUIDE:
1. Update procedure name and schema throughout
2. Modify parameter list for your specific requirements
3. Update parameter validation rules based on your use case
4. Replace business logic section (2) with your specific operations
5. Adjust logging messages to be meaningful for your process
6. Configure email notifications for your environment
7. Add any additional error handling for your specific operations
8. Test thoroughly with various scenarios and edge cases
=====================================================================================
*/
PRINT SPACE(5) + QUOTENAME(@@SERVERNAME) + '.' + QUOTENAME(DB_NAME()) + '.custom.ComprehensiveWorkflowTemplate';
GO
SET NOCOUNT ON;
-- ============================================================================
-- PROCEDURE EXISTENCE CHECK AND CREATION PATTERN
-- ============================================================================
-- This pattern allows for safe procedure deployment across environments:
-- 1. Check if procedure exists to determine create vs alter operation
-- 2. Create stub if new (required by SQL Server for ALTER to work)
-- 3. Use ALTER for actual implementation (supports all features)
-- Benefits: Works in all scenarios, supports proper dependency management
-- Template Usage: Use this pattern for all new stored procedures
IF EXISTS (
SELECT 1
FROM INFORMATION_SCHEMA.ROUTINES
WHERE ROUTINE_NAME = 'ComprehensiveWorkflowTemplate'
AND ROUTINE_TYPE = 'PROCEDURE'
AND ROUTINE_SCHEMA = 'custom'
)
BEGIN
PRINT SPACE(10) + '...Stored Procedure: update';
SET NOEXEC ON; -- Skip the CREATE, go straight to ALTER
END
ELSE
PRINT SPACE(10) + '...Stored Procedure: create';
GO
-- Stub creation (parsed but not executed if procedure exists)
-- This is required because ALTER PROCEDURE requires the procedure to exist
CREATE PROCEDURE custom.ComprehensiveWorkflowTemplate
AS
SELECT 'created, but not implemented yet.';
GO
-- Main procedure definition
SET NOEXEC OFF; -- Ensure ALTER is executed
GO
ALTER PROCEDURE custom.ComprehensiveWorkflowTemplate
-- ========================================================================
-- PARAMETER DEFINITIONS WITH FLEXIBLE DEFAULTS
-- ========================================================================
-- All parameters have NULL defaults for maximum flexibility
-- Required parameters are validated in the procedure body
-- OUTPUT parameters must be explicitly marked for return values
@ID INT = NULL, -- Context menu ID
@OutPut VARCHAR(1000) = NULL OUTPUT, -- User feedback message
@ProcessBatch_ID INT = NULL, -- Logging batch ID
@ObjectID INT = NULL, -- M-Files object ID
@ObjectType INT = NULL, -- M-Files object type
@ObjectVer INT = NULL, -- M-Files object version
@ClassID INT = NULL -- M-Files class ID
-- Add custom parameters here as needed for your specific implementation
AS
BEGIN
SET NOCOUNT ON; -- Suppress row count messages for better performance
-- ============================================================================
-- PARAMETER VALIDATION AND INITIALIZATION
-- ============================================================================
-- Purpose: Ensure required parameters are provided and initialize defaults
-- Reasoning: Fail fast with clear error messages rather than cryptic failures
-- Template Usage: Modify validation rules based on your specific requirements
-- Context Menu Integration Validation
-- Remove these validations if not using context menu integration
IF @ID IS NULL
THROW 50001, 'Parameter @ID is required for context menu integration', 1;
-- Batch Logging Validation
-- The spMFProcessBatch_Upsert procedure will auto-generate if NULL
-- Remove this validation if you want auto-generation behavior
IF @ProcessBatch_ID IS NULL
THROW 50002, 'Parameter @ProcessBatch_ID is required for audit logging', 1;
-- Object-Specific Operation Validation
-- Remove these validations for general operations not tied to specific objects
IF @ObjectID IS NULL
THROW 50003, 'Parameter @ObjectID is required for object-specific operations', 1;
IF @ClassID IS NULL
THROW 50004, 'Parameter @ClassID is required to determine target table', 1;
-- Initialize output parameter to prevent concatenation issues
IF @OutPut IS NULL
SET @OutPut = '';
-- ============================================================================
-- VARIABLE DECLARATIONS AND INITIALIZATION
-- ============================================================================
-- Purpose: Centralized declaration of all working variables
-- Reasoning: Improves readability, makes dependencies clear, enables debugging
-- Template Usage: Add variables specific to your business logic needs
DECLARE
-- Core procedure tracking variables
@ProcedureName NVARCHAR(128) = 'custom.ComprehensiveWorkflowTemplate',
@ProcedureStep NVARCHAR(128), -- Current operation for error tracking
@StartTime DATETIME, -- Timing for performance monitoring
-- M-Files integration variables
@MFClassTable NVARCHAR(128), -- Dynamic class table name
@Update_ID INT, -- MFSQL update operation ID
@objids NVARCHAR(100), -- Comma-separated object IDs for batch operations
-- Dynamic SQL variables (security-focused)
@SQLQuery NVARCHAR(MAX), -- Dynamic SQL statement
@Params NVARCHAR(MAX), -- Parameter definitions for sp_executesql
-- Business logic variables (customize as needed)
@ProcessingMode NVARCHAR(50), -- Single vs. Batch processing mode
@RecordsProcessed INT = 0, -- Counter for processed records
@ErrorCount INT = 0, -- Counter for errors encountered
-- Email notification variables
@EmailRequired BIT = 0, -- Flag to determine if email notification needed
@EmailBody NVARCHAR(MAX), -- HTML email body content
-- Error handling and logging variables
@LogStatus NVARCHAR(100), -- Status for batch logging
@LogTextDetail NVARCHAR(4000); -- Detailed log messages
BEGIN TRY
-- ====================================================================
-- 1. PROCESS INITIALIZATION AND SETUP
-- ====================================================================
-- Purpose: Set up logging, validate environment, prepare for operations
-- Reasoning: Establish audit trail before any data changes occur
-- Template Usage: Customize output messages for your specific process
SET @OutPut = 'Process Start Time: ' + CAST(GETDATE() AS VARCHAR(50));
SET @StartTime = GETUTCDATE();
-- Determine processing mode based on parameters
SET @ProcessingMode = CASE
WHEN @ObjectID IS NOT NULL THEN 'Single Object'
ELSE 'Bulk Processing'
END;
-- Get class table name from M-Files metadata when ClassID is provided
-- This determines which table contains the object data
IF @ClassID IS NOT NULL
BEGIN
SELECT @MFClassTable = TableName
FROM dbo.MFClass
WHERE MFID = @ClassID;
-- Validate that the class exists in the connector
IF @MFClassTable IS NULL
THROW 50005, 'Invalid ClassID - no matching table found in MFClass', 1;
END
ELSE
BEGIN
-- For bulk operations, set a default or handle as appropriate
SET @MFClassTable = 'General Processing'; -- Customize as needed
END
-- Set context menu as running to prevent concurrent executions
-- This is critical for user experience and data integrity
IF @ID IS NOT NULL
BEGIN
UPDATE dbo.MFContextMenu
SET IsProcessRunning = 1
WHERE ID = @ID;
END
-- Initialize process batch logging
-- This creates the parent log entry for this execution
EXEC dbo.spMFProcessBatch_Upsert
@ProcessBatch_ID = @ProcessBatch_ID,
@ProcessType = @ProcedureName,
@LogType = N'Message',
@LogText = @OutPut,
@LogStatus = N'Started',
@debug = 0;
-- Log the initialization step
SET @OutPut = 'Initialized ' + @ProcessingMode + ' for ' + ISNULL(@MFClassTable, 'Unknown Table');
SET @ProcedureStep = 'Initialize Process';
EXEC dbo.spMFProcessBatchDetail_Insert
@ProcessBatch_ID = @ProcessBatch_ID,
@LogType = N'Message',
@LogText = @OutPut,
@LogStatus = N'In Progress',
@StartTime = @StartTime,
@MFTableName = @MFClassTable,
@LogProcedureName = @ProcedureName,
@LogProcedureStep = @ProcedureStep,
@debug = 0;
-- ====================================================================
-- 2. MAIN BUSINESS LOGIC OPERATIONS
-- ====================================================================
-- Purpose: Core business logic implementation with comprehensive examples
-- Reasoning: This section contains all the actual work - customize for your needs
-- Template Usage: Replace with your specific business logic requirements
-- ====================================================================
-- 2(a). DATA REFRESH AND SYNCHRONIZATION
-- ====================================================================
-- Purpose: Ensure we have the latest object data before processing
-- Reasoning: M-Files data may have changed since last sync
-- Template Usage: Include for object-specific operations, customize for bulk operations
SET @ProcedureStep = 'Data Refresh and Synchronization';
SET @StartTime = GETUTCDATE();
IF @ObjectID IS NOT NULL AND @MFClassTable IS NOT NULL
BEGIN
-- Single object refresh using spMFUpdateTable
SET @objids = CAST(@ObjectID AS NVARCHAR(100));
EXEC dbo.spMFUpdateTable
@MFTableName = @MFClassTable,
@UpdateMethod = 1, -- Get from M-Files
@ObjIDs = @objids,
@Update_IDOut = @Update_ID OUTPUT,
@ProcessBatch_ID = @ProcessBatch_ID,
@Debug = 0;
SET @OutPut = 'Refreshed object ' + CAST(@ObjectID AS NVARCHAR(10)) +
' from M-Files with Update_ID ' + CAST(@Update_ID AS NVARCHAR(10));
SET @RecordsProcessed = 1;
END
ELSE IF @MFClassTable IS NOT NULL
BEGIN
-- Bulk incremental refresh for large datasets
DECLARE @MFLastUpdateDate SMALLDATETIME;
EXEC dbo.spMFUpdateMFilesToMFSQL
@MFTableName = @MFClassTable,
@MFLastUpdateDate = @MFLastUpdateDate OUTPUT,
@UpdateTypeID = 1, -- Incremental Update
@Update_IDOut = @Update_ID OUTPUT,
@ProcessBatch_ID = @ProcessBatch_ID,
@debug = 0;
-- Get count of refreshed records
SELECT @RecordsProcessed = COUNT(*)
FROM dbo.MFUpdateHistory
WHERE Update_ID = @Update_ID;
SET @OutPut = 'Incremental refresh completed for ' + @MFClassTable +
' - ' + CAST(@RecordsProcessed AS NVARCHAR(10)) + ' records processed';
END
-- Log the refresh step
EXEC dbo.spMFProcessBatchDetail_Insert
@ProcessBatch_ID = @ProcessBatch_ID,
@LogType = N'Debug',
@LogText = @OutPut,
@LogStatus = N'In Progress',
@StartTime = @StartTime,
@MFTableName = @MFClassTable,
@LogProcedureName = @ProcedureName,
@LogProcedureStep = @ProcedureStep,
@debug = 0;
-- ====================================================================
-- 2(b). TEMPORARY TABLE PROCESSING FOR COMPLEX OPERATIONS
-- ====================================================================
-- Purpose: Handle complex processing scenarios with intermediate data storage
-- Reasoning: Temporary tables improve performance and simplify complex logic
-- Template Usage: Include for complex data transformations and validations
SET @ProcedureStep = 'Complex Data Processing';
SET @StartTime = GETUTCDATE();
-- Create temporary table for intermediate processing
IF OBJECT_ID('tempdb..#ProcessingResults') IS NOT NULL
DROP TABLE #ProcessingResults;
CREATE TABLE #ProcessingResults (
ObjID INT PRIMARY KEY,
Name_or_Title NVARCHAR(255),
Keywords NVARCHAR(4000),
RelatedObject_Ref NVARCHAR(255),
RelatedObject_ObjID INT,
LookupValue NVARCHAR(255),
ValuelistItem_MFID INT,
ProcessingStatus NVARCHAR(50) DEFAULT 'Pending',
ErrorMessage NVARCHAR(1000),
LastModified DATETIME DEFAULT GETUTCDATE()
);
-- Populate temporary table based on processing mode
IF @ObjectID IS NOT NULL
BEGIN
-- Single object processing
SET @SQLQuery = N'
INSERT INTO #ProcessingResults (ObjID, Name_or_Title, Keywords, LastModified)
SELECT ObjID, Name_or_Title, Keywords, GETUTCDATE()
FROM ' + QUOTENAME(@MFClassTable) + N'
WHERE ObjID = @ObjectIDParam';
SET @Params = N'@ObjectIDParam INT';
EXEC sys.sp_executesql @SQLQuery, @Params, @ObjectIDParam = @ObjectID;
END
ELSE
BEGIN
-- Bulk processing - customize criteria as needed
SET @SQLQuery = N'
INSERT INTO #ProcessingResults (ObjID, Name_or_Title, Keywords, LastModified)
SELECT ObjID, Name_or_Title, Keywords, GETUTCDATE()
FROM ' + QUOTENAME(@MFClassTable) + N'
WHERE Process_ID = 0 OR Process_ID IS NULL'; -- Process unprocessed records
EXEC sys.sp_executesql @SQLQuery;
END
-- Get count of records to process
SELECT @RecordsProcessed = COUNT(*) FROM #ProcessingResults;
SET @OutPut = 'Loaded ' + CAST(@RecordsProcessed AS NVARCHAR(10)) + ' records for processing';
EXEC dbo.spMFProcessBatchDetail_Insert
@ProcessBatch_ID = @ProcessBatch_ID,
@LogType = N'Debug',
@LogText = @OutPut,
@LogStatus = N'In Progress',
@StartTime = @StartTime,
@MFTableName = @MFClassTable,
@LogProcedureName = @ProcedureName,
@LogProcedureStep = @ProcedureStep,
@debug = 0;
-- ====================================================================
-- 2(c). VALUELIST ITEM MANAGEMENT AND SYNCHRONIZATION
-- ====================================================================
-- Purpose: Handle valuelist dependencies and create missing items
-- Reasoning: Ensures data integrity for lookup values
-- Template Usage: Include when working with valuelist properties
SET @ProcedureStep = 'Valuelist Management';
SET @StartTime = GETUTCDATE();
-- Example valuelist processing (customize for your valuelists)
DECLARE @StatusValuelist_MFID INT;
-- Get valuelist MFID for Status valuelist (example)
SELECT @StatusValuelist_MFID = MFID
FROM dbo.MFValuelist
WHERE alias = 'ProcessingStatus'; -- Replace with your valuelist alias
IF @StatusValuelist_MFID IS NOT NULL
BEGIN
-- Update existing valuelist item references
UPDATE pr
SET pr.ValuelistItem_MFID = vli.MFID
FROM #ProcessingResults pr
LEFT JOIN dbo.MFValuelistItem vli ON pr.ProcessingStatus = vli.NAME
INNER JOIN dbo.MFValuelist vl ON vli.Valuelist_ID = vl.ID
WHERE vl.MFID = @StatusValuelist_MFID;
-- Check for missing valuelist items
IF EXISTS(SELECT 1 FROM #ProcessingResults WHERE ValuelistItem_MFID IS NULL AND ProcessingStatus IS NOT NULL)
BEGIN
SET @OutPut = 'Creating missing valuelist items in M-Files';
-- Insert new valuelist items
INSERT INTO dbo.MFValuelistItem (Valuelist_ID, NAME, process_id)
SELECT DISTINCT vl.ID, pr.ProcessingStatus, 1
FROM #ProcessingResults pr
CROSS JOIN dbo.MFValuelist vl
LEFT JOIN dbo.MFValuelistItem vli ON pr.ProcessingStatus = vli.NAME AND vli.Valuelist_ID = vl.ID
WHERE vl.MFID = @StatusValuelist_MFID
AND vli.ID IS NULL
AND pr.ProcessingStatus IS NOT NULL;
-- Synchronize new items to M-Files
EXEC dbo.spMFSynchronizeValueListItemsToMfiles
@ProcessBatch_ID = @ProcessBatch_ID,
@Debug = 0;
-- Update references with new MFIDs
UPDATE pr
SET pr.ValuelistItem_MFID = vli.MFID
FROM #ProcessingResults pr
LEFT JOIN dbo.MFValuelistItem vli ON pr.ProcessingStatus = vli.NAME
INNER JOIN dbo.MFValuelist vl ON vli.Valuelist_ID = vl.ID
WHERE vl.MFID = @StatusValuelist_MFID;
END
END
EXEC dbo.spMFProcessBatchDetail_Insert
@ProcessBatch_ID = @ProcessBatch_ID,
@LogType = N'Debug',
@LogText = @OutPut,
@LogStatus = N'In Progress',
@StartTime = @StartTime,
@MFTableName = @MFClassTable,
@LogProcedureName = @ProcedureName,
@LogProcedureStep = @ProcedureStep,
@debug = 0;
-- ====================================================================
-- 2(d). BUSINESS RULE IMPLEMENTATION AND OBJECT UPDATES
-- ====================================================================
-- Purpose: Apply specific business rules and update object properties
-- Reasoning: This is where your custom business logic is implemented
-- Template Usage: Replace with your specific business requirements
SET @ProcedureStep = 'Business Rule Implementation';
SET @StartTime = GETUTCDATE();
-- Example business rule: Update keywords based on processing status
UPDATE pr
SET pr.Keywords = pr.Keywords + ' [Processed: ' + FORMAT(GETDATE(), 'yyyy-MM-dd HH:mm') + ']',
pr.ProcessingStatus = 'Completed',
pr.LastModified = GETUTCDATE()
FROM #ProcessingResults pr
WHERE pr.ProcessingStatus = 'Pending';
-- Apply updates to the actual class table
SET @SQLQuery = N'
UPDATE ct
SET ct.Keywords = pr.Keywords,
ct.Process_ID = 1 -- Mark for M-Files update
FROM ' + QUOTENAME(@MFClassTable) + N' ct
INNER JOIN #ProcessingResults pr ON ct.ObjID = pr.ObjID
WHERE pr.ProcessingStatus = ''Completed''';
EXEC sys.sp_executesql @SQLQuery;
-- Get count of updated records
SET @RecordsProcessed = @@ROWCOUNT;
SET @OutPut = 'Applied business rules to ' + CAST(@RecordsProcessed AS NVARCHAR(10)) + ' records';
EXEC dbo.spMFProcessBatchDetail_Insert
@ProcessBatch_ID = @ProcessBatch_ID,
@LogType = N'Debug',
@LogText = @OutPut,
@LogStatus = N'In Progress',
@StartTime = @StartTime,
@MFTableName = @MFClassTable,
@LogProcedureName = @ProcedureName,
@LogProcedureStep = @ProcedureStep,
@debug = 0;
-- ====================================================================
-- 2(e). SYNCHRONIZATION BACK TO M-FILES
-- ====================================================================
-- Purpose: Push all changes back to the M-Files vault
-- Reasoning: Ensure M-Files reflects all SQL-side changes
-- Template Usage: Always include when modifying object properties
SET @ProcedureStep = 'Synchronize to M-Files';
SET @StartTime = GETUTCDATE();
-- Check if there are records to synchronize
SET @SQLQuery = N'SELECT @CountParam = COUNT(*) FROM ' + QUOTENAME(@MFClassTable) + N' WHERE Process_ID = 1';
SET @Params = N'@CountParam INT OUTPUT';
EXEC sys.sp_executesql @SQLQuery, @Params, @CountParam = @RecordsProcessed OUTPUT;
IF @RecordsProcessed > 0
BEGIN
-- Use batch update for better performance with large datasets
EXEC dbo.spMFUpdateTableinBatches
@MFTableName = @MFClassTable,
@UpdateMethod = 0, -- Send to M-Files
@WithStats = 1, -- Include statistics
@ProcessBatch_ID = @ProcessBatch_ID,
@Debug = 0;
SET @OutPut = 'Synchronized ' + CAST(@RecordsProcessed AS NVARCHAR(10)) + ' records to M-Files';
END
ELSE
BEGIN
SET @OutPut = 'No records required synchronization to M-Files';
END
EXEC dbo.spMFProcessBatchDetail_Insert
@ProcessBatch_ID = @ProcessBatch_ID,
@LogType = N'Debug',
@LogText = @OutPut,
@LogStatus = N'In Progress',
@StartTime = @StartTime,
@MFTableName = @MFClassTable,
@LogProcedureName = @ProcedureName,
@LogProcedureStep = @ProcedureStep,
@debug = 0;
-- ====================================================================
-- 2(f). DATA VALIDATION AND ERROR REPORTING
-- ====================================================================
-- Purpose: Validate processing results and generate notifications
-- Reasoning: Ensure data quality and provide feedback on issues
-- Template Usage: Customize validation rules for your business requirements
SET @ProcedureStep = 'Data Validation and Reporting';
SET @StartTime = GETUTCDATE();
-- Check for validation errors
SELECT @ErrorCount = COUNT(*)
FROM #ProcessingResults
WHERE ProcessingStatus = 'Error' OR ErrorMessage IS NOT NULL;
-- Generate email notification if errors found
IF @ErrorCount > 0
BEGIN
SET @EmailRequired = 1;
-- Create HTML table of errors for email
EXEC dbo.spMFConvertTableToHtml
@SqlQuery = N'SELECT TOP 20 ObjID, Name_or_Title, ErrorMessage, LastModified
FROM #ProcessingResults
WHERE ProcessingStatus = ''Error'' OR ErrorMessage IS NOT NULL
ORDER BY LastModified DESC',
@TableBody = @EmailBody OUTPUT,
@ProcessBatch_ID = @ProcessBatch_ID,
@Debug = 0;
-- Send notification email to administrators
EXEC dbo.spMFSendHTMLBodyEmail
@Body = @EmailBody,
@MessageTitle = N'MFSQL Workflow Processing Errors Detected',
@FromEmail = N'mfsql@yourcompany.com', -- Configure for your environment
@ToEmail = N'admin@yourcompany.com', -- Configure for your environment
@CCEmail = N'support@yourcompany.com', -- Configure for your environment
@ProcessBatch_ID = @ProcessBatch_ID,
@Debug = 0;
SET @OutPut = 'Validation completed: ' + CAST(@ErrorCount AS NVARCHAR(10)) + ' errors found and reported';
END
ELSE
BEGIN
SET @OutPut = 'Validation completed: No errors detected';
END
EXEC dbo.spMFProcessBatchDetail_Insert
@ProcessBatch_ID = @ProcessBatch_ID,
@LogType = N'Message',
@LogText = @OutPut,
@LogStatus = N'In Progress',
@StartTime = @StartTime,
@MFTableName = @MFClassTable,
@LogProcedureName = @ProcedureName,
@LogProcedureStep = @ProcedureStep,
@debug = 0;
-- ====================================================================
-- 3. PROCESS COMPLETION AND CLEANUP
-- ====================================================================
-- Purpose: Clean up resources, finalize logging, return success status
-- Reasoning: Proper cleanup ensures system remains in consistent state
-- Template Usage: Always include cleanup code - critical for production systems
SET @ProcedureStep = 'Process Completion';
SET @StartTime = GETUTCDATE();
-- Reset context menu status to allow future executions
IF @ID IS NOT NULL
BEGIN
UPDATE dbo.MFContextMenu
SET IsProcessRunning = 0
WHERE ID = @ID;
END
-- Clean up temporary tables
IF OBJECT_ID('tempdb..#ProcessingResults') IS NOT NULL
DROP TABLE #ProcessingResults;
-- Generate final success message with comprehensive summary
SET @OutPut = @ProcessingMode + ' completed successfully. ' +
'Records processed: ' + CAST(@RecordsProcessed AS NVARCHAR(10)) + '. ' +
'Errors: ' + CAST(@ErrorCount AS NVARCHAR(10)) + '. ' +
'Process End Time: ' + CAST(GETDATE() AS VARCHAR(50));
-- Log successful completion
EXEC dbo.spMFProcessBatch_Upsert
@ProcessBatch_ID = @ProcessBatch_ID,
@ProcessType = @ProcedureName,
@LogType = N'Message',
@LogText = @OutPut,
@LogStatus = N'Completed',
@debug = 0;
-- Final detail log entry
EXEC dbo.spMFProcessBatchDetail_Insert
@ProcessBatch_ID = @ProcessBatch_ID,
@LogType = N'Message',
@LogText = @OutPut,
@LogStatus = N'Success',
@StartTime = @StartTime,
@MFTableName = @MFClassTable,
@LogProcedureName = @ProcedureName,
@LogProcedureStep = @ProcedureStep,
@debug = 0;
RETURN 1; -- Explicit success return code
END TRY
BEGIN CATCH
-- ====================================================================
-- COMPREHENSIVE ERROR HANDLING
-- ====================================================================
-- Purpose: Gracefully handle any errors that occur during processing
-- Reasoning: Proper error handling prevents system corruption and provides debugging info
-- Template Usage: This pattern should be used in all production procedures
-- Capture timing and error details immediately
SET @StartTime = GETUTCDATE();
SET @LogStatus = 'Failed w/SQL Error';
SET @LogTextDetail = 'Error in ' + ISNULL(@ProcedureStep, 'Unknown Step') +
': ' + ERROR_MESSAGE();
SET @OutPut = 'Error: ' + ERROR_MESSAGE();
SET @ProcedureStep = ISNULL(@ProcedureStep, 'Error Handler');
-- CRITICAL: Reset context menu status on error
IF @ID IS NOT NULL
BEGIN
UPDATE dbo.MFContextMenu
SET IsProcessRunning = 0
WHERE ID = @ID;
END
-- Clean up temporary tables on error
IF OBJECT_ID('tempdb..#ProcessingResults') IS NOT NULL
DROP TABLE #ProcessingResults;
-- Log detailed error information
INSERT INTO dbo.MFLog (
SPName, ErrorNumber, ErrorMessage, ErrorProcedure,
ErrorState, ErrorSeverity, ErrorLine, ProcedureStep
)
VALUES (
@ProcedureName, ERROR_NUMBER(), ERROR_MESSAGE(), ERROR_PROCEDURE(),
ERROR_STATE(), ERROR_SEVERITY(), ERROR_LINE(), @ProcedureStep
);
-- Log error in process batch system
EXEC dbo.spMFProcessBatch_Upsert
@ProcessBatch_ID = @ProcessBatch_ID,
@ProcessType = @ProcedureName,
@LogType = N'Error',
@LogText = @LogTextDetail,
@LogStatus = @LogStatus,
@debug = 0;
-- Log error details for step-level tracking
EXEC dbo.spMFProcessBatchDetail_Insert
@ProcessBatch_ID = @ProcessBatch_ID,
@LogType = N'Error',
@LogText = @LogTextDetail,
@LogStatus = @LogStatus,
@StartTime = @StartTime,
@LogProcedureName = @ProcedureName,
@LogProcedureStep = @ProcedureStep,
@debug = 0;
RETURN -1; -- Explicit error return code
END CATCH
END
/*
=====================================================================================
IMPLEMENTATION CHECKLIST FOR PRODUCTION USE
=====================================================================================
BEFORE DEPLOYMENT:
□ Update procedure name throughout the code
□ Modify parameter list for your specific requirements
□ Update parameter validation rules based on your use case
□ Replace business logic sections with your specific operations
□ Configure email settings for your environment
□ Adjust logging messages to be meaningful for your process
□ Update valuelist aliases to match your vault structure
□ Replace table names with your actual class tables
□ Test error handling with various failure scenarios
□ Update header documentation with your specific details
SECURITY VERIFICATION:
□ All dynamic SQL uses QUOTENAME() for table names
□ All user inputs are parameterized through sp_executesql
□ No direct string concatenation of user inputs in SQL
□ Parameter validation prevents injection attacks
□ Email addresses are configured for your environment
PERFORMANCE OPTIMIZATION:
□ Indexes exist for all query patterns used
□ Batch processing is used for large datasets
□ Temporary tables are properly sized and indexed
□ Update operations use appropriate batch sizes
□ Logging frequency is optimized for performance
OPERATIONAL READINESS:
□ Test with various object types and states
□ Verify context menu integration works correctly
□ Ensure error messages are user-friendly
□ Document any special setup or configuration requirements
□ Create monitoring and alerting for the process
□ Train support staff on troubleshooting procedures
MONITORING AND MAINTENANCE:
□ Set up automated monitoring for process failures
□ Configure regular review of process batch logs
□ Establish performance baseline metrics
□ Plan for regular maintenance and updates
□ Document escalation procedures for critical failures
=====================================================================================
*/
Related Procedures and Functions:
This template demonstrates integration with key MFSQL Connector procedures:
spMFUpdateTable - Core synchronization procedure
spMFUpdateTableinBatches - Batch processing optimization
spMFProcessBatch_Upsert - Process logging management
spMFProcessBatchDetail_Insert - Detailed audit logging
spMFSynchronizeValueListItemsToMfiles - Valuelist synchronization
spMFConvertTableToHtml - HTML report generation
spMFSendHTMLBodyEmail - Email notification system
Configuration Requirements:
Before using this template in production:
Email Configuration: Update email addresses and SMTP settings
Valuelist Aliases: Replace example aliases with your vault’s valuelists
Table Names: Update class table references to match your implementation
Business Logic: Replace example business rules with your requirements
Security Settings: Verify all dynamic SQL follows security best practices
Performance Considerations:
Use batch processing for datasets larger than 1000 records
Consider indexing strategy for temporary table operations
Monitor process batch logging for performance bottlenecks
Implement appropriate transaction isolation levels for your use case
Error Handling Strategy:
This template implements a comprehensive 4-layer error handling approach:
Parameter Validation: Fail fast with clear messages
Business Logic Validation: Detect and handle data quality issues
System Error Handling: Comprehensive SQL error capture and logging
Automated Notifications: Email alerts for critical failures requiring attention