Skip to content

timchapman/sqlserver-to-postgresql

Folders and files

NameName
Last commit message
Last commit date

Latest commit

Β 

History

2 Commits
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 

Repository files navigation

SQL Server to PostgreSQL Migration Tool

Automated migration tool for converting SQL Server databases to PostgreSQL using hybrid rule-based and AI-assisted conversion. This tool connects directly to SQL Server instances to extract live database schemas, or alternatively works with DACPAC files, then intelligently converts schema and code objects to PostgreSQL-compatible SQL.

πŸ“– For complete workflow, see MIGRATION_WORKFLOW.md

✨ Key Features

  • πŸ”Œ Direct SQL Server Connection: Connects to live SQL Server instances using SMO to extract database schema
  • πŸ“¦ DACPAC Support: Alternative extraction from .dacpac or .bacpac files
  • πŸ€– Hybrid Migration: Rule-based schema migration (fast, deterministic) + AI-powered code conversion (3-stage pipeline)
  • πŸ“Š Organized Outputs: 10 numbered schema deployment files + individual code objects for version control
  • ⚑ Parallel Processing: Multi-threaded execution for faster migrations
  • 🧩 Extension Detection: Automatically identifies required PostgreSQL extensions (pgcrypto, uuid-ossp, ltree, postgis, etc.)
  • βœ… Production Ready: Proper dependency ordering, UTF-8 encoding, PostgreSQL best practices

πŸ“‹ Prerequisites

  • PowerShell 7+ (for automation scripts)
  • Python 3.8+ (for AI pipeline)
  • Azure OpenAI access (for code migration) - Get access
  • SQL Server (for direct extraction) OR .dacpac file
  • VS Code + GitHub Copilot (optional, for interactive mode)

πŸš€ Quick Start

1️⃣ Configure Environment

Copy .env.example to .env and configure:

# Azure OpenAI Settings (required for code migration)
DRAFT_AZURE_OPENAI_ENDPOINT=https://your-resource.openai.azure.com
DRAFT_AZURE_OPENAI_KEY=your-key-here
AZURE_OPENAI_DEPLOYMENT_DRAFT=gpt-4o

# SQL Server Connection (for direct extraction)
SQL_INSTANCE=localhost
SQL_SERVER_DATABASE=AdventureWorks2016
SQL_SERVER_USERNAME=sa
SQL_SERVER_PASSWORD=YourPassword
SQL_SERVER_USE_WINDOWS_AUTH=false  # Set to true for Windows Auth

2️⃣ Extract SQL Server Objects

Option A: Direct SQL Server Connection (Recommended)

Connects to a live SQL Server instance and extracts all database objects:

.\scripts\extract_sqlserver_objects.ps1 `
    -Server "localhost" `
    -Database "AdventureWorks2016" `
    -OutputDir "Migrations\AdventureWorks2016\Input\AdventureWorks2016"

Or use credentials from .env:

.\scripts\extract_database.ps1

Option B: From DACPAC File

Extract from a pre-exported .dacpac file:

.\scripts\extract_dacpac_objects.ps1 -Package "path\to\database.dacpac"

⚠️ Important: Let the extraction complete fully. Do NOT run monitoring commands while it's running.

Output Structure:

Migrations/DatabaseName/Input/DatabaseName/
β”œβ”€β”€ Tables/
β”‚   β”œβ”€β”€ Tables/           # Table definitions
β”‚   β”œβ”€β”€ Constraints/      # Constraints (PK, FK, Check, Default)
β”‚   └── Indexes/          # Index definitions
└── Programmability/
    β”œβ”€β”€ Views/            # View definitions
    β”œβ”€β”€ Functions/        # User-defined functions
    β”œβ”€β”€ StoredProcedures/ # Stored procedures
    └── Triggers/         # Triggers

3️⃣ Migrate Schema Objects (Automated)

Converts tables, constraints, indexes, sequences using hybrid rule-based + AI approach:

.\scripts\migrate_schema_objects.ps1 `
    -InputDir "Migrations\AdventureWorks2016\Input\AdventureWorks2016" `
    -OutputDir "Migrations\AdventureWorks2016\Output\schema" `
    -MaxParallel 4

Output: 10 numbered deployment files in proper dependency order:

  • 01_extensions.sql - Required PostgreSQL extensions
  • 02_schemas.sql - Schema definitions
  • 03_sequences.sql - Identity sequences
  • 04_tables.sql - Table definitions
  • 05_primary_keys.sql - Primary key constraints
  • 06_unique_constraints.sql - Unique constraints
  • 07_check_constraints.sql - Check constraints
  • 08_default_constraints.sql - Default values
  • 09_foreign_keys.sql - Foreign key relationships
  • 10_indexes.sql - Indexes

4️⃣ Migrate Code Objects (AI-Powered)

Converts views, functions, stored procedures, and triggers using 3-stage AI pipeline:

.\scripts\migrate_code_objects.ps1 `
    -InputDir "Migrations\AdventureWorks2016\Input\AdventureWorks2016" `
    -OutputDir "Migrations\AdventureWorks2016\Output\AdventureWorks2016" `
    -MaxParallel 4

⚠️ Important: Start the command and let it run uninterrupted. Do NOT check progress during execution.

AI Pipeline Stages:

  1. Draft - Initial T-SQL β†’ PostgreSQL conversion
  2. Refine - Improves accuracy by comparing with original
  3. Verify - Validates correctness and syntax

Output:

  • 00_code_extensions.sql - Extensions required by code objects
  • 11_views.sql - All views (consolidated)
  • 12_functions.sql - All functions (consolidated)
  • 13_procedures.sql - All stored procedures (consolidated)
  • 14_triggers.sql - All triggers (consolidated)
  • Programmability/Views/*.sql - Individual view files for version control
  • Programmability/Functions/*.sql - Individual function files
  • Programmability/StoredProcedures/*.sql - Individual procedure files
  • Programmability/Triggers/*.sql - Individual trigger files

5️⃣ Deploy to PostgreSQL

Deploy the generated files in order:

# 1. Deploy schema (order matters!)
psql -d your_database -f Migrations/AdventureWorks2016/Output/schema/01_extensions.sql
psql -d your_database -f Migrations/AdventureWorks2016/Output/schema/02_schemas.sql
psql -d your_database -f Migrations/AdventureWorks2016/Output/schema/03_sequences.sql
psql -d your_database -f Migrations/AdventureWorks2016/Output/schema/04_tables.sql
psql -d your_database -f Migrations/AdventureWorks2016/Output/schema/05_primary_keys.sql
psql -d your_database -f Migrations/AdventureWorks2016/Output/schema/06_unique_constraints.sql
psql -d your_database -f Migrations/AdventureWorks2016/Output/schema/07_check_constraints.sql
psql -d your_database -f Migrations/AdventureWorks2016/Output/schema/08_default_constraints.sql
psql -d your_database -f Migrations/AdventureWorks2016/Output/schema/09_foreign_keys.sql
psql -d your_database -f Migrations/AdventureWorks2016/Output/schema/10_indexes.sql

# 2. Deploy code objects
psql -d your_database -f Migrations/AdventureWorks2016/Output/AdventureWorks2016/Programmability/00_code_extensions.sql
psql -d your_database -f Migrations/AdventureWorks2016/Output/AdventureWorks2016/Programmability/11_views.sql
psql -d your_database -f Migrations/AdventureWorks2016/Output/AdventureWorks2016/Programmability/12_functions.sql
psql -d your_database -f Migrations/AdventureWorks2016/Output/AdventureWorks2016/Programmability/13_procedures.sql
psql -d your_database -f Migrations/AdventureWorks2016/Output/AdventureWorks2016/Programmability/14_triggers.sql

Or use a simple loop:

# Deploy all schema files
for f in Migrations/AdventureWorks2016/Output/schema/*.sql; do
  psql -d your_database -f "$f"
done

# Deploy all code files
for f in Migrations/AdventureWorks2016/Output/AdventureWorks2016/Programmability/*.sql; do
  psql -d your_database -f "$f"
done

6️⃣ Validate (Optional)

Check for any conversion issues:

.\scripts\validate_migrated_objects.ps1 `
    -OutputDir "Migrations\AdventureWorks2016\Output\AdventureWorks2016"

Report saved to: Output/AdventureWorks2016/code_validation.json

πŸ—οΈ Architecture

Migration Workflow

SQL Server Instance β†’ Extract β†’ Schema Migration β†’ Code Migration β†’ PostgreSQL
       ↓                 ↓            ↓                  ↓              ↓
  [Live Database]    [Input/]    [01-10.sql]        [11-14.sql]   [Deploy]
   or [DACPAC]

Schema Migration (Hybrid Approach)

  • 90% rule-based for predictable, fast conversion
  • 10% AI assistance for complex edge cases
  • Handles: data types, constraints, indexes, sequences, user-defined types
  • Deterministic and repeatable

Code Migration (AI-Powered)

  • 3-stage pipeline (Draft β†’ Refine β†’ Verify)
  • Uses Azure OpenAI for intelligent conversion
  • Handles complex T-SQL logic patterns

Intelligent Pattern Handling:

  • MERGE statements β†’ INSERT ... ON CONFLICT
  • Cursors β†’ FOR loops or set-based operations
  • SQL Server functions β†’ PostgreSQL equivalents (GETDATE() β†’ CURRENT_TIMESTAMP)
  • Window functions β†’ PostgreSQL window function syntax
  • Temp tables (#temp) β†’ TEMPORARY TABLE
  • Table variables β†’ CTEs or temp tables
  • TRY/CATCH β†’ BEGIN ... EXCEPTION

PostgreSQL Extension Detection

Automatically detects and generates extension requirements:

Extension Purpose SQL Server Equivalent
uuid-ossp UUID generation NEWID(), NEWSEQUENTIALID()
pgcrypto Cryptographic functions HASHBYTES(), encryption functions
ltree Hierarchical data hierarchyid
postgis Spatial types geometry, geography
pg_trgm Text similarity Full-text search functions
tablefunc Crosstab/pivot PIVOT, UNPIVOT
hstore Key-value storage Property bags, JSON

πŸ“ Directory Structure

Migrations/
└── DatabaseName/
    β”œβ”€β”€ Input/                          # Extracted SQL Server DDL
    β”‚   └── DatabaseName/
    β”‚       β”œβ”€β”€ Tables/
    β”‚       β”‚   β”œβ”€β”€ Tables/             # Table definitions
    β”‚       β”‚   β”œβ”€β”€ Constraints/        # Constraints (PK, FK, etc.)
    β”‚       β”‚   └── Indexes/            # Index definitions
    β”‚       └── Programmability/
    β”‚           β”œβ”€β”€ Views/              # View definitions
    β”‚           β”œβ”€β”€ Functions/          # User-defined functions
    β”‚           β”œβ”€β”€ StoredProcedures/   # Stored procedures
    β”‚           └── Triggers/           # Trigger definitions
    └── Output/
        β”œβ”€β”€ schema/                     # Schema migration output
        β”‚   β”œβ”€β”€ 01_extensions.sql
        β”‚   β”œβ”€β”€ 02_schemas.sql
        β”‚   β”œβ”€β”€ 03_sequences.sql
        β”‚   β”œβ”€β”€ 04_tables.sql
        β”‚   β”œβ”€β”€ 05_primary_keys.sql
        β”‚   β”œβ”€β”€ 06_unique_constraints.sql
        β”‚   β”œβ”€β”€ 07_check_constraints.sql
        β”‚   β”œβ”€β”€ 08_default_constraints.sql
        β”‚   β”œβ”€β”€ 09_foreign_keys.sql
        β”‚   └── 10_indexes.sql
        └── DatabaseName/               # Code migration output
            └── Programmability/
                β”œβ”€β”€ 00_code_extensions.sql
                β”œβ”€β”€ 11_views.sql
                β”œβ”€β”€ 12_functions.sql
                β”œβ”€β”€ 13_procedures.sql
                β”œβ”€β”€ 14_triggers.sql
                β”œβ”€β”€ Views/              # Individual files
                β”œβ”€β”€ Functions/          # Individual files
                β”œβ”€β”€ StoredProcedures/   # Individual files
                └── Triggers/           # Individual files

βš™οΈ Configuration

Azure OpenAI Setup

Configure different models for each AI stage:

# Draft Stage (initial conversion) - uses GPT-4o
DRAFT_PROVIDER=azure
DRAFT_AZURE_OPENAI_ENDPOINT=https://your-resource.openai.azure.com
DRAFT_AZURE_OPENAI_API_VERSION=2025-01-01-preview
DRAFT_AZURE_OPENAI_KEY=your-key-here
AZURE_OPENAI_DEPLOYMENT_DRAFT=gpt-4o

# Refine Stage (accuracy improvement) - uses GPT-4o
REFINE_PROVIDER=azure
REFINE_AZURE_OPENAI_ENDPOINT=https://your-resource.openai.azure.com
REFINE_AZURE_OPENAI_API_VERSION=2025-01-01-preview
REFINE_AZURE_OPENAI_KEY=your-key-here
AZURE_OPENAI_DEPLOYMENT_REFINE=gpt-4o

# Verify Stage (validation) - can use cheaper model
VERIFY_PROVIDER=azure
VERIFY_AZURE_OPENAI_ENDPOINT=https://your-resource.openai.azure.com
VERIFY_AZURE_OPENAI_API_VERSION=2025-01-01-preview
VERIFY_AZURE_OPENAI_KEY=your-key-here
AZURE_OPENAI_DEPLOYMENT_VERIFY=gpt-4o-mini

Entra ID (Azure AD) Authentication

Leave *_AZURE_OPENAI_KEY blank to use Azure AD authentication:

DRAFT_AZURE_OPENAI_KEY=
# Requires: az login, managed identity, or other DefaultAzureCredential method

SQL Server Connection Options

Windows Authentication:

SQL_INSTANCE=localhost
SQL_SERVER_DATABASE=AdventureWorks2016
SQL_SERVER_USE_WINDOWS_AUTH=true

SQL Server Authentication:

SQL_INSTANCE=localhost
SQL_SERVER_DATABASE=AdventureWorks2016
SQL_SERVER_USERNAME=sa
SQL_SERVER_PASSWORD=YourPassword
SQL_SERVER_USE_WINDOWS_AUTH=false

Connection String (Alternative):

SQL_SERVER_CONNECTION_STRING=Server=localhost;Database=AdventureWorks2016;Integrated Security=True;TrustServerCertificate=True

🎯 Common Scenarios

Scenario 1: Full Database Migration

# 1. Extract from SQL Server
.\scripts\extract_sqlserver_objects.ps1 -Server "localhost" -Database "MyDB"

# 2. Migrate schema
.\scripts\migrate_schema_objects.ps1 -InputDir "Migrations\MyDB\Input\MyDB" -OutputDir "Migrations\MyDB\Output\schema"

# 3. Migrate code
.\scripts\migrate_code_objects.ps1 -InputDir "Migrations\MyDB\Input\MyDB" -OutputDir "Migrations\MyDB\Output\MyDB"

Scenario 2: Schema-Only Migration

.\scripts\extract_sqlserver_objects.ps1 -Server "localhost" -Database "MyDB"
.\scripts\migrate_schema_objects.ps1 -InputDir "Migrations\MyDB\Input\MyDB" -OutputDir "Migrations\MyDB\Output\schema"

Scenario 3: Code-Only Migration

.\scripts\extract_sqlserver_objects.ps1 -Server "localhost" -Database "MyDB"
.\scripts\migrate_code_objects.ps1 -InputDir "Migrations\MyDB\Input\MyDB" -OutputDir "Migrations\MyDB\Output\MyDB"

Scenario 4: Using Stored Credentials

# Configure .env file with connection details
# Then run without parameters:
.\scripts\extract_database.ps1

πŸ”§ Troubleshooting

SQL Server Connection Issues

Problem: "Unable to connect to SQL Server"

Solutions:

  • Verify SQL Server is running: sqlcmd -S localhost -Q "SELECT @@VERSION"
  • Check firewall settings (default port 1433)
  • Ensure SQL Server authentication is enabled (mixed mode)
  • For Windows Auth, ensure your Windows user has access
  • Test connection with SSMS first
  • Check connection string format

Common Error Messages:

Login failed for user 'sa'
β†’ Check username/password in .env

Named Pipes Provider: Could not open a connection
β†’ Verify SQL Server service is running

A network-related or instance-specific error
β†’ Check server name and firewall

Azure OpenAI Errors

Problem: "API key invalid" or "Deployment not found"

Solutions:

  • Verify endpoint URL in .env (should end with .openai.azure.com)
  • Check API key is correct and not expired
  • Ensure deployment name exactly matches your Azure OpenAI resource
  • Verify API version is supported (2025-01-01-preview recommended)
  • Check Azure subscription has available quota

Check your configuration:

# Test Azure OpenAI connection
curl -H "api-key: YOUR_KEY" "YOUR_ENDPOINT/openai/deployments/YOUR_DEPLOYMENT/chat/completions?api-version=2025-01-01-preview"

Script Execution Issues

Problem: Script hangs or produces no output

Solutions:

  • Do NOT monitor progress while script is running
  • Do NOT check terminal output during execution
  • Let scripts complete fully before checking results
  • Check output directory for log files
  • Increase -MaxParallel parameter if too slow
  • Decrease -MaxParallel if experiencing API throttling

Migration Quality Issues

Problem: Converted code has errors

Solutions:

  • Review validation report from validate_migrated_objects.ps1
  • Check Programmability/ individual files for specific issues
  • Verify required extensions are installed in PostgreSQL
  • Test converted SQL in PostgreSQL manually
  • Check for SQL Server-specific features that need manual conversion

⚠️ Known Limitations

This tool focuses on schema and code migration only. The following are intentionally out of scope:

Feature Status Alternative
Data migration ❌ Out of scope Use pg_dump, ETL tools, or custom scripts
CLR assemblies ❌ Not supported Rewrite in PL/pgSQL or external service
SQL Server Agent jobs ❌ Not supported Use pg_cron extension or external scheduler
Service Broker ❌ Not supported Use message queues (RabbitMQ, Kafka)
Linked servers ❌ Not supported Use Foreign Data Wrappers (FDW)
Full-text search ⚠️ Limited Use PostgreSQL FTS or Elasticsearch
Replication ❌ Not supported Configure PostgreSQL replication separately
SSRS/SSIS/SSAS ❌ Not supported Requires separate BI tool migration

🀝 Contributing

Contributions are welcome! To contribute:

  1. Fork the repository
  2. Create a feature branch (git checkout -b feature/amazing-feature)
  3. Make your changes
  4. Test thoroughly
  5. Commit with clear messages (git commit -m 'Add amazing feature')
  6. Push to your branch (git push origin feature/amazing-feature)
  7. Open a Pull Request

Areas for contribution:

  • Additional SQL Server β†’ PostgreSQL conversion patterns
  • Support for more data types and functions
  • Performance improvements
  • Documentation enhancements
  • Bug fixes

πŸ“ License

This project is licensed under the MIT License - see the LICENSE file for details.

πŸ†˜ Support

For issues, questions, or feature requests:

πŸ™ Acknowledgments

  • Built with Azure OpenAI
  • Uses SQL Server SMO for extraction
  • Inspired by real-world SQL Server β†’ PostgreSQL migration challenges

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors