Migration & Upgrades

ERP Data Migration Best Practices: A Complete Planning Guide [2026]

Data migration is the highest-risk and most time-consuming phase of ERP migrations, representing 40-60% of total project timeline and success depends on early data auditing, disciplined cleansing, detailed mapping rules, multiple test migration runs, and a well-executed cutover strategy with comprehensive validation checkpoints.

Last updated: March 15, 202615 min read10 sections
Quick Reference
Data Migration % of Timeline
40–60%
Data Audit & Cleaning
4–8 weeks
Typical Data to Migrate
3–5 years of history
Typical Issues Found
20–40% of records have quality issues
Test Migration Runs
2–3 before production cutover
Critical Validations
GL balance tie, transaction counts, master file completeness
Cutover Downtime (typical)
4–8 hours
Parallel Run Duration
2–8 weeks
Most Common Issue
Duplicate masters (customers, vendors, items)
Cost to Fix Issues Post-Go-Live
3–5x cost to fix during project

Why Data Migration Is the #1 Risk Factor

If your ERP migration fails, it is usually a data problem, not a technology problem. Failed migrations are rarely caused by the ERP platform itself; they are caused by:

  • Unexpected data quality issues discovered too late to remediate
  • Incorrect mapping rules that create GL imbalances or transaction errors
  • Incomplete data extraction from the source system
  • Failed validation checks that go unnoticed until post-go-live
  • Master data duplicates that break integrations or create reporting errors

The stakes are high: data errors post-go-live are expensive to fix ($20K-$50K in rework) and erode user confidence. Getting data right during the project is the highest ROI activity you can do.

Pre-Migration Data Assessment: The Audit Phase

What to Assess

Before building migration templates or writing transformation rules, conduct a thorough data audit:

1. Data Volume and Scope

  • Total record count by entity (customers, vendors, items, GL accounts, transactions)
  • Historical data span (oldest transaction date; how far back to keep)
  • Annual transaction growth rate (to predict future volume)
  • Database size and table fragmentation

2. Data Quality Issues

  • Duplicates: Count duplicate customer/vendor records (same name, different IDs), duplicate items, duplicate GL accounts
  • Obsolete records: Inactive customers (no transactions in 2+ years), discontinued items still marked active, closed GL accounts with residual balances
  • Orphaned records: Sales orders with no customer, purchase orders with no vendor, inventory transactions with no item master
  • Missing required fields: Customers without tax ID, vendors without payment terms, items without unit of measure
  • Invalid data: GL account balances that don't reconcile, negative inventory quantities, dates out of sequence

3. Data Structure Complexity

  • GL account structure (number of segments, hierarchies, budget vs. actual accounts)
  • Multi-entity consolidation logic
  • Complex customer/vendor hierarchies or parent-subsidiary relationships
  • Custom data fields (are there extended tables or custom columns that need mapping?)
  • Integration points (what external systems feed data into your ERP?)

4. Data Integration Dependencies

  • List all systems that integrate with your ERP (payroll, HR, e-commerce, CRM, accounting, banking)
  • Identify master data that must stay synchronized (e.g., customer in ERP + CRM)
  • Identify transactional data that flows in/out (e.g., payroll to GL, orders from e-commerce)

How to Conduct the Audit

Step 1: SQL queries to profile data (Week 1)

  • Run summary queries to count records by type, find duplicates, identify data gaps
  • Example: SELECT COUNT(DISTINCT CustomerName), COUNT(*) FROM Customers GROUP BY CustomerName HAVING COUNT(*) > 1 (to find duplicate names)
  • Generate exception reports (e.g., customers with missing tax IDs)

Step 2: Review findings with business stakeholders (Week 1-2)

  • Present data profile to finance, operations, IT stakeholders
  • Ask: Which obsolete records should we retire? Which duplicates should we consolidate?
  • Identify master data that must be cleaned before migration

Step 3: Plan data remediation (Week 2-3)

  • Create remediation plan with owners and timelines
  • Priority 1 (required for migration): duplicates, orphaned records, invalid GL account balances
  • Priority 2 (nice to have): standardize formatting, fill missing optional fields

Step 4: Execute remediation (Week 3-6)

  • Consolidate duplicate customers/vendors into single master
  • Retire obsolete records or archive to separate table
  • Correct GL account hierarchies and balances
  • Validate fixes; repeat audits to confirm remediation

Data Mapping: From Source to Target

Mapping Strategy

Data mapping is the translation rule that defines how source system data becomes target system data. Each field in the target system needs a mapping rule.

Types of mappings:

  • Direct 1:1 mapping: Source field maps directly to target field (e.g., Customer Name → Customer Name). Simple, low-risk.
  • Calculated/transformed mapping: Source data is transformed via calculation or logic (e.g., Customer Type in source must be translated to a dimension code in target). Medium complexity.
  • Consolidated mapping: Multiple source records consolidate into one target record (e.g., three duplicate customer records → one customer in target). Higher complexity.
  • Lookup/reference mapping: Source field is looked up against a reference table to find corresponding target value (e.g., source GL account "1000" is looked up against account mapping table to find target account "1-1000"). Common; needs reference tables built upfront.
  • Conditional mapping: Mapping rule depends on data conditions (e.g., if source customer type = "Distributor", then target dimension = "DISTxxx"; if type = "Retail", then dimension = "RETxxx"). Complex; high error risk.

Building the Mapping Specification

Create a detailed mapping specification document before writing migration code:

Source Entity/Field Target Entity/Field Mapping Type Logic/Transformation Owner Status
CUSTOMERS.CustomerID Customer.No Direct 1:1 No transformation; use source ID as-is Finance Complete
CUSTOMERS.CustomerName Customer.Name Direct 1:1 Trim whitespace; validate length < 100 chars Finance Complete
CUSTOMERS.CustType Customer.Dimension1 (Customer Group) Lookup Ref table: GP Type → D365 Dimension (Distributor → DIST, Retail → RETAIL, Other → MISC) Finance Pending ref table
GL.AccountNo + GL.DeptNo GL.No (+ dimension) Consolidated Concatenate source GL account + dept; map via account mapping table to target GL account; store dept in dimension Accounting In progress

Critical Mapping Areas

1. Chart of Accounts (GL)

The most critical mapping. Errors here cascade to all financial reporting and consolidated statements.

  • Source: Review every GP GL account; determine if it should map to BC target GL account or be consolidated
  • Mapping rules: Create account mapping table (GP account → BC account). Include GL account hierarchy and dimension assignments.
  • Opening balances: For each mapped GL account, define the opening balance as of migration date (sum of all activity YTD + opening balance from prior year)
  • Validation: Run control totals before and after mapping; GL trial balance must match

2. Customer and Vendor Masters

  • Deduplication: Consolidate duplicate records during mapping (choose primary record, retire duplicates)
  • Customer groups / segmentation: Map GP customer type or class to BC dimension (Customer Group)
  • Contact information: Standardize phone/email formats; decide which contact info to keep vs. retire
  • Payment terms, shipping methods, price groups: Map or reassign during migration
  • Credit limits, payment history: Carry forward or reset?

3. Item/Inventory Masters

  • Deduplication: Consolidate duplicate item records
  • Item categories: Map GP item classes to BC item categories and subcategories
  • Valuation methods: If source uses LIFO and target uses FIFO, define conversion approach
  • Unit of measure: Standardize (every item must have base UOM; alternate UOMs are optional)
  • Standard costs: Carry forward, recalculate, or reset to zero?
  • Opening balances: Define inventory qty by location as of migration date

4. Open Transactions (AR, AP, Inventory)

  • AR invoices: Migrate open invoices with customer, amount, due date, aging info
  • AP invoices: Migrate open purchase invoices
  • Payments received/paid: Decide how far back to migrate; most organizations migrate last 12-24 months of detail, archive older transactions
  • On-hold transactions: Preserve hold status or clear holds during migration?

Data Cleansing Strategy

The 80/20 Rule

Don't try to achieve 100% perfect data. The cost of perfection is disproportionate. Most migration projects aim for:

  • 80% cleanliness in data provided by the source system (remove obvious duplicates and obsolete records)
  • 95% completeness for required fields (master data must have all mandatory fields)
  • 100% GL reconciliation (GL accounts must balance perfectly; this is non-negotiable)

Cleansing Workflow

Phase 1: Master Data Cleansing (Weeks 1-4)

Step 1: Identify and consolidate duplicates

  • Run fuzzy match logic to find similar names (e.g., "Acme Corp", "ACME CORP", "Acme Corpor")
  • For each duplicate set, designate primary record and merge others
  • Reassign transactions from duplicate records to primary record
  • Retire (soft-delete or archive) duplicate records

Step 2: Retire obsolete records

  • Identify customers with no transactions in 2+ years; mark as inactive
  • Identify discontinued items; mark as discontinued
  • Move to archive or simply mark inactive (don't delete)

Step 3: Standardize and validate key fields

  • Email addresses: Validate format; remove invalid entries
  • Phone numbers: Standardize format; remove invalid entries
  • Tax IDs: Validate format (if country-specific validation available)
  • Addresses: Validate postal code format; fill missing state/country

Phase 2: Transactional Data Preparation (Weeks 3-6)

Step 1: Determine cutoff date and scope

  • Decide: migrate 3 years of history, 5 years, or all history?
  • Older transactions (beyond 3-5 years) are often archived separately for compliance
  • Define opening date: all transactions before this date are summarized into opening balances; transactions on or after are detailed records

Step 2: Extract and validate

  • Extract all transactions for the selected date range
  • Validate that source GL is balanced (trial balance)
  • Validate that detailed transactions sum to GL balances
  • Validate AR aging (customer balance = sum of invoice balances)
  • Validate AP aging (vendor balance = sum of invoice balances)

Step 3: Identify and remediate exceptions

  • If GL is not balanced, investigate and fix source issues
  • If AR aging doesn't reconcile, adjust or clear exceptions
  • If transactions reference deleted masters, decide: delete transaction or reassign to retained master?

Migration Testing Strategy

The Three Migration Runs

All successful migrations perform at least 3 migration runs: practice runs, UAT run, and production run.

Run 1: Practice Migration (Week 6-7)

  • Environment: Development or test environment
  • Data scope: Full data set or large subset (to test at volume)
  • Objective: Test mapping logic, identify data exceptions, refine transformation rules
  • Validation: GL trial balance, transaction counts, master data completeness
  • Outcome: List of issues, fixes, and refinements for next run
  • Typical issues found: 30-50 data exceptions, 5-10 mapping refinements, 2-5 script fixes

Run 2: UAT Migration (Week 9-10)

  • Environment: Test/UAT environment that users can access
  • Data scope: Full production data (most recent snapshot as of 1-2 weeks before UAT cutover)
  • Objective: User validation that migrated data looks correct and complete
  • Validation: Power user review of customer balances, vendor balances, item quantities, GL accounts
  • Outcome: User sign-off that data is acceptable, or list of issues to remediate
  • Typical issues found: 5-15 business logic questions ("Why is this customer balance different?"), 2-5 actual data errors

Run 3: Production Migration (Week 15-16, Cutover Week)

  • Environment: Production target environment
  • Data scope: Most recent snapshot, typically 1-2 days before go-live
  • Objective: Final data load before users access system
  • Validation: Critical control totals (GL trial balance, AR aging total, AP aging total, inventory quantities)
  • Outcome: Data signed off by CFO/Finance Manager; system ready for user access
  • Typical issues found: 0-3 (usually edge cases or data that entered system between previous extract and production cutover)

Validation Checklist for Each Run

Validation Owner Target Accuracy Action if Fails
GL Trial Balance (total debits = total credits) Accounting Manager 100% match to source system Investigate GL mappings; do not proceed until balanced
GL Reconciliation (detail transactions reconcile to GL balances) Accounting Manager 100% match Review transaction mappings; identify missing/duplicate transactions
Master Record Count (customers, vendors, items) Operations Manager Within 1-2% of expected (accounts for consolidations) Verify consolidation logic is correct
Transaction Count (invoices, POs, receipts) Finance Manager 100% or documented exclusions Identify missing transactions; verify age cutoffs are applied correctly
AR Aging (customer balance = sum of open invoices) AR Supervisor 100% match by customer Investigate mismatches; correct invoice mappings or customer assignments
AP Aging (vendor balance = sum of open invoices) AP Supervisor 100% match by vendor Investigate mismatches; correct invoice mappings or vendor assignments
Inventory Quantity (item qty on hand by location) Inventory Manager Within 0.5% of expected (accounts for rounding) Physical count or cycle count to validate; adjust if needed
Fixed Assets (asset cost basis and accumulated depreciation) Accounting Manager 100% match in aggregate Validate depreciation calculation logic; investigate variances
Date Range (oldest transaction, newest transaction) Project Manager Match source system expectations Verify age cutoff logic and opening date are applied correctly
Required Field Completeness (no null values in required fields) IT/Data Manager 100% Remediate missing values or default them appropriately

The Cutover Process: From Source to Target

Cutover Timeline (Typical Day-1 Process)

Day 0 (Friday evening, before cutover)

  • Final backup of source ERP system
  • Notify users that system will be down starting at 5 PM
  • Begin final data extract from production source system

Day 1 (Cutover Day, Saturday or Sunday)

  • 08:00 AM: Complete final data extract from source system
  • 08:30 AM: Disable all users in source system (prevent new transactions)
  • 09:00 AM: Begin data transformation (map source data to target using final extraction)
  • 10:00 AM: Load transformed data into production target environment (BC/F&O)
  • 10:30 AM: Run validation queries (GL balance, transaction counts, master record counts)
  • 11:00 AM: Investigate any validation failures; fix and retry (or rollback if critical issue)
  • 12:00 PM: Sign-off: CFO or Finance Manager confirms critical validations pass
  • 12:30 PM: Enable users in target system; activate business processes in D365
  • 01:00 PM: First users access target system; monitor for issues
  • 01:00–05:00 PM: Support team on-site or virtual; troubleshoot user issues
  • 05:00 PM onward: Continue monitoring; escalate critical issues

Rollback Planning

When would you rollback?

  • GL is not balanced in target system (indicates data corruption)
  • Critical data is missing (e.g., all customer records failed to load)
  • System is unstable or crashing immediately post-load
  • Rare in practice: Most issues discovered in Run 2 (UAT); rollbacks are uncommon if project executed well

Rollback process:

  • Revert users to source system (restore backup from Day 0)
  • Analyze failure; root cause analysis
  • Remediate issue (typically a data mapping or transformation problem)
  • Re-execute cutover 1-2 days later with corrected data/process

Cost of rollback: Significant. A rollback typically costs $15K-$40K in additional consulting labor, extended downtime, user confusion, and reputational damage. This is why thorough testing (Run 2) is critical—it prevents rollbacks.

Parallel Run Strategy (Post-Go-Live Validation)

What Is a Parallel Run?

A parallel run means running both the old system (source ERP) and new system (target ERP) simultaneously for a period (typically 2-8 weeks post-go-live). Users perform transactions in the new system; finance team reconciles new system transactions back to old system to ensure nothing was lost or corrupted during migration.

Typical Parallel Run Schedule

Week Activity Owner
Week 1-2 All business transactions in new system only. Daily validation: compare opening balances, YTD GL activity, AR/AP aging in new vs. old system. Investigate variances. Finance team
Week 3-4 Continue in new system. Monthly close simulation: do a practice month-end close in new system, reconcile to old system month-end close from last month. Finance team
Week 5-8 (optional) If everything is clean, end parallel run. If issues detected, extend parallel run or run second accounting cycle in parallel. Once users are confident and no significant issues, formally retire old system. CFO/Finance Manager

Critical Validations During Parallel Run

  • Daily GL activity: post-migration GL should match yesterday's GL (no unexplained changes)
  • Daily AR aging: customer balances in new system should match old system as of cutover date, adjusted for post-cutover transactions
  • Daily AP aging: vendor balances same logic
  • Weekly bank reconciliation: post-migration bank account balances should tie to bank (both systems should reconcile)
  • Month-end close: at month end, close both systems and compare: GL balances, AR/AP aging, profit & loss, balance sheet

Data Migration Tools and Approaches

Microsoft Data Migration Toolkit (DMT)

  • Free Excel-based templates for common entities (GL, customers, vendors, items)
  • Limited flexibility; works for simple, straightforward migrations
  • Common approach: use for master data (customers, vendors, items); use custom SQL scripts for transactional data

SQL-Based ETL Scripts

  • Custom SQL scripts to extract from source, transform, and load to target
  • High flexibility; works for complex transformations
  • Requires SQL expertise and careful quality control
  • Common approach for larger migrations with complex data structures

Third-Party Migration Tools

  • Tools like Scribe, Kingfisher, Celigo specialize in ERP data migration
  • Pre-built adapters for common source/target combinations (GP → BC, NAV → BC, AX → F&O)
  • Faster setup; reduced custom development
  • Cost: $10K-$50K depending on tool and complexity

Partner-Built Migration Accelerators

  • Many implementation partners have proprietary migration accelerators for common paths
  • Microsoft partners often have IP around GP → BC migrations
  • Can significantly reduce migration timeline and cost

Data Migration Checklist

  • ☐ Conduct data audit: identify duplicates, quality issues, obsolete records
  • ☐ Create data remediation plan with timelines and owners
  • ☐ Build GL account mapping table; validate opening balances
  • ☐ Build customer/vendor mapping and deduplication logic
  • ☐ Build item master mapping and consolidation rules
  • ☐ Define data cutoff date and historical scope (how many years to migrate?)
  • ☐ Create migration mapping specification document with all field-level rules
  • ☐ Build migration scripts (SQL, Excel, or third-party tool)
  • ☐ Test scripts in development environment with sample data
  • ☐ Execute Practice Migration (Run 1): full data set, identify exceptions
  • ☐ Fix exceptions and refine scripts based on Run 1 findings
  • ☐ Execute UAT Migration (Run 2): full data set, power user validation
  • ☐ Document UAT results; obtain user sign-off on data quality
  • ☐ Execute Production Migration (Run 3): final cutover with pre-cutover data extract
  • ☐ Validate GL trial balance, transaction counts, master record counts
  • ☐ Obtain CFO sign-off on data; enable users
  • ☐ Execute parallel run (2-8 weeks): daily validations, weekly reconciliations, month-end close simulation
  • ☐ Document parallel run reconciliation; formally retire source system

Frequently Asked Questions

Data Migration Effort by Source System and Scope

FeatureFactorImpact on Timeline & CostWinner
Small organization (25 users), clean data, 3 years historyLow Complexity4–6 weeks data migration; $5K–$10K effort
Mid-market (200 users), moderate data quality issues, 5 years historyMedium Complexity6–10 weeks data migration; $15K–$30K effort
Large organization (1000+ users), complex GL, heavy customization, 10 years historyHigh Complexity12–16 weeks data migration; $50K–$100K+ effort
Data with 30%+ duplicate/obsolete records requiring consolidationHigh RiskAdd 3–5 weeks for remediation; add $10K–$20K effort
Multi-entity, multi-currency environment with intercompany transactionsMedium-High ComplexityAdd 2–4 weeks for GL restructuring; add $5K–$15K effort
Integration dependencies (payroll, CRM, external systems)Medium RiskAdd 2–3 weeks for integration testing; add $5K–$15K effort

Frequently Asked Questions

Typical approach: migrate 3–5 years of transactional detail, archive older transactions separately. Older transactions are rarely needed for day-to-day operations but are important for audit trail and historical reporting. Migrating 10+ years of detail increases migration complexity and cost significantly. Most organizations migrate 5 years of GL detail, 3–5 years of AR/AP invoices, 2–3 years of inventory transactions.

Stop and remediate. Don't proceed with migration if critical issues are discovered (e.g., GL out of balance, missing required fields). Data problems discovered during migration are cheaper to fix than post-go-live. This is why multiple test runs are essential—they flush out issues before you are locked into a cutover date. Issues in Run 1 are easy to fix; issues in Run 2 delay UAT; issues in Run 3 cause rollback (expensive).

Ideally 2–4 weeks. Extended parallel runs (beyond 8 weeks) are expensive and create user confusion (which system is the "real" one?). The goal is to build user confidence in the new system quickly. If parallel run extends beyond 4 weeks, escalate; it usually indicates unresolved issues. Some organizations do 2–3 accounting cycles in parallel (if cycles are monthly, that's 2–3 months); this is acceptable if needed for quarterly close validation.

This is a critical issue; do not proceed. The GL imbalance indicates a data corruption or mapping error. Investigate: (1) were all GL transactions migrated? (2) are debits and credits properly tagged in the mapping? (3) are subsidiary ledgers (AR, AP, Inventory) properly posting to GL control accounts? Root cause and fix, then re-run migration. This is where thorough testing prevents go-live disasters.

Ensure both systems are configured for the same currency and decimal precision before migration. If source system uses 4 decimal places and target uses 2, you will have rounding differences. Decide upfront: truncate, round, or adjust? Document the rule and apply consistently. Test with sample transactions to validate that the rounding rule works as expected.

Before (during migration). Consolidating post-migration creates customer/vendor records with fragmented history (some transactions in one record, some in another) and breaks reporting. Consolidate before migration: choose primary record, move all transactions to primary, retire duplicate. This requires data remediation upfront (not in the critical path) but results in clean data in the target system.

Source systems often use hierarchical GL account numbers (e.g., "1-1100-001-01" = Revenue by Department by Product by Region). Target systems often use dimensions instead. Create a mapping table that translates source GL segment structure to target GL account + dimension values. This requires analysis of your GL structure but usually results in simpler, more analytical GL in the target.

Not auditing data before starting mapping. Teams often jump to mapping and scripting without understanding what's in the source system. This results in surprised discoveries (massive duplicate records, corrupt data, missing fields) late in the project. Spend Weeks 1–2 of the project auditing data; it will save you 2–4 weeks of rework downstream.

Use the three-run approach (Practice, UAT, Production) and the validation checklist. Critical controls: (1) GL trial balance must be 100% accurate, (2) transaction counts must match source system, (3) AR/AP aging must reconcile by customer/vendor, (4) Inventory quantities must match counts, (5) all required fields must be populated. Build reports or queries that validate each of these. Have power users review sample records (customers, vendors, invoices) to validate data makes sense.

Yes, for at least 6–12 months. Keep source system accessible (read-only) for historical reference, audit trail, and "what did we do in the old system?" lookups. Formal retirement happens once users are confident in the target system and all audit/compliance requirements are met. Total cost of running source system in read-only mode is minimal (~$5K/year for hosting/support).

Previous
Dynamics 365 Migration Costs: What to Budget for GP, NAV, and AX Upgrades [2026]

Related Reading

From the Blog & Resources