Data
Related Goals and Activities by Phase
A sound ERP will fail if it relies on inaccurate or incomplete data. “Garbage in, garbage out” is true. The Data Pillar needs a disciplined approach that focuses on governance, quality, migration, and integration.
Explore & Commit
Goals
-
Assess the high-level state of current data.
-
Identify major data-related business risks.
-
Evaluate partner's data migration methodology.
-
Develop the high-level data migration strategy
-
Establish the initial Data Governance framework.
Activities
Data considerations start with a high-level assessment and strategy. The aim is to understand the scope of the data challenges to guide planning and budgeting; detailed cleansing will come later.
The team looks into the legacy data landscape, identifying key data areas (Customer, Product, Vendor), their record systems, and major quality issues.
Data migration is a risky activity. The implementation partner’s methods, tools, and experience with complex data conversions must be thoroughly examined. The phase ends with defining the overall migration approach and setting up basic data governance, which includes forming a committee and designating data owners.
Plan & Activate
Goals
-
Finalize the detailed data migration plan.
-
Conduct deep data profiling of legacy systems.
-
Define data standards and cleansing rules.
-
Develop the data cleansing strategy.
Activities
The high-level strategy turns into a detailed, actionable plan, incorporating specific tasks, timelines, and plans for several mock conversion cycles.
Data profiling tools analyze legacy data sources deeply. This accurately identifies and quantifies data quality issues, like duplicates or missing fields.
The Data Governance Committee starts its important work. They set the official data standards and quality rules for each master data object. Based on the profiling results, a specific strategy is created for cleansing the legacy data, and responsibilities are assigned to business users.
Design & Build
Goals
-
Execute legacy data cleansing activities.
-
Develop data transformation and mapping logic
-
Perform initial mock data migrations.
-
Validate migration results and refine the process.
Activities
This phase involves significant data preparation work. Business users and data stewards actively cleanse legacy data following the established rules, whether in legacy systems or staging areas.
At the same time, the technical team creates detailed mappings and transformation logic needed to transfer data to the new ERP structure.
The first mock data migrations are carried out. Cleansed data is extracted, transformed, and loaded into a test environment. This serves as an important early test of the entire process. Data stewards check the results, spotting errors in the data or the migration itself, which helps improve the process before the next iteration.
Validate
Goals
-
Execute multiple, validated mock data migrations.
-
Perform rigorous data reconciliation and validation.
-
Test business processes using migrated data.
-
Achieve business sign-off on data quality.
Activities
The data migration process is practiced and refined through multiple mock cycles. The final mock cycle acts as a full rehearsal for the production load.
Each cycle uses an increasingly cleaner dataset, and validation becomes more thorough. Detailed data reconciliation occurs, comparing key financial and operational totals between the legacy system and the ERP test environment.
Crucially, data from the final mock migration is used for User Acceptance Testing (UAT). This checks that business processes function correctly with real, migrated data. The Phase ends with formal approval from Data Owners, confirming that the data is accurate and ready for go-live.
Launch
Goals
-
Execute the final production data migration.
-
Perform critical post-load technical validation.
-
Conduct final business data reconciliation.
Activities
This phase involves the actual live production data cutover. The process follows a carefully planned sequence. Transactions in the legacy system are halted, and the final data extract is taken.
Proven migration scripts are executed to transform and load the final, approved dataset into the production ERP environment. Immediately after loading, the core data team conducts a critical validation to ensure a successful technical load.
Key business users then urgently reconcile critical data, like opening financial balances and inventory quantities. Business sign-off on this reconciliation is necessary before the system is opened to users.
Stabilize
Goals
-
Operationalize data governance processes.
-
Maintain data quality through stewardship.
-
Monitor key data quality metrics.
-
Address any post-go-live data issues.
Activities
The focus shifts from migration to maintaining data quality. The new data governance processes must be followed to prevent data quality from declining.
Permanent processes for creating and managing master data (like new customer setup workflows) are put in place. The Data Governance Committee starts monitoring key data quality metrics to ensure it remains high.
The support team addresses any data-related issues reported by users, making corrections through a controlled process. Communications reinforce the ongoing responsibility of data stewards for their respective areas.
Grow
Goals
-
Implement continuous data quality improvement.
-
Define and execute data archiving policies.
-
Enforce data retention and compliance requirements.
-
Govern data across new modules and systems.
Activities
Data management evolves into a continuous business function. The organization aims to actively improve data quality over time, treating it as an ongoing program rather than a one-time task. Data quality tools are used to regularly monitor data and spot new issues.
The full data lifecycle is managed. A strategy for archiving historical transactional data is implemented to maintain system performance.
Formal data retention policies are established and enforced to meet legal and regulatory requirements. As new functionalities or systems are introduced, the data governance framework must be expanded to include these new data objects.