PADISO.ai: AI Agent Orchestration Platform - Launching May 2026
Back to Blog
Guide 29 mins

Migrating From Power BI to D23.io: A Step-by-Step Playbook

12-week Power BI to D23.io migration playbook: data remapping, DAX translation, dashboard rebuilds, training. Realistic effort estimates per workstream.

The PADISO Team ·2026-05-10

Migrating From Power BI to D23.io: A Step-by-Step Playbook

Table of Contents

  1. Why Organisations Migrate From Power BI
  2. The 12-Week Migration Framework
  3. Pre-Migration Assessment and Planning
  4. Data Source Remapping and Architecture
  5. DAX-to-SQL Translation Strategy
  6. Dashboard and Report Rebuilding
  7. User Acceptance Testing and Validation
  8. Training and Change Management
  9. Post-Migration Optimisation and Decommissioning
  10. Common Pitfalls and How to Avoid Them
  11. Next Steps and Ongoing Support

Why Organisations Migrate From Power BI

Power BI has been the go-to business intelligence platform for thousands of organisations worldwide. Yet as data ecosystems mature, companies often find themselves constrained by Power BI’s architecture, licensing model, or integration capabilities. Whether you’re facing scaling challenges, cost pressures, or the need for more sophisticated data orchestration, migrating to a modern platform like D23.io represents a significant operational decision.

D23.io (built on Apache Superset) offers organisations a lightweight, open-source alternative with superior customisation, lower total cost of ownership, and tighter integration with contemporary data stacks. Unlike Power BI’s monolithic approach, D23.io excels in environments where you need granular control over semantic layers, direct SQL authoring, and seamless connections to data warehouses and lakehouses.

The decision to migrate, however, is not trivial. Your existing Power BI estate likely contains hundreds of reports, dozens of data models, complex DAX calculations, and embedded user workflows. A poorly executed migration can stall analytics adoption, damage stakeholder trust, and waste months of engineering effort.

This playbook addresses that risk. Over the next 12 weeks, you’ll systematically decommission your Power BI environment and build a modern, resilient analytics platform on D23.io—with clear effort estimates, realistic timelines, and proven patterns from production migrations.


The 12-Week Migration Framework

The migration from Power BI to D23.io follows a structured four-phase approach, each mapped to specific weeks and workstreams. This framework has been validated across seed-stage startups and mid-market operators modernising their analytics stacks.

Phase 1: Discovery and Planning (Weeks 1–2)

Effort estimate: 120–160 hours (2–3 full-time engineers + 1 business analyst)

During weeks 1 and 2, your team conducts a comprehensive audit of your existing Power BI environment. This phase is non-negotiable; skipping or rushing it is the single most common cause of migration delays and cost overruns.

You’ll inventory all Power BI workspaces, datasets, reports, and dashboards. Document the owner, last refresh date, user count, and business criticality of each artefact. Create a dependency map showing which reports feed into which business processes. Identify all data sources—databases, APIs, cloud data warehouses, flat files—and test connectivity from your target D23.io environment.

Simultaneously, audit your Power BI data models. Export the data model structure for each dataset, including table relationships, column data types, and any row-level security (RLS) rules. Extract all DAX calculations and measures; these will require translation to SQL or D23.io-native expressions.

Conduct stakeholder interviews with report owners, business users, and IT governance teams. Understand which reports are business-critical (daily or hourly refresh cycles), which are reference reports (monthly or quarterly), and which are orphaned (no active users). This prioritisation directly shapes your migration sequence.

Document your current Power BI licensing model, refresh schedules, gateway configurations, and any custom visuals or third-party integrations. These details inform your D23.io architecture design.

Phase 2: Design and Proof of Concept (Weeks 3–5)

Effort estimate: 200–280 hours (2–3 engineers + 1 solutions architect + 1 data engineer)

With discovery complete, weeks 3–5 focus on designing your target D23.io architecture and validating it through a proof of concept (PoC).

Select 2–3 high-impact, moderately complex Power BI reports to serve as your PoC cohort. These should represent different data sources, calculation patterns, and user workflows—but not your most mission-critical dashboards. Success with the PoC builds confidence and uncovers technical risks before full-scale migration.

Design your D23.io semantic layer. Unlike Power BI’s embedded data models, D23.io typically sources data directly from your warehouse or lakehouse via SQL. You’ll define the schema, table relationships, and calculated columns in your data warehouse, then expose them through D23.io’s native dataset definitions.

Architect your data refresh strategy. Power BI uses scheduled refreshes managed through the Power BI Service; D23.io typically integrates with your existing data pipeline orchestration (Airflow, dbt, Fivetran, Stitch). Map your current refresh cadences and dependencies to your data platform’s native scheduler.

Design user authentication and authorisation. If you’re currently using Power BI’s RLS or Azure AD integration, translate those rules into D23.io’s role-based access controls (RBAC) and database-level row filtering. Document any custom security requirements.

Execute the PoC: rebuild your 2–3 selected reports in D23.io. Translate the DAX calculations to SQL. Validate data accuracy against the source Power BI reports. Test refresh performance, query latency, and user experience. Document any gaps, workarounds, or architectural adjustments needed.

Conduct a PoC retrospective with your engineering team and key stakeholders. Identify blockers, refine your architecture, and update your effort estimates for the full migration.

Phase 3: Full Migration Execution (Weeks 6–10)

Effort estimate: 600–900 hours (3–4 engineers + 1 data engineer + 1 QA analyst)

Weeks 6–10 are the migration’s operational core. Your team executes the bulk of the report rebuilding, data validation, and user acceptance testing.

Prioritise report migration in tiers: Tier 1 (business-critical, daily+ refresh), Tier 2 (important, weekly refresh), Tier 3 (reference, monthly or ad hoc). Migrate Tier 1 reports first to validate your process and build stakeholder confidence.

For each report cohort, execute the following workflow:

  1. Data source validation: Confirm all source systems are accessible from D23.io. Test connectivity, query performance, and data freshness.
  2. DAX translation: Convert Power BI measures and calculated columns to SQL or D23.io expressions. Validate calculated values against the original Power BI results.
  3. Dashboard rebuild: Recreate the report layout, visuals, filters, and interactions in D23.io. Ensure visual fidelity and user experience parity.
  4. Data validation: Compare row counts, aggregations, and key metrics between the original Power BI report and the D23.io rebuild. Flag and resolve discrepancies.
  5. Performance optimisation: Tune SQL queries, add indexes, and optimise refresh schedules to meet or exceed Power BI’s performance baseline.
  6. Documentation: Capture the report’s business logic, data lineage, and refresh dependencies in your knowledge base.

Parallel to report migration, configure your D23.io infrastructure for production:

  • Set up a dedicated D23.io instance (or cluster, if scaling to 50+ concurrent users).
  • Implement single sign-on (SSO) via OAuth2, SAML, or LDAP to match your current Power BI authentication.
  • Configure data source connections, refresh schedules, and monitoring.
  • Set up backup and disaster recovery procedures.
  • Implement audit logging and compliance controls (if pursuing SOC 2 or ISO 27001 certification).

Parallel execution of report migration and infrastructure setup accelerates the timeline but requires clear task separation and daily standups to manage dependencies.

Phase 4: User Acceptance Testing, Training, and Cutover (Weeks 11–12)

Effort estimate: 160–240 hours (1–2 engineers + 1 trainer + QA analyst)

Weeks 11–12 focus on validating the migrated environment with business users, training them on D23.io’s interface and workflows, and executing a controlled cutover from Power BI.

Conduct user acceptance testing (UAT) with representatives from each business function. Provide them with access to the D23.io environment and a structured test plan covering report accuracy, refresh timeliness, filter functionality, and ease of use. Document feedback and prioritise bug fixes.

Deliver training to all active Power BI users. Cover D23.io’s interface, how to navigate dashboards, how to create ad hoc queries, how to export data, and how to request new reports. Training can be delivered via live webinars, recorded videos, and self-paced documentation.

Execute the cutover: disable Power BI refresh schedules, archive Power BI workspaces, and redirect users to D23.io. Maintain a Power BI read-only archive for 30–60 days in case users need to reference historical reports.

Monitor D23.io performance closely during the first 2 weeks post-cutover. Be ready to roll back or adjust configurations if issues arise. Assign a support team to triage user questions and resolve technical issues.


Pre-Migration Assessment and Planning

A rigorous assessment phase sets the foundation for a successful migration. Too many organisations underestimate this phase, leading to surprises and delays later.

Inventory Your Power BI Estate

Begin by cataloguing every Power BI artefact in your organisation. Use the Power BI migration overview from Microsoft Learn as a reference for the pre-migration steps and governance considerations.

Create a spreadsheet with the following columns for each report:

  • Report name and workspace
  • Owner and stakeholders
  • Data sources (database, API, file share, etc.)
  • Refresh frequency (hourly, daily, weekly, monthly, ad hoc)
  • User count (active monthly users)
  • Business criticality (critical, important, reference)
  • Complexity rating (simple, moderate, complex)
  • Custom visuals or third-party integrations
  • Row-level security (RLS) rules
  • Estimated migration effort (hours)

For data models, document:

  • Dataset name and size
  • Source tables and relationships
  • Calculated columns and measures (extract the DAX formulas)
  • Refresh mode (import, DirectQuery, dual)
  • Data refresh schedule
  • Gateway requirements

This inventory becomes your migration roadmap. Reports rated as “complex” or “critical” require more design time and testing; reports rated as “reference” may be candidates for decommissioning rather than migration.

Assess Data Source Accessibility

Power BI often abstracts the underlying data sources behind a semantic layer. In D23.io, you’ll typically query your data warehouse or lakehouse directly via SQL. Confirm that all your current data sources are accessible from your D23.io environment.

Test connectivity to:

  • Data warehouses (Snowflake, BigQuery, Redshift, Azure Synapse)
  • Data lakes (Delta Lake, Iceberg, Parquet on cloud storage)
  • Operational databases (PostgreSQL, MySQL, SQL Server)
  • APIs and SaaS platforms (Salesforce, Stripe, HubSpot, etc.)
  • Flat files (CSV, Excel stored in cloud storage)

If a data source is currently inaccessible from D23.io (e.g., an on-premises database behind a firewall), plan to either:

  1. Replicate the data to a cloud data warehouse accessible from D23.io.
  2. Deploy a D23.io instance on-premises or in your private network.
  3. Use a data integration tool (Fivetran, Stitch, Airbyte) to pipe data from the source to a cloud warehouse.

Document the effort and timeline for each accessibility gap.

Review Licensing and Cost Implications

Power BI’s per-user licensing model ($10–$20 per user per month) often becomes expensive as your user base scales. D23.io, being open-source, has zero per-user licensing costs; you pay only for infrastructure (cloud compute, storage, and bandwidth).

Calculate your current Power BI licensing spend. If you have 100 users on Power BI Premium at $5,000/month, migrating to D23.io on a modest cloud instance ($500–$2,000/month) yields significant savings.

However, factor in the migration effort cost. If your team bills at $150/hour and the migration requires 1,500 hours of effort, that’s $225,000 in labour. At $4,000/month in Power BI savings, you’ll break even in 56 months—unless you value the operational benefits (faster deployment, tighter data warehouse integration, customisation flexibility) as additional ROI.

For most mid-market organisations, the migration pays for itself within 18–24 months when accounting for both licensing savings and operational efficiency gains.


Data Source Remapping and Architecture

One of the biggest architectural shifts in moving from Power BI to D23.io is how data flows into your analytics platform. Power BI often uses a centralised import model; D23.io typically queries your data warehouse directly.

Designing Your Semantic Layer

In Power BI, the semantic layer is embedded within the Power BI dataset—a proprietary format that contains tables, relationships, and DAX calculations. When you migrate to D23.io, you’ll shift this responsibility to your data warehouse.

Design your warehouse schema to support analytics:

  • Fact tables: Immutable, grain-specific tables storing transactional or event-level data (orders, pageviews, support tickets).
  • Dimension tables: Slowly-changing dimensions (SCD Type 1 or 2) for entities like customers, products, and dates.
  • Aggregated tables: Pre-aggregated summaries (daily revenue by product, weekly active users) to accelerate dashboard queries.
  • Staging tables: Intermediate tables used during ETL; typically hidden from end users.

Use a tool like dbt (data build tool) to manage your warehouse schema as code. dbt integrates seamlessly with D23.io and enables you to version-control your data models, document lineage, and automate testing.

For each Power BI dataset, map its tables and relationships to your warehouse schema. If a Power BI dataset uses an import model, you may need to replicate that data to your warehouse if it’s not already there. If it uses DirectQuery, you’re already querying the source system directly—validate that D23.io can access the same source.

Mapping Data Sources to D23.io

D23.io (built on Apache Superset) connects to data sources via SQLAlchemy, a Python SQL toolkit that supports 50+ databases and data warehouses. Common connectors include:

  • Snowflake, BigQuery, Redshift, Azure Synapse (cloud data warehouses)
  • PostgreSQL, MySQL, SQL Server, Oracle (relational databases)
  • Elasticsearch, MongoDB (NoSQL databases)
  • Presto, Trino (distributed query engines)
  • Spark SQL (distributed processing)

For each Power BI data source, identify the equivalent D23.io connector. If you’re currently using a Power BI gateway to query an on-premises database, consider whether to:

  1. Replicate the data to a cloud warehouse (preferred for latency and resilience).
  2. Deploy D23.io in your private network or on-premises.
  3. Use a data integration tool to sync the data to the cloud.

Option 1 is typically the most operationally sound; it decouples your analytics platform from your operational systems and enables you to apply data transformations, quality checks, and aggregations before exposing data to analysts.

Refresh Strategy and Data Freshness

In Power BI, refresh schedules are managed through the Power BI Service. You can schedule up to 48 refreshes per day for Premium capacity, or 8 per day for Pro licenses.

In D23.io, refreshes are typically orchestrated by your data warehouse’s native scheduler or an external orchestration tool:

  • Data warehouse native: Snowflake’s Tasks, BigQuery’s Scheduled Queries, Redshift’s Scheduled Queries.
  • External orchestrators: Apache Airflow, Prefect, Dagster, dbt Cloud.
  • Change Data Capture (CDC): Tools like Fivetran or Stitch can push incremental updates in near-real-time.

Map your current Power BI refresh cadences to your target refresh strategy. If you currently refresh a dataset hourly, ensure your data warehouse pipeline can deliver fresh data hourly. If a report is refreshed weekly, align the D23.io query to use the latest weekly snapshot.

Consider implementing incremental refreshes for large fact tables. Instead of reloading all data, append only new or changed rows. This reduces compute costs and shortens refresh windows.

For mission-critical dashboards, implement monitoring and alerting. If a refresh fails or data becomes stale, alert the analytics team immediately so they can investigate and remediate.


DAX-to-SQL Translation Strategy

Power BI’s Data Analysis Expressions (DAX) language is a formula language optimised for tabular data models. D23.io doesn’t natively support DAX; instead, you’ll translate DAX formulas to SQL or use D23.io’s expression language.

Understanding DAX Measures and Calculated Columns

DAX has two primary constructs:

  • Measures: Aggregations computed on demand (e.g., SUM(Sales[Amount]), DISTINCTCOUNT(Customers[ID])). Measures are evaluated in the context of the current filter selection.
  • Calculated columns: Values computed row-by-row and stored in the data model (e.g., [Profit] = [Revenue] - [Cost]). Calculated columns increase model size but enable faster measure computation.

When migrating to D23.io, you have three options:

  1. Translate to SQL: Rewrite the DAX logic as SQL views or computed columns in your data warehouse. This is the most performant approach but requires SQL expertise.
  2. Use D23.io expressions: D23.io supports a limited expression language for calculated fields and filters. Simple DAX formulas can often be translated directly.
  3. Compute in the application: For complex calculations not easily expressed in SQL, compute them in D23.io’s Python backend or in the dashboard layer.

Common DAX Patterns and SQL Equivalents

Here are common DAX patterns and their SQL translations:

SUM with filters:

DAX: CALCULATE(SUM(Sales[Amount]), Sales[Region] = "North")
SQL: SELECT SUM(Amount) FROM Sales WHERE Region = 'North'

DISTINCTCOUNT:

DAX: DISTINCTCOUNT(Customers[CustomerID])
SQL: SELECT COUNT(DISTINCT CustomerID) FROM Customers

Year-to-date (YTD):

DAX: CALCULATE(SUM(Sales[Amount]), DATESYTD(Dates[Date]))
SQL: SELECT SUM(Amount) FROM Sales WHERE YEAR(Date) = YEAR(CURDATE()) AND Date <= CURDATE()

Previous period comparison:

DAX: CALCULATE(SUM(Sales[Amount]), DATEADD(Dates[Date], -1, MONTH))
SQL: SELECT SUM(Amount) FROM Sales WHERE Date >= DATE_TRUNC('month', CURRENT_DATE - INTERVAL 1 MONTH) AND Date < DATE_TRUNC('month', CURRENT_DATE)

Rank:

DAX: RANK(SUM(Sales[Amount]), ALL(Products[ProductID]))
SQL: SELECT Product, Amount, ROW_NUMBER() OVER (ORDER BY Amount DESC) as Rank FROM (SELECT Product, SUM(Amount) as Amount FROM Sales GROUP BY Product)

For each DAX measure in your Power BI datasets, create a SQL equivalent in your data warehouse. Test the SQL output against the Power BI result to ensure numerical accuracy.

Handling Complex DAX Logic

Some DAX patterns are difficult to translate to SQL. Examples include:

  • Time intelligence functions (PARALLELPERIOD, SAMEPERIODLASTYEAR) that rely on calendar hierarchies.
  • Complex nested CALCULATE contexts with multiple filter conditions.
  • Iterative functions (SUMX, PRODUCTX) that require row-by-row computation.

For these cases, consider:

  1. Simplifying the logic: Can you achieve the same business outcome with a simpler calculation?
  2. Pre-computing in the warehouse: Use window functions or CTEs to pre-compute the result.
  3. Accepting a slight change in UX: If a complex measure isn’t critical, you may deprioritise its migration.

Document any measures that require manual workarounds or approximations. Communicate these trade-offs to stakeholders during UAT.

Validation and Testing

Create a test matrix comparing Power BI results to D23.io results for each migrated measure:

MeasurePower BI ResultD23.io ResultMatch?Notes
Total Revenue$1,234,567$1,234,567Exact match
Avg Order Value$156.78$156.78Exact match
Customer Count5,4325,432Exact match

Test across multiple filter combinations (by region, by product, by date range) to ensure the translated measures behave correctly in different contexts.


Dashboard and Report Rebuilding

With your data sources mapped and calculations translated, you’re ready to rebuild your reports and dashboards in D23.io.

Prioritisation and Sequencing

Rebuild reports in priority order:

  1. Tier 1 (business-critical): Daily or hourly refresh, 50+ active users, directly informs business decisions. Examples: executive dashboard, sales pipeline, revenue tracking.
  2. Tier 2 (important): Weekly refresh, 10–50 active users, informs operational decisions. Examples: marketing campaign performance, customer support metrics.
  3. Tier 3 (reference): Monthly or ad hoc refresh, <10 active users, reference or exploratory. Examples: historical trend analysis, ad hoc data exploration.

Migrate Tier 1 reports first. Success with high-impact reports builds stakeholder confidence and allows you to refine your process before tackling lower-priority reports.

Visual Fidelity and UX Parity

When rebuilding a report, aim for visual and functional parity with the original Power BI report. This minimises user disruption and ensures stakeholders can trust the new environment.

D23.io’s visualisation library includes:

  • Charts: Line, bar, scatter, area, pie, sunburst, treemap, heatmap.
  • Tables: Sortable, filterable tables with conditional formatting.
  • Maps: Geospatial visualisations (requires geospatial data).
  • Gauges and KPIs: Single-value displays with thresholds.
  • Custom visuals: Python-based visualisations for advanced use cases.

For most Power BI reports, you’ll find a D23.io equivalent. If a Power BI custom visual doesn’t have a D23.io equivalent, consider:

  1. Using a standard D23.io visual that conveys the same insight.
  2. Developing a custom Python-based visual (requires engineering effort).
  3. Accepting that the visualisation will change slightly (document this for users).

Filters, Interactions, and Drill-Through

Power BI dashboards often include interactive filters, cross-filtering between visuals, and drill-through actions. D23.io supports similar interactivity:

  • Filters: Dropdown, text, date range filters that constrain dashboard data.
  • Cross-filtering: Clicking a bar chart filters other visuals on the dashboard.
  • Drill-through: Clicking a metric opens a detailed report (achieved via dashboard linking or URL parameters).

Map each interactive element from your Power BI report to D23.io. Test that filters work correctly and that cross-filtering behaves as expected.

Performance Optimisation

D23.io’s performance depends on the underlying SQL queries. Slow dashboards often indicate slow queries, not a D23.io limitation.

Optimise query performance by:

  1. Adding indexes: Index columns used in WHERE, JOIN, and GROUP BY clauses.
  2. Using aggregated tables: Pre-aggregate high-cardinality dimensions (e.g., daily revenue by product) to avoid scanning billions of rows.
  3. Partitioning large tables: Partition fact tables by date to enable partition pruning.
  4. Caching query results: D23.io can cache query results; configure cache TTL based on data freshness requirements.
  5. Limiting data scope: Use date filters to restrict queries to recent data (e.g., last 12 months) rather than all-time.

Benchmark query performance against Power BI’s baseline. If D23.io is significantly slower, investigate the root cause (missing indexes, inefficient joins, unpartitioned tables) and optimise.


User Acceptance Testing and Validation

Before cutover, conduct thorough user acceptance testing (UAT) to validate that the D23.io environment meets business requirements.

UAT Planning and Test Cases

Develop a test plan covering:

  • Data accuracy: Row counts, aggregations, and key metrics match Power BI.
  • Functionality: Filters, cross-filtering, drill-through, and exports work as expected.
  • Performance: Dashboard load times and query latency are acceptable (<10 seconds for typical queries).
  • Usability: Users can navigate the dashboard, find reports, and export data without training.
  • Compliance: Data access controls, audit logging, and security controls function correctly.

For each migrated report, create specific test cases:

Test CaseStepsExpected ResultActual ResultPass/Fail
Revenue by Region1. Open dashboard 2. Select Q1 2024Revenue totals match Power BI[User result]✓/✗
Customer Filter1. Open dashboard 2. Filter by customer “Acme Corp”Only Acme data displayed[User result]✓/✗

Assign test cases to business users representing different departments (sales, finance, operations). Provide them with access to a UAT environment and a 1-week testing window.

Defect Triage and Resolution

As users test, they’ll report issues. Categorise defects by severity:

  • Critical: Data is incorrect, dashboard doesn’t load, or core functionality is broken. Resolve before cutover.
  • High: Feature doesn’t work as expected but workarounds exist. Resolve before cutover or document workaround.
  • Medium: Minor UI issue or non-critical feature missing. Can be resolved post-cutover.
  • Low: Nice-to-have enhancement. Defer to post-cutover roadmap.

For each defect, document the steps to reproduce, the expected vs. actual behaviour, and the root cause. Assign to an engineer for resolution.

Re-test resolved defects before cutover to ensure the fix doesn’t introduce new issues.

Sign-Off and Cutover Readiness

Obtain formal sign-off from business stakeholders and IT leadership before proceeding to cutover. Sign-off indicates that the D23.io environment is ready for production use and that stakeholders accept any remaining known issues or trade-offs.

Create a cutover checklist:

  • All Tier 1 and Tier 2 reports migrated and tested.
  • Data accuracy validated across all reports.
  • Performance benchmarks met.
  • User training completed.
  • Support team briefed and on-call.
  • Power BI read-only archive created.
  • Rollback plan documented.
  • Stakeholder sign-off obtained.

Training and Change Management

A successful migration isn’t just about technology—it’s about people. Effective training and change management ensure users adopt D23.io quickly and confidently.

Training Content and Delivery

Develop training materials covering:

  • Interface tour: How to navigate D23.io, access dashboards, and find reports.
  • Filtering and exploration: How to use filters, cross-filtering, and drill-through.
  • Exporting and sharing: How to export data to CSV or Excel, share dashboards with colleagues.
  • Creating ad hoc queries: For power users, how to write custom SQL queries and create new dashboards.
  • Troubleshooting: Common issues and how to resolve them; who to contact for support.

Deliver training via:

  1. Live webinars: 30–60 minute sessions covering the basics. Record for asynchronous viewing.
  2. Self-paced videos: 5–10 minute videos demonstrating specific tasks (e.g., “How to filter a dashboard”).
  3. Documentation: Written guides with screenshots for reference.
  4. Office hours: 1-on-1 or small-group sessions for users with questions.

Schedule training 1–2 weeks before cutover, while users still have access to Power BI as a reference.

Change Communication

Communicate the migration timeline and benefits to all stakeholders:

  • Week 1: Announce the migration, explain why (cost savings, modernisation, new capabilities), and outline the timeline.
  • Week 4: Share the PoC results and highlight early wins.
  • Week 8: Announce the cutover date and begin training.
  • Week 11: Remind users of the cutover date and direct them to training materials.
  • Week 12: Execute cutover and provide post-cutover support.

Address concerns proactively. If users worry that D23.io won’t have a feature they rely on, demonstrate the equivalent functionality or explain the workaround.

Post-Cutover Support

In the first 2 weeks post-cutover, users will have questions and issues. Assign a support team to:

  • Triage user issues: Collect, categorise, and prioritise reported problems.
  • Provide training: Answer questions and guide users through tasks.
  • Monitor performance: Watch for slow dashboards, failed refreshes, or data quality issues.
  • Escalate technical issues: Pass complex issues to the engineering team for investigation.

Maintain a Power BI read-only archive for 30–60 days. If users need to reference a historical report or verify data, they can access the archive without requiring a full Power BI license.


Post-Migration Optimisation and Decommissioning

Once D23.io is live and users are comfortable, shift focus to optimisation and decommissioning.

Performance Tuning

Monitor dashboard performance in production. Identify slow queries using D23.io’s query logs and database query plans. Optimise by:

  • Adding indexes: Work with your data warehouse team to index heavily-queried columns.
  • Rewriting queries: Simplify complex queries or break them into multiple smaller queries.
  • Using caching: Cache frequently-accessed queries with appropriate TTL.
  • Aggregating data: Pre-compute aggregations for common queries.

Target <5 second load times for typical dashboards and <10 seconds for complex multi-table queries.

Governance and Maintenance

Establish governance practices for your D23.io environment:

  • Naming conventions: Standardise dataset, dashboard, and field naming.
  • Documentation: Maintain a data dictionary documenting each field, its source, and its calculation.
  • Access controls: Regularly audit user access; remove access for users who’ve left the organisation or changed roles.
  • Refresh monitoring: Alert if a refresh fails or data becomes stale.
  • Backup and recovery: Test backups regularly to ensure you can recover from data loss or corruption.

For guidance on implementing governance practices, refer to Power BI migration best practices which, while Power BI-focused, cover many principles applicable to D23.io.

Decommissioning Power BI

Once you’ve validated that all reports have been successfully migrated and users are comfortable with D23.io, decommission Power BI:

  1. Disable refreshes: Stop all Power BI dataset refreshes to prevent stale data.
  2. Archive workspaces: Move Power BI workspaces to an “archived” state so users can’t modify them.
  3. Redirect traffic: Update bookmarks, links, and documentation to point to D23.io.
  4. Maintain a read-only archive: Keep Power BI in read-only mode for 60–90 days for historical reference.
  5. Cancel licenses: Once the archive period ends, cancel Power BI licenses and reclaim the cost savings.

Document the decommissioning process and obtain sign-off from IT and business stakeholders before proceeding.


Common Pitfalls and How to Avoid Them

Based on production migrations, here are the most common pitfalls and strategies to avoid them.

Pitfall 1: Underestimating DAX Complexity

Problem: Teams assume DAX measures will translate trivially to SQL, only to discover complex time-intelligence or context-dependent calculations that are difficult to replicate.

Solution: During the discovery phase, audit all DAX measures and classify them by complexity. Allocate extra time for complex measures. Consider involving a data engineer or analytics engineer who specialises in SQL optimisation.

Pitfall 2: Ignoring Data Freshness Requirements

Problem: Teams migrate reports without understanding the data freshness requirements. A report that was refreshed hourly in Power BI is now refreshed daily in D23.io, leading to stale data and user frustration.

Solution: During the pre-migration assessment, document refresh cadences for every report. Ensure your D23.io refresh strategy meets or exceeds these requirements. For mission-critical reports, consider implementing incremental refreshes or near-real-time CDC.

Pitfall 3: Skipping User Acceptance Testing

Problem: Teams rush to cutover without thorough UAT, only to discover data discrepancies or missing functionality post-cutover. This erodes user trust and requires emergency hotfixes.

Solution: Allocate at least 2 weeks for UAT. Involve business users representing different departments. Create detailed test cases and track defects. Resolve critical defects before cutover; document workarounds for non-critical issues.

Pitfall 4: Inadequate Change Management

Problem: Users aren’t trained on D23.io and resist the new platform. They complain that “it’s not like Power BI” and request rollback.

Solution: Invest in change communication and training. Explain the benefits of the migration. Provide comprehensive training materials and office hours. Acknowledge that change takes time; be patient and supportive.

Pitfall 5: Migrating Reports That Nobody Uses

Problem: Teams spend effort migrating low-value reports that have no active users, wasting time and resources.

Solution: During the discovery phase, identify orphaned reports with no active users. Consider decommissioning these reports rather than migrating them. Confirm with stakeholders that it’s safe to retire these reports.

Pitfall 6: Overlooking Data Quality Issues

Problem: The migration reveals data quality issues (duplicates, nulls, inconsistencies) that weren’t apparent in Power BI. Users lose confidence in the new platform.

Solution: Conduct data quality checks during the discovery phase. Identify and remediate issues before migration. Implement data validation rules in your warehouse to catch issues early.

For additional guidance, consult 10 migration mistakes that can derail your Power BI transition which outlines common pitfalls and remediation strategies.


Leveraging PADISO for Your Migration

Migrating from Power BI to D23.io is a significant undertaking. Many organisations benefit from partnering with experienced teams who’ve executed similar migrations at scale.

At PADISO, we’ve guided dozens of organisations through analytics platform migrations, data warehouse modernisations, and AI-driven automation projects. Our approach is outcome-led: we focus on shipping results—faster time-to-insight, lower total cost of ownership, and operational resilience.

We offer several services relevant to your migration:

CTO as a Service: If you lack in-house engineering leadership, our fractional CTOs provide strategic guidance on architecture, technology selection, and execution. We’ve helped seed-stage startups and mid-market operators navigate complex technical decisions.

Platform Design & Engineering: Our platform engineering team designs and builds production-grade analytics infrastructure. We handle data warehouse schema design, ETL orchestration, semantic layer development, and D23.io configuration—so your team can focus on business logic.

AI & Agents Automation: Beyond analytics, we help organisations automate workflows using agentic AI. If your migration uncovers opportunities to automate report generation, anomaly detection, or data quality checks, we can design and implement those automations.

Security Audit (SOC 2 / ISO 27001): If compliance is a concern, we help organisations achieve SOC 2 Type II or ISO 27001 certification. We’ve worked with Vanta to streamline compliance workflows, reducing audit overhead and accelerating certification timelines.

Our engagement model is flexible. Some organisations engage us for the full 12-week migration; others engage us for specific phases (e.g., PoC design, DAX translation, infrastructure setup). We’ve also partnered with private equity firms on post-acquisition analytics modernisation, helping portfolio companies consolidate disparate BI platforms and unlock data-driven value creation.

If you’re considering a Power BI to D23.io migration and want expert guidance, reach out to PADISO. We’ll assess your environment, outline a realistic roadmap, and help you execute a migration that delivers both immediate wins and long-term operational benefits.


Next Steps and Ongoing Support

You now have a comprehensive playbook for migrating from Power BI to D23.io. Here’s how to proceed:

Immediate Actions (Week 1)

  1. Assemble your team: Identify the engineers, data analysts, and business stakeholders who’ll lead the migration.
  2. Schedule discovery workshops: Block time with report owners and IT leadership to inventory your Power BI estate.
  3. Evaluate D23.io: If you haven’t already, set up a trial D23.io instance and explore its capabilities.
  4. Estimate effort: Use the effort estimates in this playbook to forecast timeline and cost. Adjust based on your specific environment.

Weeks 2–4: Planning and PoC

  1. Complete the discovery phase: Finish inventorying Power BI reports, data sources, and DAX measures.
  2. Design your target architecture: Define your data warehouse schema, refresh strategy, and D23.io configuration.
  3. Execute a PoC: Migrate 2–3 representative reports and validate your approach.
  4. Refine your roadmap: Based on PoC learnings, update your effort estimates and timeline.

Weeks 5–12: Execution and Cutover

  1. Execute migration in tiers: Migrate Tier 1, then Tier 2, then Tier 3 reports.
  2. Conduct UAT: Involve business users in testing and validation.
  3. Train users: Deliver training materials and office hours.
  4. Execute cutover: Disable Power BI, redirect users to D23.io, and monitor closely.
  5. Optimise and decommission: Tune performance, establish governance, and retire Power BI.

Ongoing: Governance and Optimisation

  1. Monitor performance: Track dashboard load times and query latency. Optimise as needed.
  2. Manage access: Regularly audit user permissions and remove access for users who’ve left.
  3. Maintain documentation: Keep your data dictionary and architecture documentation up-to-date.
  4. Plan enhancements: Identify opportunities to expand D23.io’s capabilities (e.g., new reports, embedded analytics, agentic AI for anomaly detection).

For ongoing support, consider engaging PADISO’s AI Advisory Services Sydney or AI Agency Methodology Sydney to ensure your analytics platform evolves with your business needs.

You can also explore how organisations are leveraging AI Automation for Financial Services, AI Automation for Marketing, and AI Automation for Retail to extend your analytics platform with intelligent automation capabilities.

Resources and Further Reading

For additional guidance on Power BI migrations, consult:

For D23.io-specific guidance, consult the official D23.io documentation and explore PADISO’s $50K D23.io consulting engagement, which outlines a real-world engagement including architecture design, semantic layer development, dashboard rebuilding, and training—all delivered in 6 weeks.


Conclusion

Migrating from Power BI to D23.io is a significant undertaking, but with a clear roadmap, realistic effort estimates, and disciplined execution, it’s entirely achievable. The 12-week playbook outlined in this guide has been validated across multiple organisations and consistently delivers results: faster deployment cycles, lower per-user costs, tighter data warehouse integration, and greater customisation flexibility.

The key to success is treating the migration as a structured programme, not a quick lift-and-shift. Invest time in discovery and planning. Validate your approach with a PoC. Execute in tiers, prioritising high-impact reports. Conduct rigorous UAT. Train users thoroughly. And maintain disciplined governance post-cutover.

If you’re ready to begin your migration, start with the discovery phase this week. Inventory your Power BI estate, engage stakeholders, and estimate effort. By week 4, you should have a clear roadmap and a validated PoC. By week 12, you’ll be live on D23.io with users actively adopting the new platform.

For expert guidance on any phase of the migration—from architecture design to PoC execution to post-cutover optimisation—PADISO is here to help. We’ve guided organisations from seed stage through Series B and beyond, and we know what it takes to execute complex technical transformations at pace. Reach out, and let’s ship your analytics transformation together.