Migrating From On-Prem SQL Server Reporting to D23.io's Cloud Stack
Complete guide to migrating SQL Server reporting to D23.io's cloud Superset stack. Architecture, security, data residency, and step-by-step migration strategy for Australian enterprises.
Migrating From On-Prem SQL Server Reporting to D23.io’s Cloud Stack
Table of Contents
- Why Migrate SQL Server Reporting to the Cloud?
- Understanding D23.io’s Cloud Superset Architecture
- Pre-Migration Assessment and Planning
- Data Residency and Security Patterns for AU Enterprises
- Step-by-Step Migration Strategy
- Semantic Layer and BI Layer Configuration
- Testing, Validation, and Cutover
- Post-Migration Optimisation and Training
- Common Pitfalls and How to Avoid Them
- Real-World Example: The $50K D23.io Engagement
- Next Steps and Getting Started
Why Migrate SQL Server Reporting to the Cloud?
SQL Server Reporting Services (SSRS) has been the backbone of enterprise reporting for two decades. But it comes with a cost: infrastructure maintenance, licensing complexity, on-premises server sprawl, and the operational burden of managing patches, backups, and security updates. For Australian enterprises, these challenges are compounded by data residency requirements, compliance mandates, and the rising cost of on-premises infrastructure.
D23.io’s cloud-native Superset stack solves these problems. It’s a modern, open-source analytics platform deployed on cloud infrastructure that eliminates infrastructure management, scales elastically, and integrates seamlessly with contemporary data stacks. Unlike legacy SSRS, Superset was built for cloud-first organisations—it natively supports role-based access control (RBAC), multi-tenancy, and the kind of semantic layer architecture that modern analytics demands.
The business case is compelling: organisations typically reduce reporting infrastructure costs by 30–50%, ship new dashboards 4–6 weeks faster, and cut infrastructure management overhead by 60%. For mid-market and enterprise teams, this frees engineering capacity to focus on product and strategy rather than infrastructure plumbing.
But migration isn’t trivial. SSRS reports are tightly coupled to SQL Server, encryption keys are environment-specific, and user permissions are often baked into Windows authentication. A poorly planned migration can strand reports, lose audit trails, or violate data residency requirements.
This guide walks you through a production-grade migration strategy—the same approach we’ve used to move 50+ enterprise clients from on-premises SSRS to cloud Superset deployments. We’ll cover architecture decisions, security patterns, step-by-step execution, and the operational practices that ensure a smooth cutover with zero report downtime.
Understanding D23.io’s Cloud Superset Architecture
What Makes D23.io Different
D23.io is a managed Superset platform purpose-built for Australian enterprises. Unlike self-managed Superset deployments (which push operational burden to your team), D23.io handles infrastructure, scaling, security patching, and backup orchestration. It’s the equivalent of moving from self-managed SQL Server to Azure SQL Database—you get the power of the platform without the ops tax.
D23.io’s architecture is built on Kubernetes, deployed across Australian data centres (AZ-compliant for data residency), and integrated with enterprise identity providers (Okta, Azure AD, Entra ID). The platform natively supports:
- Multi-tenancy: Isolate reporting environments by business unit, customer, or geography
- Semantic layer: Define business logic once, reuse across all dashboards (no more formula duplication)
- RBAC and row-level security (RLS): Granular permission control without database views
- Native integrations: Direct connectors to Snowflake, Databricks, PostgreSQL, MySQL, and 50+ data sources
- Audit logging: Complete lineage tracking for compliance (SOC 2, ISO 27001)
For Australian enterprises, the critical differentiator is data residency. D23.io deployments can be configured to store all data, metadata, and logs within Australian data centres (typically Sydney or Melbourne), meeting IRAP, PSPF, and ASD Essential Eight requirements without additional compliance engineering.
Core Components
A D23.io Superset deployment consists of four layers:
1. Data Layer: Your source databases (SQL Server, Snowflake, Databricks, PostgreSQL). D23.io connects via JDBC/ODBC drivers or native connectors. For SQL Server, we typically migrate reporting workloads to Azure SQL Database or leave them on-premises with encrypted network tunnels.
2. Semantic Layer: A metadata repository that defines dimensions, measures, and business logic. This is where “Revenue” gets defined once, so all dashboards use the same calculation. The semantic layer sits in D23.io’s managed PostgreSQL instance (data-residency-compliant).
3. BI Layer: Superset’s dashboard engine. This is where analysts build interactive reports, drill-downs, and alerts. Superset uses Apache Druid for caching and query optimisation, reducing query latency from seconds to milliseconds.
4. Access Layer: Identity and access management. D23.io integrates with your corporate directory (Okta, Entra ID) and enforces RBAC at the dashboard, dataset, and row level.
Why Superset Over Tableau or Power BI?
Tableau and Power BI are excellent platforms, but they’re not always the right fit for Australian enterprises migrating from SSRS. Here’s why Superset wins for this use case:
- Cost: Superset is open-source. No per-user licensing. A 500-person organisation pays the same as a 5-person team. For mid-market companies, this translates to 40–60% lower TCO than Tableau or Power BI.
- Data residency: Superset can be deployed entirely within Australian data centres. Tableau and Power BI require some metadata to flow to US-based SaaS infrastructure, which may violate IRAP or PSPF requirements for government and defence contractors.
- Semantic layer: Superset’s native semantic layer (built on Apache Druid) is more flexible than Power BI’s data model for complex business logic. Tableau requires custom SQL or calculated fields in every dashboard.
- Integration: Superset integrates seamlessly with modern data stacks (Snowflake, Databricks, dbt, Airflow). If your organisation is already on these platforms, Superset is the natural choice.
That said, Tableau and Power BI are excellent platforms. The choice depends on your data stack, compliance requirements, and budget.
Pre-Migration Assessment and Planning
Step 1: Inventory Your SSRS Estate
Before you migrate, you need to understand what you’re migrating. This sounds obvious, but most organisations have 200+ SSRS reports they’ve forgotten about. Start by running this SQL query on your SSRS database:
SELECT
c.Name,
c.Path,
c.Type,
c.CreatedDate,
c.ModifiedDate,
DATALENGTH(c.Content) AS SizeBytes
FROM Catalog c
WHERE c.Type = 2 -- 2 = Report
ORDER BY c.ModifiedDate DESC;
This gives you a complete inventory: report name, path, creation date, and last modification date. Look for:
- Orphaned reports: Reports not modified in 2+ years. These are candidates for decommissioning.
- High-traffic reports: Reports accessed daily or weekly. These are your migration priorities.
- Complex reports: Reports with embedded SQL, custom code, or cascading parameters. These require manual translation to Superset.
- Performance-critical reports: Reports that currently run slowly or time out. These are opportunities to optimise with Superset’s caching layer.
Typically, 30–40% of reports can be decommissioned or consolidated. This reduces migration scope by a third and simplifies your post-migration BI stack.
Step 2: Assess Data Sources and Connectivity
SSRS reports connect to various data sources: SQL Server, Oracle, SAP, Salesforce, REST APIs. You need to understand:
- Which data sources are used: Run a query on the SSRS database to extract data source configurations.
- Network connectivity: Can D23.io reach your data sources? For on-premises SQL Server, you’ll need a secure tunnel (VPN, ExpressRoute, or private link).
- Authentication: How does SSRS authenticate to your data sources? (Service account, Windows authentication, API key?) You’ll need to replicate this in D23.io.
- Query complexity: Are reports using stored procedures, views, or raw SQL? Stored procedures need to be converted to views or SQL queries for Superset.
For Australian enterprises, data residency is critical. If your SQL Server is hosted in Sydney and your reports are accessed locally, you’ll want D23.io to access it via a private network link (not over the public internet). This ensures data never leaves Australia.
Step 3: Define Success Metrics
Before you start migrating, define what “success” looks like:
- Report availability: 99.9% uptime (vs. 95% on-premises)
- Query performance: 50% reduction in dashboard load time (via caching)
- User adoption: 90% of SSRS users actively using Superset within 30 days
- Cost reduction: 40% reduction in reporting infrastructure costs
- Time-to-value: New dashboards shipped in 2 weeks (vs. 6–8 weeks with SSRS)
These metrics keep the project focused and give you a clear cutover criteria.
Data Residency and Security Patterns for AU Enterprises
Australian Data Residency Requirements
Australian enterprises—especially government, defence, and financial services—must comply with strict data residency rules. The key frameworks are:
- IRAP (Information Security Registered Assessors Program): Required for government and defence contractors. Mandates that all data and infrastructure remain within Australia.
- PSPF (Protective Security Policy Framework): Applies to Australian public sector agencies. Requires data to be stored and processed within Australia.
- ASD Essential Eight: Australian Signals Directorate’s mandatory security controls for Australian Government agencies.
- Privacy Act 1988 (Cth): Australian privacy law. Requires personal information to be stored in Australia unless the individual consents.
D23.io is deployed within Australian data centres (Sydney, Melbourne, Brisbane). All data, metadata, logs, and backups remain in Australia. This satisfies IRAP, PSPF, and Privacy Act requirements without additional compliance engineering.
Security Architecture for D23.io
When migrating SSRS to D23.io, your security posture actually improves. Here’s why:
1. Encryption in Transit: All connections between your data sources and D23.io are encrypted with TLS 1.3. For on-premises SQL Server, you can use Azure ExpressRoute (private network link) or a site-to-site VPN to eliminate internet exposure.
2. Encryption at Rest: D23.io stores all metadata (dashboard definitions, user permissions, audit logs) in encrypted PostgreSQL. Your query results are cached in Apache Druid with AES-256 encryption. Sensitive data is never logged or cached unless explicitly configured.
3. Identity and Access Management: D23.io integrates with your corporate directory (Okta, Entra ID, SAML). Users authenticate once (single sign-on) and permissions are enforced at the dashboard and row level. No more managing local SSRS user accounts.
4. Audit Logging: Every action in D23.io is logged: who accessed which dashboard, which queries ran, which data was exported. This audit trail is essential for SOC 2 and ISO 27001 compliance. For organisations pursuing Security Audit (SOC 2 / ISO 27001) compliance, D23.io’s audit logging integrates seamlessly with Vanta for continuous compliance monitoring.
5. Network Isolation: D23.io can be deployed in a VPC with no public internet access. All connections from your on-premises SQL Server go through a private link (ExpressRoute, PrivateLink, or VPN). This eliminates the attack surface of internet-exposed SSRS servers.
Data Residency Patterns
For Australian enterprises, we typically implement one of three patterns:
Pattern 1: Fully Cloud (Recommended)
Migrate your reporting database to Azure SQL Database (Australia region) or Amazon RDS (Sydney region). D23.io connects directly to the cloud database via a private link. All data stays in Australia. This is the simplest pattern and offers the best performance.
Pattern 2: Hybrid (On-Prem + Cloud)
Keep your operational SQL Server on-premises. D23.io accesses it via a VPN or ExpressRoute tunnel. This works well if you have strict on-premises requirements, but adds network latency and complexity. Ensure the tunnel is encrypted and monitored.
Pattern 3: Data Warehouse + Analytics
Migrate reporting data to a cloud data warehouse (Snowflake, Databricks, or BigQuery Australia region). D23.io queries the warehouse. This is ideal if you’re already running a modern data stack and want to separate analytics from operational workloads.
For most Australian enterprises, Pattern 1 (fully cloud) is the fastest and most secure. It eliminates on-premises infrastructure, simplifies compliance, and improves query performance.
Step-by-Step Migration Strategy
Phase 1: Foundation (Weeks 1–2)
Week 1: Infrastructure Setup
- Provision D23.io instance in Australian data centre (Sydney or Melbourne)
- Configure identity provider integration (Okta, Entra ID, SAML)
- Set up network connectivity to your data sources (VPN, ExpressRoute, or private link)
- Create encryption keys for sensitive data
- Enable audit logging and configure log retention (minimum 90 days for compliance)
Week 2: Data Source Configuration
- Register your SQL Server (or cloud database) as a data source in D23.io
- Test connectivity and validate query performance
- Create database views for complex SSRS queries (stored procedures must be converted to views)
- Set up caching policies (e.g., cache dashboard queries for 5 minutes)
- Document all data sources, authentication methods, and refresh schedules
Phase 2: Semantic Layer (Weeks 3–4)
This is where you build the “single source of truth” for your metrics. Instead of defining “Revenue” in 50 different SSRS reports, you define it once in the semantic layer and reuse it everywhere.
Week 3: Dimension and Measure Definition
- Audit all SSRS reports and extract common dimensions (Date, Customer, Product, Region) and measures (Revenue, Cost, Margin)
- Create dimensions in D23.io’s semantic layer (typically using dbt or SQL views)
- Define measures with consistent business logic (e.g., Revenue = SUM(order_amount) WHERE order_status = ‘completed’)
- Document the semantic layer with business definitions (“Revenue = all completed orders, excluding refunds and cancellations”)
Week 4: Validation and Reconciliation
- Run sample queries in both SSRS and D23.io. Numbers must match exactly.
- Compare SSRS report totals with D23.io dashboard totals. Investigate discrepancies.
- Validate date calculations, currency conversions, and any custom business logic
- Get sign-off from business stakeholders (Finance, Sales, Operations) that numbers are correct
This phase typically takes 2–3 weeks for mid-market organisations. It’s the most critical phase—if your semantic layer is wrong, all downstream dashboards will be wrong.
Phase 3: Dashboard Migration (Weeks 5–8)
Now you migrate SSRS reports to Superset dashboards. This is where you get creative—Superset dashboards are more interactive and visually rich than SSRS reports.
Week 5: High-Priority Dashboards
Start with your 5–10 most-used reports. These are the ones your executives check daily. For each report:
- Extract the SQL query from the SSRS report (usually embedded in the report definition)
- Translate it to a Superset dataset (a reusable SQL query with parameters)
- Create a Superset dashboard with the same visualisations (table, chart, gauge, etc.)
- Add interactivity: filters, drill-downs, date pickers
- Test with 2–3 key users. Gather feedback.
Week 6–7: Medium and Low-Priority Reports
Repeat the process for the remaining 30–50 reports. By now, your team has a rhythm and can migrate 5–10 reports per week.
Week 8: Decommissioning
For the 30–40% of reports you identified as orphaned or redundant, formally decommission them. Document which dashboard replaced them (if any) and notify users.
Phase 4: User Acceptance Testing (Weeks 9–10)
- Grant 20–30 power users access to D23.io
- Ask them to use Superset dashboards in parallel with SSRS for 2 weeks
- Collect feedback: missing reports, incorrect numbers, performance issues
- Fix critical issues. Document minor issues for post-cutover improvement
- Get sign-off from business stakeholders
Phase 5: Cutover (Week 11)
- Friday afternoon: Disable SSRS for all users except a small support team
- Weekend: Run final validation. Compare SSRS and D23.io numbers one more time.
- Monday morning: Enable D23.io for all users. Have a support team on standby for 48 hours.
- Week 2: Monitor performance, gather feedback, fix issues. Decommission SSRS infrastructure.
Semantic Layer and BI Layer Configuration
Building Your Semantic Layer
The semantic layer is the foundation of your analytics stack. It’s where you define what “Revenue,” “Customer,” and “Margin” mean. Once defined, every dashboard uses these definitions, eliminating inconsistency.
In D23.io, the semantic layer is built using:
- Datasets: Reusable SQL queries that define a business entity (e.g., “Orders”, “Customers”, “Products”)
- Columns: Dimensions (categorical) and measures (numeric) within each dataset
- Filters: Business rules (e.g., “exclude test orders”, “only completed transactions”)
- Relationships: Links between datasets (Orders → Customers, Orders → Products)
For example, an “Orders” dataset might look like:
Dataset: Orders
Dimensions:
- Order Date (DATE)
- Customer Name (STRING)
- Product Category (STRING)
- Region (STRING)
Measures:
- Revenue (SUM of order_amount)
- Order Count (COUNT)
- Average Order Value (Revenue / Order Count)
Filters:
- Exclude test orders (order_source != 'test')
- Exclude refunds (order_status = 'completed')
Relationships:
- Orders → Customers (on customer_id)
- Orders → Products (on product_id)
Once you’ve defined this, every dashboard can use “Revenue” and it will calculate consistently across the entire organisation.
Designing Your BI Layer
The BI layer is where analysts build dashboards and reports. In Superset, a dashboard is a collection of charts (tables, bar charts, line charts, maps, etc.) connected by filters.
When designing dashboards, follow these principles:
1. One metric per chart: Each chart should answer a single question. “What’s our revenue by region?” is one chart. “What’s our revenue by region and product?” is a different chart.
2. Progressive disclosure: Start with a summary view (total revenue). Let users drill down to details (revenue by region, then by customer).
3. Consistent colour schemes: Use the same colour for “Revenue” across all dashboards. This builds intuition.
4. Responsive design: Dashboards should work on desktop, tablet, and mobile. Superset handles this automatically.
5. Performance budgets: Each chart should load in <2 seconds. If a chart takes >5 seconds, optimise the underlying query or enable caching.
Role-Based Access Control (RBAC)
Superset’s RBAC system lets you control who sees what. For example:
- Sales team: Can see Revenue, Orders, and Customers dashboards. Cannot see Cost or Margin.
- Finance team: Can see all dashboards, including Cost and Margin.
- Executives: Can see summary dashboards with YoY trends. Cannot see transaction-level details.
- Analysts: Can see all dashboards and create new ones.
You define roles in D23.io and assign permissions at the dashboard and dataset level. For sensitive data (e.g., customer PII), you can enable row-level security (RLS) to ensure users only see data relevant to their region or business unit.
Testing, Validation, and Cutover
Pre-Cutover Testing Checklist
Before you shut down SSRS, complete this checklist:
Data Validation
- All SSRS report totals match D23.io dashboard totals (within 0.01%)
- Date calculations are identical (YTD, MTD, rolling 12-month, etc.)
- Filters work correctly (date range, customer, product, region)
- Drill-down paths are intuitive and correct
- Exports (CSV, Excel, PDF) are formatted correctly
Performance Validation
- Dashboard load time <3 seconds (vs. SSRS >10 seconds)
- Filters apply instantly (<1 second)
- Caching is working (repeated queries are <100ms)
- No timeouts or “query too complex” errors
User Acceptance
- 20+ power users have tested D23.io for 2 weeks
- Feedback has been collected and critical issues fixed
- Users are comfortable with the new interface
- Training materials are complete
Security and Compliance
- User permissions are correctly configured (RBAC)
- Audit logging is enabled and working
- Encryption is enabled for sensitive data
- Data residency requirements are met (all data in Australia)
- Compliance checklist is complete (SOC 2, ISO 27001, IRAP, PSPF)
Infrastructure
- Backups are automated and tested
- Disaster recovery plan is documented
- Monitoring and alerting are configured
- Support runbooks are written
Cutover Plan
The cutover is the moment you switch from SSRS to D23.io. Here’s a safe approach:
Friday 4 PM: Send a message to all users: “SSRS will be offline for maintenance this weekend. Please use D23.io for all reporting starting Monday.”
Friday 5 PM: Disable SSRS for all users except a small support team. Keep a senior analyst and a DBA on call.
Friday 5 PM–Sunday 11 PM: Run validation:
- Compare SSRS and D23.io numbers for 50+ reports
- Investigate any discrepancies
- Fix bugs
- Run performance tests
Monday 8 AM: Enable D23.io for all users. Have a support team on standby for 48 hours.
Monday–Friday: Monitor usage, gather feedback, fix bugs. Most issues will surface in the first 24 hours.
Following Monday: Decommission SSRS infrastructure. Cancel SQL Server licenses. Celebrate.
Common Issues and Remediation
Issue 1: “Numbers don’t match between SSRS and D23.io”
Cause: Rounding errors, timezone differences, or different join logic.
Fix: Compare the underlying SQL queries. SSRS might be using a view with different business logic than your Superset query. Align the queries.
Issue 2: “Dashboards are slow”
Cause: Complex queries, missing indexes, or caching not configured.
Fix: Profile the slow queries. Add indexes to your database. Enable caching for frequently accessed dashboards. Consider moving to a data warehouse (Snowflake, Databricks) for better performance.
Issue 3: “Users can’t find reports”
Cause: Poor dashboard naming or organisation.
Fix: Organise dashboards into folders (e.g., “Sales”, “Finance”, “Operations”). Use consistent naming. Add descriptions to each dashboard.
Issue 4: “Permissions are wrong”
Cause: RBAC not configured correctly or users not assigned to roles.
Fix: Audit user permissions. Ensure each user is assigned to the correct role. Test permissions with a test user account.
Post-Migration Optimisation and Training
User Training and Adoption
Migrating to D23.io isn’t just a technical project—it’s a change management project. Users are comfortable with SSRS. Superset is different. You need to train them.
Week 1 (Pre-Cutover)
- Send a 5-minute video: “Welcome to D23.io. Here’s how to find your dashboard.”
- Hold optional office hours for questions
- Create a quick-start guide (1 page, with screenshots)
Week 2 (Cutover Week)
- Hold live training sessions: “How to filter, drill-down, and export data in Superset”
- Pair power users with analysts for 1-on-1 training
- Have a Slack channel for questions
Week 3–4 (Post-Cutover)
- Host advanced training: “Creating your own dashboards”
- Share best practices: “How to design dashboards that answer business questions”
- Gather feedback and iterate
Optimisation Roadmap
After cutover, your D23.io deployment will evolve. Here’s a typical 6-month roadmap:
Month 1: Stabilisation. Fix bugs, gather feedback, optimise slow queries.
Month 2: Advanced features. Enable row-level security (RLS) for sensitive data. Add alerts (“notify me if revenue drops >10%”). Integrate with Slack.
Month 3: Self-service analytics. Train analysts to create their own dashboards. Enable scheduled exports (“email me a CSV every Monday”).
Month 4–6: Modernisation. Migrate reporting data to a cloud data warehouse (Snowflake, Databricks). Integrate with dbt for data transformation. Add real-time dashboards (streaming data from Kafka, Kinesis).
Measuring Success
Track these metrics to measure the success of your migration:
Technical Metrics
- Dashboard load time (target: <3 seconds)
- Query performance (target: 50% improvement vs. SSRS)
- Uptime (target: 99.9%)
- Cost (target: 40% reduction vs. SSRS)
Business Metrics
- User adoption (target: 90% of users actively using D23.io within 30 days)
- Dashboard creation rate (target: 5+ new dashboards per month)
- Time-to-insight (target: 2 weeks from “I need a report” to “report is live”)
- User satisfaction (target: 8/10 NPS)
Review these metrics monthly. Share them with stakeholders. Celebrate wins (“We reduced query time by 60%!”). Address issues (“3 dashboards are underutilised—let’s consolidate them”).
Common Pitfalls and How to Avoid Them
Pitfall 1: Underestimating Semantic Layer Complexity
Problem: Teams assume they can build the semantic layer in a week. They can’t. Complex business logic (revenue recognition, customer lifetime value, attribution models) takes time to define and validate.
Solution: Allocate 3–4 weeks for semantic layer development. Involve finance, sales, and operations teams. Get written sign-off on every metric definition. Use the $50K D23.io consulting engagement as a benchmark—6 weeks is a realistic timeline for a mid-market organisation.
Pitfall 2: Not Planning for Data Residency
Problem: Teams assume D23.io can access any data source. For Australian enterprises, data residency is non-negotiable. If your SQL Server is in the US and D23.io is in Australia, data is flowing internationally—potentially violating Privacy Act or IRAP requirements.
Solution: Audit your data sources early. Ensure they’re in Australia or accessible via a private link. For sensitive data, migrate to a cloud database in an Australian region (Azure SQL Database, RDS Sydney, Snowflake AU region).
Pitfall 3: Ignoring Performance Until Cutover
Problem: Teams build dashboards without thinking about performance. After cutover, they’re shocked when dashboards take 30 seconds to load.
Solution: Profile queries during UAT. If a query takes >5 seconds, optimise it before cutover. Add indexes, simplify joins, or enable caching. Use D23.io’s query profiling tools to identify bottlenecks.
Pitfall 4: Insufficient User Training
Problem: Teams assume users will figure out Superset. They won’t. After cutover, support tickets flood in: “Where’s my report?”, “How do I export to Excel?”, “Why are the numbers different?”
Solution: Invest in training. Create video tutorials. Hold live training sessions. Pair power users with analysts. Have a Slack channel for questions. The first 2 weeks after cutover are critical—good support prevents mass user defection.
Pitfall 5: Over-Engineering RBAC
Problem: Teams try to replicate SSRS permissions exactly. SSRS has 200+ users with 50+ custom permission groups. Replicating this is a nightmare.
Solution: Simplify RBAC during migration. Consolidate permission groups. Use role-based access (Sales, Finance, Operations) instead of per-user permissions. You can always add complexity later.
Pitfall 6: Not Having a Rollback Plan
Problem: Teams shut down SSRS immediately after cutover. If something goes wrong (data mismatch, performance issues), they have no fallback.
Solution: Keep SSRS running for 2 weeks after cutover. If critical issues emerge, you can roll back. After 2 weeks, decommission SSRS.
Real-World Example: The $50K D23.io Engagement
To illustrate how this works in practice, let’s walk through a real engagement. A mid-market financial services company (500 employees, $100M revenue) migrated from SSRS to D23.io. Here’s what they did:
Engagement Overview
- Duration: 6 weeks
- Cost: $50,000 (fixed-fee)
- Team: 2 PADISO engineers, 3 client analysts, 1 client DBA
- Scope: 80 SSRS reports → 25 Superset dashboards, semantic layer, training
Week 1–2: Foundation
- Provisioned D23.io in Sydney data centre
- Configured Okta integration for single sign-on
- Set up VPN tunnel to on-premises SQL Server
- Inventoried 80 SSRS reports; identified 30 for decommissioning
Week 3–4: Semantic Layer
- Defined 8 core datasets: Customers, Accounts, Transactions, Products, Employees, Branches, Campaigns, Costs
- Aligned on metric definitions: Revenue (completed transactions only), Cost (including overhead allocation), Margin (Revenue - Cost)
- Validated numbers against SSRS reports; fixed 3 discrepancies (rounding, timezone, join logic)
Week 5–6: Dashboard Migration
- Migrated 25 high-priority reports to Superset dashboards
- Added interactivity: date filters, customer drill-downs, region filters
- Configured caching: dashboard queries cached for 1 hour
- Trained 30 power users; held 4 training sessions
Cutover
- Friday 5 PM: Disabled SSRS
- Weekend: Validated numbers, fixed 2 minor bugs
- Monday 8 AM: Enabled D23.io for all users
- Week 2: Monitored usage, fixed support tickets (mostly “how do I export?”)
Results
- Dashboard load time: 15 seconds (SSRS) → 2 seconds (D23.io) = 87% improvement
- Infrastructure cost: $25K/year (SSRS) → $12K/year (D23.io) = 52% reduction
- Time-to-dashboard: 6 weeks (SSRS) → 2 weeks (D23.io) = 3x faster
- User adoption: 95% of users actively using D23.io within 30 days
- Cost per dashboard: $2,000 (including infrastructure, training, support)
This engagement is documented in detail in the $50K D23.io consulting engagement breakdown, which includes architecture diagrams, semantic layer definitions, and training materials.
Advanced Topics: Integration with Modern Data Stacks
Integrating with dbt for Data Transformation
If you’re already using dbt (data build tool) for data transformation, D23.io integrates seamlessly. Instead of defining business logic in the semantic layer, you define it in dbt models, then expose those models in D23.io.
For example:
-- models/marts/fct_orders.sql (dbt)
SELECT
order_id,
customer_id,
order_date,
order_amount,
order_status,
CASE
WHEN order_status = 'completed' THEN order_amount
ELSE 0
END AS revenue,
CASE
WHEN order_status = 'completed' THEN 1
ELSE 0
END AS order_count
FROM {{ ref('stg_orders') }}
WHERE order_source != 'test'
Then in D23.io, you simply expose fct_orders as a dataset. The business logic is already in dbt.
This pattern is ideal if you’re migrating to a modern data stack (Snowflake, Databricks, BigQuery) alongside D23.io.
Real-Time Dashboards with Streaming Data
Superset supports real-time dashboards via Apache Druid. If you’re streaming data from Kafka or Kinesis, you can configure D23.io to ingest that data and display it in real-time dashboards.
For example, an e-commerce company might stream order events to Kafka, ingest them into Druid, and display live order volume, revenue, and customer counts on a dashboard.
This is advanced, but increasingly common for organisations building real-time analytics.
Row-Level Security (RLS) for Multi-Tenant Deployments
If you’re a SaaS company or have multi-tenant reporting requirements, D23.io’s RLS feature is powerful. You can configure dashboards so that each customer only sees their own data.
For example:
-- Dataset: Customer Orders
-- RLS rule: customer_id = {{ current_user_id }}
Now, when a customer logs in, they only see orders for their customer ID. This is enforced at the database level—no data leakage.
Compliance and Audit Readiness
SOC 2 and ISO 27001 Compliance
If you’re pursuing SOC 2 or ISO 27001 certification, D23.io’s audit logging is essential. Every action is logged:
- Who accessed which dashboard
- Which queries ran
- Which data was exported
- When permissions changed
- When dashboards were modified
For organisations pursuing Security Audit (SOC 2 / ISO 27001) compliance via Vanta, D23.io’s audit logs integrate directly with Vanta for continuous compliance monitoring.
IRAP and PSPF Compliance for Government Contractors
D23.io is designed for government and defence contractors. All infrastructure is in Australian data centres, audit logging is comprehensive, and data residency is guaranteed.
For IRAP-certified deployments, we typically implement:
- Encryption at rest (AES-256) and in transit (TLS 1.3)
- Multi-factor authentication (MFA) for all users
- Network isolation (no public internet access)
- Comprehensive audit logging (90+ days retention)
- Regular security assessments and penetration testing
Next Steps and Getting Started
If You’re Ready to Migrate
-
Schedule a discovery call: Discuss your SSRS estate, data sources, and compliance requirements. PADISO offers a free 30-minute discovery call to assess your migration scope and timeline.
-
Get a migration estimate: Based on your report inventory and complexity, we’ll provide a fixed-fee estimate. Most mid-market migrations cost $40K–$80K and take 8–12 weeks.
-
Start with a proof-of-concept: Migrate 5–10 high-priority reports to D23.io. Validate numbers, test performance, gather user feedback. This typically takes 2–3 weeks and costs $10K–$15K.
-
Plan your full migration: Once the POC is successful, plan the full migration. We recommend a phased approach: foundation (weeks 1–2), semantic layer (weeks 3–4), dashboard migration (weeks 5–8), UAT (weeks 9–10), cutover (week 11).
Key Resources
For technical guidance on SSRS migration, refer to Microsoft’s official documentation on migrating SSRS native mode installations and comprehensive guides on moving SQL Reporting Services to another server. For cloud database migration strategies, Azure Database Migration Service documentation provides authoritative guidance.
To understand the broader context of analytics platform selection, Gartner’s Magic Quadrant research and Forrester’s analysis of cloud database solutions are valuable references. For industry trends, Computer Weekly’s coverage of SQL Server migration to the cloud provides timely case studies.
For technical deep-dives, SQLShack’s SSRS tutorial and Red Gate’s collection of SSRS articles are excellent references.
Why Partner With PADISO
Migrating from SSRS to cloud analytics is complex. You need partners who understand:
- SSRS internals: How to extract reports, migrate encryption keys, and validate data
- D23.io architecture: How to design semantic layers, configure caching, and optimise performance
- Australian compliance: Data residency, IRAP, PSPF, Privacy Act, ASD Essential Eight
- Change management: How to train users, manage cutover risk, and drive adoption
PADISO has migrated 50+ organisations from SSRS to cloud analytics. We’ve built semantic layers for financial services, healthcare, government, and e-commerce companies. We understand the pitfalls and know how to avoid them.
Our AI & Agents Automation and Platform Design & Engineering services can also help you modernise your data stack beyond reporting—integrating real-time data pipelines, streaming analytics, and AI-driven insights.
For organisations pursuing compliance, our Security Audit (SOC 2 / ISO 27001) services ensure your D23.io deployment meets audit requirements from day one.
Contact PADISO
Ready to migrate? Let’s talk. PADISO offers:
- Free discovery call: 30 minutes to assess your SSRS estate and migration scope
- Fixed-fee migration engagements: $40K–$80K for mid-market organisations
- Fractional CTO support: Ongoing guidance as you modernise your analytics stack
- Training and change management: Ensure user adoption and long-term success
Visit PADISO’s website to learn more about our venture studio and co-build services, AI strategy and readiness programs, and platform engineering expertise.
For Sydney-based teams, we also offer on-site workshops and training. For distributed teams, we work remotely with asynchronous documentation and regular syncs.
Summary
Migrating from on-premises SQL Server Reporting to D23.io’s cloud Superset stack is a strategic investment. You’ll reduce infrastructure costs by 40–50%, ship dashboards 3x faster, and gain the flexibility and compliance-readiness that modern analytics demands.
The key to success is planning. Spend time on:
- Semantic layer design: Get your metrics right. This is the foundation of everything.
- Data residency: Ensure all data stays in Australia. This satisfies compliance and performance requirements.
- User training: Invest in change management. Technology is only half the battle.
- Testing and validation: Validate numbers, performance, and user acceptance before cutover.
- Phased migration: Don’t try to migrate everything at once. Start with high-priority reports, learn, iterate.
With a 12-week timeline, a dedicated team, and the right partner, you’ll have a modern analytics platform that scales with your business and meets your compliance requirements.
Ready to start? Contact PADISO for a free discovery call. Let’s build your analytics future together.