In this post i’m going to share a step by step guide on how i created a sales dashboard for the WideWorldImporters sample database using Microsoft Fabric and PowerBI Desktop.
You can check out this post for a guide on the ETL process to get the datasets from SQLServer into Microsoft Fabric.
PS: Check out the Github repository for this project here.
Business Demand Overview
Business Problem:
The sales team at WideWorldImporters currently lacks a comprehensive and interactive dashboard that provides insights into various aspects of sales performance. This results in delayed decision-making and missed opportunities to optimize sales strategies.
Objective:
Develop a dynamic Sales Dashboard that visualizes key sales data from multiple dimensions (e.g., customer, product, region, and time), to enable the sales team to monitor performance, identify trends, and take data-driven actions. The dashboard should be user-friendly, accessible, and updated in real-time.
Stakeholders:
- Sales Manager: Needs to monitor overall sales performance and identify areas for improvement.
- Sales Representatives: Require detailed insights into their respective territories and customer segments.
- Marketing Team: Interested in understanding customer behavior and product popularity.
- Logistics Team: Needs data on delivery performance to optimize operations.
- Executives: Require high-level overviews for strategic decision-making.
User Stories
User Story 1: Sales Performance Overview
As a Sales Manager, I want to view the Total Sales, Total Profit, and Profit Margin metrics, so that I can quickly assess the overall financial performance of the company.
User Story 2: Regional Sales Analysis
As a Regional Sales Director, I want to compare sales performance across different cities and regions, so that I can allocate resources more effectively and target underperforming areas.
Acceptance Criteria:
The dashboard should provide a map or chart showing sales by region and city.
Data should be drillable to see sales at the city level.
Users should be able to compare different regions side-by-side.
User Story 3: Sales Trends Analysis
As a Business Analyst, I want to analyze trends in Total Sales and Total Profit over time (monthly, quarterly, yearly), So that I can identify periods of strong or weak sales performance and understand seasonal impacts.
User Story 4: Top-Selling Products
As a Marketing Analyst, I want to identify the top 10 best-selling products over the past year, so that I can focus our marketing efforts on promoting these products.
Acceptance Criteria:
The dashboard should list the top 10 products by sales revenue.
Each product should be clickable to view more detailed information (e.g., sales by region or customer segment).
The list should update dynamically based on the selected time period.
User Story 5: Delivery Performance
As a Logistics Manager, I want to track average delivery times for sales orders, so that I can identify and address any bottlenecks in the delivery process.
Acceptance Criteria:
The dashboard should display the average delivery time per region, product, and time period.
Users should be able to view trends in delivery performance over time.
Alerts should be set up for delivery times that exceed a certain threshold.
Data Gathering and Preparation
The WWI sample database has been restored in SQL Server and imported into Microsoft Fabric, which i documented in the ETL Guide. And some preliminary exploration has been done on the datasets using SparkSQL, which i also documented here.
The datasets were structured such that the original tables were stored in a warehouse, and a shortcut was created in a lakehouse (in the same Fabric workspace) pointing to those tables. This is because i wanted the tables to stay in a warehouse so that i can potentially run full transactional DDL and DML queries on them, but also wanted the ability to work with the tables in a Lakehouse notebook.
The dataset was ready for analysis , i just needed one more view/table to add.
Creating a new fact_sales View/table
Since the fact_sales table doesn’t link directly to the payment methods table ( i found out about this in my previous analysis of the tables), i wanted to create a view that adds the payment method key to each sales transaction (i.e each row) in the sales table (the payment method key can be gotten from the transactions table which i will not be bringing into my semantic model for the sales report).
These were the steps i took:
Wrote a cross database query that creates a view in my lakehouse using both the sales and transactions tables from the warehouse (a view can be created in a lakehouse from the SQL analytics endpoint)
Wanted to add the new ‘FactSales’ view as part of the model for the report but received a warning that having a view in my semantic model might have some performance implications
So now i have to create a ‘FactSales’ table instead. And since you can not create a table from a Lakehouse’s SQL endpoint, i could either create it from a notebook using spark or from a warehouse using T-SQL.
- While trying to create the new ‘FactSales’ table from a notebook (tables created in a notebook are stored as delta tables in the lakehouse), i will have to enable column mapping because the column names from the source tables (i mean the warehouse tables that the shortcuts are pointing to) has spaces in them, and Delta tables by default do not allow special characters, including spaces, in column names.
But enabling column mapping will make the table unusable in the workspace’s Warehouse, T-SQL endpoint and in semantic models, because they all do not support column mapping.
- So i ended up creating the new FactSales table in the warehouse (under a new schema) and created shortcut to it from the lakehouse (just like the other tables in the lakehouse).
Creating and preparing a semantic model
I created a new custom direct lake semantic model (from the lakehouse) using the new FactSales table and other dimension tables needed for the report.
Created some measures for the semantic model right there in Fabric
Changed the data category of the city column to ‘City’ .
Sorted the ‘month’ column in the date dimension table by the ‘month number’ column so that it gets ordered correctly on visuals
Connected PowerBI desktop to the direct lake semantic model created earlier
-Created some measures within PowerBI desktop as well .
Creating a report in PowerBI desktop and a documentation on how to use it for stakeholders
Below are the visuals created for the different user stories in the project, including notes on how to use them, and other necessary descriptions.
KPI Card for User Story 1 (for Sales Manager):
- Created a dynamic KPI card that displays the Total Sales, Total Profit, and Profit Margin metrics for the Sales manager
- It can be filtered by year , month , customer , product and sales territory
- The indicator color of the card visual changes to red for any filter where the profit margin drops below 30%
Map Visual for User Story 2 (for Regional Sales Director):
- Created a map visual that shows sales acrross different cities
- It can be driled up and down between the city and country level
- The tooltip for this visual also shows profit for the location hovered over
Area Chart for User story 3 (Business Analyst):
- Created an area chart showing sales and profit trend by month
- Can be drilled up or down between month and year
- Can be filtered by customer , product and sales territory
Bar Chart for User story 4 (Marketing Analyst):
- Created a Bar Chart that shows the Top 10 best selling products
- The tooltip for the visual also shows the total quantity sold for a product as well as its unit price
- Can be filtered by Time , Region and every other slicer on the report
Table Visual for User Story 5 (for Logistics Manager ):
- Created a table that shows the average delivery day for each product compared to it’s expected delivery days
- Can be filtered by region and time
- Created a data activator alert on the visual that sends an email once the average delivery days for any product become more than a day
The Final dashboard
END!
Don’t forget to connect with me on Linkedin
You want to support my work? click here
Thanks for reading, and i’ll catch you in the next one.
Top comments (0)