Tuesday, February 17, 2026

How to return the Last Refresh Date of SQL Tables in Power BI

How use Power Query to return the Last Refresh Date of SQL Tables in Power BI
Scenario:
We have a set of tables/views sourced from Azure SQL Synapse, each containing a column name as _modified_timestamp . Using this column, we need to generate a summary dataset in Power BI that returns the last refresh date for each table/view.

Src_Tables = {"dim_Country", "dim_Product", "dim_Segment"}

We can achieve this Scenario using the following Power Query:

let
    // Define Data Source Parameters
    _server = "MyServer-synapse-server.database.windows.net",
    _database = "db_MyAnalytics",
    _schema = "dbo",

    // Define Source Connection
    Source = Sql.Database(_server, _database),

    // List of tables for which we need to check last refresh Date
    Src_Tables = {"dim_Country", "dim_Product", "dim_Segment"},
 
    // Function to run native query for each table
    GetLastRefreshDate = (TableName as text) =>
        let
            Query = "SELECT MAX(_modified_timestamp) AS Last_Refresh_Date FROM " & _schema & "." & TableName,
            Query_Result = Value.NativeQuery(Source, Query),
            WithServer = Table.AddColumn(Query_Result, "Server_Name", each _server),
            WithDb = Table.AddColumn(WithServer, "Database_Name", each _database),
            WithSchema = Table.AddColumn(WithDb, "Schema_Name", each _schema),
            WithTable = Table.AddColumn(WithSchema, "Table_Name", each TableName),
            Result = Table.ReorderColumns(WithTable, {"Server_Name","Database_Name","Schema_Name","Table_Name","Last_Refresh_Date"})
        in
            Result,

    // Apply function to get the last refresh date for each table
    All_Results = List.Transform(Src_Tables, each GetLastRefreshDate(_)),

    // Combine each table result into one table
    Final_Result = Table.Combine(All_Results)
in
    Final_Result

Output:


Note: 
Use parameters to define source connection details such as Server, Database, and Schema. Avoid hard‑coding these values directly within the query definition.


--------------------------------------------------------------------------------------------------------
Thanks
--------------------------------------------------------------------------------------------------------

Thursday, February 12, 2026

How to Identify the duplicate rows in a table using Power Query

How to Identify the duplicate Values in a Column using Power Query
Scenario:
Suppose, we have a dataset(ds_Sample) as below with duplicate values in a Prod_ID column and some duplicate rows:
let
    Source = #table (
        {"Trans_ID", "Prod_ID", "Prod_Name"},
        {
            {12345, 123, "Laptop"},
            {12346, 124, "Radio"},
            {12347, 125, "Keyboard"},
            {12348, 126, "Television"},
            {12349, 127, "Printer"},
            {12350, 128, "Scanner"},
            {12351, 129, "Camera"},
            {12352, 130, "Tripod"},
            {12345, 123, "Laptop"},     // duplicate row
            {12348, 126, "Television"}, // duplicate row
            {12351, 129, "Camera"},     // duplicate row
            {12356, 127, "Printer"},
            {12357, 124, "Radio"},
            {12358, 126, "Television"}
        }
    ),
    ChangeType = Table.TransformColumnTypes(Source, {
        {"Trans_ID", Int64.Type},
        {"Prod_ID", Int64.Type},
        {"Prod_Name", type text}
    })
in
    ChangeType

Now can identify and flag the duplicate values and rows using below methods.

Method-1: Using Text.Combine method with Composite Key based on specific Columns:

let
    /* Source dataset */
    Source = ds_Sample,

    /* Columns to include in Composite Key */
    SelectedCols = {"Trans_ID","Prod_ID"},

    /* Composite Key from selected columns */
    Def_CompositeKey = Table.AddColumn(Source, "Composite_Key", 
        each Text.Combine(
            List.Transform(
                Record.ToList(Record.SelectFields(_, SelectedCols)), 
                each Text.From(_)
                ), 
                "|"
            )
        ),

    /* Flag duplicate rows using Composite Key */
    Flag_DupRow = Table.AddColumn(Def_CompositeKey, "IsDuplicate_Row", 
        each if List.Count(List.Select(Def_CompositeKey[Composite_Key], (x) => x = [Composite_Key])) > 1 
        then "Yes" else "No"
        ),

    /* Flag duplicate products by Prod_ID */
    Flag_DupProd = Table.AddColumn(Flag_DupRow, "IsDuplicate_Prod", 
        each if List.Count(List.Select(Source[Prod_ID], (x) => x = [Prod_ID])) > 1 
        then "Yes" else "No"
        )
in
    Flag_DupProd

Result:
------------------------------------------------
Method-2: Using Text.ToBinary method and Composite Key (Binary Encoded) based on specific Columns:

let
    // Source dataset
    Source = ds_Sample,

    // Columns to include in Composite_Hash_Key
    SelectedCols = {"Trans_ID","Prod_ID"},

    // Generate Composite_Hash_Key from selected columns only
    Def_CompositeKey = Table.AddColumn(Source, "Composite_Hash_Key", 
        each Binary.ToText(
            Text.ToBinary(
                Text.Combine(
                    List.Transform(
                        Record.ToList(Record.SelectFields(_, SelectedCols)), 
                        each Text.From(_)
                        ), 
                        "|"
                    )
                ),
                BinaryEncoding.Base64
            )
        ),

    // Flag duplicate rows using Composite_Hash_Key
    Flag_DupRow = Table.AddColumn(Def_CompositeKey, "IsDuplicate_Row", 
        each if List.Count(List.Select(Def_CompositeKey[Composite_Hash_Key], (x) => x = [Composite_Hash_Key])) > 1 
        then "Yes" else "No"
        ),

    // Flag duplicate products by ProdID
    Flag_DupProd = Table.AddColumn(Flag_DupRow, "IsDuplicate_Prod", 
        each if List.Count(List.Select(Source[Prod_ID], (x) => x = [Prod_ID])) > 1 
        then "Yes" else "No"
        )
in
    Flag_DupProd

Result:
------------------------------------------------
Method-3: Using Table.Group method to identify the Duplicate values based on Prod_ID:

let
    Source = ds_Sample,
    // Count how many times each ProdID appears
    GroupByProd = Table.Group(Source, {"Prod_ID"}, {{"Prod_Count", each Table.RowCount(_), Int64.Type}}),
    // Join back to original details table
    MergeToDetails = Table.NestedJoin(Source, {"Prod_ID"}, GroupByProd, {"Prod_ID"}, "ProdCount"),
    ExpandTable = Table.ExpandTableColumn(MergeToDetails, "ProdCount", {"Prod_Count"}),
    // Add a column to flag duplicate values
    FlagDup_Prod = Table.AddColumn(ExpandTable, "IsDuplicate_Prod", each if [Prod_Count] > 1 then "Yes" else "No")
in
    FlagDup_Prod

Result:
------------------------------------------------
Method-4: Using Table.Group method to return the no. of Duplicate rows and products:

let
    Source = ds_Sample,
    // 1. Count duplicate rows
    DistinctRows = Table.Distinct(Source),
    DuplicateRowCount = Table.RowCount(Source) - Table.RowCount(DistinctRows),

    // 2. Count duplicate products (ProdID)
    GroupByProd = Table.Group(Source, {"Prod_ID"}, {{"Count", each Table.RowCount(_), Int64.Type}}),
    DuplicateProd = Table.RowCount(Table.SelectRows(GroupByProd, each [Count] > 1)),

    // 3. Return summary of results
    Summary= #table(
        {"Metric", "Value"},
        {
            {"Duplicate Rows", DuplicateRowCount},
            {"Duplicate Products", DuplicateProd}
        }
    )
in
    Summary

Result:

--------------------------------------------------------------------------------------------------------
Thanks
--------------------------------------------------------------------------------------------------------

Saturday, November 22, 2025

How to Use Dynamic Query Parameters to Switch Source Query Connections in Power BI

How to Switch Source Query Connection Strings in Power BI with Dynamic Query Parameters
Scenario:
To manage multiple data or query environments such as Dev, Test, and Prod often requires maintaining separate connection strings in Power BI. 
Rather than hard‑coding server, database, and schema names, you can centralize these details in an environment table and control them dynamically through a parameter (e.g. p_Query_Env with values like Dev, Test, and Prod). 
With this setup, switching between environments becomes as simple as changing the parameter value, while Power Query automatically selects the correct server, database, and schema. This provides a clean, scalable solution for environment management that minimizes manual effort and ensures consistency across deployments.

We can achieve the above Scenario, as discussed below.

Step1: Define an Environment Variables Table
To enable dynamic source data environment switching in Power BI, we need to create a dedicated Environment Variables Table like below. This table acts as a central repository for all connection details across your environments (Dev, Test, and Prod).

Info_Query_Environment:

let
    /* Define Data Environment variables table with respective connection details */
    _Env_Table = 
        #table(
            {"Source_Env", "Server_Name", "DB_Name", "Schema_Name"},
            {
                {"Dev",  "Dev_Server01",  "Dev_DB",  "Dev_Schema"},
                {"Test", "Test_Server01", "Test_DB", "Test_Schema"},
                {"Prod", "Prod_Server01", "Prod_DB", "Prod_Schema"}
            }
        ),

    _DataType = Table.TransformColumnTypes(
        _Env_Table,
        {
            {"Source_Env", type text}, {"Server_Name", type text},
            {"DB_Name", type text}, {"Schema_Name", type text}
        }
    )
in
    _DataType


Step 2: Define a Query Parameter for Dynamic Environment Selection
We need to define a list parameter as p_Query_Environ that controls which environment (Dev, Test, or Prod) Power BI should use when running queries. 
This parameter acts as a switch, allowing you to easily toggle between environments without editing the query code.


Step 3: Pass Connection Strings Dynamically to Queries
Instead of hard‑coding server, database, and schema names, we will filter the Environment Variables Table using the parameter (p_Query_Environ) and feed those values directly into the query connection. 
This makes our queries flexible and environment‑aware, ensuring that Power BI automatically connects to the right source based on the environment we select.

Example: dim_Country

let
    /* 
    Filter the Info_Query_Environment table based on Parameter Value (Eg. Dev, Test, Prod)
    */
    SelectedEnv = 
        Table.SelectRows(Info_Query_Environment, each [Source_Env] = p_Query_Environ){0},

    /* Assign the values to Environment Variables */
    _ServerName   = SelectedEnv[Server_Name],
    _DatabaseName = SelectedEnv[DB_Name],
    _SchemaName   = SelectedEnv[Schema_Name],

    /* SQL Query with dynamic Environment Variables */
    vSrc_Qry = "Select Distinct [Country_ID], [Country_Name], [Currency_Code]
         From " & _SchemaName & ".dim_Country",

    /* Run the SQL Query */
    Source = Sql.Database(_ServerName, _DatabaseName, [Query = vSrc_Qry])
in
    Source

Final Step (Optional): Save Report as Power BI Template
We can save your report as a Power BI Template (.PBIT), which captures the report structure, queries, and parameters, but leaves out the actual data. This makes it ideal for scenarios where different users need to run the same report against different environments (Dev, Test, Prod).


Provide the description to the Report Template while saving.


When a user opens the Power BI Template, it automatically prompts to choose a value for the p_Query_Environ parameter. The options (e.g., Dev, Test, Prod) come directly from the parameter definition that we created earlier.
Based on the selection, Power Query filters the environment table and connects to the correct server, database, and schema for that environment. 
Once the environment is chosen, the user simply clicks Load to run the report, and Power BI retrieves data from the selected environment seamlessly.


Note:
You can refine this method to fit your own scenario. Adjust the setup as needed for your environment.

--------------------------------------------------------------------------------------------------------
Thanks
--------------------------------------------------------------------------------------------------------

Featured Post from this Blog

How to compare Current Snapshot Data with Previous Snapshot in Power BI

How to Dynamically compare two Snapshots Data in Power BI Scenario: Suppose we have a sample Sales data, which is stored with Monthly Snapsh...

Popular Posts from this Blog