Creating and maintaining up-to-date documentation for your Power BI reports can be a time-consuming task. However, with the help of Power Automate and DAX queries, you can simplify this process significantly. This blog will walk you through a streamlined approach to automate your Power BI documentation, capturing essential details like tables, columns, measures, and relationships. By setting up this automated workflow, you’ll not only save time but also ensure that your documentation stays accurate, organized, and ready whenever you or your team need it.
Now let's get started with Power BI documentation!
Overall view:
In this solution we will use the new INFO DAX functions to extract metadata and information from the Power BI semantic Model. Tables, Columns, Measures, and relationships between tables are the data we are looking for documentation, and we want to save the results to an Excel file on SharePoint after extracting them. Using Power Automate, we will automate all these steps to be run on our preferred schedule.
INFO DAX functions - DAX Queries
Let's first review the DAX queries we want to use in this solution. As mentioned before, we want to extract information about Tables, Columns, Measures, and relationships between tables from a Power BI semantic model, so we need 4 DAX queries.
Note: I’ve designed the DAX queries to be as simple as possible, focusing only on the essential information needed for documentation. However, you’re free to expand these queries to capture a broader range of details if needed. To get started, open your Power BI file and run these queries in the DAX Query view. Make sure the output includes all the data required for your documentation to ensure completeness.
Tables:
Columns:
Relationships:
Measures:
Output sample:

So now we have the data we are looking for and need to start developing the Power Automate Flow for creating Excel file, Run DAX Queries for extracting data, and save output data into the Excel file.
Developing Power Automate Flow:
For creating this flow in Power Automate, first let's use the Manual trigger to be able to test the flow and then once it's ready we will change it to Recurrence trigger to run the flow automatically at regular, customized time intervals. Click on Create a new flow and choose Instant flow to begin, and then follow these steps:
Step 1: Set Variable for Workspace and Semantic Model
In this flow, we’re documenting a semantic model of the specific report (Dataset), so we need to set the workspace ID and semantic model ID to run DAX queries against that model. Start with adding 2 Initialize variables as first actions to the model and set them as below:

In the value section, add your report's workspace ID and Sematic model ID (you can find it in the URL of the report or Semantic Model)
Step 2: Create a new Excel file
For saving the extracted information from Power BI, a new Excel file needs to be created
To create this new Excel file:
First start by creating a new Excel file in SharePoint. Next, create a flow and use the "Get file content using path" action to retrieve the file’s content in JSON format. This step only needs to be done once.
when you have the JSON content, you can reuse it to create new files. Run this initial flow once to capture the file content in JSON and then you can remove this flow.

Once you run this flow, the output is as below:

Copy all data in the file content section, and move to the main flow. Add Compose action into the flow and paste the file content in this action.

Now, add Create File action from SharePoint group into the flow and use current time into the File Name to make sure each time you run the flow, it will create a unique file and use the Compose action output in the File Content as below:

By running this action, a new Excel file with a unique name will be created in the SharePoint folder you defined. However, this file is still not fully ready, because we have 4 queries and we need to add the output of these 4 queries into different sheets in that Excel file. We need to automatically add 4 sheets to the Excel file and set up a table in each sheet with column names matching the output of the INFO DAX queries. We can do this easily by using Microsoft Office Scripts in Excel.
Step 3: Set an Office Scripts in Excel to create tables automatically
Open an Excel file (Any excel file) in your web browser (Excel Online), navigate to the Automate tab, and click on New Script. Copy and paste the script code into the window that opens, give the script a name like “CreateTables,” and save and test it.
Output:

Step 4: Add Actions to Run DAX Queries
An Excel file with 4 tables and required columns has been created as the output of previous steps. Now, we can add 4 "Run a query against a dataset" actions in parallel into the flow to run those 4 DAX queries.
Use Workspace and Dataset variables as the input for the first 2 fields and copy past the DAX queries I provided above into the Query Text section of each action as below:

** The Full image of this flow with all details is available at the end of this article, click here.
Step 5: Parse JSON (DAX Queries output) and select the required data
This step is required for 2 reasons:
To conver the data into the format which can be used by Office Script to add them automatically into the Excel Tables
Add Workspace and Dataset/Semantic Model details into the output - this is useful when you use this solution for more that 1 report (for example, you can use a loop to document several reports at once).
To parse those DAX query outputs, add a Parse JSON action after each Run DAX Query action, add the First Table Rows from previous action into the Content field, then run the flow once, and copy the output of each DAX query action and use it as generate from sample for the Parse JSON action.
Then add SELECT action after each Parse JSON action, and set it as below image:

Final Step (6): Add data into Excel Tables using Office Script
Our 4 DAX queries' outputs are now ready in the proper format for saving into the Excel file that we created earlier using this flow. We need to run the second Office Script "AddDatatotheTable" to save the JSON data into the Excel file's sheets.
Download the scripts:
Add Run Script action from Excel group after the select action under each branch and set them as below:

Due to the fact that the flow is only capable of running one script at a time, every script must be set to be the predecessor of the next so that once one is completed, the next can start. The output will be as:

Now the file is ready and updated.
Change the trigger of the flow from Manually Trigger to Schedule - Recurrence trigger and set it to run regularly (every day or every hour).
What's NEXT:
You now have a foundational flow that automatically captures essential report data and saves it into an Excel file. To take this further, you can enhance the flow by adding loops to scan your entire tenant, identify all workspaces and reports, and execute DAX queries on each report one by one. This enhancement allows you to document all reports across your tenant automatically, even on an hourly schedule.
Overall Flow Image:

Commentaires