Trending December 2023 # My Practice Tips When Using The Power Bi Advanced Editor # Suggested January 2024 # Top 13 Popular

You are reading the article My Practice Tips When Using The Power Bi Advanced Editor updated in December 2023 on the website We hope that the information we have shared is helpful to you. If you find the content interesting and meaningful, please share it with your friends and continue to follow and support us for the latest updates. Suggested January 2024 My Practice Tips When Using The Power Bi Advanced Editor

We’re going to talk about some of my best practice tips using Advanced Editor inside Power BI. You may watch the full video of this tutorial at the bottom of this blog.

Surprisingly, most people don’t know about the Advanced Editor. It is because it’s hidden away inside the Query Editor, which a lot of users don’t utilize as much as they should.

If you don’t know what it is, you need to have a better understanding of how it operates in Power BI.

The Advanced Editor records all of the different transformations you make inside the Query Editor.

You don’t need to grasp how to write any of the code (called ‘M’). But you must have a good understanding of what it actually means and how it works behind the scenes.

You have to know this is because there are so many times especially when you make more and more transformations inside the Query Editor that you’ll need to complete some fixes within the Advanced Editor.

The first thing to do is to go to the query editor by selecting the icon Edit Queries. 0:50

As you start doing more work in Power BI, you will need to make some small adjustments in what’s called the Advanced Editor.

This is where lines of code get written when you make transformations on your data sets. I’m currently on the Dates table looking at the source where the data is coming from, which comes from Dates Query.

The key thing to note here is that this table was created from a source, and the source in this case is just a parameter query. It has been labeled with one line of code called Renamed Columns.

Let’s go and make some changes and create a Short Month column. I will duplicate the column, and then split the column by number of characters.

I’m just going to divide it by three, and choose Once, as far left as possible.

I will then rename the column to Short Month. You will be able to see the steps I applied on the right side.

The great thing is you can make changes that will be reflected in the table. For instance you don’t want to call it “Short Month” anymore, and you want to change it to “Short Mth.”

You can come inside the editor and make the changes here, and it makes the change for you on the front end. This is pretty handy from an auditing perspective, simply because when you get errors, this is the place to go. You can dive into here and figure out where the error occurred.

Say for instance the Short Mth column disappears and then it appears at the end of the table. You might want to move it and place it next to the full month, right?

You have to be aware of this happening. Let’s say you realized at this point that you wanted to change the name of something, or even put something as simple as a space between Week and Ending, it will give you a warning that says Insert Step.

Sometimes, these error codes don’t always tell you exactly what’s wrong or where to find the error. But in this case, it is saying the WeekEnding can’t be found. Now the reason for this is because the initial table name was actually listed in this particular row.

So you have to go and find the row and that particular column name. You can see here that “WeekEnding” hasn’t changed, causing the error. So all we have to do is put a space between Week and Ending to fix the error.

My main point is that having an understanding is more important than being able to write any code. That’s my main takeaway for you from this tutorial.

Advanced Data Transformations & Modeling


You're reading My Practice Tips When Using The Power Bi Advanced Editor

Power Bi New Customers Retention Analysis Using Advanced Dax

Another term for this is attrition analysis because we want to see how our customer are churning, how many of our customers are coming on board and buying our products, how many are coming back and buying some more, how many customers we are losing, and so forth.

In this customer analysis example, I start off going through customer churn and exploring how many customers are being lost after a certain time frame. I also dive into new customers and returning customers.

Analyzing your customer churn is a very key piece of analysis for an organization, especially if you’re a high-frequency selling business like an online retailer or a supermarket chain.

Obviously, if you get customers on board, you want to be selling them more and not losing them to competitors, for example.

It’s much easier to sell to an existing customer than it is to find new customers.

Existing customers are crucial to most businesses as it’s so much more profitable to continue to market to them as opposed to having to find new customers all the time.

In this first visualization here, we have what we would consider overtime Lost Customers.

This point down to just about 90 days is not as relevant because when we’re at the very first days, we’re actually considering everyone as “lost” at the moment.

Now let’s walk through the function to see what we are doing here.

In this formula, we are counting up the customers who have not purchased for the last 90 days, or whatever is your variable churn date.

We are creating a virtual table on every single customer through this CustomersList variable.

We filter all customers for any day. And what we’re doing with ALL is that we’re actually looking at every single customer in each individual day.

And then for every single customer, we’re evaluating if they have made a purchase on the last 90 days. If they have not, then that’s going to evaluate to 0 and count that customer.

Now let’s look at our New Customers and see what it is evaluating to.

In this table, we see that it’s more on the earlier dates, January to July because we just started our business. People are generally new.

Then obviously, it flattens out towards the end because we just have our return customers there.

Its function is doing similar logic. We are counting out how many customers have made a sale before today.

And if they have not purchased anything, which is going to evaluate as 0, then it evaluates as New Customer.

Returning customers are those who have been evaluated as lost.

In other words, they haven’t bought anything for 90 days. Through time, we will calculate how many are actually returning.

This would be an amazing insight if you are running promotions or doing marketing, and you want to know how many of these lost customers you got back through your marketing activities.

In the Returning Customers formula, we’re only evaluating customers that actually purchased on any given day.

So here we are running some logic on each customer, evaluating if they made a sale in the last 90 days.

If they didn’t purchase for the last 90 days, then they are considered returning. Then, evaluate to true, and count out that customer in that particular day.

In the past, this sort of information would cost a lot of money to generate. But now, you could achieve these awesome insights through some clean and effective formula, utilizing the DAX language.

Remember that it actually aligns with the data model. Everything is incorporated in there.

We can actually place some filters on this. For instance, we want to dive into just one state, say Florida, or our top 3 states, it all evaluates dynamically.

If you can see the opportunities and potential with Power BI, then your mind can just exponentially expand with the possibilities of running analysis over your own data sets.

All the best and good luck with these techniques.


When To Use Topn Function In Power Bi

This tutorial will review how to use TOPN as a virtual ranking function to generate interesting insights based on ranking logic in Power BI. You may watch the full video of this tutorial at the bottom of this blog.

Using TOPN as a virtual ranking function allows you to dynamically produce the top and bottom results for any measure.

This example will show how to work out the locations that generate the highest and lowest revenue per customer.

This table contains the Revenue for each customer.

You want to work out which locations have the highest and lowest Revenue.

Let’s say that a customer bought your products from a range of different locations.

Using Stephen Howard as an example, you can see that he bought products from six different cities:

You now want to know how much revenue was generated for each city. Then, you want to virtually rank the cities and calculate which of them belong to the top and bottom two.

This formula counts the number of locations a customer has purchased from:

The COUNTROWS function works out each unique place where a product has been bought.

However, this formula can’t give you with the ranking results you need.

This is where the TOPN function comes in. It allows you to have a virtual ranking inside a formula.

To calculate the top two cities with the highest revenue for each customer, you need to use this formula:

The CALCULATE function computes the Total Revenue using a different context for the top two cities.

Let’s now focus on the TOPN statement in the formula:

The first parameter for this TOPN statement is the total ranking that needs a virtual calculation. Thus, 2 is used to get the top two cities.

If it’s 4, it will return the top four cities.

You need to make sure that you’re only iterating through the places that a customer has purchased from. This is why Index is used rather than an element in the model.

Using Index ensures that you’re only counting the regions your customers have purchased from, and not all the regions in your model.

If you were to put the VALUES function along with the actual Name of the City, you’d get the overall Total Revenue from the top two cities — not from each individual customer.

The TOPN function creates a brand new context for each result in the table.

It’s creating a virtual table containing only the top two locations a customer has purchased from.

This is the formula used to calculate the bottom two cities:

It’s exactly the same formula as the first, but you need to change DESC to ASC.

Here’s how to check if this formula is correct:

If you bring the cities with purchases to the table, you can see that the number of iterations matches the result of the Total Cities.

For example, Aaron Bradley bought from four different locations. So, there are four iterations showing in the second table.

You can see the four different purchase amounts the customer has for each location.

If you calculate and compare the figures of the two tables, you will see that they both match. All of Aaron Bradley’s amounts equal to 173,128.00, which is the Total Revenue. The top two cities have a Revenue of 124, 191.20, and the bottom two 48,936.80.

This tutorial discussed how to use TOPN as a virtual ranking function to create effective ranking visuals in Power BI.

You can wrap the TOPN function in COUNTROWS, SUMX, or AVERAGE to create more valuable insights in your reports. It’s a very flexible and reusable tool to use.

All the best,


Power Bi Planning Using The Priorities Matrix In Analyst Hub

This tutorial will discuss the Priorities Matrix, which is a Power BI planning tool found in the Analyst Hub in Enterprise DNA. Planning tools in Power BI are very useful for businesses and organizations because they help in streamlining the succeeding processes in Power BI implementation.

When planning out your strategies in Power BI, you need to think about what insights to show in a specific workspace. The more different insights you list ahead of time, the better direction it gives you on how much effort goes into a report. This also gives the developers within your organization a much clearer idea of what they should be showing in their reports.

The Analyst Hub in Enterprise DNA is a feature that allows analysts to create and share interactive dashboards, reports, and visualizations with stakeholders. It enables the users to create and publish reports to a central location, where they can be accessed and viewed by others in the organization.

The Priorities Matrix in the Analyst Hub is a simple board that allows you to create a list of ideas around insights. You can then group these ideas according to user value and effort needed to create the insight.

As an example, let’s create a priorities matrix for financial reporting. Note that teams can also work on a matrix together; it doesn’t have to be created by a single person.

To start, you just need to list things out. From a finance perspective, some of the insights could be cash management, historical sales, financial reporting, global markets, commodity markets, financial metrics, year-on-year KPIs, budgeting details, and forecasting.

Scroll down and set whether the matrix is a private or team document.

You can view your matrix in the Documents tab in the Analyst Hub.

Then, choose the project folder you want to place the matrix.

You can view your projects by going to the Project tab.

Moreover, you can also choose which team members from your organization can have access to each priority matrix. You can do this by going to the Teams tab.

This Analyst Hub tool helps centralize the Center of Excellence initiatives within the organization. Different teams can coordinate with each other in order to ensure proper communication and standard documentation.

This is another example of a priorities matrix for marketing insights.

Creating a priorities matrix might feel like extra work. But if you spend time organizing insights for each report, you’ll get clearer priorities on what you should do during the report development.

By gathering all the information together, you create a benchmark that others can adhere to throughout the deployment. This will bring more coordination and continuity within the organization.

Enterprise DNA is a website that provides training and resources for Power BI users, including courses and tutorials on using Analyst Hub. One of the tools in the Analyst Hub is the Priorities Matrix. The Priorities Matrix is a feature that allows users to prioritize and focus on the most important insights in a Power BI report.

By using this tool, you’re able to streamline and prioritize insights based on user value and effort. So make sure to utilize this tool every time you create a Power BI report.

All the best,

Sam McKay

Analyzing Average Results Per Month Using Dax In Power Bi

In this tutorial, I go through an average results analysis technique. I dive into some types of procurement & purchasing analysis that you can do with Power BI. You may watch the full video of this tutorial at the bottom of this blog.

Specifically, we’re going to look at the average monthly purchases a business may make.

I go through which DAX formula combination you need to be able to calculate averages in a monthly context. I also show you how you can combine these formulas or functions with the data model. By doing this effectively, you can quickly extract information across a range of different dimensions.

Whether you want to analyze your monthly purchases by division,  region or according to each manager, all of this is possible with Power BI in an effective and efficient way.

You don’t need to write different formulas to do it.

What you can do is leverage the data model that you’ve built for a procurement type scenario. You can bring in different elements or different dimensions from these various tables that you have in your data model and overlay them with the formulas that you have created.

Additionally, you can branch out even further from that. For instance, you can analyze the difference in purchasing over time for any of these different dimensions in your data. You can include many time comparisons and time intelligence type techniques in the analysis as well.

So in this example, we look at things on an average basis. Instead of looking at every individual purchase, we want to see how the average purchases in our departments are going. I’ll show you how you can use a combination of techniques to actually work this out using Power BI.

First of all, I’ve created a measure to get the total amount of invoicing that we’re doing. Here’s the formula:

So now we work out averages through time. We’ll look through time and see what time frame we are making purchases. In this case, we have all the way from January 2023 to January 2023.

We simply drag our Date table into our axis and we can see our Invoicing Totals by Date.

Now we want to work out how much we are invoicing per month on a department basis. To see that insight, we grab the Invoicing Totals then drag in our Department into the axis.

It’s also easy to overlay a slicer from our Date table, where we can easily change it and look at certain time frames, which will impact both the context and the results.

But say for instance, within this date context, we want to see on average in the months here, how much we are actually spending per department. To do this, it just takes an understanding of how to use the AVERAGEX function and what virtual table we want to place inside that function.

So we go create a new measure and call it Average Monthly Purchases. We’ll use AVERAGEX, then inside we put a virtual table of every single month. We’ll find our Month and Year dimension inside our Date table and that is going to do the iterating for us.

AVERAGEX is an iterating function and so we need to give it a table to iterate through. In this case, we’re giving it a virtual table of every month and year. And that’s what’s going to create the average because it’s only going to calculate the amount of months in years inside this context.

Then, we add our invoicing measure and we’ll see that now we have an average. We format it and change the color to make it stand out.

It’s good to see the average purchasing, but it would also be a great thing to see and to compare average purchasing over a previous time.

To do this, we simply just expand on what we have here. We start from our simple measures, and then branch out to more DAX techniques to get more insights.

We’ll call our formula Average Monthly Purchases LY and we’ll use the CALCULATE function of our Average Monthly Purchases. We’ll have DATEADD and DATES, and the number of intervals is -1 – then we go YEAR.

Based on this context, we can now compare our Average Monthly Purchases over the time frame we have selected versus exactly that same time frame from the previous year.

I have run through all of the steps indicated above in this tutorial. If you can relate to this scenario, I’m very confident you will learn a lot about how Power BI can be used to optimize decisions around these sorts of purchasing scenarios.

And what’s great about this in Power BI is that once you learn the technique, you can easily apply it to many different data scenarios.

Time comparisons don’t change no matter what data scenario you’re working on and very quickly you can get some pretty cool insights. You can even branch out if you want to go further with your Power BI average analysis.

All the best!


Power Bi Virtual Table

Power BI virtual table is my personal favorite DAX topic. They’re the key to unlocking the full power of DAX. Virtual tables are the only type of tables within Power BI that are fully dynamic, and there are problems that can only be solved by applying virtual table techniques within your measures. You can watch the full video of this tutorial at the bottom of this blog.

In this tutorial, I’m going to share my top 5 tips and tricks that I’ve accumulated over the years that really have helped me understand and debug what’s happening within Power Bi virtual tables.

The example I’m going to walk through today is from the Enterprise DNA forum, and it comes from a member named Dave C, who works in industrial safety. Dave had a series of safety scores and he wanted to normalize those so that the top score was 10, and then dynamically come up with the Nth in that list.

Initially, we thought of doing this through a simple RANKX measure, but we later realized that a lot of his normalized values have ties. For example, if you want the seventh item on the list, there’s not going to be a number seven in a RANKX. There’s no easy way to pull that out of a filter condition. So we decided on a TOPN-based measure so that it would always count down the nth number.

This is akin to when you’re pulling the seventh card from a deck, you count out seven cards, and then you flip over that stack of seven, and the card on the bottom is the one you want. We’re going to do the equivalent of that in a TOPN measure.

You can use DAX Studio or the Tabular Editor. In this example, I’m using the Tabular Editor 3 (TE3). It’s technically possible to do this using the Modeling – New Table tab, but that’s going to create physical tables within your data model. You’re going to have to manually flip between that and the editor and it’s just a slow and difficult way to do it.

When you see the dynamic way in which it can be done through an external tool, you’ll see the benefit.

So within the TE3, we create a new DAX query. We can take our initial measure and copy this over to our DAX query.

If you remember, DAX queries always start with EVALUATE. We’re going to get an error initially because DAX queries return tables. This was a measure with the last two variables that are scalars. What we can do here is change the return value, which is my next tip.

You can debug virtual tables in much the same way as you do with measures – piece by piece, by changing the return value. Let’s start with the first virtual table, the vEvalTable. We simply replace the RETURN value (Final) with our first variable (VAR). And you can see that the error goes away because the DAX query is now getting a table.

In the vEvalTable, we’re taking the original data, which are the safety scores, and we’re normalizing those and adding that Normalized Value column to the virtual table. We’ve got the Index, the Value of the Region, and the Normalized value. We can sort these values up or down and filter the values as well.

This is giving us exactly what we’d expected. It returns 50 rows, which is the full data set. That’s all going well, so let’s go down and explore the next table, which is the vTableTopN. In this table, we’re taking TOPN using the nth item slider value. In this example, we have that seventh value of the virtual table above (vEvalTable), and we’re taking that TOPN based on the normalized value in descending order.

So, when we change our RETURN function into that, it falls off and we don’t get anything. Let’s take a look at why because this is a really important concept for debugging and understanding virtual tables.

If we look at the formula, we have the Nth Item Slider Value as the main suspect here. Back to Power BI, we can see that this is basically just harvesting the number seven. Note that sliders exist within the context of a page.

And so, in this case, when we’re looking at debugging that table out of context, that selected value has no context around it. It doesn’t have anything in terms of being able to pull that number. We were getting a TOPN, but we don’t know what the N is in TOPN because that selected value is returning a blank.

How do we handle that? Let’s look at the selected value measure. Most of the time, we always pay attention to the first parameter in SELECTEDVALUE, but there’s a second parameter, which is an alternate. This brings us to my third tip.

What happened here is that it has been pulling the blank as the alternate. But what we want to do (for debugging purposes) is that we want to put a real value in here. So, we put the number 7 and save that.

Now we have some values. It’s returning seven rows, which is exactly what it should because of that TOPN value of seven.

Let’s continue down the line to the next virtual table, which is the vTableNthItem. We’ve got that stack of seven cards, and this table is basically flipping it over. We were in descending order in the previous table, and now we’re in ascending order.

If we take and copy this down to the RETURN section, we get the results. It’s interesting that it’s not returning one row. It’s returning three rows because these three are tied. That’s exactly the reason why we use TOPN rather than RANKX, in this case.

Now let’s go to Result. If we copy the VAR Result down to the RETURN section, this gets to my fourth tip.

Within the formula, we’re taking the max (MAXX) of that vTableNthItem and we’re returning the normalized value. This could be MAX, it could be MIN, it could be AVERAGE. It’s just some aggregator that’s returning that one value in that table. And so, if we copy this down, it’s going to give us an error because this is now a scalar.

But this is my fourth tip, which is in the context of debugging. What you can do is just add the curly brackets. By doing so, it turns that scalar into a table.

And then, what we’ve got here is just a final error check, which is if it turns out that the evaluation table is smaller than the number of rows, it will return insufficient data. But we know in this case that our data set is big enough. However, we can just test that by typing in Final. Again, because that’s a scaler, we also need the curly brackets, and we get the same value here.

We’ve delved in and debugged this virtual table, and we’ve used the alternate value in the SELECTEDVALUE to keep it from falling over out of context. Now I just want to show you one additional tip that I found really useful.

In the context of doing your debugging, you’re going to want to see in Power BI what that table looks like. The general rule is that a measure can only return a scalar, not a table. But, there’s one cheat that I’m going to show you that allows it to quasi return a table.

Let’s take a look at this measure, which is Visualized Virtual Table, and we’ve got here all the virtual tables that we had initially. For example, we want to display, let’s say on the front report page, the vTableTopN.

You can use this CONCATENATEX function. You can actually take that virtual table name (vTableTopN) and take the values in that table and concatenate them. You can create something that basically looks like a virtual table.

If we take this measure, we need to go back to Power BI and drop this into a card measure. Typically, the table gives an error, but through CONCATENATEX, it turned that table into a scaler. You can see that it’s fairly primitive, but it’s returning exactly what we expect and it’s doing so in a dynamic way.

It is a way to push a table into a measure and show that in your report. It’s a really helpful debugging trick. It will provide a good format in a card value that you can use in a report.

Hopefully, this tutorial gives you some food for thought in terms of working with a Power BI virtual table. These are some additional tips and tricks for understanding what’s going on within your virtual tables. I hope you found that helpful.

Visit our website for more Power BI tutorials and check out the links below for more related content.

All the best!


Update the detailed information about My Practice Tips When Using The Power Bi Advanced Editor on the website. We hope the article's content will meet your needs, and we will regularly update the information to provide you with the fastest and most accurate information. Have a great day!