Miguel Escobar Published February 21, 2021

Power BI REST API Connector

Power BI

Since the inception of Power BI Custom Connectors, I’ve been drawn to coming up with new ones that could help my life and the lives of my customers. One of the most helpful custom connectors that I’ve created so far is the Power BI REST API Custom Connector (url).

However, in order for people to use it they had to go through a really complex set of steps where they had to register their own application, modify the custom connector until they were able to actually launch the connector in their Power BI Desktop.

A big chance occured not so long ago – a new Authentication kind has been added to the Power Query SDK called the AAD (Azure Active Directory) kind which is specific for resource that use the AAD. This is similar to how you might use the ADAL library when developing a solution in .NET. I’m no programmer / developr, so I have little to no experience working with ADAL or MSAL, but this implementation of the authentication kind goes beyond just a simple library, as it also provides an embedded first-party application created by Microsoft for anyone to use.

What does this mean? It means that it’s much easier to use the connector now as it doesn’t require anything else other than the connector file itself.

In short, with this new feature for the custom connector, all you need to do to connect to the Power BI REST API from within Power BI is to:

  1. Download the .mez file from the GitHub repo
  2. Save the file in your custom connectors folder ([Documents]\Power BI Desktop\Custom Connectors). If the folder doesn’t exist, create it and plac the file in there.
  3. In Power BI Desktop, select File > Options and settings > Options > Security. Under Data Extensions, select (Not Recommended) Allow any extension to load without validation or warning. Select OK, and then restart Power BI Desktop.
  4. Open Power BI Desktop and find the connector listed inside the Get Data window

Features found in the connector

With this connector you are able to connect to almost every single GET endpoint from the REST API. Most importantly, if you don’t find it in the long list of tables available, there’s a GETData function so you can insert any url of the Power BI REST API and query the data that you need.

One of the coolest features that I was able to introduce was a way to get the Event Activity Log data inside of Power BI through a function. Not only that, but the function itself can be used for Incremental Refresh, meaning that we could absolutely compile all of the event activity log data into a single dataset and analyze everything and anything with ease. You can read more about how to use that function from this article (url) that I wrote a few years ago.

How to schedule a refresh in the Power BI Service

Since this is a Power BI Custom Connector, it follows the same set of rules as other custom connectors. You can read Microsoft’s official documentation for the step by step on how to configure the scheduled refresh, using a gateway, of your dataset sourced from a data connector from here (url).


I’ve tested this connector for the past few years in small (dozens of workspaces and users) to big environmnts (hundreds of workspaces and users), but I’ve never attempted to use this custom connector in a huge environment with thousands of workspaces and users under the same tenant. In those cases perhaps a more granular and programmatic approach might be necessary.

The primary reason for this is that the REST API is not necessarily fast in sending the data, but the connector has been purposefully built in a way that query folding is not enabled and there’s no schema defined for most tables in an effort to make this a connector that will simply always show you all the data that we get directly from the REST API in the easist to manage way. I’ve chosen usability over performance. In theory, this will impact tenants which have ‘larger than life’ workspaces and users.

If you’re getting serious about managing your Power BI tenant, then this is a must have tool for you. Give it a try and if you find any issues please report them on the GitHub repo!

Power BI
56 Comentario
Newest Most Voted
Inline Feedbacks
View all comments
jerry deng

This is awesome and thanks for your contribution. This is the most convenient way for me as admin to extract and analyze tenant data. I have a separate workflow to store the daily event activity log and will continue using it, but your connector allows me to quickly get the other metadata (report/dataset catalog for example) to do a hollistic tenant review.

jerry deng

Hi Miguel, does this connector work for the service refresh as well? I tried with enterprise gateway following the documentation below but could not find this as a connector option when adding new data source for the gateway.

jerry deng

Sorry please disregard my earlier question. I am able to proceed by moving the connector file to a folder location that is accessible by the gateway service account. Still having an error now with “gateway cannot be reached”, but that’s probably something specific to my scenario. Thanks again!

Antonio Palomas

Thank you very much for your contribution. This is very good and will help a lot in the administration of the Power BI Service. Congratulations on the idea and initiative to build the connector.

Ricardo Bunge

Hi Miguel — this sounds fantastic! I grabbed the .mez file, placed it in the Documents\Power BI Desktop\Custom Connectors folder, changed the security setting in PBI, but I still cannot get your connector to appear in the Get Data list. Any ideas?

Ricardo Bunge

This is actually the first one I’ve tried. I’ll have a look at that link you sent. Thanks!


Hi Miguel!

It would be very nice to add support for Dataflow (Get Dataflow Transactions), so we could track the refreshes status! I leave the documentation here: https://docs.microsoft.com/en-us/rest/api/power-bi/dataflows/getdataflowtransactions

Nice job!

Rui Caio

First of all, I would like to thank you for this. This is the easiest way to consume the API.
I’m facing only a small problem using the “DatasetDatasourcesAsAdmin”
“Error returned: OLE DB or ODBC error: [Expression.Error] The field ‘datasourceid’ of the record wasn’t found”
Could you help me with that?
Everything else is working well.
Thank you for this!


Awesome stuff! I am having a problem though – I’m trying to pull multiple months of the “event activity log” with a filter right after “Invoked Function…”, where “Operation = GenerateEmbedToken.” This will run for a short time frame passed in the date function, for example if I pull Jan 1 to Feb 1st. But it slows to a crawl if I Select Jan 1st to today for example. Rather, it won’t even load. The only troubleshooting I’ve been able to identify is that my task manager on my laptop says memory is at 99.9% when I’m trying to pull those dates. Is this function very memory intensive? My machine has 16 gb ram, so I figured that would be enough.


Hi Miguel! This is an awesome connector and just what I need. I’ve built a report in Power BI Desktop using the connector and am currently trying to configure it on our gateway. Using the Microsoft documentation on setting the custom connector up (the link in your article), I’ve dropped a copy of the latest connector file in a folder and the server where the gateway is installed. I’ve configured the folder location on the Connectors tab of the gateway app and provided the service account full permissions on the folder. I’ve also verified that I’ve used the correct service account set up with the gateway. Sadly I still can’t get the gateway to recognise the connector. Have you encountered this issue with any clients or have any other ideas why the gateway won’t pick the connector up? Thanks


It appears the solution was to keep the folder name for the custom connectors short and use a different service account instead of the default PBIEgwService account.

varun devaraj

I am implementing the activity pull using your connector. However, it’s taking way too long to pull the data when I am trying to pull the last 90 days. Do you know of any method that could help me to improve my performance?

I am using the connector with dynamic parameters for Start and End dates

Power query for StartDate

Source = Date.From((Date.AddDays(DateTime.LocalNow(),-90)))
Power query for EndDate

Source = Date.From(DateTime.LocalNow()),

Event Activity Function:

(#”StartDate” as date, #”EndDate” as date)=>
Source = PowerBIRESTAPI.Navigation(),
Functions = Source{[Key=”Functions”]}[Data],
EventActivityLog = Functions{[Key=”EventActivityLog”]}[Data],
#”Invoked FunctionEventActivityLog1″ = EventActivityLog(#date(Date.Year(StartDate), Date.Month(StartDate), Date.Day(StartDate)), #date(Date.Year(EndDate), Date.Month(EndDate), Date.Day(EndDate)))
#”Invoked FunctionEventActivityLog1″
Invoked Function:

Source = EventActivity(StartDate, EndDate),
#”Filtered Rows” = Table.SelectRows(Source, each ([UserId] “”)),
#”Filtered Rows1″ = Table.SelectRows(Source, each ([ActivityId] “”))
#”Filtered Rows1″

varun devaraj

The issue I am having is it’s taking way too long to load, tried to load data for 6 hours then failed. Was trying to figure out if there is any option to improve the performance of data pull.


Hi Miguel!

I’ve been using your connector for a bit now and it mostly works great and super easy to use, I’ve just been coming onto a bit of a problem though with the Audit Log endpoint. My scheduled refresh is failing everyday now saying that “The column ‘(Insert Column Here)’ of the table wasn’t found”. The column doesn’t matter as it will pick the next one if I remove it from the dataset. It’s erroring when I give a column a data type. The connector works fine when refreshed on desktop only in the service both with scheduled refresh and on demand. I’m lost for where to go from here, let me know if you need more information.



Oh that makes sense! I think I could create an empty table with columns that I want, then append the results of the Audit Log onto that table and do a Remove Other Columns. It works for me as I know what columns that I want, but still doesn’t work for all possible combinations like you were talking about.


Hi – I just wanted to say thank you for this connector – really appreciate you publishing this.

Ricardo Pedro

Hi Miguel Great Tool.
we have a project of Power BI reports Embedded in Dynamics 365 portal (iframe) is there a way to track the usage adoption of does embedded reports?

Any tips or guidance will be great .


Awesome. I am wondering if you can create a custom connector in Power Automate that can do the same please?

Chuck Gould


Thank you and others for creating and sharing the Power BI REST API connector. This is awesome!

2 issues I am having.

a) The GetData and GatRawData functions import as tables, not functions.
b) The Event Activity Log function

I would like to pass todays date as the end date and todays date -30 as the start date without managing parameters so that when I refresh it will get the last 30 days data.

Do you have any suggestions on how to approach passing dates into the function?

Mickael Girardy

Thank you very much for this.
I have an issue, i’m not able to open .sln file for adding some fonctions. I have Visual Studio 2017 SSDT.


Hi Miguel. Very useful tool, thank you for this.

We have some embedded reports which clog up the user activity data, since they have the operation: “GenerateEmbedToken”. It makes the API call slow, since there is a lot of these rows. I am not interest in reporting on these, but instead only on operation: “ViewReport”.

Microsoft uses this example as how to filter these out in the API call:
GET https://api.powerbi.com/v1.0/myorg/admin/activityevents?startDateTime='2019-08-13T07:55:00.000Z'&endDateTime='2019-08-13T08:55:00.000Z&$filter=Activity eq ‘ViewReport’

Can I add this filter in the API call using your custom connector? I am not sure where i would state this. This would be my M code after using the ‘Event Activity Log function. I tried adding the filter after the dates, but without luck unfortunately.

  Source = #”Event Activity Log”(#date(2022, 1, 1), #date(2022, 1, 1))


Hi Miguel.

Thank you for taking the time to get back to me.

I have taken a look at the code and tried to modify it to fit my need. By looking at the highlighted code in your link, I would assume that the only change I need to make is this (picture, green highlight). However, it still returns all activities.

Did I miss another spot where I should specify this?

Have a good one.

Filter activity.png

Hello Miguel,

Thanks a milion for creating and sharing this connector! It is awesome and makes it super-simple to work with PBI API.

Just a one rookie question – when I open Dataflow Transactions, it returns me always only 10 last records for any dataflow despite I can see more history in the web front-end. I tried checking the web but without any luck.

Would you have some idea how to get over this / what might be the cause? (nature of API / settings of my PBI Desktop / admin settings of my company for API…)

Thanks again!


Last edited 2 years ago by Mike

Muchisimas gracias por esta contribución.

Llevaba tiempo buscando algo así. Hasta ahora tenia que resolverlo usando powershell, pero esto permite una integración mucho mas simple.


Great work with the connector. I have a question for GetData Functions. It’s asking for a URL. Which URL do I need to provide for this?

Thank you,


Thanks a lot for this awesome connector. I have been struggling to get this data and set this up in an automated way and I came across this connector and it really simplifies things.

I am trying to setup incremental refresh for the event log, but not sure how exactly to do it. As we are using REST APIs there is no query folding here so I don’t understand how incremental refresh would work here. Would appreciate some pointers to help set this up.


Hi Miguel,

Many thanks for the excellent Connector. I’m hoping you can assist please…

I am getting an error in the last row of the Dataflow Tables (namely Tables 7,8,9 & 10)…

“Expression.Error: The field ‘name’ of the record wasn’t found.

…which prohibits me from loading into the model.

In Power Query Editor I can see all the records in these tables but with the final row being ‘error’ across all columns. The ‘Remove error’ and ‘Replace error’ functions won’t resolve this issue.

Can you advise?

Many thanks,





Hi Miguel,

I am getting the below error when trying to refresh the dataset using this API. Any suggestions?

Data source error: Unable to refresh the model (id=15154) because it references an unsupported data source.
Cluster URI: WABI-AUSTRALIA-SOUTHEAST-B-PRIMARY-redirect.analysis.windows.net
Time: 2022-07-21 22:47:01Z

Mei Wah

Thk u very much! This connector is very helpful to me. As I’ve encountered problems in calling Power BI Rest API by Postman, I don’t know which parameters I should use. With this connector, I don’t need to call by Postman

I’m on the next step of configuring a schedule refresh. Hopefully it works!

Last edited 1 year ago by Mei Wah

Hey, are there possibly any plans to have this awesome connector certified, so its available in dataflows?


Well I am not seeing API in Custom Connectors even if I copied it in new created folder. Am I doing something wrong? What kind of ADmin rights do I need to have actually in Power BI for this?