Edit 5th May 2019: Unfortunately this method will not work in the Power BI service!
Edit 23rd August 2021: Pedro del Melo Cabral Sousa has found a workaround how to make this method work in the service. I haven’t had the chance to test this out, but you might want to give it a try: “if you migrate the necessary queries to power bi dataflows and then create one big query in your report dataset with all the necessary references to the power bi dataflows together with the POST request, it will work in Power BI Service”
In my last 2 posts I’ve described a way to automatically validate attachments from incoming E-mails. Microsoft Flow would watch for incoming E-mails, that match certain criteria and move their attachments to a dedicated folder. Then it would trigger a refresh of a Power BI dataset, that has been designed to check for errors in those attachments. Data driven alerts in Power BI would indicate if there are errors and trigger a Flow that sends an E-mail back to the sender, informing him that his attachments didn’t meet the agreed criteria.
In this article I will now explain how not just a trigger about the existence of a faulty attachment could be passed back to Flow, but also the corresponding data itself. Therefore I write a query that exports data from Power BI to Flow. But watch out: This is not suitable for very big tables. I experienced timeouts at tables with 300k rows already. Also, you need a paid Power Automate license for the connector (Pricing | Microsoft Power Automate).
Send data from Power BI to Microsoft Flow
As Chris Webb described in this article, Power Query can create POST requests to a webservice, thereby passing values in the body of the call to the webservice. This allows to export the data from Power BI. With Flow, it is very easy to setup a webservice: Just create a Flow whose trigger is a “When a HTTP request is received” (no further inputs in that step) and add a “Compose”-action with the Body-element in it. Then save the Flow and copy the generated URL:
Power BI
In Power BI, you perform the following steps:
- Transform your error-table into a list of records (as that is the format of a JSON-array that Flow likes):
- row 3: Table.ToRecords(<YourTableName>)
- Transform that into a JSON-binary that the Web.Contents-function can digest:
- row 4: Json.FromValue(<YourListOfRecordsFromAbove>)
- Make the Web-call to the copied URL (use Anonymus credentials):
- row 5-7: Web.Contents(<PasteTheURLFromFlowHere>, [Content=<JsonFromAbove>])
- Parse the result and load this query to the datamodel (this is very important, because otherwise the WebCall wouldn’t be made if the dataset is refreshed!):
- row 8: Lines.FromBinary(<WebCallFromAbove>)
Back in Flow
Modify the “Compose”-step to parse out the table from the JSON like this:
Next step is to create a csv table:
And export it to OneDrive:
I’ve passed the FolderPath and FileName dynamically from Power BI here, but that’s not necessary. Just make sure that the folder is different than the one that contains the original attachments. A trigger will be set on this folder that sends the e-mail back to the sender. So the saving of the original attachment mustn’t trigger this return-email back to the sender already.
Now this sub-flow is completed. In the next steps you will adjust the main flow (like described in the previous post) to attach the generated file to the email.
Adjust main Flow
Task is to attach the stored csv-file to the email that will be sent out to the sender of the attachment. There are a couple of new steps (red) and an adjustment in the “Send email”-step:
As the query, which calls the web-service will probably be executed twice, I’ve added a wait-step. The time has to fit to the table length, so you might test this for your specific file behaviour. Then the content from the saved file has to be fetched and added in the attachment:
The last 2 new steps in the flow are similar to the previous ones and simply delete the files from the “return-attachment”-folder. Of course, if you want to keep them in a folder somewhere, you can instead implement a step that moves the file over there, instead of deleting.
Outlook
As soon as Power Query will be fully integrated in Flow, these tasks don’t need any Power BI-involvement any more: You can do the validations directly in the editor there. Although: Given the current lack of support to parse csv-files in Flow (and the terrible performance of the current workaround), I’m wondering if it wouldn’t be better, if the Power Query connector in Flow would save it’s results to csv-files instead of pushing it back to Flow (a bit like the new dataflows are doing it now). This would allow for mass-data-transformations and forwarding its results, without congesting the Flow-service.
What are your thoughts on this?
Enjoy & stay queryious 😉
Wow. This is an ingenious solution!
Thank you, Fred!
Got this error in data source credentials when published to Power BI service.
Failed to update data source credentials: Web.Contents failed to get contents from ‘https://prod-56.westeurope.logic.azure.com/workflows/346eea9b6717462695ca367562e49f1a/triggers/manual/paths/invoke?api-version=2016-06-01&sp=%2Ftriggers%2Fmanual%2Frun&sv=1.0&sig=e6iSqgGlmOJGOUcEGeuslyUvoAXgzrtL9lyOpnuWPv4’ (400): Bad RequestHide details
Activity ID: eee3fbcd-d6cb-4639-a701-8ef3108ffad4
Request ID: 9fd6a5f4-1056-3ae2-d083-abaa425e8f84
Status code: 400
Time: Wed May 08 2019 00:35:47 GMT+0500 (Pakistan Standard Time)
Version: 13.0.9264.166
Cluster URI: https://wabi-west-europe-redirect.analysis.windows.net
Unfortunately I cannot say what’s wrong with the flow.
Thank you for this wonderful post. It was indeed a breeze a to set it up and I was able to run few flows after the scheduled refreshes. But, suddenly it started through the bad request 404 errors. Nothing is changed in the configs though. Have raised it with PBI community hoping to find a solution.
Anything else you can think of? Happy to provide more details. Thanks in advance!!
Sasi.
Hi Sasi,
there might have been some changes in the service, that prevent this working now unfortunately.
I’m not using this in production, so cannot tell from own experience.
/Imke
Actually it is an Data source credentials error. everything works fine in Power BI desktop but when I upload it to Power BI service and tries to refresh, I got this error. (I used Anonymous credentials for HTTP request.)
I’m very sorry, just recognized that I didn’t test this in the service. Unlike the data driven alert, that allows to sign in to SharePoint, there is not such an option for this method.
That was a bad overlook from my side. Will have to check if it’s possible by using a custom connector for it.
Very sorry for wasting your time,
Imke
Don’t be sorry, but considering the whole scenario, It was supposed to work in the Power BI Service. Do you manually refresh the dataset to check whether the email attachment has errors or not?
Do you have any other solution in mind? please discuss so that we can overcome this issue.
Haven’t come up with a good idea yet.
I would be very interested in knowing if you think of a solution to getting this to work in the Power BI service. I’m in need of an automated solution that allows me to export a Power BI dataset to csv or Excel.
Hi Riley,
using Python or R should work in the service: https://www.thebiccountant.com/2018/12/29/export-data-from-power-bi-to-csv-using-python/
/Imke
Hi Imke,
Thank you for the suggestion, and I got that working in the desktop and in the service. Very cool and useful stuff!
I also have a need to export from Power BI to an Excel file (xlsx). I’m struggling to adapt the export to csv concept to fit the need to export to Excel. Any idea what that script or syntax would look like in the Power Query editor?
As Chris Webb described in this article, Power Query can create POST requests to a webservice, thereby passing values in the body of the call to the webservice. This
…i do not understand the very first step…i have did it till let
Source = #” Tracker – SAP”,
ToArray = Table.ToRecords(Source),
ToJsonBinary = Json.FromValue(ToArray)
in
ToJsonBinary
but how to go further
shree
You have to add the steps “WebCall” and “LinesFromBinary” like described in the article.
Could you please specify the steps for webcall method to call power bi datasets as i am unable to get the steps for it
You’re missing the steps 3) and 4) from the listing in the article above. That’s the last 2 steps in this query:
let
Source = #”1_SourceTable”,
ToArray = Table.ToRecords(Source),
ToJsonBinary = Json.FromValue(ToArray),
WebCall = Web.Contents(
“https://prod-123.westeurope.logic.azure.com:443/workflows/….xxxxx….geQ”,
[Content = ToJsonBinary]),
LinesFromBinary = Lines.FromBinary(WebCall)
in
LinesFromBinary
Where “https://prod-123.westeurope.logic.azure.com:443/workflows/….xxxxx….geQ” needs to be replaced by URL from Flow (“When a HTTP request is received” -> “HTTP POST URL”
/Imke
I am entering below mentioned code in “When a HTTP request is received” in flow to generate URL
let
Source = #”Tracker – SAP”,
ToArray = Table.ToRecords(Source),
ToJsonBinary = Json.FromValue(ToArray),
[Content = ToJsonBinary]),
LinesFromBinary = Lines.FromBinary(WebCall)
in
LinesFromBinary
but getting error message in the first line as expecting JSON object.
My question is how to generate this “HTTP POST URL” in flow
1st step
Create Instant flow
2nd Step
Select “When a HTTP request is received”
3rd
I am entering above mentioned code to get URL
Kindly correct me below
how to generate this “HTTP POST URL” in flow
Sorry, but I can’t help you any further here.
/Imke
sorry.. i am asking this again as part of curiosity
Hello,
Actually my simple question ..how you are connecting to power bi using http post request..is it that you are doing schema export or or calling power bi web service as this url requires proper codes to call and your codes says it is invalid json code..please assist
It works in Power BI Service. Just mark “Skip test connection” when adding it as DataSource to Gateway Clusters.
Awesome! Thanks heaps, Marko!
for no-gateway sources check this out:
https://medium.com/@martijnlentink/use-missing-skip-test-connection-in-scheduled-refresh-in-power-bi-5b3fa9a30daf
Hi there,
I’ve managed to replicate the same process as above but set to Create file in a sharepoint folder.
The only problem I’m coming across is that even though I have a daily scheduled refresh configured at specified hours against the dataset, why is it that the flow doesnt trigger also?
This has been working for me except for one thing that I’m still struggling with. Does anyone know how I would return the table back to the query from the function? Right now, after I call the function I’m left with no results (“this list is empty”).
Hi Riley,
because M is a (partially) lazy language, the last step of the export-query has to be the result that the web call that the export does returns.
Otherwise the export would not be executed.
But that should not be a problem. Just split the export query at the step which you want to work with further:
– check the step that follows the one you want to continue with,
– rightclick mouse and select “Extract Previous”
– enter a name for the new “staging” query
– check that query -> rightclick mouse -> reference
Ahh. That could work, but I was kind of hoping to do it in a few less steps. I’m building DataFlows to populate Azure AD groups and this would take me from 60+ queries to 120+ queries. Thanks for the suggestion though, and I’ll see how it affects the overall performance of each DataFlow.
Hope you can help. I can add the HTTP request, compose and create CSV file and the query in power bi desktop works but as soon as i add create file i get the error DataSource.Error: Web.Contents (400): Bad Request
What can i do to fix this?
Sorry, no idea.
You can try to use Fiddler and analyse the web protocols. Just check what Power Query is sending and compare that to the requests from your endpoint.
Hi Imke,
I used Fiddler and got the error:
{“error”:{“code”:”InvalidRequestContent”,”message”:”The input body for trigger ‘manual’ of type ‘Request’ must be of type JSON, but was of type ‘application/octet-stream’.”}}.
In the instruction above its says that you need to add outputs(‘Compose’)[‘BinaryContent’] to the Create CSV but im not allowed to add [‘BinaryContent’].
Might this cause the error?
Hi Walter,
I can’t imagine where the error lies here, unfortunately.
/Imke
Hi,
Thank you for this amazing post !
I had the same error as Walter and made some change to the Web.Contents step to specify the content-type :
WebCall=
Web.Contents(UrlFlow,
[
Headers = [#”Content-Type”=”application/json”],
Content = ToJsonBinary
]),
I had also to change the flow in Power Automate : replace the step ‘Json(decodeBase64(triggerBody()[‘$content’]))’ by simply ‘Body’
And this worked for me ! But I could not understand why… i am very new to all the concepts in this post… any idea ?
Great Lionel, thanks for sharing your code!
/Imke
I fixed it
https://community.powerbi.com/t5/Power-Query/DataSource-Error-Web-Contents-failed-to-get-contents-from-400/m-p/1364487/highlight/true#M43011
Awesome! thanks for sharing.
Hi,
Maybe my question can be silly but I couldn’t understand where did you write the code? In powerbi? Or which platform?
Thanks.
Hi Beyza,
the code has to be written in the query editor in Power BI (Transform data).
/Imke
Somehow I have an error on the Web.Contents(, [Content=]) step
Instead of posting, it somehow returns DataFormat.Error: The input couldn’t be recognized as a valid Excel document.
Details:
Binary
I can’t understand why this error occurs at this place and for what reason. Any ideas?
how do I do the bit i power bi? Is this in advanced mode?
My data will be refreshed every hour, will this method still work?
I have copied the M query to my PB Desktop, but I got an error message that “Formula.Firewall: Query ‘export`’ (step ‘LinesFromBinary’) references other queries or steps, so it may not directly access a data source. Please rebuild this data combination.”
Below is my M query code. Did I miss something ?
let
Source = opportunity,
ToArray=Table.ToRecords(Source),
ToJsonBinary=Json.FromValue(ToArray),
WebCall=Web.Contents(“xxxxxxx”,[Content = ToJsonBinary]),
LinesFromBinary=Lines.FromBinary(WebCall)
in
LinesFromBinary
Hi Jean,
you have to disable privacy settings.
File -> Options and settings -> Options -> Privacy (under: CURRENT FILE) -> Ignore the Privacy Levels..:
/Imke
Do you mean I should tick ‘Ignore the Privacy Levels..”?
This time I get a different error message.
DataSource.Error: Web.Contents failed to get contents from ‘https://xxxxxxx’ (400): Bad Request
Details:
DataSourceKind=Web
DataSourcePath=https://prod-86.westeurope.logic.azure.com/workflows/cd86ab8a272d4cd39d5652ff6c69c016/triggers/manual/paths/invoke
Url=https://xxxxxx
This could be an indicator that you the flow is not covered by a (premium) license.
Do I need a premium Flow license? I have a free Azure account already, will that cover the website?
Thanks!
Jean
Hi Jean,
yes, this connector is premium meanwhile.
You need one of these licenses for it: https://powerautomate.microsoft.com/en-us/pricing/
Azure account doesn’t cover that.
/Imke
Thank you so much for the help, Imke! I purchased a premium license and the Flow works now! I almost have tears in eyes when the csv file opens in front of me.
Hi Imke. I have one last question. According to the export query in power bi, a http request will be made to the url listed in Flow, which means my table in power bi will be converted into Json and be posted to a dedicated url. Is that safe. Will other people be able to see the data posted on that url?
sorry, I have another problem with the Power BI file with the web.call query. In my PB desktop version, I choose anonymous connection type for the web.call url, the flow works with PB desktop version.
But after I publishing the pbix file to workspace, the gateway stops working due to a connection issue.
Below is the error message from Power BI service. What kind of connection should I choose for this API url?
GATEWAY CLUSTERS
Unable to connect: We encountered an error while trying to connect to . Details: “We could not register this data source for any gateway instances within this cluster. Please find more details below about specific errors for each gateway instance.”Show details
Troubleshoot connection problems
Hello Jean,
as stated in the first sentence, this method will not work in the service unfortunately.
You can try the workaround that is mentioned in the second sentence, but I haven’t tried it out yet.
A relatively new alternative which I haven’t blogged about yet is to use the new DAX Query endpoint in the Power Query Rest API. https://powerbi.microsoft.com/en-us/blog/announcing-the-public-preview-of-power-bi-rest-api-support-for-dax-queries/
You can either create a custom connector in Power Automate to use it (using your new license) or use Azure Key Vault to obtain a token like described here: https://www.datalineo.com/post/using-power-automate-to-run-dax-queries-against-your-power-bi-dataset
To my knowledge, this doesn’t create a dedicated endpoint so is potentially safer. But the API itself doesn’t encrypt the result JSON in the body: https://docs.microsoft.com/en-us/rest/api/power-bi/datasets/execute-queries (see also requirements and restrictions for this method in that link).
Will post link to blogpost here once finished.
Cheers, Imke
Hello,
My flow does not work fine. I followed all the instructions above. I put a button on Power BI Desktop on which i pasted the url.
Though, when i click on that button, the webpage opens but that message shows up :
{“error”:{“code”:”TriggerRequestMethodNotValid”,”message”:”The HTTP method for this request is not valid: expected ‘Post’ and actual ‘GET’.”}}
Yet, my http request trigger is on “POST”. I tried with GET, but it does not work either.
Any idea on how to solve this issue ?