I have several views on a Dataverse table with an image column in my Model Driven App. When I export the view or a filtered view using the built-in feature of Export to Excel in the Model Driven App command bar, it doesn't include the image column as Image embedded in Excel. I know there are no direct ways possible but I also don't know the available approaches or steps to complete this job. Can anyone advise on an efficient method of exporting the Dataverse table views and filtered views to Excel with images in it? I am ok to use Power Automate for this but how do I let Power Automate know the views or filtered views dynamically? How do I pass the dynamic filtered view from Model Driven App to Power Automate flow?
@ChrisPiasecki @rampprakash @parvezghumra @dpoggemann
Thanks & Regards,
Ramesh Mukka
@ChrisPiasecki Thank you Chris, I'll consider the env variable.
Sorry for the delay in response @RameshMukka, yes there is case sensitivity to be aware of there. Glad you were able to get it working.
Just be aware with the approach you're taking of calling an HTTP trigger Flow directly, is that the URL will change between environments. You might want to consider sticking that flow trigger URL inside of an environment variable and retrieving the environment variable value at runtime in your JavaScript (I don't think there is currently a programmatic way to retrieve the trigger URL directly, so the env variable is a workaround).
---
Please click Accept as Solution if my post answered your question. This will help others find solutions to similar questions. If you like my post and/or find it helpful, please consider giving it a Thumbs Up.
Oops. I was referring to FetchXML, it should be FetchXml.
function ExportWithLogos(gridContext){
var grid = gridContext.getGrid();
//console.log(gridContext);
var fetchXMLStr = gridContext.getFetchXml();
console.log(fetchXMLStr);
var fetchXMLWithEscapedQuotes = fetchXMLStr.replace(/"/, '\\"');
console.log(fetchXMLWithEscapedQuotes);
var raw = JSON.stringify({"FetchXmlQuery": fetchXMLStr});
var myHeaders = new Headers();
myHeaders.append("Content-Type","application/json")
var requestOptions = {
method: 'POST',
headers: myHeaders,
body: raw
};
fetch("https://prod-05.westeurope.logic.azure.com:443/workflows/222c31d7faf044f6922ab3b6c0c1e456/triggers/manual/paths/invoke?api-version=2016-06-01&sp=%2Ftriggers%2Fmanual%2Frun&sv=1.0&sig=NUoPnj247bDSLPRAMvFAraxoWJNaD2No3kQGGuGCa0c", requestOptions)
.then(response => response.text())
.then(result => console.log(result))
.catch(error => console.log('error', error));
}
@ChrisPiasecki Thank you for the high-level solution. I was trying to get the Main Grid context and associated FetchXML using the below code but it doesn't work. I am passing "SelectedControl" from the Main Grid Command bar's new Custom Button.
function ExportWithLogos(gridContext){
var grid = gridContext.getGrid();
var fetchXMLStr = gridContext.getFetchXML();
console.log(fetchXMLStr);
var myHeaders = new Headers();
myHeaders.append("Content-Type","application/json")
var requestOptions = {
method: 'POST',
headers: myHeaders,
body: fetchXMLStr
};
fetch("https://prod-05.westeurope.logic.azure.com:443/workflows/488c31d7faf044f6922ab3b6c0c1e456/triggers/manual/paths/invoke?api-version=2016-06-01", requestOptions)
.then(response => response.text())
.then(result => console.log(result))
.catch(error => console.log('error', error));
}
Below is the error I get
app.3e7136f1ee0fd09a7329c8ed6b14ab80.js:14 Uncaught (in promise) TypeError: gridContext.getFetchXML is not a function
at ExportWithLogos (tms_ExportWithLogos:3:35)
at y._executeFunctionInternal (app.3e7136f1ee0fd09a7329c8ed6b14ab80.js:14:1194896)
at y.execute (app.3e7136f1ee0fd09a7329c8ed6b14ab80.js:14:1193330)
at 12.7a4539ab78f9b42326a33d062838c305.js:4:40950
at i (app.3e7136f1ee0fd09a7329c8ed6b14ab80.js:14:99256)
at 12.7a4539ab78f9b42326a33d062838c305.js:4:40940
at Array.map (<anonymous>)
at 12.7a4539ab78f9b42326a33d062838c305.js:4:39917
Hi @RameshMukka,
At a high-level should be possible. Below is what would be required (note this requires code):
I would not recommend doing this against a large dataset due to the large volume of API calls required. Limit this to 50-100 rows if possible. If you need a large data set, then I would move the processing to an Azure Function or Azure Data Factory pipeline.
---
Please click Accept as Solution if my post answered your question. This will help others find solutions to similar questions. If you like my post and/or find it helpful, please consider giving it a Thumbs Up.
WarrenBelz
637
Most Valuable Professional
stampcoin
570
Super User 2025 Season 2
Power Apps 1919
473