
Announcements
Hi
I have a JSON file containing about 300 URLs; there is data to be extracted from each URL. The Webpages have the same layout (same domain, different articles). Is there a way to extract as much data as the JSON objects? Like executing the extracting process 300 times and then writing all the data to an Excel file? If this is possible, how can it be done? I'm pretty new to the tool, so please be patient. Thank you for your time
Hi @BigMamba
You can make use of Desktop Flows to read the json, go into each of the 300 URLs and extract the required information from the webpage.
Power Automate Desktop has the ability to do screen scraping and capture information from the screens as in a RPA tool.
Just for your reference the forum link for Desktop flows where you will get more help on it is here.