
Announcements
Hello,
I have a dataverse dataflow that takes data form a csv and saves it in a dataverse table. Every month y replace de csv file with updated data, the csv format is always the same and the size of the file is similar. About 1.5GB. The dataflow has been working fine for 5 months. Some weeks ago the dataflow refresh failed with the next error "Evaluation ran out of memory and can't continue". I tried splitting the csv in 4 to reduce it's size and running the dataflow with each reduced file, and worked fine but just once. Now is failling again, the csv file size is only 380MB and the dataflow has not been changed. I don't know why this can be happening.
Any suggestions? Could it be because tenant's free memory is reducing?
Hello, @Julen_Sanzol, check your Environment Database capacity.
If my reply helped you, please give a 👍 If it solved your issue, please give a 👍 & accept it as the Solution to help other community members find it more.
Visit my Blog: www.powerplatformplace.com
Visit my YouTube Channel: https://www.youtube.com/@powerplatformplace/videos