I think that you're right,
@asjones987, there's a lot about Power Automate (
PAuto), specifically, as a product which is concerning. Both from users who use it personally in their daily work, to power users and developers, to people that rely on it as a part of their offering in low/no-code development on consultancy basis.
Before I go on, well done for sticking around ... don't be put off by anyone saying that you're in the wrong place, or asking the wrong question. This is not Stack Exchange/whatever, the worst situation you can normally hope for, here, is no responses! :sweat_smile: ;-)
Anyway ... yes ... A chief issue for me is that there is zero open structure to the handling of the system, and that does not mirror the *excellent* handling of it's parent product, Logic Apps (LA), on Azure ... which PAuto is basically a version of.
I mean, as a flippant humour-intended remark, there is not even a dark mode.
As a more pertinent, serious, remark ... they pushed this new UI on everyone LONG before it was PROD ready ... and ... there are still potentially back-breaking issues, there.
Why do I say potentially? Well, it all comes back to how LA are handled (at least the GUI) ... and the openness of that. If you have problems with the LA GUI, there is a GitHub (not linking here to prevent it getting a LOT of unneeded PAuto traffic, more on that in a second!) where you can scan to see whether others are also seeing the same issue.
With PAuto, we can only hope that MS has managed to notice an issue, and ... if they have ... to include it in the message center ... which you are lucky if you can see. Plus, if you want to contact them about it? Better hope you're paying (which is fine, not complaining, there ... but ... ) or are the admin at your company, because if not ... it's going to be weeks before you even get a pointless question in response (that's not a judgement, just how support works) that you might miss ... and ... ugh ... it's ... it's not good.
The thing about the two products is made worse because the PAuto GUI developers forked their GUI off from the LA development ... so ... even if we found a fixed version of an issue we have in PAuto on LA's GitHub ... we have ZERO clue what has been done to it on the PAuto side. That is horrible, IMO.
---
One thing that I will say about the other communities (which run on better - non-power-platform-based - forum software) is that those MS professionals that you're seeing often tend to be very ... ... ... ... swift ... short (not rude) ... and ... quite often nonsensical in their responses. I'd put them as the equivalent of the chancers on here that just quickly post links to their blogs / money makers, and leave nothing in the forum. Not necessarily wrong, or bad, as people or contributors ... but I'd hardly class them as a useful as a proper response.
---
Anyway ... thought I'd just leave some general chatter (not answers) about the two parts of your post that I think you should spin out in to separate question threads, those being file transfer and CSVs:
- Power Automate cannot move files larger than 20-40 megs through the “On-premises data gateway” (size depends on what you are doing). This is way too small these days. We know the Gateway itself supports large file transfers for Power BI and MS Fabric Data Flows Gen 2. So, the limitation seems to be somewhere else.
- Power Automate should have cleaner process to deal with reading and writing CSV files of various types. CSV files are a common format for file transfers, and it is limited and challenging in Power Automate.
Before I do, though, if you do raise them as questions ( ... and PLEASE do 🙏🙂 ) just make sure that you paste pictorial examples of what you've done / tried thus far (obfuscated where necessary), and (where possible) include sample data and/or a paste of any non-sensitive actions in a code block, too. This really helps folks assist as quickly as possible without guessing. :)
Oh, and ... probably best that we do not dwell on these topics ... as they're DEFINITELY good question fodder ... but ... I'm hoping that the below gives you a nice start, at least. :-)
File Transfers Chat
With File Transfer you may wish to look into 'chunking' and suchlike, there are quite a few ways to handle large (or large amounts of) files, but they're not *all* straight forward. Also, potentially look at using adjacent services that you already pay for that CAN take the files. For example, I may be misremembering, but there's a OneDrive or Google Drive action which ostensibly allows you to give it a link, and then it will go off and download the file directly. It ... just takes a bit of doing. Once you've got it on one of the major services, you can work from there.
That is, of course ... keeping things as 'free' as possible ... there are more expensive solutions out there, including LA and Azure Data Factory, to name but two.
CSV Chat
On CSVs, this is a topic that has been covered ... a LOT ... on this forum. Unfortunately some of the posts may not be accessible because their links don't work (transferred from the old forum) or ... well ... that search tool isn't great. ;-)
However, I feel you on that, specifically. I literally developed my own CSV Function App (using open source javascript, python, or PHP, code/libraries, I think which I baked INTO the Function App to ensure ongoing functionality) and I am not a coder. However ... I then put that in place with an LA for a client, and that did the CSV work ... in either direction. The data CSV action is OK for making tables, but ... it's not 100% ... and (obviously) is only one way.
There are also paid services, too.
A third option is putting something in an Excel sheet, then calling on Excel actions to enact a script or add-in to produce / manage your CSV data.
---
Either way ... again, hope that you raise some additional questions, and ... great thread!!