1

So as an overview of what I am trying to do

  1. User Uploads a csv file to Power Apps
  2. The Power App sends that file to a Power Automate Flow
  3. The Flow take the file contents and sends it to a SQL Stored Procedure as a NVARCHAR(MAX) param
  4. ???

Power Automate Flow Screenshot

So I have all that working, and the contents of the CSV come in fine as a complete text string parameter in the SQL stored procedure, that has the proper new line formatting and everything from the file.

Awkwardly MS SQL has the Bulk Insert tool that if I had the file saved somewhere would work great, but doesn't to my knowledge have a way to process it if it's just a string. I have seen answers where people setup like 5 different sql functions and then run a proc that uses them to parse the string, and I'd really prefer to utilize a more official/built-in method to do so if possible.

I was thinking maybe take the string, and then in the stored procedure save that to temporary csv or something, then I could use bulk import from that? My concern is I want it to be versatile enough that regardless if this is a local SQL server or an Azure SQL server it'll work, so I'm not sure if you can save a file locally on Azure SQL? Also in terms of saving it somewhere beforehand like to sharepoint/blob storage as I'm not sure if the bulk insert would have a way to properly authenticate since I don't want the files to be just publicly accessible?

Alternatively I'm fine with processing the file in either the Power App or Power Automate Flow stages, I'm just not sure that I can. Flows have some csv actions but they're all third party and seem to require subscriptions and whatnot.

Maybe I could make an Azure function to do it? I'd like to have as few moving pieces as possible (and fewer things to configure) but if that'd be a more reliable way to read the file and I'd need blob storage otherwise then maybe it should be an option.

Another idea I had that I have no idea if it's realistic would be maybe I could save that file/file-contents to an environment variable in Power Platform that I could then use with a PowerQuery action or something?

Thank you for your time!

3
  • Why use Power Automate for something that can already be done infinitely better with BULK INSERT, the bcp command line tool or SSIS? Why use a stored procedure at all? If you want a GUI-based tool, that's Azure DataFactory, not Power Automate. Power Automate is the completely wrong tool for the job Commented Mar 20, 2024 at 14:32
  • Because it is an import that triggered from a power app and needs to be run dynamically from there as part of an end user functionality, by all means if there is a good way to do it directly from power apps I'm all for that Commented Mar 20, 2024 at 14:42
  • I am experiencing the same issue. I can't believe there isn't a straightforward way to do this and Power Automate doesn't contain any support for CSV stuff OOB. The way I see it we have two options: 1. Convert the CSV data into JSON and use a stored procedure to insert or update the in a table using MERGE. 2. Forward the content or use an Azure function app in the first place (if possible). The CSV data will be much easier to manipulate and handle with PowerShell, Python, C# etc. Commented Oct 8, 2024 at 20:22

0

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.