0

Is it possible to execute the bulk insert query in SQL Server with a buffer from Python (df.to_csv(buffer) where buffer is a Pandas dataframe) instead of a file path:

bulk insert #temp 
from 'file_path'        -- instead of a file_path, use buffer?
with (  

        fieldterminator = ',',
        rowterminator = '\n'

      );

The only way I can think of doing this is to have Python make a CSV file and store it locally, and then have SQL execute a bulk insert via the file path. However, is there a way to do this without storing it locally, and rather directly in Python/SQL Server?

4
  • no it has to be a file learn.microsoft.com/en-us/sql/t-sql/statements/… Commented Jun 22, 2023 at 21:11
  • Send it as JSON. eg stackoverflow.com/questions/60745932/… Commented Jun 22, 2023 at 21:14
  • @DavidBrowne-Microsoft i think hwe will get the data faster by saving the csv and run a bulkinsert Commented Jun 22, 2023 at 22:00
  • At some size, probably. But there's not a simple pure-python way to do that. BULK INSERT requires the file to be visible to the SQL Server. So you would need to use BCP or similar. Commented Jun 22, 2023 at 23:39

0

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.