I'm trying to update an SQL table by way of a inner join between a pandas dataframe which is calculatated locally on my server and a SQL table in a remote server with pyodbc, but I can't seem to match the keys between the dataframe/tables.
My first approach was to create a simple query where i update the 3 columns i need, using inner join between the column in dataframes and the colum in the SQL table. But alas, it didn't work, as i was greeted with
The query used in pydobc was:
'UPDATE table1
SET table1.col1 = ' + df[col1] + ', ' +
'table1.col2 = ' + df[col2] + ', ' +
'table1.col3 = ' + df[col3] +
' FROM table1 ' +
' inner join ' + df[key_col] + ' on ' + df[key_col] + '= table1.key_col'
Which returns the error:
TypeError: The first argument to execute must be a string or unicode query.
My second approach was to use a loop and iterate over each row of the dataframe, matching consecutively between the dataframe and SQL table:
SET table1.col1 = df[col1],
table1.col2 = df[col2],
table1.col3 = df[col3]
FROM table1
WHERE table1.key_col = df[key_col]
But alas, it takes up to an hour to match the all the rows between them due to the size of the dataframe.
My expected result was the update of the three columns in table 1, but nothing is actually update.
My current solution was to create new table in SQL with the columns and key I need and then with another query, inner join between the two SQL tables but that is a temporary solution.
Can this be done with pyodbc? I've looked in the documentation and couldn't find anything helpful.