I'm using Pandas' read_sql() function to read multiple SQL tables into DataFrames. This function requires a pre-specified list of column names which should be read as datetime objects using the 'parse_dates' parameter but the function does not infer datetimes automatically from varchar columns in the server. Because of this, I get DataFrames in which all columns are of dtype Object.
col1 col2
-----------------------------------
0 A 2017-02-04 10:41:00.0000000
1 B 2017-02-04 10:41:00.0000000
2 C 2017-02-04 10:41:00.0000000
3 D 2017-02-04 10:41:00.0000000
4 E 2017-02-03 06:13:00.0000000
Is there a built-in Pandas function to automatically infer columns which should be datetime64[ns] WITHOUT having to specify the column names?
I've tried:
df.apply(pd.to_datetime(df, infer_datetime_format=True), axis=1)
which results in an error:
to assemble mappings requires at least that [year, month, day] be specified: [day,month,year] is missing
I also tried:
pd.to_datetime(df.stack(), errors='ignore', format='%Y%m%d% H%M%S%f').unstack()
and
pd.to_datetime(df.stack(), errors='coerce', format='%Y%m%d% H%M%S%f').unstack()
But this does not work.
Any suggestions about how to infer datetime columns automatically after the DataFrame is constructed?
datetimedtype. That said i would consider fixing data types on the DB side...