This is basically a duplicate of this with s/mysql/postgresql/g.
I created a table that has a timestamp TIMESTAMP column and I am trying to import data from CSV files that have Unix timestamped rows.
However, when I try to COPY the file into the table, I get errors to the tune of
2:1: conversion failed: "1394755260" to timestamp
3:1: conversion failed: "1394755320" to timestamp
4:1: conversion failed: "1394755800" to timestamp
5:1: conversion failed: "1394755920" to timestamp
Obviously this works if I set the column to be INT.
In the MySQL variant, I solved with a trick like
LOAD DATA LOCAL INFILE 'file.csv'
INTO TABLE raw_data
fields terminated by ','
lines terminated by '\n'
IGNORE 1 LINES
(@timestamp, other_column)
SET timestamp = FROM_UNIXTIME(@timestamp),
third_column = 'SomeSpecialValue'
;
Note two things: I can map the @timestamp variable from the CSV file using a function to turn it into a proper DATETIME, and I can set extra columns to certain values (this is necessary because I have more columns in the database than in the CSV).
I'm switching to postgresql because mysql lacks some functions that make my life so much easier with the queries I need to write.
Is there a way of configuring the table so that the conversion happens automatically?
bigint, import the data, then convert the column to a propertimestamp