4

I have a CSV file in the following format:

data, data, "timestamp", data, data, data, data, data

I need to remove the double quotes from around the timestamp data, then insert it into the table as a DATETIME data type.

After researching formatfiles, I have come up with this:

10.0
8
1   SQLCHAR 0   12  ","     1   Data        SQL_Latin1_General_CP1_CI_AS
2   SQLCHAR 0   12  ","     2   Data        SQL_Latin1_General_CP1_CI_AS
3   SQLCHAR 0   26  "",""   3   Timestamp   SQL_Latin1_General_CP1_CI_AS
4   SQLCHAR 0   41  ","     4   Data        SQL_Latin1_General_CP1_CI_AS
5   SQLCHAR 0   41  ","     5   Data        SQL_Latin1_General_CP1_CI_AS
6   SQLCHAR 0   41  ","     6   Data        SQL_Latin1_General_CP1_CI_AS
7   SQLCHAR 0   5   ","     7   Data        SQL_Latin1_General_CP1_CI_AS
8   SQLCHAR 0   12  "0x0a"  6   Data        SQL_Latin1_General_CP1_CI_AS

where the 3rd row, Timestamp, is the item with the double quotes around it.

Attempting to use this file in a Bulk Insert results in the error message

Msg 4823, Level 16, State 1, Line 2 Cannot bulk load. Invalid column number in the format file.

Is there a way I can alter the formatfile to do what I need? I'm using MSSQL.

2 Answers 2

5

Well your invalid column number error is probably caused by having column number 6 repeated instead of column number 8 in your destination field numbers.

But for removing the "" you need to use \" in your delimiter for column 2 and 3 like so...

SQLCHAR 0   12  ","     1   Data        SQL_Latin1_General_CP1_CI_AS
SQLCHAR 0   12  ",\""   2   Data        SQL_Latin1_General_CP1_CI_AS
SQLCHAR 0   26  "\","   3   Timestamp   SQL_Latin1_General_CP1_CI_AS
SQLCHAR 0   41  ","     4   Data        SQL_Latin1_General_CP1_CI_AS
SQLCHAR 0   41  ","     5   Data        SQL_Latin1_General_CP1_CI_AS
SQLCHAR 0   41  ","     6   Data        SQL_Latin1_General_CP1_CI_AS
SQLCHAR 0   5   ","     7   Data        SQL_Latin1_General_CP1_CI_AS
SQLCHAR 0   12  "\r\n"  6   Data        SQL_Latin1_General_CP1_CI_AS

-- note: use \r\n for row terminator for an Excel file saved as CSV

so for the delimter for column 2 is actually ," and your delimiter for your column 3 (timestamp) is ", - the " gets removed from the data as it's part of the delimiter.

Note: if you have column titles in your first row this will not work correctly i.e. if your first row contains column headers like...

Field1Name,Field2Name,Timestamp,Field3Name ...

then the delimiters above will not work for this row because there are no quotes around the column header Timestamp. The result of this is that your first row will have correct data in columns 1 and 2 but then column 3 doesn't have a valid delimiter in row one (",) so it contains all the rest of the column headers and the first 3 fields of row 2 until it finally finds a correct delimiter (",) at the end column 3 on row 2. Then the rest of row 2 appears in the columns after. It's a mess. And you can't get round it by using

FIRSTROW = 2

You have to either remove the header row, or put quotes around your column 3 title -

Field1Name,Field2Name,"Timestamp",Field3Name ...

or remove the quotes via your SQL once you have finished the bulk insert.

Sign up to request clarification or add additional context in comments.

Comments

0

would this work:

SQLCHAR 0   12  ","     1   Data        SQL_Latin1_General_CP1_CI_AS
SQLCHAR 0   12  ","     2   Data        SQL_Latin1_General_CP1_CI_AS
SQLCHAR 0   26  '","'   3   Timestamp   SQL_Latin1_General_CP1_CI_AS
SQLCHAR 0   41  ","     4   Data        SQL_Latin1_General_CP1_CI_AS
SQLCHAR 0   41  ","     5   Data        SQL_Latin1_General_CP1_CI_AS
SQLCHAR 0   41  ","     6   Data        SQL_Latin1_General_CP1_CI_AS
SQLCHAR 0   5   ","     7   Data        SQL_Latin1_General_CP1_CI_AS
SQLCHAR 0   12  "0x0a"  6   Data        SQL_Latin1_General_CP1_CI_AS

1 Comment

No, I've tried that variation. I get "Cannot bulk load because the file could not be read. Operating system error code (null)."

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.