33

I try to load my database with tons of data from a .csv file sized 1.4 GB. But when I try to run my code I get errors.

Here's my code:

USE [Intradata NYSE] 
GO
CREATE TABLE CSVTest1
(Ticker varchar(10) NULL,
dateval date NULL,
timevale time(0) NULL,
Openval varchar(10) NULL,
Highval varchar(10) NULL,
Lowval varchar(10) NULL,
Closeval varchar(10) NULL,
Volume varchar(10) NULL
)
GO

BULK
INSERT CSVTest1
FROM 'c:\intramerge.csv'
WITH
(
FIELDTERMINATOR = ',',
ROWTERMINATOR = '\n'
)
GO
--Check the content of the table.
SELECT *
FROM CSVTest1
GO
--Drop the table to clean up database.
DROP TABLE CSVTest1
GO

I try to build a database with lots of stockquotes. But I get this error message:

Msg 4832, Level 16, State 1, Line 2 Bulk load: An unexpected end of file was encountered in the data file. Msg 7399, Level 16, State 1, Line 2 The OLE DB provider "BULK" for linked server "(null)" reported an error. The provider did not give any information about the error. Msg 7330, Level 16, State 2, Line 2 Cannot fetch a row from OLE DB provider "BULK" for linked server "(null)"

I do not understand much of SQL, but I hope to catch a thing or two. Hope someone see what might be very obvious.

21 Answers 21

26

Resurrecting an old question, but in case this helps someone else: after much trial-and-error I was finally (finally!) able to get rid of this error by changing this:

ROWTERMINATOR = '\n'

To this:

ROWTERMINATOR = '0x0A'
Sign up to request clarification or add additional context in comments.

3 Comments

This worked for me. Also I recommend running the more command in CMD to inspect the end of larger files for footers or issues. If you do more +n [filename] the CMD will show all lines after n, use this to inspect near to the end of the file.
That's what I have, but doesn't work.
The 0x0A fix has worked for me in the past. But not this time. Still upvoted the answer because it does work.
16

I had same issue.

Solution:

Verify the CSV or textfile in text editors like notepad+. Last line might be incomplete. Remove it.

Comments

8

I got the same error when I had a different number of delimited fields in my CSV than columns I had in my table. Check if you have the right number of fields in intramerge.csv.

Methods to determine rows with issues:

  1. Open CSV in spreadsheet, add Filter to all data and look for empty values

enter image description here

and here are the rows with less columns enter image description here

  1. Use this page https://csvlint.com to create your validation rules and you can detect your problems in your CSV as well.

enter image description here

Comments

8

This is my solution: just give up.

I always end up using SSMS and [ Tasks > Import Data ].

I have never managed to get a real world .csv file to import using this method. This is an utterly useless function that only works on pristine datasets that don't exist in the real world. Perhaps I've never had any luck because the datasets I deal with are quite messy and are generated by third parties.

And if it goes wrong, it doesn't give any clue as to why. Microsoft, you sadden me with your utter incompetence in this area.

Microsoft, perhaps add some error messages, so it says why it rejected it? Which line did it fail on? Which column did it fail on? It's almost impossible to fix the issue if the reason it failed is concealed!

3 Comments

This should be the accepted answer.
i'm in this boat, a whole day wasted, why is this so hard?
Its beneficial to make use of BULK INSERT with MAXERRORS parameters and ERRORFILE which tell you if the error is due to corruption in the file
4

It was an old question but It seems that my finding would enlight some other people having a similar issue.

The default SSIS timeout value appears to be 30 seconds. This makes any service bound or IO bound operation in your package goes well beyond that timeout value and causes a timeout. Increasing that timeout value (change to "0" for no timeout) will resolve the issue.

Comments

3

I got this error when my format file (i.e. specified using the FORMATFILE param) had a column width that was smaller than the actual column size (e.g. varchar(50) instead of varchar(100)).

3 Comments

Similar case for me, changing the int datatype of column to bigint solves the problem. However I think this should raise an overflow error.
Same for me. Nothing to do with column size, but an incorrect format file all the same.
I have all of them set to max. I'm still getting this error.
1

I got this exception when the char field in my SQL table was too small for the text coming in. Try making the column bigger.

Comments

1

This might be a bad idea with a full 1.5GB, but you can try it on a subset (start with a few rows):

CREATE TABLE CSVTest1
(Ticker varchar(MAX) NULL,
    dateval varchar(MAX) NULL,
    timevale varchar(MAX) NULL,
    Openval varchar(MAX) NULL,
    Highval varchar(MAX) NULL,
    Lowval varchar(MAX) NULL,
    Closeval varchar(MAX) NULL,
    Volume varchar(MAX) NULL
)

... do your BULK INSERT, then

SELECT MAX(LEN(Ticker)),
    MAX(LEN(dateval)),
    MAX(LEN(timevale)),
    MAX(LEN(Openval)),
    MAX(LEN(Highval)),
    MAX(LEN(Lowval)),
    MAX(LEN(Closeval)),
    MAX(LEN(Volume))

This will help tell you if your estimates of column are way off. You might also find your columns are out of order, or the BULK INSERT might still fail for some other reason.

Comments

1

I encountered a similar issue, but in this case the file being loaded contained some blank lines. Removing the blank lines solved it.

Alternatively, as the file was delimited, I added the correct number of delimiters to the blank lines, which again allowed the file to import successfully - use this option if the blank lines need to be loaded.

Comments

1

This can also happen if you file columns are separated with ";" but you are using "," as the FIELDTERMINATOR (or the other way around)

Comments

0

i just want to share my solution to this. The problem was the size of table columns, use varchar(255) and all should work.

Comments

0

The bulk insert will not tell you if the import values will "fit" into the field format of the target table.

For example: I tried to import decimal values into a float field. But as the values all had a comma as decimal point, it was unable to insert them into the table (it was expecting a point).

These unexpected results often happen when the provided CVS value is an export from an Excel file. Your computer's regional settings will decide which decimal point will be used when saving an Excel file into a CSV. CSV's provided by different people will cause different results.

Solution: import all fields as VARCHAR, and try to deal with the values afterwards.

Comments

0

For anyone who happens to come across this post, my problem was a simple oversight in regard to syntax. I had this inline with some Python, and brought it straight into SSMS:

BULK 
INSERT access_log
FROM '[my path]'
WITH (FIELDTERMINATOR = '\\t', ROWTERMINATOR = '\\n');

The problem being, of course, the double backslashes which were needed in Python for the way I had this embedded as a string in the script. Correcting to '\t' and '\n' obviously fixed it.

Comments

0

Same happend with me, Turns out that this was due to duplicate column names. Renamed the columns to be unique. & It works fine

1 Comment

Please add further details to expand on your answer, such as working code or documentation citations.
0

Please look at your file, if any special characters or spaces at end of the file, then remove and try again.

Comments

0

I came across another potential reason. I got this error when my table had a data source as int but the user had commas in the csv file. Change to number formatting and it imported the data.

Comments

0

My case is I use txt file to import data into SQL Server. All the columns are matched and I can't find what's wrong. At the end, it's encoding problem.

Solution: Use notepad++ to change to the right file encoding.

Comments

0

I am getting this error when I try to pass Null for int columns even though those columns are nullable.

So, I opened the csv file in a editor and replaced all Null values with empty value. And it worked.

Before Data:

636,NULL,NULL,1,5,K0007,105,NULL,2023-02-15 11:27:11.563

After Data:

636,,,1,5,K0007,105,,2023-02-15 11:27:11.563

Comments

0

I did have same issue, the root cause is the destination table's field structure and total fields has to be always matching, else the BULK INSERT will fail.

Comments

0

Make sure there's no null in your CSV file. SQL Server threw the same error in our case because there was null present in some cells of the csv file.

Comments

0

I usually fix this with 0x0a as proposed by J.Perkins above. Actually, I don't fix it: all of my scripts use 0x0a. I hit this problem so rarely that I always have to search for the fixes because it is too long between breakages.

The problem this time is that the file had no CRLF on the last line. Added CRLF and ... runs like a champ.

*As for Contango's comment that he has "never managed to get a real world .csv file to import using this method". He is correct. If your "real world" is 3rd party datasets, then BULK INSERT is a really, really bad idea. My real world is very different. My team creates the file using bcp. Yep. Always pristine data. Well ... almost always. Developer modified the file after it was created.

Comments

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.