1

I have try to upload a CSV file into Bigquery using Google Cloud Client Library. In that one of the CSV file has 'null' text in the columns, while uploading the file Bigquery returns an error message saying "Too Few Columns".

Sample File Data:

column1, column2, column3, column4

1, null, 3, null,

2, null, null, null

I have verified the Configuration json sent, it has four Table fields for 4 columns. And error message says 'Expected 4 column(s) but got 2 column(s)'.

Is there any specfic configuration required to handle this scenario?

1
  • What is the schema (type of each columns) ? Commented Mar 10, 2014 at 16:56

2 Answers 2

3

If the columns are numeric, then you specify a null with an empty value.

For example, this works.

$ echo 2,,, > rows.csv
$ bq load lotsOdata.lfdhjv2 rows.csv c1:integer,c2:integer,c3:integer,c4:float
Waiting on bqjob_r4f71e9aebbf9cb57_00000144acfa7622_1 ... (23s) Current status: DONE   

Note that in your example above, you would have an extra value in the 1,null,3,null, line because there is an extra comma at the end. And also note that if your .csv file has a header row, you should use the --skip_leading_rows=1 parameter so that the header doesn't get interpreted as data.

Sign up to request clarification or add additional context in comments.

Comments

-1

If you have a CSV file with the string "null" as the marker for null values, you will need this parameter on bq load

--null_marker="null"

Comments

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.