0

I'm trying to intentionally insert one dummy record with full schema data into bigquery GA360 table.

One way I figured it out is to "INSERT" all fields:

INSERT INTO <project.dataset.table> (visitorId,visitStartTime,date,totals,customDimension.......)
VALUES ( 1234, 12345, 20200101, ( 1,2,3,4,5,6,7,8,9,10,11,12,13),[(1,"asd"),....].....)

I gave up when I encountered "hits" field which has complicated nested structure.

Is there any other way (ex. something I can imagine is to code with client API using table schema JSON file) or anyone who has succeed with DDL?

1 Answer 1

3

How would I solve this:

  1. Get a sample as JSON object:
SELECT TO_JSON_STRING(a)
FROM `bigquery-public-data.google_analytics_sample.ga_sessions_20170801` a
LIMIT 1
  1. Save that long JSON string to a file.

  2. Modify that file as much as you want.

  3. Create a new table to insert that file:

CREATE TABLE `temp.analy` AS
SELECT *
FROM `bigquery-public-data.google_analytics_sample.ga_sessions_20170801` 
LIMIT 0
  1. Insert your file with the manually modified values:

bq load --source_format=NEWLINE_DELIMITED_JSON temp.analy my.json

Or

5b. Create a federated table, so you could

INSERT INTO 'table'
SELECT * 
FROM `federated_table_that_reads_the_json_file`
Sign up to request clarification or add additional context in comments.

1 Comment

Thanks for the tip !- it took sometime to modify values but was much easier than inserting all values

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.