1

I am integrating a bigQuery in my google cloud project. I have settle up all the requirements which required to integrate the big query. Now i want to perform the insert operation through my php file. I have created a dataset and table in bigQuery.

  • Dataset Name- userDetails
  • Table name- userInfo

I want to make insertion in this table through my php file. Before this, I am saving user details in cloud datastore but now my requirement has changed and I want to save these details in bigQuery. Here is my code for inserting the values in cloud datastore:

$datastore = new Google\Cloud\Datastore\DatastoreClient(['projectId' => 'google_project_id']);
        $key = $datastore->key($entity_kind);

        $key->ancestor(parent_kind, key);
        $entity = $datastore->entity($key);

        /*------------- Set user entity properties --------------*/
        $entity['name'] = $username;
        $entity['date_of_birth'] = strtotime(date('Y-m-d H:i'));
        $entity['religion'] = $religion;

        $entity->setExcludeFromIndexes(['religion']);

        $datastore->insert($entity);

Similarly, How i can do this in big query rather than datastore?

Thanks!

1
  • You have a low rate. Important on SO, you have to mark accepted answers by using the tick on the left of the posted answer, below the voting. This will increase your rate. See how this works by visinting this link: meta.stackoverflow.com/questions/5234/… Commented Sep 17, 2018 at 11:25

1 Answer 1

4

In Bigquery this process is called Streaming insert.

You have plenty of example on Github samples

/**
 * For instructions on how to run the full sample:
 *
 * @see https://github.com/GoogleCloudPlatform/php-docs-samples/tree/master/bigquery/api/README.md
 */
namespace Google\Cloud\Samples\BigQuery;
// Include Google Cloud dependendencies using Composer
require_once __DIR__ . '/../vendor/autoload.php';
if (count($argv) < 4 || count($argv) > 5) {
    return print("Usage: php snippets/stream_row.php PROJECT_ID DATASET_ID TABLE_ID [DATA]\n");
}
list($_, $projectId, $datasetId, $tableId) = $argv;
$data = isset($argv[4]) ? json_decode($argv[4], true) : ["field1" => "value1"];
# [START bigquery_table_insert_rows]
use Google\Cloud\BigQuery\BigQueryClient;
/** Uncomment and populate these variables in your code */
// $projectId = 'The Google project ID';
// $datasetId = 'The BigQuery dataset ID';
// $tableId   = 'The BigQuery table ID';
// $data = [
//     "field1" => "value1",
//     "field2" => "value2",
// ];
// instantiate the bigquery table service
$bigQuery = new BigQueryClient([
    'projectId' => $projectId,
]);
$dataset = $bigQuery->dataset($datasetId);
$table = $dataset->table($tableId);
$insertResponse = $table->insertRows([
    ['data' => $data],
    // additional rows can go here
]);
if ($insertResponse->isSuccessful()) {
    print('Data streamed into BigQuery successfully' . PHP_EOL);
} else {
    foreach ($insertResponse->failedRows() as $row) {
        foreach ($row['errors'] as $error) {
            printf('%s: %s' . PHP_EOL, $error['reason'], $error['message']);
        }
    }
}
# [END bigquery_table_insert_rows]
Sign up to request clarification or add additional context in comments.

4 Comments

Pentium thankx its working but data will display in table after 90 mins as mentioned in the doc. How it will be possible to display data instantly after insert operation? Thanks
It's maximum 90 minutes, usually it's 10-30 seconds. BigQuery is a big data tool, it's not designed for transaction databases, so don't expect instant answers never.
ok as we are using insertRows method to insert values in table. Why we are not using runQuery method to insert values? Is there any reason behind it
just look at their names, one is to insert the other is to query, different wrappers

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.