cloud.google.com/go/bigqueryv1.3.0
Package bigquery provides a client for the BigQuery service.
The following assumes a basic familiarity with BigQuery concepts.
See https://cloud.google.com/bigquery/docs.
See https://godoc.org/cloud.google.com/go for authentication, timeouts,
connection pooling and similar aspects of this package.
To start working with this package, create a client:
To query existing tables, create a Query and call its Read method:
Then iterate through the resulting rows. You can store a row using
anything that implements the ValueLoader interface, or with a slice or map of bigquery.Value.
A slice is simplest:
You can also use a struct whose exported fields match the query:
You can also start the query running and get the results later.
Create the query as above, but call Run instead of Read. This returns a Job,
which represents an asynchronous operation.
Get the job's ID, a printable string. You can save this string to retrieve
the results at a later time, even in another process.
To retrieve the job's results from the ID, first look up the Job:
Use the Job.Read method to obtain an iterator, and loop over the rows.
Query.Read is just a convenience method that combines Query.Run and Job.Read.
You can refer to datasets in the client's project with the Dataset method, and
in other projects with the DatasetInProject method:
These methods create references to datasets, not the datasets themselves. You can have
a dataset reference even if the dataset doesn't exist yet. Use Dataset.Create to
create a dataset from a reference:
You can refer to tables with Dataset.Table. Like bigquery.Dataset, bigquery.Table is a reference
to an object in BigQuery that may or may not exist.
You can create, delete and update the metadata of tables with methods on Table.
For instance, you could create a temporary table with:
We'll see how to create a table with a schema in the next section.
There are two ways to construct schemas with this package.
You can build a schema by hand, like so:
Or you can infer the schema from a struct:
Struct inference supports tags like those of the encoding/json package, so you can
change names, ignore fields, or mark a field as nullable (non-required). Fields
declared as one of the Null types (NullInt64, NullFloat64, NullString, NullBool,
NullTimestamp, NullDate, NullTime, NullDateTime, and NullGeography) are
automatically inferred as nullable, so the "nullable" tag is only needed for []byte,
*big.Rat and pointer-to-struct fields.
Having constructed a schema, you can create a table with it like so:
You can copy one or more tables to another table. Begin by constructing a Copier
describing the copy. Then set any desired copy options, and finally call Run to get a Job:
You can chain the call to Run if you don't want to set options:
You can wait for your job to complete:
Job.Wait polls with exponential backoff. You can also poll yourself, if you
wish:
There are two ways to populate a table with this package: load the data from a Google Cloud Storage
object, or upload rows directly from your program.
For loading, first create a GCSReference, configuring it if desired. Then make a Loader, optionally configure
it as well, and call its Run method.
To upload, first define a type that implements the ValueSaver interface, which has a single method named Save.
Then create an Inserter, and call its Put method with a slice of values.
You can also upload a struct that doesn't implement ValueSaver. Use the StructSaver type
to specify the schema and insert ID by hand, or just supply the struct or struct pointer
directly and the schema will be inferred:
BigQuery allows for higher throughput when omitting insertion IDs. To enable this,
specify the sentinel `NoDedupeID` value for the insertion ID when implementing a ValueSaver.
If you've been following so far, extracting data from a BigQuery table
into a Google Cloud Storage object will feel familiar. First create an
Extractor, then optionally configure it, and lastly call its Run method.
Errors returned by this client are often of the type googleapi.Error: https://godoc.org/google.golang.org/api/googleapi#Error
These errors can be introspected for more information by using `xerrors.As` with the richer *googleapi.Error type. For example:
In some cases, your client may received unstructured googleapi.Error error responses. In such cases, it is likely that
you have exceeded BigQuery request limits, documented at: https://cloud.google.com/bigquery/quotas